You are currently browsing the daily archive for July 24, 2024.
It is really amazing the lengths people will go through to confirm their victimhood identities. And of course, the CBC will highlight how awesome it is to use AI to ‘make the internet a safer place for Indigenous people’.
Good lord. If the bad internet is hurting you…turn it off. But rather than make an adult decision, let’s do this:
“A new tool aims to use artificial intelligence to help make the internet a safer place for Indigenous people.
The project was given the name wâsikan kisewâtisiwin, which translates to “kind energy” in Cree.
“We’re trying to make the internet a kinder place. We’re trying to change the trajectory of the internet towards discriminated people,” Shani Gwin told CBC’s Radio Active.”
On the internet you are (with certain measures) essentially anonymous. What you say on the internet will be taken at face value (in theory).
“Being developed in collaboration with the Alberta Machine Intelligence Institute (AMii), the tool is dual purpose, intended to help both Indigenous people and non-Indigenous Canadians reduce racism, hate speech, and online bias.
The first function of the program is to moderate online spaces like comment sections. While the internet has been a tool used by Indigenous people for advocacy, it also can frequently be an unsafe space for communities that are discriminated against, Gwin said.
Gwin said that all it takes is one comment for online spaces to fester.”
If people want to pillow-up a spot on the internet, they are more than welcome to do so. Usually though, this sort of anti-free speech mechanism escapes from its hug-box confines and is loosed into the wider ecosystem.
“The tool flags hateful comments, and then provides sample responses, while also documenting these instances for future reporting.
The second function of the tool is designed to serve as a writing plug-in for your computer — similar to Grammarly. Intended to help general Canadians understand their bias, it will flag any writing that may be biased against Indigenous people, provide an explanation, and a suggestion for how to reword the sentence.”
Wow! It is like having your own personal Big Brother making sure that you are engaged in ‘right-thinking’ at all times. Plus offering real time suggestions on how to neuter your speech as to not risk offence to others.
“AI right now is designed through the lens of Canada’s dominant culture. And I would say that across the world that without input from racialized communities, including Indigenous people, AI cannot analyze and produce culturally safe and respectful content,” Gwin said.
“Every piece of infrastructure in Canada has been developed from the white patriarchal lens,” she said. “So more racialized people, more women need to get involved in the development of AI so that it doesn’t continue to be built in a way that’s going to harm us again.”
Whoops! Did you catch the turn into Marxist Critical Theory? I certainly did – That damn AI developed through the lens of ‘dominant culture’. Beginning with a conclusion and then looking for evidence based on your assumptions almost always leads to bullshit results.
Just no. AI was developed by a diverse body of people from across the world, let’s not shoehorn your ‘critical perspective’ into this.
“AI bias revealed itself in training, Qroon said, adding that at times when experimenting with the AI, it would try to minimize the tragedies that Indigenous people went through.
“And that’s why it was very important for us to integrate the Indigenous community into this process and get their perspective and get the instructions from them.”
The AI making the decision to not follow a trauma informed narrative? Huh. Well that will need to be fixed ASAP.
“Gwin said that her hope for the project is that it helps take the emotional labour of education off Indigenous people — and free them up to do things besides moderating comment sections.
“I think there might be concerns that people think that this AI tool will take jobs away from Indigenous people, but it’s not, that’s not what it’s for. It’s there to do the work that we don’t want to do.”
Yes, censorship is such an emotional labour. Much better to let a machine – an entity with even less capacity for nuance – take the reins.
“But it also means changing the internet and Canadians’ hearts and minds about who Indigenous people are.”
You mean changing minds in a positive way, right? Because this just looks like social and emotional manipulation in service of maintaining a oppressed/oppressor narrative that benefits no one in Canada.




Your opinions…