måndag, september 12, 2016

AI's, bots and Algorithms - a threat to justice, truth and egality!

Apple, Google, Microsoft, MugMag (Fjäskbuck), even law enforcement systems & many more - use algorithms to make decisions!
Broad generalisations, stereotypes and preconceptions - perpetuated by machines is a threat to us all!

We must all be aware that the algorithms are very blunt selection tools - even dumber than the flawed humans that wrote them!
In theory - I should be favoured by the AI's and algorithms, being perceived as all that society's stereotypes hold as the norm  - being a caucasian, northern European, middle aged, white male - perceived as CIS-man and economically stable - with a background in the political and judicial systems…

But we're all att risk!

No one is safe! Even a strongly biased system can flip over or a "learning" algorithm - can come to conclusions that is harmful!

Conditions of society, human rights, the legal system and even the multinational Corporations at the top of this post - by using these flawed tools - are at risk to be more harmful than helpful!

 -  -  -  -  -  -  -

Here's an article from the Guardian!

A beauty contest was judged by AI and the robots didn't like dark skin

"When Microsoft released the “millennial”chatbot named Tay in March, it quickly began using racist language and promoting neo-Nazi views on Twitter. And after Facebook eliminated human editors who had curated “trending” news stories last month, the algorithm immediately promoted fake and vulgar stories on news feeds, including one article about a man masturbating with a chicken sandwich.

While the seemingly racist beauty pageant has prompted jokes and mockery, computer science experts and social justice advocates say that in other industries and arenas, the growing use of prejudiced AI systems is no laughing matter. In some cases, it can have devastating consequences for people of color."

"Civil liberty groups have recently raised concerns that computer-based law enforcement forecasting tools – which use data to predict where future crimes will occur – rely on flawed statistics and can exacerbate racially biased and harmful policing practices."

"A ProPublica investigation earlier this year found that software used to predict future criminals is biased against black people, which can lead to harsher sentencing.
“That’s truly a matter of somebody’s life is at stake,” said Sorelle Friedler, a professor of computer science at Haverford College."

"Prejudiced AI programs aren’t limited to the criminal justice system. One study determined that significantly fewer women than men were shown online ads for high-paying jobs. Last year, Google’s photo app was found to have labeled black people as gorillas."