Artificial intelligence could identify gang crimes -- and ignite an ethical firestorm
Retweeted by WhatFreshHellIsThisHat: https://twitter.com/Popehat
This is an unconscionably horrible answer to a crucial question about algorithm design for social systems. When you design algos with a high impact on lives, you have to take the consequences seriously.
Researchers build AI to identify gang members. When asked about potential misuses, presenter (a computer scientist at Harvard) says "I'm just an engineer." 🤦🏿♂️🤦🏿♂️🤦🏿♂️http://www.sciencemag.org/news/2018/02/artificial-intelligence-could-identify-gang-crimes-and-ignite-ethical-firestorm
Artificial intelligence could identify gang crimesand ignite an ethical firestorm
By Matthew Hutson Feb. 28, 2018 , 8:00 AM
When someone roughs up a pedestrian, robs a store, or kills in cold blood, police want to know whether the perpetrator was a gang member: Do they need to send in a special enforcement team? Should they expect a crime in retaliation? Now, a new algorithm is trying to automate the process of identifying gang crimes. But some scientists warn that far from reducing gang violence, the program could do the opposite by eroding trust in communities, or it could brand innocent people as gang members.
That has created some tensions. At a presentation of the new program this month, one audience member grew so upset he stormed out of the talk, and some of the creators of the program have been tight-lipped about how it could be used. ... This is almost certainly a well-intended piece of work, says Google software engineer Blake Lemoine, who is based in Mountain View, California, and has studied ways of reducing bias in artificial intelligence. But have the researchers considered the possible unintended side effects?
....
But researchers attending the AIES talk raised concerns during the Q&A afterward. How could the team be sure the training data were not biased to begin with? What happens when someone is mislabeled as a gang member? Lemoine asked rhetorically whether the researchers were also developing algorithms that would help heavily patrolled communities predict police raids. ... Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldnt be sure how the new tool would be used. Im just an engineer, he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: Once the rockets are up, who cares where they come down? Then he angrily walked out.
Approached later for comment, Lemoine said he had talked to Chan to smooth things over. I dont necessarily think that we shouldnt build tools for the police, or that we should, Lemoine said (commenting, he specified, as an individual, not as a Google representative). I think that when you are building powerful things, you have some responsibility to at least consider how could this be used.