And It’s Biased Against Blacks”, written by Julia Angwin, et al. a story is presented about a young African-American girl and a middle-aged white man both convicted of theft. The young 18 year-old African American took possession of a scooter that she saw on the side of the road but proceeded to return the scooter upon realizing it was too small for her, however before she could return the scooter without any strangers noticing her actions, a neighbor had called the police on her. For the middle-aged white man, Julia Angwin et al. describes him as a man who has a history of theft, and most recently stole eighty dollars’ worth of merchandise from Home Depot, but was once again apprehended. Upon both of these individuals’ arrests, a computer algorithm was run on them that took into account many characteristics regarding their personhood and severity of crime. This algorithm as noted in the article, was meant to “predict the likelihood of each committing a future crime” (Julia Angwin). After the computer’ calculations were made, it was determined that a once innocent 18-year-old African-American girl was “more likely” to commit future crimes as compared to the middle-aged white male with a history of theft. Two years after these actions occurred, the computer algorithm proved to be incorrect in judgment because the white-male had once again committed theft while the young African-American woman had not been charged with any new offenses (Julia Angwin). How is it possible that a computer simulation with the sole purpose of predicting future crimes had such a severe misjudgment of character? Ideas surrounding race seems to be the most practical answer to this
And It’s Biased Against Blacks”, written by Julia Angwin, et al. a story is presented about a young African-American girl and a middle-aged white man both convicted of theft. The young 18 year-old African American took possession of a scooter that she saw on the side of the road but proceeded to return the scooter upon realizing it was too small for her, however before she could return the scooter without any strangers noticing her actions, a neighbor had called the police on her. For the middle-aged white man, Julia Angwin et al. describes him as a man who has a history of theft, and most recently stole eighty dollars’ worth of merchandise from Home Depot, but was once again apprehended. Upon both of these individuals’ arrests, a computer algorithm was run on them that took into account many characteristics regarding their personhood and severity of crime. This algorithm as noted in the article, was meant to “predict the likelihood of each committing a future crime” (Julia Angwin). After the computer’ calculations were made, it was determined that a once innocent 18-year-old African-American girl was “more likely” to commit future crimes as compared to the middle-aged white male with a history of theft. Two years after these actions occurred, the computer algorithm proved to be incorrect in judgment because the white-male had once again committed theft while the young African-American woman had not been charged with any new offenses (Julia Angwin). How is it possible that a computer simulation with the sole purpose of predicting future crimes had such a severe misjudgment of character? Ideas surrounding race seems to be the most practical answer to this