- Algorithms depend on math, but they’re data driven — sometimes the information being fed into one is incorrect or doesn’t represent the actual goals of the algorithm.
- Cathy O’Neil, the author of Weapons of Math Destruction, cautions us against trusting the data being fed into our judicial systems:
In the video she explains how mistakes made by algorithms cause existing problems to become even worse:
And what ProPublica found was the compass model, which is one version of a recidivism model, made mistakes by sending people to prison longer, that kind of mistake, twice as often for African-American defendants as for white defendants, at least in Broward County Florida.
- The problem with using algorithms in police work is: there’s no such thing as crime data.
- Statistically a Black person is four times more likely to be arrested for the same crime as a white person.
- The data used to power these algorithms — the ones that are supposed to predict who will commit a crime next, and which suspects are going to become repeat offenders — is flawed to begin with.
A Harvard mathematician explains why algorithms used for sentencing and policing are biased in the same way as traditional methods.
Continue reading “Biased data teaches algorithms how to discriminate”