Among other things, one of the goals of predictive policing was to end racial bias in police work. Essentially, a computer analyzes data and predicts where crime will happen. It then recommends where officers should go to stop or react to these crimes.
The idea is that police officers may racially profile a neighborhood or a group of people and always go there, ignoring crime happening in areas they weren’t biased against. A computer supposedly couldn’t be biased, so it wouldn’t fall into this same trap.
The problem is that critics claim computerized predictive profiling often just makes biases worse. Since it is true that the computer itself doesn’t hold any sort of bias, how is this possible?
Biased data going in equals biased information coming out
The problem lies in the data. Yes, the computer program isn’t biased and just works with the information it is given, but where does that information come from? It is often from community members and police officers. These individuals can certainly have biases and prejudices — just like any human.
If they do, they may give the computer data that supports their views. They could focus only on specific neighborhoods with certain groups of people, for instance. This trains the computer to also think that crime only happens in these areas, and then it sends more officers to those areas as a result. In the end, the bias still exists and may just get worse as the computer focuses so intently on flawed data.
Your rights after an arrest
In an ideal world, police work would always be fair and just. We know that doesn’t always happen. Make sure you know your rights. If you’ve been charged with any kind of crime, your best defense is a proactive approach to your case.