Predictive policing sounds like a good idea. Police feed arrest data into a computer, and the computer then tells them where crimes are likely to happen. Police go to those areas and wait. Then they make arrests when crimes occur.
So, why is this controversial? The problem is that many people feel the system is inherently biased.
Of course, the computer itself cannot be biased. However, the problem is that human biases — based on things like ethnicity — may be exaggerated.
For instance, imagine that a police officer has a bias against people of a certain race. The officer goes to neighborhoods where these individuals live to look for crime, already believing that they are more likely to break the law.
As the officer makes arrests, the computer begins to “learn” that those neighborhoods are high-crime areas. It does not know about crimes happening elsewhere if officers do not patrol those areas as heavily and therefore make fewer arrests. It just sees the greater amount of arrests in areas that biased officers are already targeting.
As a result, the computer then begins to focus on these areas, predicting that crime will happen there. It sends more officers to those areas, more arrests are made, and the cycle continues. All it takes is a slight police bias at the beginning to throw the whole system off course.
Do you feel like police unfairly targeted you, perhaps accusing you of crimes based on your race? If you think that your rights were violated, you need to know all of the legal options you have moving forward.
Source: Nov. 30, -0001