Predictive policing and ghetto avoiding

Logan Koepke has written an interesting article at Slate about the nature of predictive policing.  Predictive policing involves the use of computer algorithms to assign police coverage to a given region on the basis of anticipated risk of crime. 

Basically, the idea is to predict where crimes are most likely to occur so that police can be present to prevent or intervene.  Thus, available police resources will be deployed to their greatest advantage in dealing with crime.

As Koepke points out, a number of issues arise with this program.  For one thing, police data of past crimes is not a perfect record of criminal activity.  Many crimes go unreported, for example.  Also, police coverage in the past may be biased towards certain locations, such as minority neighborhoods.  Thus, sending police to past "hotspots" risks simply reinforcing existing bias rather than suppressing crime as such.

This case reminds me of a similar application of police crime data, that is, in so-called ghetto-avoider apps.  These apps compare each user's current position with the locations of past assaults as recorded in publicly available police crime records.  When a user approaches within a set distance of a past assault, their app triggers an alert.

These apps perform a similar function to predictive policing.  However, instead of directing police towards a given area, they steer civilians away from them. 

As such, many of the issues that apply to predictive policing also apply to ghetto-avoiders.  The occurrence of past assaults is not always a good predictor of future risk, a situation that unfairly stigmatizes the location involved as a "ghetto".  By the same token, the non-occurrence of past assaults in police records at a given location does not always mean that people at low risk there, a situation that unfairly places users at unwanted and unanticipated risk.

I wrote an article entitled "Fairness and regulation of violence in technological design" on similar matters a while ago.

As we become more reliant on algorithms to allocate our resources, social issues, like fairness, arising from their design become all the more significant.

Police Car Lights

"Police car lights" by Scott Davidson/Flickr.com

Blog topics

  1. 2018 (17)
    1. December (2)
    2. October (2)
    3. August (2)
    4. July (3)
    5. June (3)
    6. May (2)
    7. March (2)
    8. January (1)
  2. 2017 (44)
    1. December (1)
    2. August (3)
    3. July (4)
    4. June (6)
    5. May (5)
    6. April (3)
    7. March (6)
    8. February (8)
    9. January (8)
  3. 2016 (95)
    1. December (7)
    2. November (13)
    3. October (13)
    4. September (15)
    5. August (20)
    6. July (18)
    7. June (9)