Much has been written by us at
The Big Disrupt about why Big Data and the Internet of Things will enable the
largest and most comprehensive data grab in the history the species which is
terrifying but the development that scares us the most is that it will makes
humans easier to predict which is great for governments and corporations but
not so much for Joe public who’s already tracked, watched, and monitored at
every turn.
We could write a whole book on
how problematic predictive analytics can be particularly when used to combat
crime (some concerns we’ll cover later) but, like any a good Kantian will tell
you, it’s always wise to criticize things according to their limits. Predictive
analytics can tell us much about crime from where it most frequents and who’s
most likely to fall victim to a crime but to a certain degree police
departments everywhere and the public at large know who is most likely to fall
victim to crime: the poor and vulnerable. Predictive analytics may help police
departments protect people most likely to fall victim to crime but what it can’t
address, or more to point, what it’s not designed to address, is why such
people are more likely to fall victim to crime in the first place.
Predictive analytics, like many
modern technologies, cannot address social problems but it can address inefficiencies
in processes that can address social problems but not directly. For example, predictive
analytics may inform police departments where a crime is likely to take place
and allow them send units to potentially stop crimes but this scenario is
likely to reveal department and officer biases as it’s likely that units will
engulf poorer areas that usually have a less than cordial relationship with
police departments and their officers in the first place.
This runs into another problem
with predictive analytics, what is actually being analyzed. Predictive analytics
is good at parsing through large datasets but not so much at identifying
department attitudes and tactics used towards certain neighborhoods. Because of
this, what predictive analytics is most likely to reveal is not only where and
when crime is likely to be committed but the biases of departments and their
officers, the not so flattering socio-economic and historical make up of a city,
and the inability of predictive analytics interpret the effect of both factors
have on the data it analyzes.
Another limitation of
predictive analytics being used to fight crime is that it won’t make
departments or officers any better at dealing with the public especially innocent
members of the public who live in areas deemed as “hotspots” by predictive
analytics. This is a very important point as the last few weeks and months have
shown, arming cops with data that will most likely buttress their already deep
set biases towards certain groups and areas can and will have deadly
consequences.
In sum, the all too social dynamics
of crime are more complicated than the useful but limited answers predictive
analytics can provide which is concerning as predictive analytics is in some
respects already informing policing decisions that clearly neglect the
complexities police officers are neither empowered or equipped to deal with or the technology itself can even recognize.
We’ve always subscribe to the
view that the most interesting thing about any technology is why it is being
implemented at any particular time and the implications that come with it and predictive
analysis is no different. While companies like IBM and Microsoft (both invested
to predictive analytics) would point to the great returns in crime prevention
and other efficiencies, the truth is that while predictive analytics requires a
serious investment in software, hardware, hiring and training, it can and
mostly likely will in the future lead to a drastic cull of cops on the beat. Police
work for the most part will become technocratic but at the same time simpler as
cops will likely follow crime hotspot maps that highlight where certain crimes
take place and simply wait for something to happen.
This might not seem like much
of a problem but just imagine the sight of cops just hanging around where you
live waiting for a crime to take place just because their predictive analytics
software deemed your area a “hotspot” for a certain type of crime or crime in
general. While we’ve already cited the potential of predictive analytics
serving as a confirmation of departmental biases, we must also consider that decisions
by cops in the field can and will be influenced by predictive analytics. For
example, cops are more likely to prepare for hostility in areas where crime
takes place more often than areas that rarely have instances of crime according
to data provided by predictive analytics.
This may seem like an obvious
observation but given that cops have the power of arrest and are armed, information
that confirms or even creates new biases among officers before they even engage with members of the public can prove problematic. Cops in certain areas
are likely to ask otherwise innocent members of the public for information or
conduct stop and search procedures due to the perceived prolific nature of
crime in certain areas and the proximity of residents in that area to it.
The scenario above may already
sound familiar to anybody who lives in an area with a reputation and that’s
because, to a certain extent, crime fighting is already a data driven
enterprise. A key reason why predictive analytics is being used by law
enforcement in the first place is due to the vast amount of data departments collect
and need to interpret in order to combat crime. Another key reason why there
has been an almost widespread embrace of the application of predictive analytics
in a number of police organizations across the US and elsewhere is that
departments are facing budget constraints and thus are forced to operate under
what has to be the most depressing creed of the modern age: doing more with
less.
Even in the widely cited
success stories of predictive analytics bringing crime down, the reasons why
local departments invested in predictive analytics revealed an awful lot more
than its impressive results. In The City of Lancaster in California, forced to “do
more with less” in light of sharp budget cuts and had to “deploy resources more
efficiently”, made an investment in predictive analytic systems that helped yield
an excellent 35% reduction in “ part 1” crimes in 2010 and 40% in 2011[1].
These numbers are impressive and have served as an effective sales script for
IBM (the numbers used above were cited from a IBM case study by Nucleus
Research) trumpeting the effectiveness of predictive analytics to police organizations
across the US and overseas.
James Slessor, Accenture’s
Managing director of Accenture Police services, pretty much made the same points
that IBM are making as he cited another success story in California among
others this time in Santa Cruz where law enforcement “applied predictive
analytics to burglary data in order to identify the streets at greatest risk –
it resulted in a 19 per cent drop in property theft without the need for additional
officers”[2].
In both instances, Accenture
and IBM are in effect selling predictive analytics systems not only as an
effective tool in fighting crime, which it may well prove to be, but as an efficiency
measure to deal with cuts to budgets and resources which is not a bad thing but
this is hardly the most noble motivation driving a revolution in how police
work is done in 21st century.
So far we’ve largely focused on
predictive analytics being used by departments to predict where and when
certain crimes happen but, in truth, the most concerning thing is not so much
how the technology is used to fight crimes in cities but how it can and is
being used against people. We’ve already mentioned that we are the most
watched, tracked and monitored age in the history of the species and predictive
analytics will ensure we will be the easiest to predict. Police organizations
are already using predictive analytics against criminals as Computerworld
reported back in October that the Metropolitan Police Service ran a project
with Accenture that “merged data from the Met’s various crime reporting and intelligence
systems and applied predictive analytics, generating risk scores on the
likelihood of known individuals committing violent crimes”[3].
While you might not lose sleep
over police units using predictive analytics against “known” criminals, how
easy would you sleep if a police commander came to your front door and warned
you that they’ve got their eye on you because, as The Verge Matt Stroud reported,
your name cropped up on “an index of the roughly 400 people in the city of
Chicago supposedly most likely to be involved in violent crime” predictably termed
a “heat list”[4].
I don’t know your tolerance
regarding invasions of your privacy but it surely sent shivers up Stroud’s
spine who gave his article a provocative title that speaks loudly to many of
the points quietly made in this piece.
In sum, predictive analytics can
and will play a major role in how crime is fought in cities in the 21st
century and beyond but with concerns about it’s potential to confirm or create
new biases, compromise individuals’ right to privacy and the motivations and
interests driving this push towards analytics, the need for pause must be met
with a sober debate about what predictive analytics means for the public as
well as law enforcement.
[1]
Nucleus Research, 2012, ROI Case Study IBM SPSS City of Lancaster, http://public.dhe.ibm.com/common/ssi/ecm/yt/en/ytl03131usen/YTL03131USEN.PDF
[2]
Accenture, 2012, Smarter Policing, http://www.accenture.com/SiteCollectionDocuments/PDF/Accenture-Smarter-Policing.pdf
[3] C.Jee,
2014, Met Police pilot analytic tool to fight gang crime, http://www.computerworlduk.com/news/public-sector/3582701/met-police-pilots-analytics-tool-to-fight-gang-crime/
[4] M.
Stroud, 2014, The Minority Report: Chicago’s New Police Computer Predict Crime,
But is it racist?, http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist
No comments:
Post a Comment