International Journal on Criminology Volume 4, Number 2, Winter 2016 | Page 80
Telling Tales with Inspector PredPol
There’s Nothing Less Neutral than an Algorithm…
Computers and computer science are not neutral. The tools that collect and
analyze data are not neutral either. Far from being the gold standard, algorithms reflect
the bias of the people who devise them, fallible human beings, in other words, who
might infuse their work with wishful thinking rather than pure science.
These algorithms, these tools for assembling data on the world around us that
the gullible envelop in quasi-theological adoration, may have been rigged by their
designers (for their own benefit), or by mischief-making or hired hackers. Does this
give us a glimpse of what happens to predictive policing when it is “worked over”
by hacktivists lying in wait on the dark web? Will police patrols be sent to locations
where there is nothing happening while, on the other side of the same neighborhood,
burglars are going about their business undisturbed? It would be harsh to belabor this
point.
More broadly, it is difficult to verify whether algorithms really fulfill their
mission: not just because they are capable of influencing and pre-formatting reality but
also due to their sheer volume and power. If the algorithms are used on a sufficiently
large scale, they generate their own validity and exert a “flocking” effect on the
material facts. This is something the media should know all about, since there have
been numerous recent cases of rip-off algorithms:
- In the 1970s, the Black–Scholes Model was said to be able to predict the future
value of shares. But in 1998, in spite of the apparently awesome algorithms,
the Long Term Capital Management hedge fund collapsed, leaving the global
credit market staring into the abyss.
- In 2001, a rigged model—based, you have guessed, on the most fascinating
algorithms—enabled Enron to assign an astronomical value to vanishing
assets. The company then buckled and its directors were put away for a
lengthy period.
- During the subprime crisis, it was discovered that rating agencies were
“adapting” their software (based—no surprises here—on esoteric algorithms)
to the desired outcome.
- After the aforementioned economic crisis, JP Morgan was obliged to
“apologize” for using “unsuitable” software, based on advanced algorithms,
that resulted in the banking giant losing $6 billion.
In 2008, three major hedge funds suffered huge losses due to “unpredictable
market movements”—movements that the magic algorithms were supposed to predict.
Algorithms can also lead to downright juicy scams. Take the 2005 case of the Texan
businessman calling himself a “former military intelligence officer” and university
professor. This individual claimed to have invented an algorithm that could make
a fortune on the foreign currency markets. He swindled $33 million from his
unsuspecting clients before being put away for 20 years.
79