In the famous meme, a sci-fi author states that he wrote a novel, Don’t Invent the Torment Nexus, only for a tech company to miss the point and proudly invent a real-life Torment Nexus. It seems the United Kingdom has decided to make that concept a reality, as the British government has been forced to reveal they’re building a no-kidding Minority Report-style pre-crime division.
As far as we know, this isn’t a bunch of bald psychics hooked up to computers, but is described as a “murder prediction” programme which uses people’s personal data to identify those most likely to kill someone soon. According to The Guardian, the government is using “algorithms to analyse the information of thousands of people, including victims of crime” to figure out who’s likely to snap.
The British Ministry of Justice explains that this programme seeks to “review offender characteristics that increase the risk of committing homicide” and also “explore alternative and innovative data science techniques to risk assessment of homicide”. Officials underline that this is “for research purposes only” and that, for the moment at least, they won’t be arresting anyone for a murder they’re yet to commit.
As you’d expect, profiling people for crimes that an algorithm has concluded they’re likely to commit in the future hasn’t gone down particularly well. Sofia Lyall, a researcher for pressure group Statewatch, uncovered this programme’s existence through freedom of information requests and described a “crime prediction” system as “chilling and dystopian”:
“Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming.”
Perhaps we should take the government at its word, and that this is just an experiment to see if the algorithm can really predict murder. Then again, it’s very easy to imagine lazy cops leaning on this tech and deciding that if the algorithm (doubtless boosted by AI) selects you as being a potential troublemaker (let alone a murderer) they should have the right to impose restrictions on you before you can break the law.
Let’s say you disagree with the government on some fundamental issue. Is it too crazy to imagine future versions of systems like this churning your social media profile, tagging you as someone likely to join a protest on the street, and an order made in advance of the event specifically forbidding you to attend? Either way, Philip K. Dick must be spinning in his grave right now.
Published: Apr 9, 2025 03:41 am