Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    OpenAI upgrades its Responses API to support agent skills and a complete terminal shell

    ‘Observational memory’ cuts AI agent costs 10x and outscores RAG on long-context benchmarks

    Is agentic AI ready to reshape Global Business Services?

    Facebook X (Twitter) Instagram
    • Artificial Intelligence
    • Business Technology
    • Cryptocurrency
    • Gadgets
    • Gaming
    • Health
    • Software and Apps
    • Technology
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Tech AI Verse
    • Home
    • Artificial Intelligence

      Read the extended transcript: President Donald Trump interviewed by ‘NBC Nightly News’ anchor Tom Llamas

      February 6, 2026

      Stocks and bitcoin sink as investors dump software company shares

      February 4, 2026

      AI, crypto and Trump super PACs stash millions to spend on the midterms

      February 2, 2026

      To avoid accusations of AI cheating, college students are turning to AI

      January 29, 2026

      ChatGPT can embrace authoritarian ideas after just one prompt, researchers say

      January 24, 2026
    • Business

      New VoidLink malware framework targets Linux cloud servers

      January 14, 2026

      Nvidia Rubin’s rack-scale encryption signals a turning point for enterprise AI security

      January 13, 2026

      How KPMG is redefining the future of SAP consulting on a global scale

      January 10, 2026

      Top 10 cloud computing stories of 2025

      December 22, 2025

      Saudia Arabia’s STC commits to five-year network upgrade programme with Ericsson

      December 18, 2025
    • Crypto

      HBAR Shorts Face $5 Million Risk if Price Breaks Key Level

      February 10, 2026

      Ethereum Holds $2,000 Support — Accumulation Keeps Recovery Hopes Alive

      February 10, 2026

      Miami Mansion Listed for 700 BTC as California Billionaire Tax Sparks Relocations

      February 10, 2026

      Solana Drops to 2-Year Lows — History Suggests a Bounce Toward $100 is Incoming

      February 10, 2026

      Bitget Cuts Stock Perps Fees to Zero for Makers Ahead of Earnings Season, Expanding Access Across Markets

      February 10, 2026
    • Technology

      OpenAI upgrades its Responses API to support agent skills and a complete terminal shell

      February 11, 2026

      ‘Observational memory’ cuts AI agent costs 10x and outscores RAG on long-context benchmarks

      February 11, 2026

      Is agentic AI ready to reshape Global Business Services?

      February 11, 2026

      OpenAI’s new Codex app hits 1M+ downloads in first week — but limits may be coming to free and Go users

      February 11, 2026

      Nvidia releases DreamDojo, a robot ‘world model’ trained on 44,000 hours of human video

      February 11, 2026
    • Others
      • Gadgets
      • Gaming
      • Health
      • Software and Apps
    Check BMI
    Tech AI Verse
    You are at:Home»Technology»UK MoJ crime prediction algorithms raise serious concerns
    Technology

    UK MoJ crime prediction algorithms raise serious concerns

    TechAiVerseBy TechAiVerseApril 26, 2025No Comments10 Mins Read2 Views
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    UK MoJ crime prediction algorithms raise serious concerns
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp Email

    UK MoJ crime prediction algorithms raise serious concerns

    Data-based profiling tools are being used by the UK Ministry of Justice (MoJ) to algorithmically “predict” people’s risk of committing criminal offences, but pressure group Statewatch says the use of historically biased data will further entrench structural discrimination.

    Documents obtained by Statewatch via a Freedom of Information (FoI) campaign reveal the MoJ is already using one flawed algorithm to “predict” people’s risk of reoffending, and is actively developing another system to “predict” who will commit murder.

    While authorities deploying predictive policing tools say they can be used to more efficiently direct resources, critics argue that, in practice, they are used to repeatedly target poor and racialised communities, as these groups have historically been “over-policed” and are therefore over-represented in police datasets.

    This then creates a negative feedback loop, where these “so-called predictions” lead to further over-policing of certain groups and areas, thereby reinforcing and exacerbating the pre-existing discrimination as increasing amounts of data are collected.

    Tracing the historical proliferation of predictive policing systems in their 2018 book Police: A field guide, authors David Correia and Tyler Wall argue that such tools provide “seemingly objective data” for law enforcement authorities to continue engaging in discriminatory policing practices, “but in a manner that appears free from racial profiling”.

    They added it therefore “shouldn’t be a surprise that predictive policing locates the violence of the future in the poor of the present”.

    Computer Weekly contacted the MoJ about how it is dealing with the propensity of predictive policing systems to further entrench structural discrimination, but received no response on this point.

    MoJ systems

    Known as the Offender Assessment System (OASys), the first crime prediction tool was initially developed by the Home Office over three pilot studies before being rolled out across the prison and probation system of England and Wales between 2001 and 2005.

    According to His Majesty’s Prison and Probation Service (HMPPS), OASys “identifies and classifies offending-related needs” and assesses “the risk of harm offenders pose to themselves and others”, using machine learning techniques so the system “learns” from the data inputs to adapt the way it functions.

    Structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly
    Sobanan Narenthiran, Breakthrough Social Enterprise

    The risk scores generated by the algorithms are then used to make a wide range of decisions that can severely affect people’s lives. This includes decisions about their bail and sentencing, the type of prison they’ll be sent to, and whether they’ll be able to access education or rehabilitation programmes while incarcerated.

    The documents obtained by Statewatch show the OASys tool is being used to profile thousands of prisoners in England and Wales every week. In just one week, between 6 and 12 January 2025, for example, the tool was used to complete a total of 9,420 reoffending risk assessments – a rate of more than 1,300 per day.

    As of January this year, the system’s database holds over seven million risk scores setting out people’s alleged risk of reoffending, which includes completed assessments and those in progress.

    Commenting on OASys, Sobanan Narenthiran – a former prisoner and now co-CEO of Breakthrough Social Enterprise, an organisation that “supports people at risk or with experience of the criminal justice system to enter the world of technology” – told Statewatch that “structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly”.

    He further argued that information entered in OASys is likely to be “heavily influenced by systemic issues like biased policing and over-surveillance of certain communities”, noting, for example, that: “Black and other racialised individuals may be more frequently stopped, searched, arrested and charged due to structural inequalities in law enforcement. 

    “As a result, they may appear ‘higher risk’ in the system, not because of any greater actual risk, but because the data reflects these inequalities. This is a classic case of ‘garbage in, garbage out’.”

    Computer Weekly contacted the MoJ about how the department is ensuring accuracy in its decision-making, given the sheer volume of algorithmic assessments it is making every day, but received no direct response on this point.

    A spokesperson said that practitioners verify information and follow detailed scoring guidance for consistency.

    While the second crime prediction tool is currently in development, the intention is to algorithmically identify those most at risk of committing murder by pulling a wide variety of data about them from different sources, such as the probation service and specific police forces involved in the project.

    Statewatch says the types of information processed could include names, dates of birth, gender and ethnicity, and a number that identifies people on the Police National Computer (PNC).

    Originally called the “homicide prediction project”, the initiative has since been renamed to “sharing data to improve risk assessment”, and could be used to profile convicted and non-convicted people alike.

    According to a data sharing agreement between the MoJ and Greater Manchester Police (GMP) obtained by Statewatch, for example, the types of data being shared can include the age a person had their first contact with the police, and the age they were first the victim of a crime, including for domestic violence.

    Listed under “special categories of personal data”, the agreement also envisages the sharing of “health markers which are expected to have significant predictive power”.

    This can include data related to mental health, addiction, suicide, vulnerability, self-harm and disability. Statewatch highlighted how data from people not convicted of any criminal offence will be used as part of the project.

    In both cases, Statewatch says using data from “institutionally racist” organisations like police forces and the MoJ will only work to “reinforce and magnify” the structural discrimination that underpins the UK’s criminal justice system.

    Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed
    Sofia Lyall, Statewatch

    “The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Statewatch researcher Sofia Lyall.

    “Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming.”

    Lyall added: “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.”

    Statewatch also noted that Black people in particular are significantly over-represented in the data held by the MoJ, as are people of all ethnicities from more deprived areas.

    Challenging inaccuracies

    According to an official evaluation of the risk scores produced by OASys from 2015, the system has discrepancies in accuracy based on gender, age and ethnicity, with the risk scores generated being disproportionately less accurate for racialised people than white people, and especially so for Black and mixed-race people.

    “Relative predictive validity was greater for female than male offenders, for White offenders than offenders of Asian, Black and Mixed ethnicity, and for older than younger offenders,” it said. “After controlling for differences in risk profiles, lower validity for all Black, Asian and Minority Ethnic (BME) groups (non-violent reoffending) and Black and Mixed ethnicity offenders (violent reoffending) was the greatest concern.”

    A number of prisoners affected by the OASys algorithm have also told Statewatch about the impacts of biased or inaccurate data. Several minoritised ethnic prisoners, for example, said their assessors entered a discriminatory and false “gangs” label in their OASys reports without evidence, a decision they say was based on racist assumptions.

    Speaking with a researcher from the University of Birmingham about the impact of inaccurate data in OASys, another man serving a life sentence likened it to “a small snowball running downhill”.

    The prisoner said: “Each turn it picks up more and more snow (inaccurate entries) until eventually you are left with this massive snowball which bears no semblance to the original small ball of snow. In other words, I no longer exist. I have become a construct of their imagination. It is the ultimate act of dehumanisation.”

    Narenthiran also described how, despite known issues with the system’s accuracy, it is difficult to challenge any incorrect data contained in OASys reports: “To do this, I needed to modify information recorded in an OASys assessment, and it’s a frustrating and often opaque process.

    “In many cases, individuals are either unaware of what’s been written about them or are not given meaningful opportunities to review and respond to the assessment before it’s finalised. Even when concerns are raised, they’re frequently dismissed or ignored unless there is strong legal advocacy involved.”

    MoJ responds

    While the murder prediction tool is still in development, Computer Weekly contacted the MoJ for further information about both systems – including what means of redress the department envisages people being able to use to challenge decisions made about them when, for example, information is inaccurate.

    A spokesperson for the department said that continuous improvement, research and validation ensure the integrity and quality of these tools, and that ethical implications such as fairness and potential data bias are considered whenever new tools or research projects are developed.

    They added that neither the murder prediction tool nor OASys use ethnicity as a direct predictor, and that if individuals are not satisfied with the outcome of a formal complaint to HMPSS, they can write to the Prison and Probation Ombudsman.

    Regarding OASys, they added there are five risk predictor tools that make up the system, which are revalidated to effectively predict reoffending risk.

    Commenting on the murder prediction tool specifically, the MoJ said: “This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course.”

    It added the project aims to improve risk assessment of serious crime and keep the public safe through better analysis of existing crime and risk assessment data, and that while a specific predictive tool will not be developed for operational use, the findings of the project may inform future work on other tools.

    The MoJ also insisted that only data about people with at least one criminal conviction has been used so far.

    New digital tools

    Despite serious concerns around the system, the MoJ continues to use OASys assessments across the prison and probation services. In response to Statewatch’s FoI campaign, the MoJ confirmed that “the HMPPS Assess Risks, Needs and Strengths (ARNS) project is developing a new digital tool to replace the OASys tool”.

    An early prototype of the new system has been in the pilot phase since December 2024, “with a view to a national roll-out in 2026”. ARNS is “being built in-house by a team from [Ministry of] Justice Digital who are liaising with Capita, who currently provide technical support for OASys”.

    The government has also launched an “independent sentencing review” looking at how to “harness new technology to manage offenders outside prison”, including the use of “predictive” and profiling risk assessment tools, as well as electronic tagging.

    Statewatch has also called for a halt to the development of the crime prediction tool.

    “Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist ‘quick fixes’ will only further undermine people’s safety and well-being,” said Lyall.

    Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
    Previous ArticleM&S suspends all online sales as cyber attack worsens
    Next Article Reburn launches ambitious new IP with debut of La Quimera on PC via Steam (Update: delayed)
    TechAiVerse
    • Website

    Jonathan is a tech enthusiast and the mind behind Tech AI Verse. With a passion for artificial intelligence, consumer tech, and emerging innovations, he deliver clear, insightful content to keep readers informed. From cutting-edge gadgets to AI advancements and cryptocurrency trends, Jonathan breaks down complex topics to make technology accessible to all.

    Related Posts

    OpenAI upgrades its Responses API to support agent skills and a complete terminal shell

    February 11, 2026

    ‘Observational memory’ cuts AI agent costs 10x and outscores RAG on long-context benchmarks

    February 11, 2026

    Is agentic AI ready to reshape Global Business Services?

    February 11, 2026
    Leave A Reply Cancel Reply

    Top Posts

    Ping, You’ve Got Whale: AI detection system alerts ships of whales in their path

    April 22, 2025664 Views

    Lumo vs. Duck AI: Which AI is Better for Your Privacy?

    July 31, 2025250 Views

    6.7 Cummins Lifter Failure: What Years Are Affected (And Possible Fixes)

    April 14, 2025151 Views

    6 Best MagSafe Phone Grips (2025), Tested and Reviewed

    April 6, 2025111 Views
    Don't Miss
    Technology February 11, 2026

    OpenAI upgrades its Responses API to support agent skills and a complete terminal shell

    OpenAI upgrades its Responses API to support agent skills and a complete terminal shell Vercel…

    ‘Observational memory’ cuts AI agent costs 10x and outscores RAG on long-context benchmarks

    Is agentic AI ready to reshape Global Business Services?

    OpenAI’s new Codex app hits 1M+ downloads in first week — but limits may be coming to free and Go users

    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    About Us
    About Us

    Welcome to Tech AI Verse, your go-to destination for everything technology! We bring you the latest news, trends, and insights from the ever-evolving world of tech. Our coverage spans across global technology industry updates, artificial intelligence advancements, machine learning ethics, and automation innovations. Stay connected with us as we explore the limitless possibilities of technology!

    Facebook X (Twitter) Pinterest YouTube WhatsApp
    Our Picks

    OpenAI upgrades its Responses API to support agent skills and a complete terminal shell

    February 11, 20260 Views

    ‘Observational memory’ cuts AI agent costs 10x and outscores RAG on long-context benchmarks

    February 11, 20261 Views

    Is agentic AI ready to reshape Global Business Services?

    February 11, 20261 Views
    Most Popular

    7 Best Kids Bikes (2025): Mountain, Balance, Pedal, Coaster

    March 13, 20250 Views

    VTOMAN FlashSpeed 1500: Plenty Of Power For All Your Gear

    March 13, 20250 Views

    This new Roomba finally solves the big problem I have with robot vacuums

    March 13, 20250 Views
    © 2026 TechAiVerse. Designed by Divya Tech.
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions

    Type above and press Enter to search. Press Esc to cancel.