Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    What the polls say about how Americans are using AI

    Huawei Mate 80 Pro confirmed for Malaysia launch

    ASUS Showcases 2026 AI Copilot+ PC Lineup in Malaysia Led by New Zenbook and ProArt Series

    Facebook X (Twitter) Instagram
    • Artificial Intelligence
    • Business Technology
    • Cryptocurrency
    • Gadgets
    • Gaming
    • Health
    • Software and Apps
    • Technology
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Tech AI Verse
    • Home
    • Artificial Intelligence

      What the polls say about how Americans are using AI

      February 27, 2026

      Tensions between the Pentagon and AI giant Anthropic reach a boiling point

      February 21, 2026

      Read the extended transcript: President Donald Trump interviewed by ‘NBC Nightly News’ anchor Tom Llamas

      February 6, 2026

      Stocks and bitcoin sink as investors dump software company shares

      February 4, 2026

      AI, crypto and Trump super PACs stash millions to spend on the midterms

      February 2, 2026
    • Business

      How Smarsh built an AI front door for regulated industries — and drove 59% self-service adoption

      February 24, 2026

      Where MENA CIOs draw the line on AI sovereignty

      February 24, 2026

      Ex-President’s shift away from Xbox consoles to cloud gaming reportedly caused friction

      February 24, 2026

      Gartner: Why neoclouds are the future of GPU-as-a-Service

      February 21, 2026

      The HDD brand that brought you the 1.8-inch, 2.5-inch, and 3.5-inch hard drives is now back with a $19 pocket-sized personal cloud for your smartphones

      February 12, 2026
    • Crypto

      Crypto Market Rebound Wipes Out Nearly $500 Million in Short Positions

      February 26, 2026

      Ethereum Climbs Above $2000: Investors Step In With Fresh Accumulation

      February 26, 2026

      Mutuum Finance (MUTM) Prepares New Feature Expansion for V1 Protocol

      February 26, 2026

      Bitcoin Rebounds Toward $70,000, But Is It a Momentary Relief or Slow Bull Run Signal?

      February 26, 2026

      IMF: US Inflation Won’t Hit Fed Target Until 2027, Delaying Rate Cuts

      February 26, 2026
    • Technology

      Resident Evil Requiem Steam player count breaks RE4’s 168K record 30 mins after release

      February 27, 2026

      Xgimi Titan 4K 5000-lumen dual-laser projector arrives with Dolby Vision, IMAX Enhanced, and DTS:X certifications

      February 27, 2026

      Razer introduces laptop sleeve featuring dual MagSafe charging pads

      February 27, 2026

      Garmin smartwatch users get new GPS-related alerts in update

      February 27, 2026

      Possible new Google Pixel flagship rears its head with Tensor G6

      February 27, 2026
    • Others
      • Gadgets
      • Gaming
      • Health
      • Software and Apps
    Check BMI
    Tech AI Verse
    You are at:Home»Technology»MPs propose ban on predictive policing
    Technology

    MPs propose ban on predictive policing

    TechAiVerseBy TechAiVerseJune 27, 2025No Comments7 Mins Read2 Views
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    MPs propose ban on predictive policing
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp Email

    MPs propose ban on predictive policing

    MPs are attempting to amend the UK government’s forthcoming Crime and Policing Bill so that it prohibits the use of controversial predictive policing systems

    By

    • Sebastian Klovig Skelton,
      Data & ethics editor

    Published: 27 Jun 2025 14:12

    Predictive policing technologies infringe human rights “at their heart” and should be prohibited in the UK, argues Green MP Siân Berry, after tabling an amendment to the government’s forthcoming Crime and Policing Bill.

    Speaking in the House of Commons during the report stage of the bill, Berry highlighted the dangers of using predictive policing technologies to assess the likelihood of individuals or groups committing criminal offences in the future.

    “Such technologies, however cleverly sold, will always need to be built on existing, flawed police data, or data from other flawed and biased public and private sources,” she said. “That means that communities that have historically been over-policed will be more likely to be identified as being ‘at risk’ of future criminal behaviour.”

    Berry’s amendment (NC30 in the amendment paper) – which has been sponsored by eight other MPs, including Zarah Sultana, Ellie Chowns, Richard Burgon and Clive Lewis – would specifically prohibit the use of automated decision-making (ADM), profiling and artificial intelligence (AI) for the purpose of making risk assessments about the likelihood of groups or people committing criminal offences.

    It would also prohibit the use of certain information by UK police to “predict” people’s behaviour: “Police forces in England and Wales shall be prohibited from… Predicting the occurrence or reoccurrence of an actual or potential criminal offence based on profiling of a natural person or on assessing personality traits and characteristics, including the person’s location, or past criminal behaviour of natural persons or groups of natural persons.”

    Speaking in the Commons, Berry further argued: “As I have always said in the context of facial recognition, questions of accuracy and bias are not the only reason to be against these technologies. At their heart, they infringe human rights, including the right to privacy and the right to be presumed innocent.”

    While authorities deploying predictive policing tools say they can be used to more efficiently direct resources, critics have long argued that, in practice, these systems are used to repeatedly target poor and racialised communities, as these groups have historically been “over-policed” and are therefore over-represented in police datasets.

    This then creates a negative feedback loop, where these so-called “predictions” lead to further over-policing of certain groups and areas, thereby reinforcing and exacerbating the pre-existing discrimination as increasing amounts of data are collected.

    Tracing the historical proliferation of predictive policing systems in their 2018 book Police: A field guide, authors David Correia and Tyler Wall argue that such tools provide “seemingly objective data” for law enforcement authorities to continue engaging in discriminatory policing practices, “but in a manner that appears free from racial profiling”.

    They added it therefore “shouldn’t be a surprise that predictive policing locates the violence of the future in the poor of the present”.

    As a result of such concerns, there have been numerous calls in recent months from civil society for the UK government to ban the use of predictive policing tools.

    In February 2025, for example, Amnesty International published a 120-page report on how predictive policing systems are “supercharging racism” in the UK by using historically biased data to further target poor and racialised communities.

    It found that across the UK, at least 33 police forces have deployed predictive policing tools, with 32 of these using geographic crime prediction systems compared to 11 that are using people-focused crime prediction tools.

    Amnesty added these tools are “in flagrant breach” of the UK’s national and international human rights obligations because they are being used to racially profile people, undermine the presumption of innocence by targeting people before they’ve even been involved in a crime, and fuel indiscriminate mass surveillance of entire areas and communities.

    More than 30 civil society organisations – including Big Brother Watch, Amnesty, Open Rights Group, Inquest, Public Law Project and Statewatch – also signed an open letter in March 2025 raising concerns about how the Data Use and Access Bill, which is now an Act, will remove safeguards against the use of automated decision-making by police.

    “Currently, sections 49 and 50 of the Data Protection Act 2018 prohibit solely automated decisions from being made in the law enforcement context unless the decision is required or authorised by law,” they wrote in the letter, adding that the new Clause 80 would reverse this safeguard by permitting solely automated decision-making in all scenarios where special category data isn’t being used.

    “In practice, this means that automated decisions about people could be made in the law enforcement context on the basis of their socioeconomic status, regional or postcode data, inferred emotions, or even regional accents. This greatly expands the possibilities for bias, discrimination, and lack of transparency.”

    The groups added that non-special category data can be used as a “proxy” for protected characterises, giving the example of how postcodes can be used as a proxy to potentially infer someone’s race.

    They also highlighted how, according to the government’s own impact assessment for the law, “those with protected characteristics such as race, gender and age, are more likely to face discrimination from ADM due to historical biases in datasets”.

    The letter was also signed by a number of academics, including Brent Mittelstadt and Sandra Wachter from the Oxford Internet Institute, and social anthropologist Toyin Agbetu from University College London.

    A separate amendment (NC22) introduced by Berry attempts to alleviate these data issues by introducing new safeguards for automated decisions in a law enforcement context, which would include providing meaningful redress, greater transparency around police use of algorithms, and ensuring that people can request human involvement in any police decisions about them.

    In April 2025, Statewatch also separately called for the Ministry of Justice (MoJ) to halt its development of crime prediction tools, after obtaining documents via a Freedom of Information (FoI) campaign that revealed that the department is already using one flawed algorithm to “predict” people’s risk of reoffending, and is actively developing another system to “predict” who will commit murder.

    “The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Statewatch researcher Sofia Lyall.

    “Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming.”

    She added: “Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist ‘quick fixes’ will only further undermine people’s safety and well-being.”

    Prior to this, a coalition of civil society groups called on the then-incoming Labour government in July 2024 to place an outright ban on both predictive policing and biometric surveillance in the UK, on the basis they are disproportionately used to target racialised, working class and migrant communities.

    A March 2022 House of Lords inquiry into the use of advanced algorithmic technologies by UK police has also previously identified major concerns around the use of crime prediction systems, highlighting their tendency to produce a “vicious circle” and “entrench pre-existing patterns of discrimination” because they direct police patrols to low-income, already over-policed areas based on historic arrest data.

    Lords found that, generally, UK police are deploying algorithmic technologies – including AI and facial recognition – without a thorough examination of their efficacy or outcomes, and are essentially “making it up as they go along”.

    Read more on Big data analytics


    • European Commission should rescind UK data adequacy

      By: Sebastian Klovig Skelton


    • UK MoJ crime prediction algorithms raise serious concerns

      By: Sebastian Klovig Skelton


    • UK law enforcement data adequacy at risk

      By: Sebastian Klovig Skelton


    • UK police forces ‘supercharging racism’ with predictive policing

      By: Sebastian Klovig Skelton

    Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
    Previous ArticleFujitsu’s grip on HMRC loosening but bags of taxpayer cash still to be made
    Next Article Ciaran Martin: AI might disturb attacker-defender security balance
    TechAiVerse
    • Website

    Jonathan is a tech enthusiast and the mind behind Tech AI Verse. With a passion for artificial intelligence, consumer tech, and emerging innovations, he deliver clear, insightful content to keep readers informed. From cutting-edge gadgets to AI advancements and cryptocurrency trends, Jonathan breaks down complex topics to make technology accessible to all.

    Related Posts

    Resident Evil Requiem Steam player count breaks RE4’s 168K record 30 mins after release

    February 27, 2026

    Xgimi Titan 4K 5000-lumen dual-laser projector arrives with Dolby Vision, IMAX Enhanced, and DTS:X certifications

    February 27, 2026

    Razer introduces laptop sleeve featuring dual MagSafe charging pads

    February 27, 2026
    Leave A Reply Cancel Reply

    Top Posts

    Ping, You’ve Got Whale: AI detection system alerts ships of whales in their path

    April 22, 2025696 Views

    Lumo vs. Duck AI: Which AI is Better for Your Privacy?

    July 31, 2025280 Views

    6.7 Cummins Lifter Failure: What Years Are Affected (And Possible Fixes)

    April 14, 2025162 Views

    6 Best MagSafe Phone Grips (2025), Tested and Reviewed

    April 6, 2025122 Views
    Don't Miss
    Artificial Intelligence February 27, 2026

    What the polls say about how Americans are using AI

    What the polls say about how Americans are using AIListen to this article with a…

    Huawei Mate 80 Pro confirmed for Malaysia launch

    ASUS Showcases 2026 AI Copilot+ PC Lineup in Malaysia Led by New Zenbook and ProArt Series

    Resident Evil Requiem Steam player count breaks RE4’s 168K record 30 mins after release

    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    About Us
    About Us

    Welcome to Tech AI Verse, your go-to destination for everything technology! We bring you the latest news, trends, and insights from the ever-evolving world of tech. Our coverage spans across global technology industry updates, artificial intelligence advancements, machine learning ethics, and automation innovations. Stay connected with us as we explore the limitless possibilities of technology!

    Facebook X (Twitter) Pinterest YouTube WhatsApp
    Our Picks

    What the polls say about how Americans are using AI

    February 27, 20262 Views

    Huawei Mate 80 Pro confirmed for Malaysia launch

    February 27, 20262 Views

    ASUS Showcases 2026 AI Copilot+ PC Lineup in Malaysia Led by New Zenbook and ProArt Series

    February 27, 20262 Views
    Most Popular

    7 Best Kids Bikes (2025): Mountain, Balance, Pedal, Coaster

    March 13, 20250 Views

    VTOMAN FlashSpeed 1500: Plenty Of Power For All Your Gear

    March 13, 20250 Views

    Travis Kalanick thinks Uber screwed up: “Wish we had an autonomous ride-sharing product”

    March 13, 20250 Views
    © 2026 TechAiVerse. Designed by Divya Tech.
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions

    Type above and press Enter to search. Press Esc to cancel.