Close Menu
    Facebook X (Twitter) Instagram
    Trending
    • Israeli strikes kill eight people in Gaza, medics say
    • Will the latest Ebola outbreak in DR Congo and Uganda spread further? | Health News
    • 2026 NFL schedule release: 10 dates to circle for the upcoming season
    • Air show crash prompts Idaho Air Force base lockdown
    • Taiwan In The Crosshairs | Armstrong Economics
    • Qantas flight diverted after man bites flight attendant
    • Three community kitchen workers among five killed by Israel in Gaza | Gaza News
    • Pirates’ Ryan O’Hearn to miss time due to injury
    Prime US News
    • Home
    • World News
    • Latest News
    • US News
    • Sports
    • Politics
    • Opinions
    • More
      • Tech News
      • Trending News
      • World Economy
    Prime US News
    Home»World Economy»Grandmother Falsely Imprisoned Thanks To AI Biometrics Fail
    World Economy

    Grandmother Falsely Imprisoned Thanks To AI Biometrics Fail

    Team_Prime US NewsBy Team_Prime US NewsMarch 31, 2026No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    A grandmother, Angela Lipps, was arrested at gunpoint in her own home after facial recognition software program flagged her as a suspect in a financial institution fraud case in North Dakota, a state she had by no means even visited. Authorities relied on AI-generated matches from surveillance footage and in contrast these outcomes to her driver’s license and social media pictures. That was sufficient to subject a warrant.

    She was jailed for months, extradited over 1,000 miles, and held with out significant evaluation till her lawyer introduced easy financial institution data proving she was in Tennessee on the time of the alleged crime.

    The case collapsed nearly instantly, however by then she had misplaced her residence, her automobile, and even her canine. That is what occurs when governments start to belief machines greater than primary investigation.

    AI will not be intelligence. It’s sample recognition. It compares photographs, identifies similarities, and produces possibilities. It doesn’t perceive context, intent, or reality. But these possibilities at the moment are being handled as proof. That’s the place the system breaks down. As soon as a machine flags somebody, the burden shifts onto the person to show innocence quite than on the state to show guilt.

    We’ve got already seen this before. There have been a number of circumstances throughout the US the place facial recognition methods misidentified people, resulting in wrongful arrests. In every case, the identical sample emerges. The software program produces a match and investigators construct a case round it as an alternative of questioning it. Primary verification steps are skipped as a result of the idea is that the system is appropriate.

    The issue is that folks assume AI is the end-all, be-all of supreme information. Each output is handled as reality. That’s how you find yourself with somebody sitting in jail for months for against the law they didn’t commit.

    This ties straight into what we’re seeing extra broadly with synthetic intelligence. Even contained in the tech business, there are rising issues about how these methods are being deployed. The current resignation of a senior determine at OpenAI raised alarms concerning the tempo at which AI is advancing in comparison with the safeguards in place. Issues had been expressed concerning the dangers of misuse, lack of oversight, and the potential for these methods to be weaponized in ways in which had been by no means meant. When these closest to the system start warning about its misuse, it shouldn’t be ignored.

    Governments are already increasing surveillance, monitoring monetary transactions, and constructing digital identification frameworks. AI turns into the engine that ties all of this collectively. It permits methods to flag people robotically, at scale, with out human judgment.

    As soon as that infrastructure is in place, the implications are huge. You could be flagged, investigated, and even detained based mostly on knowledge patterns that could be incorrect. And by the point the error is found, the harm is already completed.

    What occurred in Tennessee is a warning of what occurs when accountability is faraway from the method. It took minutes to show she was harmless. It took months for the system to confess it was mistaken. That is the danger of changing judgment with algorithms.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleTrump says US could end war in Iran in two to three weeks
    Next Article Tech Life – Putting polluters in court
    Team_Prime US News
    • Website

    Related Posts

    World Economy

    Taiwan In The Crosshairs | Armstrong Economics

    May 17, 2026
    World Economy

    Singaporeans Are Feeling The Economy Grow In Real-Time

    May 17, 2026
    World Economy

    The Old Days Of Open Cry Trading

    May 16, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Most Popular

    Will Mortgage Rates Crash And Spur A Real Estate Boom?

    March 2, 2026

    Trump’s global trade policy faces test, hours from tariff deadline

    July 31, 2025

    Myanmar’s military drafts thousands in first year of conscription drive | Military News

    April 11, 2025
    Our Picks

    Israeli strikes kill eight people in Gaza, medics say

    May 17, 2026

    Will the latest Ebola outbreak in DR Congo and Uganda spread further? | Health News

    May 17, 2026

    2026 NFL schedule release: 10 dates to circle for the upcoming season

    May 17, 2026
    Categories
    • Latest News
    • Opinions
    • Politics
    • Sports
    • Tech News
    • Trending News
    • US News
    • World Economy
    • World News
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Primeusnews.com All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.