Close Menu
    2digital.news2digital.news
    • News
    • Analytics
    • Interviews
    • About us
    • Editorial board
    • Events
    2digital.news2digital.news
    Home»News»Michigan Residents Lose Savings to Deepfakes and Voice Cloning
    News

    Michigan Residents Lose Savings to Deepfakes and Voice Cloning

    Mikolaj LaszkiewiczBy Mikolaj LaszkiewiczMay 12, 20262 Mins Read
    LinkedIn Twitter Threads Reddit
    hacker in white mask sitting in front of computer
    Source: Unsplash | Boitumelo
    Share
    Twitter LinkedIn Threads Reddit

    Technology that until recently was the domain of advanced labs has now become a common weapon for ordinary criminals. The latest reports out of Michigan point to a massive wave of algorithm-generated extortion. Scammers are ruthlessly exploiting residents’ trust by masquerading as their friends or family members.

    Detroit journalists summed up the current crisis in the local market:

    “Artificial intelligence is making it easier than ever before for scammers to deceive ordinary people — and Michigan residents are paying the price”

    Voice cloning has become the focal point of these attacks. Criminals no longer need hours of studio-quality recordings to pull off a successful scam. A mere few seconds of audio scraped from a victim’s public social media profile – like an Instagram story or a TikTok video – is all it takes. After feeding the model this sample, scammers get a digital copy of the victim’s voice capable of saying anything while maintaining the original tone and even the right inflection. It is obviously not a perfect clone, but over a phone call, it can be shockingly realistic.

    The criminals then call relatives and act out highly emotional scenarios. They simulate crisis situations – a car crash, a sudden arrest, or an urgent need for bail money – and use time pressure to demand an immediate cash transfer or account login details.

    You can learn more about this technology in our dedicated article.

    The online threat isn’t limited to audio, however. Simultaneously, residents are being bombarded with videos created using deepfake technology. Criminals are churning out hyper-realistic clips where cloned likenesses of recognized authority figures, celebrities, or local officials encourage people to hand over cash or invest in bogus crypto schemes.

    The rapid evolution of these methods is forcing citizens to change their habits. Cybersecurity experts advise against relying solely on the sound of a voice on the other end of the line, urging people to implement verification methods instead. The most effective at-home defense against tech-driven fraud is establishing a physical “safe word” with loved ones – a specific word or verifying question that helps distinguish a real family member from a computer-generated clone.

    Related Posts

    News

    “They don’t really make life decisions without asking ChatGPT”: Sam Altman Diagnoses Gen Z and Millennial Habits

    May 11, 2026
    News

    World’s First Dual-Core Quantum Computer Comes From China – and Raises Plenty of Doubts

    May 11, 2026
    News

    DNA for Protesting. Activists Sue the Government Over Illegal Genetic Code Collection

    May 8, 2026
    Read more

    IT Sector as an Economic Stabilizer. Digitally Strong Countries Weather Crises Better

    May 6, 2026

    A Voice Is No Longer Proof: How Scammers Learned to Fake Trust 

    May 5, 2026

    Top Wearable Medical Device Companies Shaping Modern Healthcare

    April 30, 2026
    Demo
    X (Twitter) Instagram Threads LinkedIn Reddit
    • NEWS
    • ANALYTICS
    • INTERVIEWS
    • ABOUT US
    • EDITORIAL BOARD
    • EVENTS
    • CONTACT US
    • ©2026 2Digital. All rights reserved.
    • Privacy policy.

    Type above and press Enter to search. Press Esc to cancel.