AI Voice Scam: Canadian Couple Loses $21,000 To Fake Son’s Legal Troubles

Sections of this topic

    In this article, we’ll explore the growing use of AI voice generators by scammers to impersonate loved ones and steal millions of dollars from unsuspecting victims. 

    Key Takeaways:

    • AI voice generation software is being increasingly used by scammers to impersonate loved ones and steal millions of dollars from unsuspecting victims.
    • Imposter scams, which involve scammers posing as someone else, have become the second most popular type of fraud in America.
    • Elderly individuals are particularly vulnerable to these scams, which often involve the scammers posing as their children or grandchildren.
    • Currently, it’s unclear whether companies can be held responsible for any harm caused by AI voice generators or other forms of AI technology in court.
    • The Federal Trade Commission (FTC) established a new technology office to examine how AI is being used and whether companies are addressing any potential risks that their products may pose.
    • The tech industry is taking notice of the potential harm caused by AI voice generators and is working to mitigate the risks associated with them.
    • It is important for the government to regulate the use of such technology to protect consumers, particularly vulnerable individuals like the elderly.

    Exploring the Rise of AI Voice Generators Used by Scammers

    As technology continues to advance, so do the tactics of scammers. The latest method being employed by these criminals is the use of AI voice generators to impersonate loved ones and steal millions of dollars from unsuspecting victims. 

    The growing use of AI voice generators by scammers to impersonate loved ones and steal money is happening at the same time as a significant increase in imposter scams. 

    These scams have become the second most common type of fraud in the US, with over 36,000 cases reported last year. 

    More than 5,000 victims lost money through phone scams, resulting in a total of $11 million in losses, according to the Federal Trade Commission (FTC).

    How AI Voice Generation Software is Being Used by Scammers

    AI voice generation software is a relatively new tool that scammers are using to swindle their victims. 

    Initially, scammers would require a few sentences from a person’s voice to convincingly reproduce the sound and tone of the speaker. 

    Yet, the technology has progressed to a stage where scammers can impersonate individuals with just a few seconds of their dialogue, making it easier for them to target people using this method.

    Recent Incidents of AI Voice Generator Scams

    The Washington Post recently reported on an incident where an elderly couple from Canada lost $21,000 to a scammer who claimed to be their son’s lawyer. 

    The scammer used an AI-generated voice that sounded like their son to ask for the money, claiming that their son was in jail for killing a diplomat in a car accident and needed the money for legal fees before going to court. 

    The couple collected the cash and sent the scammer money through Bitcoin, but they later admitted they thought the phone call sounded strange. 

    They realized they had been scammed after their son called to check in later that evening.

    Similar stories of elderly individuals being scammed out of their life savings using AI voice generators are becoming increasingly common. 

    The elderly seem to be the primary targets of these attacks, and they are particularly vulnerable to financial scams. 

    However, it remains unclear whether companies can be held responsible for any damages caused by AI voice generators or other AI technologies.

    Efforts to Combat AI Voice Generator Scams

    To ensure companies are limiting the risks their AI products can cause, the FTC has established an Office of Technology to investigate the potential uses of AI that businesses are promising. 

    The FTC is worried that with the increasing ease of creating and sharing deepfakes and other AI-based synthetic media, fraudsters may use these tools for fraudulent activities. 

    These AI tools can generate videos, photos, audio, and text that appear authentic, which could allow fraudsters to deceive a wider audience more quickly.

    The potential harm caused by AI voice generators has not gone unnoticed by the tech industry. 

    Recently, ElevenLabs, a company that studies voice cloning and artificial speech tools, posted on Twitter that they were launching a new tool. 

    This tool helps people check if an audio sample was created using the company’s technology. The tool, called VoiceLab, can only be accessed with payment.

     ElevenLabs is conscious of an increase in voice cloning misuse cases, particularly on websites where people can remain anonymous, such as 4chan. 

    On such sites, individuals have used ElevenLabs’ technology to create voices that sound like famous people, and then used these voices to say offensive or inappropriate things.

    Conclusion

    The use of AI voice generators by scammers to impersonate loved ones and steal money is a growing concern. 

    The technology has the potential to cause significant harm to vulnerable individuals, particularly the elderly. 

    It is important that the tech industry takes responsibility for mitigating the risks associated with AI voice generators, and that the government regulates the use of such technology to protect consumers. 

    The rise of this technology is also coinciding with an increased need for awareness and education for the public to help prevent them from falling victim to these scams.