AI Tools That Mimic Voices Of Loved Ones Being Used To Scam People
Most AIs that generate voice only need a small sample of someone's voice to form entire sentences
Artificial intelligence models can be trained to perform a variety of tasks, including the ability to sound like your close friends and family. But bad actors are using voice-generating AI software to scam people.
Most AIs that generate voice only need a small sample of someone's voice to form entire sentences, even emulating their emotional tone (watch Red Rose on Netflix to truly understand how dangerous this could become in the wrong hands). According to The Washington Post, scammers usually target the elderly with such scams. While some people could possibly sense the inauthenticity of such calls, the last thought to cross someone's mind when their loves ones are claiming to be in an emergency would be that an AI is behind it all.
One couple apparently sent $15,000 via a bitcoin terminal thinking that they'd spoken to their son. What did the AI say, you wonder? It claimed, as their son, that it needed legal fees after killing a US diplomat in a car accident.
The problem with AI imposter scams
Such scams can be operated from anywhere in the world, making them hard to track and to put a stop, the Post reported. When such imposters are operating from different countries, it's hard to ascertain jurisdiction as well.
The best way to protect yourself from such scams is to double check everything. Getting an unusual call asking for money? Check with the person they're claiming to be directly. One extra step could save you from financial loss. In general, if anyone is asking for money on a call, simply refuse - and don't click on suspicious links received on texts.
Also read: Elon Musk Reportedly Hiring AI Researchers To Build 'Anti-Woke' ChatGPT Rival
Such AI tools are a point of contention for researchers and analysts alike, for they may have a lot of utility in creative arenas and therapy, but can also be misused by scammers to target vulnerable people.
With AI bots like ChatGPT taking over the internet, our switch to AI tools seems inevitable. In fact, it's happening right now. But legislation is barely keeping up with the pace of development in the world of artificial intelligence. Regardless, tech giants are cashing in on the hype - be it Microsoft's Bing Chat, Google's Bard, and Meta's upcoming large language model.
Also read: Snapchat Unveils ChatGPT-Powered Chatbot Called 'My AI'
Do you think tech companies care about ethics while developing AI tools? Let us know in the comments below.
For more in the world of technology and science, keep reading Indiatimes.com.