Voice Deepfakes Are Calling

  • Voice Deepfakes Are Calling

After a hard day at work, you have just arrived home and are getting about to settle down for dinner when all of a sudden your phone begins vibrating. Someone close to you, maybe a parent, kid, or childhood friend, is on the other end, pleading with you to pay them money right away.

You try to comprehend them by asking them questions. Their responses are either evasive or out of character, and occasionally there is an odd wait that almost seems like they are thinking a bit too slowly. However, you are positive that it is your loved one calling since you can hear their voice and their number is displayed on the caller ID. You obediently transfer the funds to the bank account they provide you, attributing the oddity to their fright.

You phone them again the following day to confirm everything is fine. Your loved one doesn't understand what you're saying. That's because you were duped by technology—a voice deepfake—since they never phoned you. This fraud affected thousands of individuals in 2022.

As experts in computer security, we can see how ongoing developments in deep-learning algorithms, audio engineering and editing, and synthetic speech creation have made it easier to accurately mimic a person's voice.

Even worse, chatbots like ChatGPT are already producing scripts that are lifelike and have real-time adaptive replies. A deepfake transforms from a static recording to a living, lifelike avatar that can successfully have a phone conversation by fusing these technologies with speech production.

Cloning a voice

Cloning a voice

It's difficult to make a convincing, high-caliber deepfake, whether it's audio or video. It calls for a wide range of creative and technical abilities, strong hardware, and a sizable sample of the intended voice.

Some speech deepfake programmes just require a sample of a minute or less to make a voice clone that might be convincing enough to trick someone. There are increasing numbers of services promising to create moderate- to high-quality voice clones for a price. However, it would probably require a far bigger sample to persuade a loved one—for instance, to employ in an impersonation hoax.

Protecting against scams and disinformation

Having said that, the DeFake Project at the Rochester Institute of Technology, the University of Mississippi, Michigan State University, and other institutions is working hard to develop the ability to recognise video and audio deepfakes and reduce the damage they inflict. There are also simple, commonplace steps you may take to safeguard yourself.

To begin with, voice phishing, or "vishing," scams are the most typical voice deepfakes you could come across in daily life, both at work and at home. They include the scam above. An energy company was conned out of US$243,000 in 2019 after thieves impersonated the boss of its parent company to tell an employee to transfer money to a supplier. By using phoney voices, including those of close friends and family members, victims were conned out of an estimated $11 million in 2022.

Even from somebody you know well, be wary about unexpected phone calls. While it's not necessary to plan every call, it does assist to at least email or text message in advance. Don't rely on caller ID either because it might be spoofed. For instance, if someone calls you claiming to be from your bank, hang up and call the bank to verify the call's validity. Use the number you have jotted down, stored in your contacts list, or can find on Google, and don't use any other numbers.

You should also exercise caution when disclosing any personal information that may be used to identify you, such as your Social Security number, home address, birth date, phone number, middle name, and even the names of your children and pets. Scammers can use this information to pose as you when dealing with banks, realtors, and other businesses, making money off of your bankruptcy or ruining your credit while doing so.

Another piece of advise is to know oneself. In particular, be aware of your weaknesses and cognitive and emotional biases. In general, this is sound advice for living, but it's especially important to guard against manipulation. Scammers generally try to identify your money concerns, political allegiances, or other tendencies before preying on them.

This vigilance also serves as a respectable deterrent against voice deepfakes used to spread misinformation. Your confirmation bias, or what you are inclined to believe about someone, may be exploited by deepfakes.

You would be prudent to exercise caution if an influential person, whether from your community or the government, said anything that was remarkably out of the ordinary for them or confirmed your worst preconceptions about them.


Newsletter

wave

Related Articles

wave
The best smart locks for 2023

If you’re searching for a great way to modernize your home, look no further than smart locks.

Meta to introduce AI agents to ‘billions of people’

The Zuck made the remarks during a call with investors on Wednesday following Meta’s release of company data for the first quarter.

Smart Cities in the Future

Smart Cities in the Future. Visions of What’s Possible

This ‘Airliner of the Future’ Has a Radical New Wing Design

NASA and Boeing are collaborating to create the Sustainable Flight Demonstrator, which will feature long wings supported by trusses. It could first fly in 2028.