If somebody called you from an unknown number, claiming to be from your bank, and asked you to "confirm" your sensitive account information, you would probably recognize right away that this was a scam and hang up. But, what if the voice on the other end of the line was a loved one? You recognize them as someone from your family. They sound upset as they describe an emergency and beg for your help. Who could be so cold-hearted as to refuse their request for money?
Unfortunately, this is a new kind of scam that has been using the power of Artificial Intelligence (AI) to fool tens of thousands of people into sending millions of dollars to online thieves. The Washington Post tells the story of a Canadian couple in their 70s who received a call from someone who sounded exactly like their grandson Brandon. He said he was in jail, with no wallet or cellphone, and needed cash for bail. “We were sucked in,” his grandmother said. “We were convinced that we were talking to Brandon.”
The couple dashed down to their bank and withdrew the daily maximum ($2,207 in U.S. currency). Then they hurried to a second branch for more money. But an alert bank manager pulled them aside and told them how another patron had gotten a similar call and learned that the eerily accurate voice had been faked. That’s when they realized they’d been duped.1
The Post reports that technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress. "In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family, according to data from the Federal Trade Commission. Over 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said."
Advancements in AI technology now allow bad actors to replicate a voice with an audio sample of just a few sentences. Easily available online tools can then translate an audio file into a replica of a voice, allowing a scammer to make it “speak” whatever they type.
It's difficult for law enforcement to find and prosecute these thieves. And Vice reports that "the courts have not yet decided when or if companies will be held liable for harms caused by deepfake voice technology—or any of the other increasingly popular AI technology, like ChatGPT—where defamation and misinformation risks seem to be rising."2
One way to short-circuit this type of scam is to ask the "loved one" about something only he or she would know. But the surest way to confirm that the need is genuine is to contact family to find out if the loved one is the location and situation they claim. With scams like this on the rise, be highly vigilant of any unsolicited requests for money or personal information. We are happy to share resources to help you protect yourself from online thieves.
On its surface, investing for retirement seems to be all about getting the numbers right. Your timeline, your tax strategy, your portfolio allocation, and your income goals are all things that can be...
America has roughly 5 million people aged 65 to 74 who are single and childless.1 While these solo retirees report that they enjoy their independence and added freedom, they are also worried about...
In the perfect world and every good Rom-Com and Classic Novel, the protagonist has their life turned around with a surprise inheritance. Not surprisingly, it's a popular idea outside of fiction. And...