How to protect yourself from AI scams this holiday season

0
12

Scammers are using generative artificial intelligence tools to create more convincing fake text and voices to commit fraud, according to a recent FBI warning to the public.

Olivier Morin/AFP via Getty Images

Don’t be duped by a scam made with artificial intelligence tools this holiday season. The FBI issued a public service announcement earlier this month, warning criminals are exploiting AI to run bigger frauds in more believable ways.

While AI tools can be helpful in our personal and professional lives, they can also be used against us, said Shaila Rana, a professor at Purdue Global who teaches cybersecurity. “[AI tools are] becoming cheaper [and] easier to use. It’s lowering the barrier of entry for attackers so scammers can create really highly convincing scams.”

There are some best practices for protecting yourself against scams in general, but with the rise of generative AI, here are five specific tips to consider.

Beware of sophisticated phishing attacks

The most common AI-enabled scams are phishing attacks, according to Eman El-Sheikh, associate vice president of the Center for Cybersecurity at the University of West Florida. Phishing is when bad actors attempt to obtain sensitive information to commit crimes or fraud.”[Scammers are using] generative AI to create content that looks or seems authentic but in fact is not,” said El-Sheikh.

“Before we would tell people, ‘look for grammatical errors, look for misspellings, look for something that just doesn’t sound right.’ But now with the use of AI … it can be extremely convincing,” Rana told NPR.

However, you should still check for subtle tells that an email or text message could be fraudulent. Check for misspellings in the domain name of email addresses and look for variations in the logo of the company. “It’s very important to pay attention to those details,” said El-Sheikh.

Create a code word with your loved ones

AI-cloned voice scams are on the rise, Rana told NPR. “Scammers just need a few seconds of your voice from social media to create a clone,” she said. Combined with personal details found online, scammers can convince targets that they are their loved ones.

Family emergency scams or “grandparent scams” involve calling a target, creating an extreme sense of urgency by pretending to be a loved one in distress, and asking for money to get them out of a bad situation. One common scheme is telling the target their loved one is in jail and needs bail money.

Rana recommends coming up with a secret code word to use with your family. “So if someone calls claiming to be in trouble or they’re unsafe, ask for the code word and then [hang up and] call their real number [back] to verify,” she said.

You can also buffer yourself against these types of scams by screening your calls. “If someone’s calling you from a number that you don’t recognize that is not in your contacts, you can go ahead and automatically send it to voicemail,” says Michael Bruemmer, head of the global data breach resolution group at the credit reporting company Experian.

Lock down your social media accounts

“Social media accounts can be copied or screen scraped,” warned Bruemmer. To prevent impersonation, reduce your digital footprint. “Set social media accounts to private, remove phone numbers from public profiles. And just be careful and limit what personal information you share publicly,” said Sana. Leaving your social media profiles public “makes it easier for scammers to get a better picture of who you are, [and] they can use [that] against you,” she said.

Sophisticated scammers will glean information from social media accounts to craft more personalized messages to their intended victims.

Clement Mahoudeau/AFP via Getty Images

Carefully check the web address before inputting any sensitive information

Scammers can use AI to make fake websites that seem legitimate. The FBI notes AI can be used to generate content for fraudulent websites for cryptocurrency scams and other types of investment schemes. Scammers have also been reported to embed AI-powered chatbots in these websites, in an effort to prompt people to click on malicious links.

“You should always check your browser window … and make sure that [you’re on] an encrypted site. It [will start] with https://,” said Bruemmer. He also said to make sure the website domain is spelled correctly, “[fraudulent websites] can have a URL that is just one letter or character off.”

If you’re still on the fence about whether the website you’re using is legit, you can try looking up the age of a site by searching WhoIs domain lookup databases. Rana said to be extremely wary of websites that were only recently created. Amazon, for example, was founded in 1994. If the WhoIs database says the “Amazon” site you’re looking up was created this millennium, you know you’re in the wrong place.

Be wary of photos and videos prompting you to send money

The FBI warns generative AI tools have been used to create images of natural disasters and global conflict in an attempt to secure donations for fraudulent charities. They have also been used to create deepfake images or videos of famous people promoting investment schemes and non-existent or counterfeit products.

When you come across a photo or video prompting you to spend money, use caution before engaging. Look for common telltale signs that a piece of media could be a deepfake. As Shannon Bond reported for NPR in 2023, when it comes to creating photos, AI generators “can struggle with creating realistic hands, teeth and accessories like glasses and jewelry.” AI-generated videos often have tells of their own, “like slight mismatches between sound and motion and distorted mouths. They often lack facial expressions or subtle body movements that real people make,” Bond wrote.

“It’s very important for all of us to be responsible in a digital AI-enabled world and do that on a daily basis … especially now around the holidays when there’s an uptick in such crimes and scams,” said El-Sheikh.

LEAVE A REPLY

Please enter your comment!
Please enter your name here