Directory of AI Scams Targeting Consumers

Published on 2024-11-27 by Silver Keskkula

tsunami wave image

We're launching a directory of AI scams specifically targeting consumers.

Since the invention of Generative Adversarial Networks scammers have started using AI systematically to improve the effectiveness of their attacks. Technologies that allow the creation of deep fake clones of victims' voices and appearance have become commonplace.

Here we present a directory of different types of AI scams that are typically used to target consumers with a specific focus on those targeting seniors and elderly. We'll start with high level overviews of the different types of scams and also shine light on how AI has changed the game.

Family Emergency Scam with AI

This is one of the most common types of scams involving calling a parent (the victim) from an unknown number and presenting a situation in which their offspring has gotten in some sort of trouble and requires help. The scenario of such a fake situation ranges from an innocent "I lost my phone and need money for a bus ticket" all the way to a sophisticated legal mess involving a cop and a lawyer catching the offspring with indecent exposure on a public transport and blackmailing the target into resolving the alleged situation through making a payment.

What has changed since the origin of such "ore ore scams" is the fact that AI makes it easy to clone a family member's voice and thus lend credibility to the postulated emergency. This is further exacerbated by making it possible to automate a large part of the flow and scale such scams to large volumes.

Romance Scam with AI

Scams that take advantage of loneliness and play on people's desire to find partners have also been around for a very long time. They typically involve a seemingly above average (or mostly too good to be true) prospect for a partner that finds contact with the victim through an online dating platform or more recently through direct contacts on social media and proceeds to groom the target with promises of a beautiful future together. These types of scams can go on for a substantial period of time (even years) in order to earn the trust of the victim to set them up for a larger (or repeating) transaction. The actual fraudulent transaction typically involves helping out the alleged lover who has found himself in a pickle and needs help. These range from needing support to buy a plane ticket to allegedly come see the victim, all the way to convincing the victim to support the "romantic partner" in buying a house for their future together.

What has changed with the invention of Large Language Models is the fact that these scams that typically involved a substantial time commitment and dedication from a limited number of people that the scammer might have at their command to a fully automated and scalable army of AI bots proceeding with the conversations in unprecedented volumes. A single scammer that might have been able to groom 10 subjects might now be able to handle hundreds if not thousands of conversations in parallel.

Job Scams with AI

Job Scams typically involve fraudsters posting fake employment opportunities on job boards with promises of easy money. These jobs tend not to require advanced degrees or professional backgrounds that would otherwise be associated with such high pay. The actual fraudulent payment is extracted by disguising it as an application processing fee or a fee for materials required at the newly attained job opportunity. With these scams again the elderly are potentially more lucrative targets for the scammer as seniors might have low income, trouble finding jobs and thus are more eager to find opportunities for additional income.

Here what the AI has done is made all of those interactions automatable and again enabling a single fraudster to scale the attack far beyond to their prior reach while at the same time making the fraud hard to spot given the natural and potentially long interactions that lead up to taking advantage of the target.

Fake Legal Notice Scams with AI

Fake legal notice scams try to exploit the target's fear by sending fraudulent paperwork claiming to be official legal notices. These scams often involve emails, letters, or calls claiming the recipient owes money or has violated the law. Victims are pushed into action by threatening fines, lawsuits, or even arrest. The scammers typically impersonate authorities, like lawyers, debt collectors, or even court officials, to make the requests seem more legitimate. Their goal is to get the victims to pay fees or share sensitive information. It's the anxiety and the willingness to comply with official-sounding demands (especially true for seniors) that the fraudsters rely on to make the scam effective.

With Generative AI it has become easier to create new official looking documents or even companies and at the same time provide a personalized touch and varying content making it harder for simple pattern matching systems to catch this type of fraud. The conversation leading to the payments can also be automated again enabling both scale and natural flow that makes this scam more effective with AI.

How to defend your family from these types of AI scams?

We'll get into how to guard yourself against this type of AI scams in one of our future posts, but for now we're obviously taking the opportunity to introduce you to our own approach.

Here at ScamGuardian.AI we're all about helping defend families against AI scams. You can start with our first product - a quick wakeup call to your family members that uses your own voice and simulates a scam (granted with just trying to extract some basic information, not money). With two minutes of your time we will make sure your family members get a memorable experience and education needed to stay vigilant and know how to respond when the real scammers show up. Read more about it and come give it a try at

link to scamguardian.ai

In the next post we'll explore the following AI scams:

You can help spread awareness of the looming AI-scam threat by sharing some of our posts: