SCARS Institute's Encyclopedia of Scams™ RomanceScamsNOW.com Published Continuously for 25 Years

SCARS Institute’s Encyclopedia of Scams™ Published Continuously for 25 Years

In 2025 the SCARS Institute will enter its 11th year of Supporting Scam Victims Worldwide. Please let us know how we can better help you? Thank you for supporting our organization. SCARS Institute © 2024 www.AgainstScams.org

Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud

FBI Warning About How Criminals Exploit Generative Artificial Intelligence (AI) to Escalate Fraud, Impersonation, and Extortion Schemes

FBI WARNING – Presented by the SCARS Institute

Authors:
•  United States Federal Bureau of Investigation (FBI)
•  SCARS Institute Encyclopedia of Scams Editorial Team – Society of Citizens Against Relationship Scams Inc.

Article Abstract

The FBI has issued a warning about the increasing use of generative artificial intelligence (AI) by criminals to execute fraud on a broader and more convincing scale. Generative AI allows these fraudsters to produce realistic text, images, audio, and videos, thereby enhancing the believability of their schemes. This technology is exploited in various fraudulent activities including social engineering, spear phishing, romance, and investment scams, where AI can generate content that appears legitimate, reducing telltale signs of deceit like grammatical errors. AI-generated materials are used to create fake social media profiles, counterfeit documents, and even to impersonate individuals through voice cloning or video manipulation to deceive victims into sending money or divulging sensitive information.

The FBI advises the public to be vigilant, suggesting measures like establishing secret verification phrases with family, scrutinizing media for anomalies, and being cautious with personal information shared online. If one suspects they’ve been scammed, they should report it to the FBI’s Internet Crime Complaint Center, providing detailed information about the interaction and transactions involved.

FBI WARNING: Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud - 12/224 - on the SCARS Institute RomanceScamsNOW.com - the Encyclopedia of Scams™

FBI Warning About How Criminals Exploit Generative Artificial Intelligence (AI) to Escalate Fraud, Impersonation, and Extortion Schemes

Introduction

In an era where technology continuously evolves, the Federal Bureau of Investigation (FBI) has raised a significant alarm regarding the misuse of generative artificial intelligence (AI) by criminals to perpetrate sophisticated fraud schemes. Generative AI, capable of creating highly convincing and tailored content, is being exploited to enhance the scale and believability of fraudulent activities. By synthesizing new content based on learned patterns from user inputs, these AI tools not only streamline the process of deception but also eliminate common human errors that might otherwise alert potential victims. While the creation of synthetic content isn’t illegal per se, its application in fraud, extortion, and various scams poses a newfound challenge for law enforcement and the public. The FBI has outlined specific instances where AI is being used—from crafting convincing phishing emails to generating realistic fake identities and even cloning voices and videos for impersonation. This introduction delves into how these AI-driven tactics are reshaping the landscape of crime, making it more imperative than ever for individuals to understand and guard against these evolving threats.

FBI WARNING

The FBI is warning the public that criminals exploit generative artificial intelligence (AI) to commit fraud on a larger scale which increases the believability of their schemes. Generative AI reduces the time and effort criminals must expend to deceive their targets. Generative AI takes what it has learned from examples input by a user and synthesizes something entirely new based on that information. These tools assist with content creation and can correct for human errors that might otherwise serve as warning signs of fraud. The creation or distribution of synthetic content is not inherently illegal; however, synthetic content can be used to facilitate crimes, such as fraud and extortion.1 Since it can be difficult to identify when content is AI-generated, the FBI is providing the following examples of how criminals may use generative AI in their fraud schemes to increase public recognition and scrutiny.

AI-Generated Text

Criminals use AI-generated text to appear believable to a reader in furtherance of social engineering,2 spear phishing,3 and financial fraud schemes such as romance, investment, and other confidence schemes or to overcome common indicators of fraud schemes.

  • Criminals use generative AI to create voluminous fictitious social media profiles used to trick victims into sending money.
  • Criminals create messages to send to victims faster allowing them to reach a wider audience with believable content.
  • Criminals use generative AI tools to assist with language translations to limit grammatical or spelling errors for foreign criminal actors targeting US victims.
  • Criminals generate content for fraudulent websites for cryptocurrency investment fraud and other investment schemes.
  • Criminals embed AI-powered chatbots in fraudulent websites to prompt victims to click on malicious links.

AI-Generated Images

Criminals use AI-generated images to create believable social media profile photos, identification documents, and other images in support of their fraud schemes.

  • Criminals create realistic images for fictitious social media profiles in social engineering, spear phishing, romance schemes, confidence fraud, and investment fraud.
  • Criminals generate fraudulent identification documents, such as fake driver’s licenses or credentials (law enforcement, government, or banking) for identity fraud and impersonation schemes.
  • Criminals use generative AI to produce photos to share with victims in private communications to convince victims they are speaking to a real person.
  • Criminals use generative AI tools to create images of celebrities or social media personas promoting counterfeit products or non-delivery schemes.4
  • Criminals use generative AI tools to create images of natural disasters or global conflicts to elicit donations to fraudulent charities.
  • Criminals use generative AI tools to create images used in market manipulation schemes.
  • Criminals use generative AI tools to create pornographic photos of a victim to demand payment in sextortion schemes.

AI-Generated Audio, aka Vocal Cloning

Criminals can use AI-generated audio to impersonate well-known, public figures or personal relations to elicit payments.

  • Criminals generate short audio clips containing a loved one’s voice to impersonate a close relative in a crisis situation, asking for immediate financial assistance or demanding a ransom.
  • Criminals obtain access to bank accounts using AI-generated audio clips of individuals and impersonating them.

AI-Generated Videos

Criminals use AI-generated videos to create believable depictions of public figures to bolster their fraud schemes.

  • Criminals generate videos for real-time video chats with alleged company executives, law enforcement, or other authority figures.
  • Criminals create videos for private communications to “prove” the online contact is a “real person.”
  • Criminals use generative AI tools to create videos for fictitious or misleading promotional materials for investment fraud schemes.

Tips to protect yourself

  • Create a secret word or phrase with your family to verify their identity.
  • Look for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic teeth or eyes, indistinct or irregular faces, unrealistic accessories such as glasses or jewelry, inaccurate shadows, watermarks, lag time, voice matching, and unrealistic movements.
  • Listen closely to the tone and word choice to distinguish between a legitimate phone call from a loved one and an AI-generated vocal cloning.
  • If possible, limit the online content of your image or voice, make social media accounts private, and limit followers to people you know to minimize fraudsters’ capabilities to use generative AI software to create fraudulent identities for social engineering.
  • Verify the identity of the person calling you by hanging up the phone, researching the contact of the bank or organization purporting to call you, and call the phone number directly.
  • Never share sensitive information with people you have met only online or over the phone.
  • Do not send money, gift cards, cryptocurrency, or other assets to people you do not know or have met only online or over the phone.
If you believe you have been a victim of a financial fraud scheme, please file a report with the FBI’s Internet Crime Complaint Center at www.ic3.gov. If possible, include the following:

  • Identifying information about the individuals including name, phone number, address, and email address.
  • Financial transaction information such as the date, type of payment, amount, account numbers involved, the name and address of the receiving financial institution, and receiving cryptocurrency addresses.
  • Describe your interaction with the individual, including how contact was initiated, such as the type of communication, the purpose of the request for money, how you were told or instructed to make payment, what information you provided to the scammer, and any other details pertinent to your complaint.

Definitions

1Synthetic content refers to the class of media generated or manipulated by machine-learning-based techniques. 

2Social engineering is the act of obtaining or attempting to obtain otherwise secure data by conning an individual into revealing secure information. 

3Spear phishing is a directed attempt to trick a specific user or group of users into clicking on a malicious link or opening an attachment in an email that incorporates information intended to increase the chance of success. 

4Non-delivery scams occur when payment is sent, and goods or services are never received, or are of lesser quality. 

-/ 30 /-

What do you think about this?
Please share your thoughts in a comment below!

Article Rating

5
(1)

Table of Contents

ARTICLE CATEGORIES

POPULAR ARTICLES

Rapid Report Scammers

SCARS-CDN-REPORT-SCAMEMRS-HERE

Visit SCARS www.Anyscam.com

Quick Reporting

  • Valid Emails Only

  • This field is hidden when viewing the form
    Valid Phone Numbers Only

Subscribe & New Item Updates

In the U.S. & Canada

U.S. & Canada Suicide Lifeline 988

U.S. & Canada Suicide Lifeline 988

RATE THIS ARTICLE?

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 1

No votes so far! Be the first to rate this post.

As you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

LEAVE A COMMENT?

Your comments help the SCARS Institute better understand all scam victim/survivor experiences and improve our services and processes. Thank you


Thank you for your comment. You may receive an email to follow up. We never share your data with marketers.

Recent Comments
On Other Articles

Important Information for New Scam Victims

If you are looking for local trauma counselors please visit counseling.AgainstScams.org or join SCARS for our counseling/therapy benefit: membership.AgainstScams.org

If you need to speak with someone now, you can dial 988 or find phone numbers for crisis hotlines all around the world here: www.opencounseling.com/suicide-hotlines

A Note About Labeling!

We often use the term ‘scam victim’ in our articles, but this is a convenience to help those searching for information in search engines like Google. It is just a convenience and has no deeper meaning. If you have come through such an experience, YOU are a Survivor! It was not your fault. You are not alone! Axios!

A Question of Trust

At the SCARS Institute, we invite you to do your own research on the topics we speak about and publish, Our team investigates the subject being discussed, especially when it comes to understanding the scam victims-survivors experience. You can do Google searches but in many cases, you will have to wade through scientific papers and studies. However, remember that biases and perspectives matter and influence the outcome. Regardless, we encourage you to explore these topics as thoroughly as you can for your own awareness.

Statement About Victim Blaming

Some of our articles discuss various aspects of victims. This is both about better understanding victims (the science of victimology) and their behaviors and psychology. This helps us to educate victims/survivors about why these crimes happened and to not blame themselves, better develop recovery programs, and to help victims avoid scams in the future. At times this may sound like blaming the victim, but it does not blame scam victims, we are simply explaining the hows and whys of the experience victims have.

These articles, about the Psychology of Scams or Victim Psychology – meaning that all humans have psychological or cognitive characteristics in common that can either be exploited or work against us – help us all to understand the unique challenges victims face before, during, and after scams, fraud, or cybercrimes. These sometimes talk about some of the vulnerabilities the scammers exploit. Victims rarely have control of them or are even aware of them, until something like a scam happens and then they can learn how their mind works and how to overcome these mechanisms.

Articles like these help victims and others understand these processes and how to help prevent them from being exploited again or to help them recover more easily by understanding their post-scam behaviors. Learn more about the Psychology of Scams at www.ScamPsychology.org

Psychology Disclaimer:

All articles about psychology and the human brain on this website are for information & education only

The information provided in this article is intended for educational and self-help purposes only and should not be construed as a substitute for professional therapy or counseling.

While any self-help techniques outlined herein may be beneficial for scam victims seeking to recover from their experience and move towards recovery, it is important to consult with a qualified mental health professional before initiating any course of action. Each individual’s experience and needs are unique, and what works for one person may not be suitable for another.

Additionally, any approach may not be appropriate for individuals with certain pre-existing mental health conditions or trauma histories. It is advisable to seek guidance from a licensed therapist or counselor who can provide personalized support, guidance, and treatment tailored to your specific needs.

If you are experiencing significant distress or emotional difficulties related to a scam or other traumatic event, please consult your doctor or mental health provider for appropriate care and support.

Also read our SCARS Institute Statement about Professional Care for Scam Victims – click here to go to our ScamsNOW.com website.

If you are in crisis, feeling desperate, or in despair please call 988 or your local crisis hotline.