SCARS|RSN™ Guide: Deepfake
The History & Definition Of Deepfake
What Is Deepfake?
Deepfake, a variation from “deep learning” and “fake”, is an artificial intelligence-based human image synthesis technique. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique called a “generative adversarial network” (GAN). The combination of the existing and source videos results in a fake video that shows a person or persons performing an action at an event that never occurred in reality.
Such fake videos can be created to, for example, show a person performing sex acts they never took part in or can be used to alter the words or gestures a politician uses to make it look like that person said something they never did. Because of these capabilities, deepfakes may be used to create fake celebrity pornographic videos or revenge porn. Deepfakes can also be used to create fake news and malicious hoaxes.
Deepfake pornography surfaced on the Internet in 2017, particularly on Reddit, and has been banned by sites including Reddit, Twitter, and Pornhub. In autumn 2017, an anonymous Reddit user under the pseudonym “Deepfakes” posted several porn videos on the Internet. The first one that captured attention was the Daisy Ridley deepfake. It was also one of the more known deepfake videos and a prominent feature in several articles. Another one was a Deepfake simulation of Wonder Woman actress Gal Gadot having sex with her step-brother, while others were of actresses like Emma Watson, Katy Perry, Taylor Swift or Scarlett Johansson. The scenes were not real, having been created with artificial intelligence. They were debunked a short time later.
As time went on, the Reddit community fixed many bugs in the faked videos, making it increasingly difficult to distinguish fake from true content. Non-pornographic photographs and videos of the actresses, which are readily available online, were used as training data for the software. The Deepfake phenomenon was first reported in December 2017 in the technical and scientific section of the magazine Vice, leading to its widespread reporting in other media.
Scarlet Johansson, a frequent subject of deepfake porn, spoke publicly about the subject to The Washington Post in December 2018. In a prepared statement, she expressed concern about the phenomenon, describing the internet as a “vast wormhole of darkness that eats itself.” However, she also stated that she won’t attempt to remove any of her deepfakes due to her belief that they don’t affect her public image and that differing laws across countries and the nature of internet culture make any attempt to remove the deepfakes “a lost cause”; she believes that while celebrities like herself are protected by their fame, deepfakes pose a grave threat to women of lesser prominence who could have their reputations damaged by depiction in involuntary pornography or revenge porn.
In the United Kingdom, producers of deepfake material can be prosecuted for harassment but there are calls to make Deepfake a specific crime; in the United States, where charges as varied as identity theft, cyberstalking, and revenge porn have been pursued, the notion of a more comprehensive statute has also been explored.
Deepfake In Politics
Deepfakes have been used to misrepresent well-known politicians on video portals or chatrooms. For example, the face of the Argentine President Mauricio Macri was replaced by the face of Adolf Hitler, and Angela Merkel’s face was replaced with Donald Trump’s. In April 2018, Jordan Peele and Jonah Peretti created a deepfake using Barack Obama as a public service announcement about the danger of deepfakes. In January 2019, KCPQ aired a deepfake of Trump during his Oval Office address, mocking his appearance and skin color.
In January 2018, a desktop application called FakeApp was launched. The app allows users to easily create and share videos with faces swapped. The app uses an artificial neural network and the power of the graphics processor and three to four gigabytes of storage space to generate the fake video. For detailed information, the program needs a lot of visual material from the person to be inserted in order to learn which image aspects have to be exchanged, using the deep learning algorithm based on the video sequences and images.
The software uses the AI-Framework TensorFlow of Google, which among other things was already used for the program DeepDream. Celebrities are the main targets of such fake sex videos, but some other people are also affected. In August 2018, researchers at the University of California, Berkeley published a paper introducing a fake dancing app that can replace a child with an expert dancer using AI.
Deepfakes In Fiction
The phenomena of deepfakes and fake celebrity pornography are explored in the novel Epiphany Jones by Michael Grothaus. The novel is a literary thriller, social satire, and dark comedy about America’s obsession with sex, celebrity, and the internet, which explores a pornography addict’s unwilling relationship with a woman who thinks she can speak to God and their entanglement with sex traffickers that cater to the Hollywood elite. The main protagonist of the novel, Jerry Dresden, suffers from an addiction to fake pornography of celebrities. Fakes of specific celebrities mentioned in the novel include Natalie Portman, Scarlet Johansson, Halle Berry, Angelina Jolie, Kate Beckinsale, and the Pussycat Dolls. Additionally, fakes of deceased celebrities including Judy Garland and Audrey Hepburn are mentioned in the novel.
Criticisms Of Deepfake
Abuses and Potential Abuses
The Aargauer Zeitung says that the manipulation of images and videos using artificial intelligence could become a dangerous mass phenomenon. However, the falsification of images and videos is even older than the advent of video editing software and image editing programs; in this case, it is the realism that is a new aspect.
It is also possible to use Deepfake for targeted hoaxes and revenge porn.
Effects On Credibility And Authenticity
Another effect of Deepfake is that it can no longer be distinguished whether the content is targeted (e.g. satire) or genuine. AI researcher Alex Champandard has said everyone should know how fast things can be corrupted today with this technology, and that the problem is not technical one, but rather one to be solved by trust in information and journalism. The primary pitfall is that humanity could fall into an age in which it can no longer be determined whether the depicted media corresponds to the truth.
Internet Reaction To Deepfake
Some websites, such as Twitter and Gfycat, announced that they would delete Deepfake content and block its publishers. Previously, the chat platform Discord blocked a chat channel with fake celebrity porn videos. The pornography website, Pornhub, also plans to block such content; however, it has been reported that the site has not been enforcing its ban. At Reddit, the situation initially remained unclear until the subreddit was suspended on February 7, 2018, due to the policy violation of “involuntary pornography”. In September 2018, Google added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the block of results of their real or fake nudes.
The Effect In Romance Scams Of Deepfake
The impact of using Deepfake videos in the deception associated with Romance Scams and other types of Socially Engineered Cyberfraud are expected to be massive. These videos will not only allow a whole new level of stolen identities and impersonation but will also allow the scammers to convince victims of virtually any false story.
They are expected to allow for recorded messages, prerecorded chats, and a whole range of uses in creating fake social media identities.
Because of the “apparent” authenticity of the Deepfake produced video content, almost no consumer will be fully prepared to be able to tell the difference between the Deepfake and real videos.
Deepfake videos will likely increase the number of new victims because those who would normally not be susceptible to scams will now be.
- “This new technology could send American politics into a tailspin” »
- “Terrifying high-tech porn: Creepy ‘deepfake’ videos are on the rise” »
- “You thought fake news was bad? Deep fakes are where truth goes to die” »
- “What Are Deepfakes & Why the Future of Porn is Terrifying” »
- “Experts fear face swapping tech could start an international showdown” »
Share your thoughts on Deepfake?
Please leave your comments below.
A SCARS Division
Miami Florida U.S.A.
TAGS: Deepfake, Deep fake, AI Created Video, Face Swapping Tech, Impersonation Videos, Identity Stealing Videos, AI Produced Videos, artificial intelligence-based human image synthesis technique, generative adversarial network, videos using artificial intelligence,
– – – – – – – – – – – – – – END – – – – – – – – – – – – – –
– – – – – – – – – – – – – – – – – – – – –
Tell us about your experiences with Romance Scammers in our Scams Discussion Forum on Facebook »
– – – – – – – – – – – – – – – – – – – – –
FAQ: How Do You Properly Report Scammers?
It is essential that law enforcement knows about scams & scammers, even though there is nothing (in most cases) that they can do.
Always report scams involving money lost or where you received money to:
- Local Police – ask them to take an “informational” police report – say you need it for your insurance
- Your National Police or FBI (www.IC3.gov »)
- The SCARS|CDN™ Cybercriminal Data Network – Worldwide Reporting Network HERE » or on www.Anyscam.com »
This helps your government understand the problem, and allows law enforcement to add scammers on watch lists worldwide.
– – – – – – – – – – – – – – – – – – – – –
Visit our NEW Main SCARS|RSN Facebook page for much more information about scams and online crime: www.facebook.com/RSN.Main.News.And.Inromation.Home.Page »
To learn more about SCARS visit www.AgainstScams.org
Please be sure to report all scammers HERE » or on www.Anyscam.com »
All original content is Copyright © 1991 – 2019 SCARS All Rights Reserved Worldwide & Webwide – RSN/Romance Scams Now & SCARS/Society of Citizens Against Relationship Scams are all trademarks of Society of Citizens Against Relationship Scams Incorporated (formerly the Society of Citizens Against Romance Scams)
All original content is Copyright © 1991 – 2019 SCARS All Rights Reserved Worldwide & Webwide. Third-party copyrights acknowledge.
SCARS|RSN, RSN, Romance Scams Now, SCARS|WORLDWIDE, SCARS|GLOBAL, SCARS, Society of Citizens Against Relationship Scams, Society of Citizens Against Romance Scams, SCARS|ANYSCAM, Project Anyscam, Anyscam, SCARS|GOFCH, GOFCH, SCARS|CHINA, SCARS|CDN, SCARS|UK, SCARS Cybercriminal Data Network, Cobalt Alert, Scam Victims Support Group, are all trademarks of Society of Citizens Against Relationship Scams Incorporated.
Contact the law firm for the Society of Citizens Against Relationship Scams Incorporated by email at legal@AgainstScams.org
Latest posts by SCARS|RSN Romance Scams Now Editorial Team (see all)
- Lt. General Patricia D. Horoho: Have You Seen Her? Another Stolen Face / Stolen Identity - February 22, 2019
- The Great Privacy Awakening – SCARS|RSN™ Cybersecurity News - February 21, 2019
- Americans in These States Are Most Likely to be Catfishing Victims – SCARS|RSN™ SCAM NEWS - February 21, 2019
- 21 Online Scams You May Not Know About [Infographic] – SCARS|RSN™ Insight - February 21, 2019
- Colonel Tristan B.L. Siegel – SCARS|RSN™ Victims Speak Out - February 21, 2019