AI Deepfakes Becoming Tool of Choice in Cyber Extortion, Says FBI – How is Crypto Being Used in the Process?

Artificial Intelligence Elon Musk
Last updated:
Author
Author
Ruholamin Haqshanas
About Author

Ruholamin Haqshanas is a contributing crypto writer for CryptoNews. He is a crypto and finance journalist with over four years of experience. Ruholamin has been featured in several high-profile crypto...

Last updated:
Why Trust Cryptonews
Cryptonews has covered the cryptocurrency industry topics since 2017, aiming to provide informative insights to our readers. Our journalists and analysts have extensive experience in market analysis and blockchain technologies. We strive to maintain high editorial standards, focusing on factual accuracy and balanced reporting across all areas - from cryptocurrencies and blockchain projects to industry events, products, and technological developments. Our ongoing presence in the industry reflects our commitment to delivering relevant information in the evolving world of digital assets. Read more about Cryptonews
Ad DisclosureWe believe in full transparency with our readers. Some of our content includes affiliate links, and we may earn a commission through these partnerships. Read more
Image Source: iStock

The Federal Bureau of Investigation has issued a stark warning about the growing threat of “deepfakes” being used in cyber extortion. 

In a recent report, the FBI said that malicious actors are using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, and create sexually-themed images that appear authentic.

They then circulate these photos on social media or pornographic websites for the purpose of sextortion schemes or to harass the victim.

The FBI mentioned that the improvements in the quality, customizability, and accessibility of artificial intelligence-enabled image generators have further contributed to the growth of deepfakes. 

The commission said it has received reports from victims, including minors, whose photos or videos were altered to create explicit content that was then publicly circulated. 

Many victims were unaware their images had been copied, manipulated, and circulated until it either came to their attention or they stumbled across them online. 

Once the manipulated content is circulated, victims face significant challenges in preventing its continual sharing or removal from the internet.

“Malicious actors have used manipulated photos or videos with the purpose of extorting victims for ransom or to gain compliance for other demands (e.g., sending nude photos),” the FBI said. 

The federal agency recommended that people exercise caution when posting or direct messaging personal photos, videos, and identifying information on social media, dating apps, and other online sites. 

Moreover, people should use discretion when posting images, videos, and personal content online, particularly those that include children or their information, as they can be captured, manipulated, and distributed by malicious actors without your knowledge or consent. 

Applying privacy settings on social media accounts, running frequent online searches for personal information, using reverse image search engines, exercising caution when accepting friend requests or communicating with unknown or unfamiliar individuals, and securing online accounts with complex passwords and multi-factor authentication are also among the FBI’s recommendations.

Deepfakes Used to Target Crypto Users

As of late, there have also been instances where deepfakes were used to target unsuspicting crypto users.

For instance, in May, a deepfake of Tesla and Twitter CEO Elon Musk was created to promote a crypto scam. The video contained footage of Musk from past interviews, manipulated to fit the fraudulent scheme.

Scam promoters have long resorted to deepfakes to drum up demand among potential crypto investors. 

Scammers impersonate anyone from influencers to high-profile crypto figures, but also ordinary people to gain victims’ trust.

Last year, Miranda, an e-commerce worker who did not wish to disclose her real name because her company had not given her permission to speak publicly, was targeted by such an attack when imposters released a deepfake video of the Melbourne woman promoting a crypto scam and published it on her Instagram account.

More Articles

Features
Strategic Bitcoin Reserves: Everything You Need to Know
Connor Sephton
Connor Sephton
2025-01-16 11:52:03
Opinions
The Wisdom of Crowds: How Blockchain Could Transform Medical Research
Chris Crecelius
Chris Crecelius
2025-01-16 11:01:42
Crypto News in numbers
editors
Authors List + 66 More
2M+
Active Monthly Users Around the World
250+
Guides and Reviews Articles
8
Years on the Market
70
International Team Authors