News (USA)

FBI warns that criminals can use AI to create fake sexual images for blackmail

deepfake, FBI, sexual images, pornography, sextortion, crime
Photo: Shutterstock

The FBI has warned that artificial intelligence (AI) technology has enabled “malicious actors” to take non-consenting people’s personal photos and videos and insert their likenesses into pornographic “deepfake” images in order to harass or “sextort” them.

Minor children and adults have had their images taken from their video chats, social media accounts, or the open internet and then incorporated into sexually explicit content through the aid of AI technology, the FBI wrote in a Monday announcement.

These images are then shared on social media and pornographic websites or are sometimes sent directly to victims’ families or social media contacts, unless the victims pay a ransom or agree to other actions, like supplying personal information or other real-life explicit images.

Some victims may not realize that they appear in these digitally manipulated images until someone else first sees them online. Even if victims comply with an extortionist’s demands, it can be difficult to completely remove these images from the internet. In fact, the images may be continually re-shared for years to come, newly embarrassing and re-victimizing people again and again — something that can result in post-traumatic stress disorder (PTSD) and continued financial exploitation.

The FBI suggests that web users, particularly children, exercise caution when posting or direct messaging any personal photos, videos, and identifying information on social media, dating apps, or other online sites; especially when engaging with strangers. Web users may also want to regularly change their passwords, enable multi-factor authentication, or adjust their social media privacy settings to limit other people’s access to their personal images and videos.

Those who suspect that they’ve been victimized by these tactics should collect the images and as much information about where the original and deep faked images came from and contact the FBI’s Internet Crime Complaint Center, the FBI Field Office, or the National Center for Missing and Exploited Children (NCMEC).

The NCMEC also offers a free “Take It Down” service which can help victims under the age of 18 (who have possession of the image or video files) remove or stop the online sharing of deepfakes.

Don't forget to share:

Support vital LGBTQ+ journalism

Reader contributions help keep LGBTQ Nation free, so that queer people get the news they need, with stories that mainstream media often leaves out. Can you contribute today?

Cancel anytime · Proudly LGBTQ+ owned and operated

“Straight Best Friend” is a groundbreaking gay web series written by ChatGPT

Previous article

A song of revolution

Next article