WASHINGTON (Reuters) – The Federal Bureau of Investigation has warned Americans that criminals are increasingly using artificial intelligence to create sexually explicit images to intimidate and blackmail victims.
in Alert circulated this weekThe office said it had recently observed an uptick in extortion victims, saying they were targeted using fake copies of innocent photos taken from online posts, private messages or video chats.
“The images are then sent directly to the victims by malicious actors for sexual extortion or harassment,” the warning read. “Once circulated, victims can face significant challenges in preventing continued sharing of content that has been manipulated or removed from the Internet.”
The office said the images appeared “real” and that in some cases children were targeted.
The FBI did not go into detail about the software or software used to create the sexual images, but noted that technological advances “continually improve the quality, personalization, and accessibility of artificial intelligence (AI)-enabled content creation.”
The office did not respond to a follow-up letter asking for details on the phenomenon on Wednesday.
The manipulation of innocent photos to make explicit sexual images is as old as photography itself, but the release of open source AI tools has made the process easier than ever. The results are often indistinguishable from real-life images, and several websites and social media channels specializing in the creation and exchange of AI-enabled sexual imagery have sprung up in recent years.
Reporting by Raphael Sater. Editing by David Gregorio
Our standards: Thomson Reuters Trust Principles.
“Beer aficionado. Gamer. Alcohol fanatic. Evil food trailblazer. Avid bacon maven.”