FBI: artificial intelligence being used for ‘sextortion’ and harassment
WASHINGTON – The Federal Bureau of Investigation has warned Americans that criminals are increasingly using artificial intelligence to create sexually explicit images to intimidate and extort victims.
In an alert circulated this week, the bureau said it had recently observed an uptick in extortion victims saying they had been targeted using doctored versions of innocent images taken from online posts, private messages or video chats.
“The photos are then sent directly to the victims by malicious actors for sextortion or harassment,” the alert said. “Once circulated, victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet.”
The bureau said the images appeared “true-to-life” and that, in some cases, children had been targeted.
You may also like:
Faking It: US 2024 election collides with AI boom
AI threatens humanity’s future, 61% of Americans say – Reuters/Ipsos
The FBI did not go into detail about the program or programs being used to generate the sexual imagery but did note that technological advancements were “continuously improving the quality, customizability, and accessibility of artificial intelligence (AI)-enabled content creation.”
The bureau not respond to a follow-up message seeking details on the phenomenon Wednesday.
The manipulation of innocent pictures to make sexually explicit images is almost as old as photography itself, but the release of open-source AI tools has made the process easier than ever. The results are often indistinguishable from real life photographs, and several websites and social media channels that specialize in the creation and exchange of AI-enabled sexual imagery have sprung up in recent years.
Want stories like this delivered straight to your inbox? Stay informed. Stay ahead. Subscribe to InqMORNING