J
Those explicit AI-generated Taylor Swift images, unsurprisingly, came from 4Chan.
Bloomberg reports that users on the forum have been competing to find ways to bypass keyword filters on AI image-generation tools that prevent sexually explicit materials from being produced. The images moved from 4Chan to Telegram to X where they found viral attention.
Research from social network analysis company Graphika found that Swift isn’t even the community’s most frequently targeted victim, which range from global celebrities to school children.
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Loading comments
Getting the conversation ready...
Most Popular
Most Popular
- Meta’s historic loss in court could cost a lot more than $375 million
- Apple raises the Mac Mini’s starting price
- How the internet’s favorite squirrel dad made the hottest camera app of 2026
- These reusable digital Polaroids are a clever way to cover a fridge in memories
- AI music is flooding streaming services — but who wants it?











