Sexually explicit images of Taylor Swift generated by Artificial Intelligence emerged from a Telegram group before they were shared millions of times around the world, analysts believe.

Swift, 34, was said to be deeply distressed by the images, and members of Congress have renewed their calls for the criminalization of sharing of pornographic, nonconsensual deepfakes.

The images were first spotted on Wednesday, and spread rapidly, receiving 45 million views and 24,000 reposts on X before they were removed, 19 hours later, the Verge reported.

On Thursday, tech website 404 Media discovered that the images originated in a Telegram group, which was dedicated to making non-consensual AI generated sexual images of women.

Members of the group were annoyed at the attention the Swift images were drawing to their work, 404 Media reported.

‘I don’t know if I should feel flattered or upset that some of these twitter stolen pics are my gen,’ one user in the Telegram group said, according to the site.

Another complained: ‘Which one of you mfs is grabbing s*** here and throwing it on Twitter?

A third replied: ‘Well if there was any way to get this s*** shut down and raided it’s idiots like that.’

The images were not classic ‘deepfakes’, 404 Media reported, with Swift’s face superimposed onto someone else’s body.

Instead, they were entirely created by AI – with members of the group recommending Microsoft’s AI image generator, Designer.

Microsoft will not permit users to generate an image of a person, by putting the commands ‘Taylor Swift’.