Deepfake explicit images of Taylor Swift spread on social media. Her fans are fighting back
Published
A scourge of pornographic deepfake images generated by artificial intelligence and sexualizing people without their consent has hit its most famous victim, singer Taylor Swift. The deepfake-detecting group Reality Defender says it tracked a deluge of nonconsensual pornographic material this week depicting Swift, particularly on X. Some images also made their way to Meta-owned Facebook and other social media platforms. When reached for comment, X directed the AP to a post from its safety account that said the company strictly prohibits the sharing of non-consensual nude images on its platform. Meta says it strongly condemns the content and has worked to remove it.
Full Article