Taylor Swift deepfake pornography sparks renewed calls for US legislation

Deepfake pornographic images of Taylor Swift have triggered renewed calls for criminalization, with US politicians advocating federal law changes. 

Swift's explicit deepfake images, seen by millions on social media, prompted Democrat congresswoman Yvette D Clarke to highlight the pervasive issue faced by women without consent. 

While some US states have specific legislation, Democrat congressman Joseph Morelle proposed the Preventing Deepfakes of Intimate Images Act in May 2023 to address deepfake pornography sharing without consent. 

Morelle emphasized the emotional, financial, and reputational harm caused by such content, describing it as "sexual exploitation." 

Republican congressman Tom Kean Jr joined the cause, co-sponsoring Morelle's bill and introducing the AI Labeling Act to mandate labeling for all AI-generated content. 

Swift has not publicly addressed the deepfake images, and there's no comment from her US publicist. 

Deepfake technology, predominantly targeting women, has evolved since 2019, with AI generating highly convincing non-consensual pornographic material. 

The UK government criminalized nonconsensual deepfake pornography in December 2022, emphasizing the need to protect women and girls from such abusive practices. 

6 Zodiac Signs With Simple Fashion Tastes