Fake Explicit Taylor Swift Images Trigger Calls For New Legislation

There are currently no federal laws against the sharing or creation of deep-fake images.

Fake Explicit Taylor Swift Images Trigger Calls For New Legislation

Following the millions of views of sexually fabricated photographs of Taylor Swift online, US senators have asked for new legislation to criminalize the creation of deep-fake images.

The photographs were shared on several social media platforms, including X and Telegram.

US Representative Joe Morelle described the circulation of the photos as "appalling."

In a statement, X stated that it was "actively removing" the photographs and taking "appropriate actions" against the accounts responsible for disseminating them.

The statement continued: "We're closely monitoring the situation to ensure that any further violations are immediately addressed and the content is removed."

While many of the photographs appear to have been removed at the time of publication, one snapshot of Swift was seen an estimated 47 million times before being taken down.

The name "Taylor Swift" is no longer searchable on X, nor are keywords like "Taylor Swift AI" and "Taylor AI."

Deepfakes use artificial intelligence (AI) to create videos of people by modifying their faces or bodies. A 2023 study discovered that there has been a 550% increase in the creation of doctored photos since 2019, driven by the growth of AI.

There are presently no federal laws prohibiting the distribution or development of deepfake photographs, while there have been state-level efforts to address the issue.

In 2023, the UK's Online Safety Act made spreading fake pornography illegal.

Democratic Representative Morelle, who last year sponsored the Preventing Deepfakes of Intimate Images Act, which would have made it illegal to disseminate deepfake pornography without consent, has urged for immediate action on the matter.

He pointed out that films and pictures "can cause irrevocable emotional, financial, and reputational harm, and unfortunately, women are disproportionately impacted."

Pornography makes up the vast bulk of deepfakes uploaded online, with women accounting for 99% of those targeted in such content, according to the State of Deepfakes report published last year.

"What's happened to Taylor Swift is nothing new," Democratic Rep. Yvette D. Clarke wrote on X. She stated that women have been targeted by technology for "years" and that "advances in AI make creating deepfakes easier and cheaper."

Swift has not commented publicly on the photographs, but the Daily Mail reports that her team is "considering legal action" against the website that published the AI-generated images.

This article was originally published on the BBC.