In a surprising move, X, the social network owned by Elon Musk, formerly known as Twitter, took steps to address a disturbing trend on its platform.
Effective last Saturday, users attempting to search for "Taylor Swift" on X were met with an error message urging them to reload.
This measure, as clarified by X's head of business operations, Joe Benarroch, is temporary and undertaken with an abundance of caution to prioritize safety in the wake of a surge in fake graphic images of Taylor Swift.
Searches for "Taylor Swift" returned an unusual error message that read “Something went wrong. Try reloading.”
The error comes in the aftermath of sexually explicit AI-generated images of Swift circulating widely across X and various other internet platforms.
The photos, labeled as disturbing and deeply concerning by SAG-AFTRA, prompted calls for legislative action against the creation and dissemination of such explicit deepfakes without consent.
A dismayed Microsoft CEO, Satya Nadella, voiced his concern about the alarming and terrible fake images, saying, "We need to act".
"First of all, absolutely this is alarming and terrible, and so therefore yes, we have to act, and quite frankly all of us in the tech platform, irrespective of what your standing on any particular issue is — I think we all benefit when the online world is a safe world,” Satya Nadella reechoed.
Even the White House has expressed concern, with Press Secretary Karine Jean-Pierre acknowledging the need for legislation to address the circulation of non-consensual AI-generated explicit content.
The controversy began when sexually explicit deepfakes of Taylor Swift went viral on X on January 24, amassing over 27 million views within 19 hours.
The account responsible for initially posting the images was swiftly suspended, but the incident ignited a broader conversation about the ethical implications of AI-generated explicit content.
X's Response and Safety Measures
In response to the escalating situation, X's Safety team released a statement on January 25, assuring users that the company is actively removing all identified images of nonconsensual nudity.
"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content."
Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely…— Safety (@Safety) January 26, 2024
Deepfakes were widely condemned for being harmful and upsetting. SAG-AFTRA called for the criminalization of AI-generated explicit content without consent.
As the controversy unfolded, X faced scrutiny over its role in allowing the dissemination of explicit deepfakes.
The platform's decision to block searches for "Taylor Swift" can be seen as a step towards addressing these concerns.
However, questions remain about the broader responsibility of online platforms in preventing the spread of harmful content, particularly AI-generated explicit material.
The Road Ahead
X's decision to temporarily block searches for "Taylor Swift" is a strategic move aimed at containing the negative impact of the recent explicit deepfake incident.
Nonetheless, the searches for "Taylor Swift" have been restored since Monday.
The X Safety Team had taken the time over the weekend to address the issue and block accounts associated with the deepfakes.
Joe Benarroch, the head of business operations at X, has announced that the platform has re-enabled the ability to search for Taylor Swift.
He also mentioned that the platform will remain vigilant against any attempts to spread inappropriate content related to Taylor Swift, and such content will be removed immediately upon detection.