Bluesky deleted a viral, AI-generated protest video in which Donald Trump is sucking on Elon Musk’s toes because its moderators said it was “non-consensual explicit material.” The video was broadcast on televisions inside the office Housing and Urban Development earlier this week, and quickly went viral on Bluesky and Twitter.
Independent journalist Marisa Kabas obtained a video from a government employee and posted it on Bluesky, where it went viral. Tuesday night, Bluesky moderators deleted the video because they said it was “non-consensual explicit material.”
Other Bluesky users said that versions of the video they uploaded were also deleted, though it is still possible to find the video on the platform.
Technically speaking, the AI video of Trump sucking Musk’s toes, which had the words “LONG LIVE THE REAL KING” shown on top of it, is a nonconsensual AI-generated video, because Trump and Musk did not agree to it. But social media platform content moderation policies have always had carve outs that allow for the criticism of powerful people, especially the world’s richest man and the literal president of the United States.
For example, we once obtained Facebook’s internal rules about sexual content for content moderators, which included broad carveouts to allow for sexual content that criticized public figures and politicians. The First Amendment, which does not apply to social media companies but is relevant considering that Bluesky told Kabas she could not use the platform to “break the law,” has essentially unlimited protection for criticizing public figures in the way this video is doing.
Content moderation has been one of Bluesky’s growing pains over the last few months. The platform has millions of users but only a few dozen employees, meaning that perfect content moderation is impossible, and a lot of it necessarily needs to be automated. This is going to lead to mistakes. But the video Kabas posted was one of the most popular posts on the platform earlier this week and resulted in a national conversation about the protest. Deleting it—whether accidentally or because its moderation rules are so strict as to not allow for this type of reporting on a protest against the President of the United States—is a problem.
Ah, the rewards of moderation: the best move is not to play. Fuck it is & has always been a better answer. Anarchy of the early internet was better than letting some paternalistic authority decide the right images & words to allow us to see, and decentralization isn’t a bad idea.
Yet the forward-thinking people of today know better and insist that with their brave, new moderation they’ll paternalize better without stopping to acknowledge how horribly broken, arbitrary, & fallible that entire approach is. Instead of learning what we already knew, social media keeps repeating the same dumb mistakes.
You do remember snuff and goatse and csam of the early internet, I hope.
Even with that of course it was better, because that stuff still floats around, and small groups of enjoyers easily find ways to share it over mainstream platforms.
I’m not even talking about big groups of enjoyers, ISIS (rebranded sometimes), Turkey, Azerbaijan, Israel, Myanma’s regime, cartels and everyone share what they want of snuff genre, and it holds long enough.
In text communication their points of view are also less likely to be banned or suppressed than mine.
So yes.
They don’t think so, just use the opportunity to do this stuff in area where immunity against it is not yet established.
There are very few stupid people in positions of power, competition is a bitch.
Sure. Unless you live in a place that have laws and laws enforcement. In that case, it’s “fuck it and get burnt down”.
You need some kind of moderation for user generated content, even if it’s only to comply with takedowns related to law (and I’m not talking about DMCA).
I had to hack an ex’s account once to get the revenge porn they posted of me taken down.
There’s a balance at the end of the day.
Illegal content has always been unprotected & subject to removal by the law. Moderation policies wouldn’t necessarily remove porn presumed to be legal, either, so moderation is still a crapshoot.
Still, that sucks.