As Musk is learning, content moderation is a messy job

As Musk is learning, content moderation is a messy job

SeattlePI.com

Published

Now that he’s back on Twitter, neo-Nazi Andrew Anglin wants somebody to explain the rules.

Anglin, the founder of an infamous neo-Nazi website, was reinstated Thursday, one of many previously banned users to benefit from an amnesty granted by Twitter's new owner Elon Musk. The next day, Musk banished Ye, the rapper formerly known as Kanye West, after he posted a swastika with a Star of David in it.

“That's cool," Anglin tweeted Friday. “I mean, whatever the rules are, people will follow them. We just need to know what the rules are.”

Ask Musk. Since the world’s richest man paid $44 billion for Twitter, the platform has struggled to define its rules for misinformation and hate speech, issued conflicting and contradictory announcements, and failed to full address what researchers say is a troubling rise in hate speech.

As the “ chief twit ” may be learning, running a global platform with nearly 240 million active daily users requires more than good algorithms and often demands imperfect solutions to messy situations — tough choices that must ultimately be made by a human and are sure to displease someone.

A self-described free speech absolutist, Musk has said he wants to make Twitter a global digital town square. But he also said he wouldn't make major decisions about content or about restoring banned accounts before setting up a “ content moderation council ” with diverse viewpoints.

He soon changed his mind after polling users on Twitter, and offered reinstatement to a long list of formerly banned users including ex-president Donald Trump, Ye, the satire site The Babylon Bee, the comedian Kathy Griffin and Anglin, the neo-Nazi.

And while Musk's own tweets suggested he would allow all legal content on the platform, Ye's banishment shows that's not entirely the case....

Full Article