Illustration by Alex Castro / The Verge
Facebook is a basically unprecedented piece of technology. Every month, a single platform gives over 2.5 billion people a specially curated selection of advertisements, status updates from friends and family, and automated suggestions for making new connections. Critics often focus — fairly — on that combination of scale and automation, arguing that Facebook’s algorithms promote false news and extremist content.
But a feature story from The New Yorker paints a slightly different picture of Facebook’s problems — one that’s not simply rooted in its design or size. Building on extensive earlier reporting of Facebook’s moderation efforts, author Andrew Marantz talks to former employees about why hate speech and misinformation can spread on...
Go read about how Facebook bends its rules for world leaders
The Verge
0 shares
1 views
Advertisement
Related news coverage
Atlas Shrugged: The Platform Vs The State – Analysis
Eurasia Review
The decisions social media platforms take about the content and users they host can affect livelihoods, political processes, and..
You might like
More coverage
Psycho-Covidology: Induced Helplessness And The Exploitation Of Fear – OpEd
Eurasia Review
Since March I have been living in two parallel universes. In one, I feel paralyzed with fear and helplessness, passively waiting..