Facebook reportedly knew its algorithms promoted extremist groups, but did nothing

Facebook reportedly knew its algorithms promoted extremist groups, but did nothing

The Next Web

Published

Facebook has long struggled with controlling extremist content on its platform. From the 2016 US elections, when Russians were able to manipulate American voters through polarizing ads, to propaganda that spread through the social network and led to violence in Myanmar. A new report by Jeff Horwitz and Deepa Seetharaman in the Wall Street Journal suggests that Facebook knew that its algorithm was dividing people, but did very little to address the problem. It noted that one of the company’s internal presentations from 2018 illustrated how Facebook’s algorithm aggravated polarizing behavior in some cases. A slide from that presentation said if these algorithms are…

This story continues at The Next Web

Or just read more coverage about: Facebook

Full Article