Facebook has long struggled with controlling extremist content on its platform.
A slide from that presentation said if these algorithms are left unchecked they would feed users more divisive content:Our algorithms exploit the human brain’s attraction to divisiveness.
If left unchecked, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.
Earlier this month, Facebook named its Oversight Board —its Supreme Court, if you will, which can overrule the social network’s decision on content moderation.
You can read WSJ’s full report on Facebook’s divisive algorithms and its internal studies here.

Comments to: Facebook reportedly knew its algorithms promoted extremist groups, but did nothing

Your email address will not be published. Required fields are marked *

Attach images - Only PNG, JPG, JPEG and GIF are supported.

Login

Welcome to Typer

Brief and amiable onboarding is the first thing a new user sees in the theme.
Join Typer
Registration is closed.