Facebook Staff Say Core Products Make Misinformation Worse

Fictional account of conservative woman devolved into a `quite troubling, polarizing state in an extremely short amount of time’

Photographer: Damien Meyer/AFP/Getty Images

Lock
This article is for subscribers only.

For years, Facebook has fought back against allegations that its platforms play an outsized role in the spread of false information and harmful content that has fueled conspiracies, political divisions and distrust in science, including Covid 19 vaccines.

But research, analysis and commentary contained in a vast trove of internal documents indicate that the company’s own employees have studied and debated the issue of misinformation and harmful content at length, and many of them have reached the same conclusion: Facebook’s own products and policies make the problem worse.

In 2019, for instance, Facebook created a fake account for a fictional, 41-year-old North Carolina mom named Carol, who follows Donald Trump and Fox News, to study misinformation and polarization risks in its recommendation systems. Within a day, the woman’s account was directed to “polarizing” content and within a week, to conspiracies including QAnon.