Why did Facebook ban QAnon now?

The social network and the heap of sand

Why did Facebook ban QAnon now?

I.

Today let’s talk about Facebook’s post-election ban on election ads, its pre-election ban on QAnon accounts, and why platforms make the decision to get rid of things.

There’s an ancient problem in philosophy called the paradox of the heap. Imagine a big pile of sand — a million grains of it, let’s say. Then remove a grain or two from the pile, then a few hundred, and then a few thousand. Despite your work, most of us would still say we are still looking at a heap of sand.

If you kept removing sand grains forever, at some point the heap would cease to be. But when, exactly? A single grain of sand isn’t a heap. Neither are 10 grains, and 100 probably doesn’t count either. So how do you define a “heap” of sand? The answer is you can’t: the paradox of the heap is one of the unsolved problems in philosophy.

Content moderation offers us a kind of paradox of the heap, but in reverse. One bad post on a big social network is simply a bad post, and can be removed without much thought or consequence. The same with 10 bad posts, or 100. Around 1,000, a social network might have to issue some guidance to its moderators.

But that’s assuming the social network recognized the posts as being connected in the first place — of being a heap. In content moderation, heaps can often accumulate for years before the network takes notice. The paradox comes in determining when they should intervene at the level of policy — after it becomes a problem worthy of concern, but before it amounts to a heap.

My favorite reverse paradox of the heap is the humble Tide Pod. I wrote this about the laundry detergent capsule in August, on the occasion of Facebook taking its first steps to ban QAnon:

Before the Tide Pods challenge was a public health crisis, it was a joke. The laundry detergent capsules, which were originally released in 2012, evolved over time to look strangely delicious: lush green and blue gels, swirled around one another attractively, all but daring you to eat them. This led to many jokes about Tide Pods maybe secretly being candy, and it might have stopped there — but then people actually started eating them. The “Tide Pods challenge” surged on social networks in 2018, and ultimately more than 10,000 children were reported to have been exposed to whatever extremely inedible substance is actually inside Tide Pods. Of teenagers who were affected, more than a quarter of the cases were intentional, the Washington Post reported at the time.

Eventually platforms banned the Tide Pods challenge, and the mania around their consumption subsided. But the story posed questions to platforms like Facebook, YouTube, and Twitter that they have struggled to answer ever since. When do you start to take a joke seriously? When does dumb talk cross the line into something dangerous? When does a lunatic conspiracy theory cross the line from a shitpost into a potential incitement to violence?

This week, Facebook gave us two different ways of thinking about answers to that question.

II.

On Thursday, Facebook announced several steps it is taking to promote the integrity of the US presidential election. It will prohibit political and issue-based advertising after the polls close for an indefinite period of time, which the company said would likely last for about a week. It will also place notifications at the top of the News Feed announcing that no winner has been decided until the race is called by mainstream political outlets. And Facebook will continue to label posts that discuss the legitimacy of the election or voting methods, such as mail-in ballots.

The official rationale for these moves is that tabulating ballots this year “may take longer than previous elections due to the pandemic and more people voting by mail.” The unstated rationale is that the president of the United States has made misinformation about mail-in voting a centerpiece of his campaign, has repeatedly declined to say he will accept a losing result, and will not agree to a peaceful transfer of power should Biden win. (During last night’s debate, Vice President Mike Pence also would not agree to a nonviolent transition in government.)

With most moderation problems, you’re working to fix something after it has already broken. In the event of the 2020 election, though, the relevant actors have given us advance notice of their intentions. We more or less know that, should Trump lose, he and his supporters will declare fraud, and insist that no matter what the vote count says, he is the true winner. We know that giant pools of dark money are standing by to flood available channels with advertising insisting that the losers won, and warning that a coup is underway. We know that the likely result of these actions will be chaos and violence.

This is terrible from the standpoint of democracy. But it’s helpful for people working on platform integrity. In a few weeks, the country will face one of the most difficult periods in his history. But our adversaries have already given us their battle plan. That has allowed Facebook to identify the most obvious routes of attack, and take action.

If it works, Facebook will have avoided the problem of the heap.

III.

But what about QAnon?

Facebook had taken down hundreds of pages, groups, and ads related to the far-ranging conspiracy movement in August. And then on Tuesday, it went much, much further — banning not just accounts related to QAnon that discuss violence, but all accounts “representing” QAnon across its family of apps.

QAnon is renowned for its shapeshifting abilities, and Facebook has warned it will take weeks to root out both existing accounts and new ones that pop up in various modified forms.

But the removals are happening. And to me, the big questions are: why, and why now?

As to why, Facebook now considers QAnon a “militarized social movement.” The number of movement adherents linked to violence has increased significantly over the past few years. And in May 2019 the FBI warned last year that conspiracy movements including QAnon represent a domestic terrorism threat.

When it started as a series of cryptic messages on 4Chan, QAnon hardly seemed any more dangerous than those early Tide Pod jokes. But the movement metastasized quickly.

If QAnon has been a domestic terrorism threat for nearly a year and a half, though, why wait to ban it until now? “Our timing was really based on the work we do to understand and combat violent threats,” a Facebook spokeswoman told me today when I asked. “This is a standard piece of how we apply our Dangerous Individuals and Organizations policy to any entity.”

Standard, maybe, but unsatisfying. Journalists have traced violent Facebook posts about QAnon to at least January 2018. At some point, the individual posts added up to a heap. At what point should Facebook have intervened?

I put that question to a bunch of folks, including Brandy Zadrozny, who along with her NBC News colleague Ben Collins has done some of the most important work in translating QAnon for non-believers. Zadrozny told me that the potential dangers of QAnon had been evident from the start: it grew directly out of Pizzagate, an earlier right-wing conspiracy theory that famously culminated in a shooting.

“I think a lot about what would have happened if Facebook had decided to do something about Pizzagate in 2016, after Edgar Welch went into Comet Ping Pong and shot up the place with an AR-15,” Zadrozny said. “If Facebook decided in 2016 that a conspiracy theory — one that puts politicians, public figures, and people eating in pizza shops in real danger — has no place on the platform? Well, then they would have seen QAnon as Pizzagate 2.0 and could have taken immediate action.”

She continued: “QAnon wouldn’t have become the thing it has. It wouldn’t have crept through wellness and health and anti-vaxx groups, wouldn’t have reached moms and bled into SaveTheChildren groups. But they didn’t. So here we are.”

One standard platforms could consider here is to identify moments of political violence with platform ties and then work backward — to understand where they are percolating, how they are spreading, and whether more action is warranted.

But Evelyn Douek, who studies online speech regulations and lectures at Harvard Law School, told me that Facebook’s QAnon ban seemed somewhat ill-formed.

“I think it's worth stopping to appreciate what a major step it is to ban a conspiracy entirely, going beyond all its violent manifestations,” she said. “When does a non-violent crazy belief tip over into being worth censoring in all its forms? When will it in future? That's just not an easy question. Is there something they know about what changed lately that we don’t?”

One answer to this might be that once a group commits enough violent acts, any association with it represents a kind of endorsement of that violence, and should be removed. But Douek raises some important questions.

In the meantime, it seems clear to me that on QAnon, Facebook failed the heap test. The signs were there, but the policy was not, and while QAnon eventually popped up on every platform — it had to be banned from Etsy this week — it does seem like Facebook had been a major recruiting ground.

When does a problem become a heap worthy of policy-level attention? “Sooner than is now standard” is one answer I like. Or maybe, looking at how Facebook is preparing for a crisis around the election, the best answer is: proactively.

And even then, as Douek notes, we have to keep our expectations in check. Many problems require platform interventions, but most problems cannot be solved at the platform level.

“Ultimately I doubt this fixes either our QAnon problem or our problems with Facebook,” Douek said. “Bans are a blunt tool to deal with a mental health crisis wrapped in conspiracy theories, and Facebook's actions are still completely unaccountable. How platforms moderate matters.”


Your thoughts on email length?

Since moving to Substack, it has become clear that an issue I write at the standard length of 2,500 to 3,000 words will get truncated by Gmail and other email clients. Which is to say, you’ll have to tap a button partway through reading to load the rest of the issue. Annoying!

One solution to this problem is to write shorter editions, which Substack seems to incentivize by showing me an angry red button labeled “Post too long for email” whenever I go over the limit. But I’ve always thought of the value of this newsletter in being comprehensive.

So I wanted to put the question to you: would you rather I aim for shorter editions that fit within Gmail’s cap, or are you willing to tap through? If you wouldn’t mind sending me a quick note with your thoughts, I’d appreciate it. (Paid subscribers can simply use Substack’s commenting feature. Try it out! And if you’re not subscribed yet, you can do so by clicking the button below.)


The Ratio

Today in news that could change public perception of the big tech companies.

🔃 Trending sideways: The website for the (fake) Real Facebook Oversight Board was taken down after a vendor filed a takedown notice over trademark infringement. This led to much anguished shouting and dunking in my Twitter timeline today, but if you name a fake Facebook Oversight Board the “Real Facebook Oversight Board” and buy a domain name to that effect, what the hell else do you expect? (David Gilbert / Vice)


Governing

Following President Trump’s cries to “liberate Michigan” earlier this year, 13 men were charged in an alleged plot to kidnap Michigan Gov. Gretchen Whitmer. According to the (chilling) indictment, the men used multiple encrypted apps to communicate — and also organized in part on Facebook. And at least one of the defendants was active on YouTube. Here are Tom Winter, Michael Kosnar and David K. Li at NBC News:

In a YouTube video from May, Caserta claimed in a 30-minute diatribe that “the enemy is government.” He shot the video in front of an anarchist’s flag and a map of Michigan.

Caserta did not post on YouTube again until three weeks ago. In that video, Caserta does not speak, and simply loads and poses with a long gun off camera while wearing a shirt that says “F--- The Government.”

⭐ It was a big day for removing networks of bad actors on Facebook.

Separately, Twitter announced the takedown of state-linked operations in Cuba, Saudia Arabia, Thailand, Iran, and Russia. One Twitter account gained more than 1 million followers pretending to be a dissident member of the Qatari royal family. (Stanford Internet Observatory)

The Justice Department seized 92 websites it said were used by Iran to spread misinformation. “Four of the web domains — Newsstand7.com, usjournal.net, usjournal.us and twtoday.net — were disguised as genuine news outlets based in the U.S.) (Kartikay Mehrotra / Bloomberg)

A judge ordered Twitter to reveal the identity of a user who posed as an FBI agent and sparked the Seth Rich conspiracy theory. The user allegedly gave forged documents to Fox News. (Bobby Allyn / NPR)

Google’s face-off with Oracle in the Supreme Court appears not to have gone well for the search giant. Justices appear to be skeptical of Google’s arguments that APIs should not able to be copyrighted. (Timothy B. Lee / Ars Technica)

Netflix was indicted by a Texas grand jury over the depictions of children in the film Cuties. The film, which Netflix describes as “a social commentary against the sexualization of young children,” won a directing prize at the Sundance Film Festival. Feels like a ripple out of the ongoing QAnon / “save the children” moral panic. (Julia Alexander / The Verge)

Roughly one-fifth of American election jurisdictions have applied for the $250 million in grants that Mark Zuckerberg and Priscilla Chan donated to protect elections. In some jurisdictions, the grants have enabled election officials to increase their budgets by 30 to 40 percent. (Teddy Schleifer / Recode)

With other platforms retreating from political advertising, more candidates are turning to Facebook. Political ad spending accounted for about 3 percent of Facebook’s third-quarter revenue, this piece estimates. (Ari Levy, Salvador Rodriguez, and Megan Graham / CNBC)

A network of right-leaning publishers is regularly publishing links to articles by Russia Today, the Kremlin-backed state media outlet. Among the sites participating in the traffic exchange are the National Review, the Daily Caller, Newsmax, and RealClearPolitics. (Keach Hagey, Emily Glazer and Rob Barry / Wall Street Journal)

A Republican senator came out against democracy. Tweets from Mike Lee of Utah gave voice to a growing sentiment among Republicans that most have been reluctant to acknowledge. (Jonathan Chait / New York)


Industry

Short-form video app Triller is spending massively in an effort to woo TikTok stars to the platform. Taylor Lorenz reports at the New York Times:

Triller has also rented mansions in Los Angeles for top creators to live in. After the TikTok stars Bryce Hall and Blake Gray had their power turned off by the city in August for flouting local guidelines around in-person gatherings, they moved into a Triller house. Last week, nine creators, including Tayler Holder, 23, of the Hype House, moved into another property rented by Triller.

The company pays for housekeeping, weekly Instacart orders, ground transportation, high speed Wi-Fi and production equipment like ring lights. Whatever the talent needs to make content, Triller will get. For one recent video by a creator, the company secured a helicopter.

Microsoft issued a set of principles for App Stores designed to make Apple and Google’s policies look draconian by comparison. Among other things, Microsoft allows rival app stores to run on Windows. (Ina Fried / Axios)

At least 60 people quit Coinbase after its CEO announced his intention to remove politics from the workplace. (Brian Armstrong / Medium)

Sandvine is an American company whose technology is used censoring the internet for authoritarian leaders around the world. The company says its products are designed to manage the flow of internet traffic, but current and former employees say it has been used to censor websites in more than a dozen countries. (Ryan Gallagher / Bloomberg)

LinkedIn has evolved into a thriving forum for Black expression. In the wake of George Floyd’s killing, more people have been using it to call out workplace discrimination — as well as sharing memes and having a good time. (Ashanti M. Martin / New York Times)

Trapped inside their homes due to the pandemic, people spent a record $28 billion on apps in the third quarter. That’s up 20 percent from the previous year. (Sarah Perez / TechCrunch)

A popular TikTok artist who goes by Ricky Desktop explains how sounds go viral there:

You need concrete, sonic elements that dancers can visually engage with on a person-by-person basis. I know that sounds super scientific, but that is how I think about it. If you’re trying to make a viral beat, it’s got to correspond with the viral dance.

In order to lock in on that, you need elements of the music to hit. So for example, I have this beat called “The Dice Beat.” I added a flute sound, which in my head was like, “Okay, people will pretend to play the flute.” And then there’s the dice sound, where they’ll roll the dice. It was super calculated. I would create the music with the dance in mind.

Love this. (Jake Kastrenakes / The Verge)


Those good tweets


Talk to me

Send me tips, comments, questions, and Tide Pods: casey@platformer.news.