Tech

“You Become Hostage to Their Worldview”: The Murky World of Moderation on Clubhouse, a Playground for the Elite

The invite-only app, which has drawn attention for anti-Semitic speech and harassment issues, says it’s focused on making users feel safe. But due to its very nature, it has become a haven for the powerful to flirt with misogyny and racism.
Image may contain Human Person Screen Electronics Wood Monitor Display Sitting Plywood and Lcd Screen
Paul Davison in San Francisco, 2012. By Peter DaSilva/The New York Times/Redux. 

In late October, freelance journalist Wanna Thompson created a chat room on Clubhouse, a buzzy invite-only app. Users—often celebrities, venture capitalists, and the occasional journalist—can create or enter audio-only chat rooms that can be wiped clean after a conversation ends, and choose who to follow. Thompson wanted to start a conversation about Clubhouse as a platform and the algorithms that make it work. She called her room “Clubhouse isn’t our friend, it’s an app.”

The discussion turned to the fact that the platform had allowed Tory Lanez, a Canadian rapper who has pleaded not guilty to charges that he shot Megan Thee Stallion in July. Lanez had been nominated to join Clubhouse by fellow rapper Tyga. In Thompson’s room, one user took the stage—meaning requested and was granted speaking time—to share Tyga’s explanation for adding Lanez, stressing that users should wait until the court case is resolved to pass judgment.

The shooting, and the public reaction to it, had become part of a larger dialogue about the mistreatment of Black women in America. Thompson knew there was a chance that members of her room, especially Black women, might be disturbed by the apparent defense of a Black man taking the side of another Black man accused of shooting a Black woman. She quickly removed the user from the stage and blocked him, despite his repeated requests to retake the virtual mic. “When I hold conversations, I always position Black women as the priority,” she said. “If there’s anyone in the audience I felt like may get triggered or had to listen to that ignorance and didn’t go into the room for that, I had to make sure I removed him and also blocked him from ever accessing any space that I curate online.”

As Clubhouse continues to grow in its beta stage, reportedly raising at least $10 million from Marc Andreessen’s venture capital firm, Andreessen Horowitz, and attracting a ballooning audience of the blue-check variety, the app has landed squarely in the middle of the debate around moderation—a topic that has become more urgent in light of renewed scrutiny on racism in the tech world, and one that’s even muddier in an audio-only format. In the bubble that is Clubhouse, pseudo-intellectual monologues from powerful users can go unchecked, leaving them free to promote racist ideas under the guise of posing legitimate questions or playing devil’s advocate. It’s the type of dialogue that wouldn’t necessarily be flagged on Twitter or Facebook either, but that seems especially common on an app with relatively less scrutiny and relatively more big-name users, who may feel comfortable airing views that would likely get ratioed elsewhere. One anonymous Clubhouse user who works in tech recalled listening in on discussions where music-industry insiders surmised that Megan Thee Stallion was lying about Tory Lanez shooting her, claiming that she was trying to ruin Lanez’s career, and alleging that Black women routinely try to derail the careers of Black men. “The issue that I see with Clubhouse is the people with the largest platforms outside of that app…tend to be the only voices in the conversation,” the user said. “And so you become hostage to their worldview.”

Another user, who asked to remain anonymous for fear of being harassed, shared recordings and screenshots of a discussion that segued into an examination of terrorism in Paris. At one point in the conversation, Pascal-Emmanuel Gobry, a Paris-based writer and fellow at the Ethics & Public Policy Center, said, “We’re not even sure how many Muslims there are in France…We’re not sure how many of those are quote-unquote ‘Islamist,’ meaning people who would support sharia law…And of course, being an Islamist doesn’t necessarily mean you support terrorism, there’s basically a spectrum, right? But the number of people who are problematic is quite large.” Gobry went on to ask how to deal with “potentially millions” of Muslim citizens in France who “fundamentally reject everything that the nation stands for and fundamentally want to either destroy it or replace it with a sharia law regime.” (When Vanity Fair contacted Gobry to ask about his participation in the conversation, he replied “no comment.”)

In response, a user who the source identified as a prominent Silicon Valley executive asked, “Is part of the problem that you have a religion that is co-traveling with a totalizing philosophy that has not been seen in other idioms at this extent and at this level?…We don’t have language for pulling apart these things. And what I find is that people have some way of making excuses, so they don’t have to become Sam Harris–like in trying to go after these things and then get pilloried for it.”

The exchange, part of which seemed to falsely suggest a concrete link between Islam and terrorism, struck at the heart of the debate over what sort of speech should be policed on tech platforms, and how to do it. And it showcased how the app allows elite users to speak in a kind of vacuum. For instance, last week, Tom Hanks’s son Chet Hanks joined a Clubhouse discussion in which he reportedly defended his choice to speak patois, comparing it to faking an English accent. (“English people were not oppressed,” one user is said to have pointed out.)

Unlike other tech platforms, which have begun to warn users about information that’s false or misleading, Clubhouse does not provide visible disclaimers to counter misinformation. There is no comment section, “like” button, or other reaction buttons for audience members to express dissent or share information within the app (though a Clubhouse spokesperson emphasized that “users can report violations [of the Terms of Service or Community Guidelines] in real time directly from the room, which triggers an investigation from the Clubhouse Trust and Safety team”). Instead, they often use Twitter to discuss what’s happening on Clubhouse. The anonymous user who shared the recording of the discussion about terrorism in Paris said the platform could become a tool for the spread of false information. “The things that they’re saying make no sense,” the user said. “So it’s just like misinformation, misinformation, misinformation, and then they have hundreds of people who believe them that are sitting there and listening in.”

Clubhouse cofounders Paul Davison and Rohan Seth declined to be interviewed, but sent the following statement via a spokesperson: “We believe in the unique power of voice to build empathy, and see Clubhouse as a place where people with different perspectives, backgrounds and lived experiences broaden their understanding and evolve their worldviews. The company unequivocally condemns all forms of racism, hate speech, and abuse, as noted in our Community Guidelines and Terms of Service, and has trust and safety procedures in place to investigate and address any violation of these rules.”

In theory, Clubhouse is aware of its shortcomings around moderation, which have drawn attention in the past. According to people I spoke with, its founders have made gestures toward regulating the app. Tracy Chou, CEO of Block Party, an app that blocks online harassment, said Seth contacted her earlier this year to discuss anti-harassment and moderation best practices for Clubhouse. Chou said she and Seth, both Stanford University alumni, spent lengthy phone conversations talking about the complexity of moderation. She said she explained that moderation goes beyond adding blocking and other technical features; it also requires setting community norms and thinking through the platform’s policies. Though Seth seemed enthused about users taking on moderation roles on their own, Chou said she stressed the need for paid moderators who know what they’re doing.

Chou said she came away from these conversations feeling that Seth had been really listening. Seth asked Chou to recommend additional people he could speak to, which she says she did. She also nudged him to pay others for their insights given Clubhouse’s fundraising success. Chou said she wanted a formal agreement in place to bring her on as an adviser to the platform. Initially, she says, Seth said he would send over the agreement, but the paperwork never came. “[I gave him] my time because I’m really hoping that [they] won’t screw this up,” Chou said. (Through a spokesperson, Clubhouse did not directly comment on Chou’s conversation with Seth. The spokesperson said the company “values the perspective of experts, operators and academics in the fields of trust, safety and moderation,” and has brought on such experts “as formal advisers, consultants, and investors.”)

Echoing Chou’s concerns, Ellen Pao, the former interim CEO of Reddit and cofounder of Project Include, which is geared toward making tech companies more inclusive, also said the platform should provide moderation training for users and implement cultural norms that allow people to have fruitful conversations. “You’ve gotta set up guardrails,” Pao said. “If you’re expecting unpaid users to moderate, you need to give them some tools. You need to make it easy for them. You need to let them be successful.”

A spokesperson for Clubhouse said users are given “tools and training” to moderate conversations and resources such as the Moderators’ Club, where they can discuss moderation best practices; rules within the Community Guidelines about best practices; and weekly new user onboarding sessions with the founders. The spokesperson said Clubhouse plans to prioritize and offer new moderation resources as the company grows.

The anonymous user who shared the recording of the discussion about terrorism in Paris also raised concerns about other powerful users spreading anti-Semitism and racism through explicit and implicit rhetoric. (In September, Clubhouse came under fire for a conversation that employed anti-Semitic stereotypes.) The user recalled an incident in which a Clubhouse member entered a room and yelled, “Fuck Jews.” According to the anonymous source, the same member entered another room that same night and said, “People think I’m anti-Semitic, but the Jews created the slave trade.” The user said that the site took action against him after others complained.

Because of the exclusive nature of Clubhouse, who’s invited to use it has also become a hot-button issue. It’s not only Tory Lanez’s presence that’s divisive; the anonymous user who sat in on the Megan Thee Stallion conversation also notified Clubhouse in October that Russell Simmons’s participation in chats might make users feel less safe. (Simmons has recently been accused of rape, sexual assault, or sexual harassment by more than a dozen women. He has vehemently denied the allegations. Drew Dixon, one of his accusers, is also a Clubhouse member.) The complicated question of who should have access to a social platform—and what message the presence of someone like Simmons sends about the app’s culture—is one that Clubhouse still seems to be trying to wrap its head around. The user shared the company’s response, which came a day after she reached out, with Vanity Fair. “Thank you for your message, and for sharing your concern,” it reads. “We’ve received feedback from several members of the community about this, and are actively working on next steps to keep the platform safe for everyone, including women and all others who were trouble [sic] by this experience. We welcome any specific suggestions you may have, and thank you again.”

The user said she was enraged by the response because it referenced multiple complaints from other users, and because it did not explain how Clubhouse might handle the situation. “The other thing I didn’t like about this is, ‘We welcome any specific suggestions you may have,’” she said. “I’m traumatized. Why do I then need to do the further labor of improving your platform?” (When asked what steps were taken to address concerns around Simmons joining the platform, a spokesperson said Clubhouse does not monitor who each person invites and that anyone can join with an invitation from a current user.)

After experiencing harassment elsewhere, especially on Twitter, Thompson remains pessimistic about whether social apps can put a system in place that takes the many nuances of moderation into account. For now, she said she’s exiting rooms that give off bad vibes and hopes that Clubhouse will offer moderation resources, but added that users should educate themselves on hosting discussions before opening a room. “As a Black woman, I don’t know if I’m confident that one day we will have an app where [people are] not allowed on it due to the harm they’ve caused,” Thompson said. “For now, I’m making sure that I am able to curate my space and who I follow.”

More Great Stories From Vanity Fair       

— Mary Trump Thinks Her Uncle’s Postpresidency Woes Are Just Beginning
— There’s a Wave of COVID Patients Who Don’t Believe It’s Real
— Doug Band: Confessions of a Clintonworld Exile
— Will Rupert Murdoch Spring for a Postpresidential Fox Gig?
— Ivanka Desperately Tries to Rehab Her Image on Her Way Out
— After Remaking CNN and Antagonizing Trump, Jeff Zucker Eyes the Exits
— With COVID Vaccines Approaching, Is the FDA Ready to Inspect Where They’re Made?
— From the Archive: Probing the Nightmare Reality of Randy Quaid and His Wife, Evi
— Not a subscriber? Join Vanity Fair to receive full access to VF.com and the complete online archive now.