You are probably spreading misinformation. Here’s how to stop.

Everyone knows you shouldn’t feed a troll. But more than ever, you should go out of the way not to retweet, share or follow one, either.

First came the pandemic. Now we’re facing an infodemic. Misinformation from so-called trolls, bots and other online agitators is spiking about the death of George Floyd and Black Lives Matter protests, following a tsunami of falsehoods about the coronavirus. And the people who care most intensely about those issues may be inadvertently spreading it further – a hard-learned lesson from social media meddling in the 2016 and 2018 elections.

To avoid being taken advantage of, we need to learn their ways – and learn some new techniques of our own to challenge what we see on Twitter, Facebook, Instagram, WhatsApp, YouTube, Reddit and Nextdoor. Whether you’re 16 or 60, spending a few seconds to do the things I list below can help keep you from becoming a tool in someone else’s information war.

Just in the last week, the hashtag #DCblackout was used to post false claims that authorities had somehow blocked protesters from communicating from their smartphones. It started with an account that had just three followers. And Twitter took down an account with violent rhetoric claiming to belong to a far-left-leaning national antifa organization that was actually linked to the white nationalist group Identity Evropa.

 

As misinformation about the novel coronavirus continues to spread, here are some important tips to keep in mind when consuming news about the outbreak. (Elyse Samuels/The Washington Post)

 

“We are acutely vulnerable in times like these, where there’s a fog of war situation,” says Kate Starbird, a professor at the University of Washington who studies the art of disinformation. It’s a perfect storm: Americans are looking online for information about protests and the coronavirus, even as the pandemic keeps many of us at home, isolated from other sources.

Advertising

Starbird learned from studying Russia’s Internet Research Agency, a major manipulator of social media around the 2016 election, how these groups turn people’s desires to be part of a movement against them. It starts with appealing messages that earn shares and follows, slowly building an audience. “They are echoing the things that we might care about, at first, to impersonate someone who might be like us – to try to become part of our group,” says Starbird. Only later do they reveal their true objectives.

“No matter how intellectual you think you are, no matter how savvy you think you are – for tech or anything else – you have been victimized by disinformation at some point,” says Shireen Mitchell, the founder of Stop Online Violence Against Women.

In 2016 and 2017, Twitter CEO Jack Dorsey retweeted a Russian account posing as a civil rights activist at least 17 times. Even I’ve fallen for “fake news” on Facebook about a very tumultuous airplane landing.

What do we call the people misleading us? “Russian trolls” became a useful shorthand, but the truth is they’re just as likely to be domestic – and not necessarily even trolls, in the sense that some are looking to do more than irritate. You might associate this activity with bots (software that tries to emulate humans), but they’re an increasingly small part of the problem, researchers say. Academics like to call all these online manipulators “bad actors,” but that also just sounds like Nicolas Cage.

A better term is “disinformers,” suggests Nina Jankowicz, author of the forthcoming book “How to Lose the Information War.” The motivations of disinformers can be many. Sometimes they’re scam artists who want to drive advertising or malware. Sometimes they’re foreign governments trying to disrupt democracy. And other times they’re just jerks who enjoy seeing what they can get away with.

 

Russia used social media to try and influence the 2016 presidential election. Here’s what you need to know about how they modernized their propaganda tactics. (Meg Kelly, Elyse Samuels, Joy Sharon Yi, Sarah Cahlan/The Washington Post)

Advertising

“Lately, it’s less about false specific information than it is about misleading narratives, or content that’s designed to raise levels of fear,” says Claire Wardle, the co-founder and director of First Draft, a nonprofit that battles misinformation.

Disinformers can also try to corrupt movements by making them less effective or, ironically, leaving them open to charges of fakery. “People from a lot of different perspectives are putting time and energy into trying to make the world a better place. And part of that effort has to be to make sure that they’re using and sharing the best information,” says Mike Caulfield, a digital literacy expert at Washington State University Vancouver, who turned the lessons of information wars into a curriculum. (The nonprofit News Literacy Project also offers free lessons targeting middle and high school students at Checkology.org.)

Part of the blame goes to tech companies that profit off the outrage disinformers share. But until these companies grow more of a conscience – or the laws change to make them more responsible – we have to take responsibility for shutting down disinformation ourselves.

After speaking with six of the leading disinformation researchers, my takeaway is that it’s no longer particularly helpful to say we should try to judge whether information looks plausible before sharing it. The truth is, very often it looks just fine.

Instead, we need to challenge sources and learn the reflexes to not just respond instantly to the emotions they stir up. Here are four steps that can help.

Step 1: Apply the brakes. People are too quick to share information they can’t personally vouch for. We need an internal speed bump.

Advertising

Emotion is the main tool disinformers use to manipulate us. “It turns out we are not very skeptical when we are scared,” says Wardle. And there’s a lot of fear right now.

Especially if you have a strong reaction, use that as a reminder to step away. Stop looking at it, then come back in a few minutes and ask yourself: “Do I really know enough to share this?”

This applies in particular to views you agree with. “Our minds are wired to make shortcuts, to find information that we already think is true,” says Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab. “Being cognizant of it is half the battle.”

Disinformers may also weaponize your family. One tactic is to goad people with language such as “share this if you really care.” That’s the digital equivalent of a chain letter.

Even better: Focus on writing and filming your own firsthand experiences and ideas, as opposed to sharing and commenting others’.

Step 2: Check the source. Trying to evaluate facts can be difficult and time-consuming. Instead, take a few seconds to evaluate the reputation of the information source.

Sponsored

Even people who grew up online can forget not all information is created equal. Just because it was re-shared by a friend, or a source has a legitimate-sounding name, does not mean you can trust it.

This matters because when you share or follow a source on social media, you’re actually endorsing it. That might not be your intent, but that’s how social media software functions — every extra follower gives a source more of a voice, and helps it rank higher in the algorithms that decide what we all get to see.

Make a rule for yourself that you won’t share until you’ve at least glanced at their profile page.

Some rules of thumb for vetting sources:

• Sometimes the immediate source is a family member or friend. Then you need to check their source.

• Look how long the account has been around. (Twitter and Facebook both list a “joined” date on profile pages.) There’s been a surge of social media accounts with fewer than 200 followers created in the last month, a common sign of disinformation efforts. The fake antifa account that Twitter shut down was only a few days old.

• Does the person say who they are? If so, you could probably Google them.

Advertising

• Glance at an account’s most recent posts – as well as ones from a few weeks ago. Is it consistent?

• Ask yourself what puts them in a position to know about this topic. Is the source even in the place that it claims to have information about?

• If it claims to be a news outlet, does it have a website? A way to contact it? A service called NewsGuard offers a Web browser plug in that rates more than 4,000 news websites based on their records of publishing accurate information.

And if you’re part of a movement, take time to figure out who really is a member of your community. You can avoid the traps of interlopers by only trusting information from verifiable accounts of leaders, as opposed to whoever is shouting loudest online.

Step 2.5: Don’t trust cute things. Memes, those images and slogans that spread like wildfire, can be fun. Just know now they’re also weapons.

Case in point: Russian accounts back in the lead up to the 2016 election shared many delightful images, such as one with a Golden Retriever waving an American flag and text reading “Like if you feel that’s gonna be a great week!” Its source, a Facebook page called Being Patriotic, channeled jingoism and had over 200,000 followers.

Advertising

Yes, the disinformers have appropriated puppies. They used Beyoncé memes, too.

A post doesn’t have to be false to be dangerous. The disinformers are more interested in hijacking the mechanism of sharing, by getting you to improve their standing.

And that takes us back to step two, checking sources. Who are you supporting when you share, like or retweet? “There’s no reason to be amplifying content from pure strangers,” says Jankowicz.

This applies even if the meme comes from a Facebook group, an increasingly common target for disinformers. Groups market themselves as tightknit communities, but they may just be hiding bad activity behind the closed walls of the group.

Step 3: Become a citizen investigator. Sometimes a quick source check comes up inconclusive, but you’re still really interested in the information they’re sharing. Then it’s time to perform what Caulfield calls “lateral reading.” Instead of digging deep into the information at hand, look across the Internet for other sources.

Questions to ask:

• Have any reputable fact-check organizations looked into the claim?

• Did anybody else report the same thing, perhaps from a different angle?

Advertising

• Where and when was the image or quote created? Try a reverse-image search site, such as images.google.com. BuzzFeed recently debunked a post that had more than 15,000 retweets claiming a McDonald’s restaurant was burning during Minnesota protests. The photo used was actually taken in Pennsylvania in 2016.

Step 4: When you find misinformation, correct it — carefully. Concerned citizens can and should help others not fall for misinformation by leaving a trail of bread crumbs to the truth. Research shows people are less likely to share information when someone has commented with a fact check, says Caulfield.

But do so with caution. Re-sharing the original with a comment can sometimes help to amplify the original source. A better idea, used by some professional fact-checkers, is to take a screen shot of the image or video, and then draw a red X through it and share that.

If you’re commenting on someone else’s post, just remember most people do not like being corrected.

“Don’t make a fight out of being right,” says Brookie. “If you do, there’s a wide body of social science that would indicate that they’ll probably have a reaction that makes them double down on whatever they thought to begin with because you just made them feel stupid.”

How do you do it right? When Mitchell recently saw a family member share a video containing coronavirus misinformation, at first she was outraged. “I remember feeling, this is stupid,” she says. Then she watched the video all the way through to understand what resonated with her relative: a distrust of government rooted in the notorious Tuskegee syphilis study.

“The way disinformation works, there is a kernel of truth in there,” says Mitchell. “So when you’re dissecting it you have find the truth and address the truth – and then say the rest is a lie.”

The Seattle Times does not append comment threads to stories from wire services such as the Associated Press, The New York Times, The Washington Post or Bloomberg News. Rather, we focus on discussions related to local stories by our own staff. You can read more about our community policies here.
Advertising