Skip to main contentSkip to navigationSkip to navigation
One hand holds a gold-colored credit card while the other types on a laptop.
For-profit disinformation networks are capitalizing on the thirst for conspiracy content on Facebook. Photograph: Artur Widak/NurPhoto/REX/Shutterstock
For-profit disinformation networks are capitalizing on the thirst for conspiracy content on Facebook. Photograph: Artur Widak/NurPhoto/REX/Shutterstock

Disinformation for profit: scammers cash in on conspiracy theories

This article is more than 2 years old

Some accounts claiming to support the Canada trucker protests are run by con artists abroad

When Facebook removed dozens of groups dedicated to Canada’s anti-government “Freedom Convoy” protests earlier this month, it didn’t do so because of extremism or conspiracies rife within the protests. It was because the groups were being run by scam artists.

Networks of spammers and profiteers, some based as far afield as Vietnam or Romania, had set up the groups using fake or hacked Facebook accounts in an attempt to make money off of the political turmoil.

That foreign networks of social media scammers had seized on a divisive political issue may feel like somewhat of a throwback. Before investigations into Russian troll factories’ operations during the US presidential election and culture war conflicts over content moderation, one of the biggest challenges facing social media platforms was profiteers pushing fake news articles and spam for easy money. Hundreds of websites mimicking US news outlets promoted their content on social media, reaping ad revenue from the traffic they generated.

Platforms like Facebook have cracked down on such “inauthentic activity” since 2016, but the global misinformation industry remains. In recent years, these for-profit disinformation networks have seized on the popularity of conspiracy movements and far-right groups online, creating content aimed at anti-vaccine protesters and QAnon followers.

“It can be an extremely lucrative industry for people in other parts of the world to very closely monitor US and Canadian political climates, then capitalize on moment-to-moment trends,” Emerson Brooking, a senior fellow at the Digital Forensic Research Lab of the Atlantic Council, told the Guardian. “If you’re out for money, and measure success not by sowing discord in a country but by maximizing ad revenue, there’s still a lot of benefit to these operations.”

Scammers use fake or compromised accounts to generate ad revenue by pushing anti-vaccine or QAnon content. Photograph: Pavlo Gonchar/SOPA Images/Rex/Shutterstock

Disinformation for profit

It is hard to know the exact scale of the for-profit misinformation industry, researchers say, since it functions as part of an underground economy and comes in various forms. In addition to content mills and ad revenue schemes, there are private firms across the globe that are hired to create fake engagement or promote political propaganda. In 2021 alone, Facebook said it removed 52 coordinated influence networks across 32 countries that attempted to direct or corrupt the public debate for strategic goals, according to a company report on inauthentic behavior.

In addition, small networks can have an outsized impact if they effectively use online groups to mass organize and fundraise. In the case of the Freedom Convoy accounts, many of the largest Facebook groups involved appeared to be run by fake accounts or content mills hailing from numerous countries. Facebook took down the groups this month, but not before supporters of the convoy raised over $7m in crowdfunding and generated mass mainstream attention. (GoFundMe later disabled the campaign).

A Bangladeshi digital marketing firm ran two of Facebook’s largest anti-vaccine trucker groups, according to Grid News, which had more than 170,000 members combined before the platform removed them. The hacked Facebook account of a Missouri woman set up a network of several other pro-demonstration groups, collectively gaining more than 340,000 members in weeks. Other groups promoting American spinoffs of the Canadian protests were from Facebook accounts and networks based in Vietnam, Romania and other nations, Facebook officials told NBC News.

Recent research has shed light on how some of these for-profit misinformation operations work. A series of case studies from the Institute for Strategic Dialogue, a London-based think tank, detailed what it takes to run a money-making online news scam. One example was a cobbled-together website called The U.S. Military News.

The headlines on The US Military News look much like those you might find on any number of far-right media outlets, with titles like “Trump Wrecks Pence In Awesome Statement” and articles praising the Canadian trucker protests. A shop on the site markets Trump-related merchandise including free American flags and Trump 2024 “Revenge Tour” commemorative coins. There are repeated pleas for donations all over the front page and attached to every article.

But despite the name and wall-to-wall American branding, the site has no connection to the US military, or the United States for that matter. Its domain is registered in Vietnam, and it’s unclear if it employs any writers or if the products it advertises even exist. The articles themselves consist solely of stock footage videos, with an automated voice reading plagiarized content.

Police on horseback and an armored police vehicle are positioned in front of protesters during protests in Ottawa on Friday. Photograph: Justin Tang/AP

A number of the articles and headlines posted on sites linked to the network veer into outright QAnon conspiracy content, featuring falsehoods about military tribunals and Biden officials being sentenced to death. One site’s front page prominently features a range of anti-vaccine and pro-Trump conspiracy content, while also promoting an Amazon affiliate link to Trump’s Art of the Deal book.

The Guardian contacted the email address that the US Military News is registered under, but did not receive a reply. The US Military News is just one of a number of sites that appear linked to the same Vietnam-based network, according to ISD.

In another of ISD’s reports, researcher Elise Thomas found a network of dozens of Facebook groups and pages – which also appear to be linked to a small group of people in Vietnam – that shared plagiarized pro-Trump content aimed at conservative social media users. Taking articles from far-right conspiracy sites like the Gateway Pundit, the network created Facebook groups with names like “Conservative Voices” and built up large numbers of followers – sometimes in the tens of thousands of users.

Although for-profit misinformation networks often monetize their audiences through running ads on their websites, the network ISD found appeared to be building up their Facebook group members in order to resell the groups themselves.

“This was the original threat that platforms were worried about,” Brooking said. “It wasn’t disinformation, you would characterize it as sort of ad fraud or ad farming.”

The original ‘fake news’

In many cases, including ISD’s case studies, there isn’t vast amounts of money being made from inauthentic Facebook groups and conspiracy sites. But to many of the operators based in countries with low per capita income relative to the US, making a few hundred dollars a month from pushing conspiratorial content means significant gain. One of the more lucrative sites linked to Vietnam that ISD analyzed brought in around $1800 each month through advertising alone – about 10 times the monthly per capita income in the country.

These scams have strong echoes of the surge in online commercial misinformation in 2016. Many of the people behind posts with false claims such as “Pope Francis Endorses Donald Trump” also came from outside the US, often from a single small town in North Macedonia called Veles, which was responsible for over 140 imitation news websites.

These original “fake news” websites capitalized on salacious headlines and social media algorithms that promoted posts with high engagement regardless of their content, leading creators to choose contentious political issues involving race, religion and culture war flashpoints to drive the most attention to their sites and social media accounts. Although the strategies to evade content moderators have evolved, that playbook of monetizing conspiracies and misinformation appears to have stayed largely the same.

“This is what the misinformation threat looked like before we were even talking about state actions,” Brooking said. “It’s interesting that this sort of older threat is now back in center stage.”


Most viewed

Most viewed