Professional disinformation campaigns are a boom market, according to a new report from the Oxford Internet Institute (OII); researchers found “evidence of 81 countries using social media to spread computational propaganda and disinformation about politics,” up from 70 in 2019. Among them: Both sides of the civil war in Libya, a pro-Trump group that tries to convince ageing boomers he’s popular with college students, and failed billionaire candidate for the presidency Michael Bloomberg.
The OII researchers also found evidence that private companies had been hired as “cyber troops” for such campaigns in at least 48 of those nations, nearly double that of the previous year. The institute has identified over 65 such firms that offer online propaganda services that have racked up over $US60 ($78) million in combined income since 2009 — though it’s all but certain the actual amount is far higher because secretive political operatives generally tend not to release details of their contracts.
The report defines cyber troops as “government or political party actors” tasked with surreptitiously “manipulating public opinion online.” Whether they are engaged in domestic operations or attempting to influence foreign politics, the researchers wrote, they are a “pervasive part of public life.”
Disingenuous propaganda campaigns waged by political parties included Facebook pages attempting to jack up tensions in Tunisia before elections and Bloomberg, whose Democratic primary campaign ran an elaborate astroturf network of Twitter bots and sock-puppet accounts. Bloomberg, with far more financial resources than actual grassroots support, bankrolled an army of over 500 “deputy digital organisers” to aggressively promote his message on social-media sites in an effort that resembled undisclosed sponsored content. The results were unimpressive: Twitter later banned dozens of accounts it said were involved in “platform manipulation and spam,” and the plan backfired on Bloomberg’s doomed campaign after it was widely mocked as out-of-touch and unconvincing.
Bloomberg’s use of the strategy may have been cringeworthy, but it threatened to undermine the legitimacy of the election by treating it as an auction open to the highest bidder. The report makes clear that similar strategies are also used by authoritarian governments to maintain their grips on power and as psychological operations in warzones. Examples included police forces in the Philippines that falsely identified activists as terrorists, and the extensive state media apparatus operated by the authoritarian government of Belarus. Per the report:
Between 2019-2020, recent examples of government-led activity include the Philippine Police who used Facebook to influence narratives about military activities against terrorism (Gleicher, 2020b), or ongoing cyber conflicts between the Government of National Accord and the Libyan National Army who have used social media to shape narratives about the ongoing civil war (Kassab & Carvin, 2019). An example of state-funded media includes the Belarussian media infrastructure, where the government controls more than six hundred news outlets, many of which show evidence of propaganda and manipulation (Bykovskyy, 2020; Freedom House, 2019).
One of the most common tactics was simply the creation of automated social media bots, which was seen at scale in 57 countries, such as a network associated with public institutions in Honduras. The report identified networks of propaganda accounts that were curated by humans in 79 countries. One specific example was a marketing firm Rally Forge which was employed by pro-Trump group Turning Point USA. Facebook banned after it had spent $US1.15 ($1) million on ads associated with an inauthentic network of “202 Facebook accounts, 54 Pages and 76 Instagram accounts.” (Turning Point bills itself as a youth organisation training college students to become the next generation of the Republican Party, but its Facebook ads overwhelmingly reach older adults, apparently in an effort to mislead them on how popular GOP ideology is with young people.)
Other techniques OII identified in the report included the creation of disinformation and manipulated media, data-driven advertising of misleading information to specific demographic groups, trolling, doxxing, and mass reporting activists, dissidents, and journalists to social media moderators in the hopes sites will censor opposing news.
Countries the report identified as having “high cyber troop activity” consisting of “large numbers of staff, and large budgetary expenditure on psychological operations or information warfare” included China, Egypt, India, Iran, Iraq, Israel, Myanmar, Pakistan, the Philippines, Russia, Saudi Arabia, Ukraine, the United Arab Emirates, the United Kingdom, the United States, Venezuela, and Vietnam. All of these countries were listed as having permanent cyber troops rather than temporary ones organised on an ad hoc basis during elections, though they varied on their level of organisation. For example, propaganda networks in the U.S. and UK were largely decentralized, as opposed to state-employed Chinese propagandists or Venezuelan government-organised cyber militias.
Propaganda isn’t new; social media may just be one of the cheaper and more efficient ways of converting dollars into persuasion. But the OII report is yet more evidence (as if we need it) that despite claims from firms like Facebook and Twitter that they have taken effective action against propaganda operations run by states, political parties, and various third-party operations, they really haven’t. Even if they did, the rise of alternative platforms presents other venues for manipulation: just look at the riots at the Capitol in DC this month, which was spurred by calls to action by conservative media and organised on apps like Telegram and Parler.
“For a long time our perception of propaganda and disinformation was that they come from governments rather than considering the fact that they are part of a commercial enterprise,” Sam Woolley, a professor specializing in propaganda research at the University of Texas at Austin, told the Financial Times. “What we’ve realised is that many of the firms that build online disinformation are based in democratic countries as well.”