Brain power

How cyber troops are influencing what you think and know

In a previous RealKM Magazine article, we looked at the operations of the British Joint Threat Research Intelligence Group (JTRIG). As leaked documents show, JTRIG’s strategies are based on psychological and human science research, and aim to deliberately deceive and manipulate public thought.

A new working paper from the Computational Propaganda Research Project at the University of Oxford finds that “cyber troops” such as those in JTRIG are a pervasive and global phenomenon. Cyber troops are defined as “government, military or political party teams committed to manipulating public opinion over social media.”

The working paper reports on cyber troop organizations across the 28 countries for which evidence of their activity exists, compiling an inventory according to the kinds of messages, valences, and communication strategies used. It finds that many different countries employ significant numbers of people and resources to manage and manipulate public opinion online, sometimes targeting domestic audiences and sometimes targeting foreign publics.

Valence is a term that is used to define the attractiveness (goodness) or averseness (badness) of a message, event or thing. Some teams use pro‐government, positive, or nationalistic language when engaging with the public online. Other teams will harass, troll, or threaten users who express dissenting positions.

Both countries seen as authoritarian regimes, for example Russia, China, and Saudi Arabia, and those seen as democracies, for example the United States and United Kingdom, carry out social media campaigns aimed at managing and manipulating public opinion. However, there are significant differences in the target audiences:

Looking across the 28 countries, every authoritarian regime has social media campaigns targeting their own populations, while only a few of them target foreign publics. In contrast, almost every democracy in this sample has organized social media campaigns that target foreign publics, while political‐party‐supported campaigns target domestic voters.

Further, while countries such as Russia and China are often the target of western articles about social media manipulation,

Authoritarian regimes are not the only or even the best at organized social media manipulation. The earliest reports of government involvement in nudging public opinion involve democracies, and new innovations in political communication technologies often come from political parties and arise during high‐profile elections.

Over time, the primary mode for organizing cyber troops has gone from involving military units that experiment with manipulating public opinion over social media networks to strategic communication firms that take contracts from governments for social media campaigns.

Inventory of the kinds of messages, valences, and communication strategies used

Strategies, tools and techniques for social media manipulation

Cyber troops use a variety of strategies, tools and techniques for social media manipulation.

  • Commenting on social media posts. This can involve positive messages that reinforce or support the government’s position or political ideology, negative interactions where there is verbal abuse, harassment and so‐called “trolling” against social media users who express criticism of the government, or neutral comments that are designed to distract or divert attention from the issue being discussed.
  • Individual targeting, which involves selecting an individual or group to influence on social media. For example, opinion leaders are carefully selected and targeted with messages in order to convince them that their followers hold certain beliefs and values, or alternatively the values, beliefs or identity of a user or a group of users are the target of harassment that generally involves verbal abuse, hate speech, discrimination and/or trolling.
  • Government-sponsored accounts, web pages or applications. Some countries run their own government‐sponsored accounts, websites and applications designed to spread political propaganda.
  • Fake accounts and computational propaganda. Many cyber troop teams also run fake accounts to mask their identity and interests. This phenomenon has sometimes been referred to as “astroturfing”, whereby the identity of a sponsor or organization is made to appear as grassroots activism. In many cases, these fake accounts are “bots”—or bits of code designed to interact with and mimic human users.
  • Content creation. Some cyber troop teams create substantive content to spread political messages. This content creation amounts to more than just a comment on a blog or social media feed, but instead includes the creation of content such as blog posts, YouTube videos, fake news stories, pictures or memes that help promote the government’s political agenda.

Organizational forms

Cyber troops are often made up of an assortment of different actors. In some cases, governments have their own in‐house teams that are employed as public servants. In other cases, talent is outsourced to private contractors or volunteers.

  • Government. Government‐based cyber troops are public servants tasked with influencing public opinion. These individuals are directly employed by the state as civil servants, and often form a small part of a larger government administration such as a ministry.
  • Politicians and parties. Political parties and candidates have used social media to manipulate public opinion during campaigns, either by purposefully spreading fake news or disinformation, trolling or targeting support for the opposition party, or using fake accounts to artificially inflate the number of followers, likes, shares or retweets a candidate receives, creating a false sense of popularity. This is different to traditional digital campaign strategies, which have generally focused on spreading information about the party or candidate’s platform or sent advertisements out to voters.
  • Private contractors. In some cases, cyber troops are private contractors hired by the government. Private contractors are usually temporary, and are assigned to help with a particular mission or cause.
  • Volunteers. Some cyber troops are volunteer groups that actively work to spread political messages on social media. They are not just people who believe in the message and share their ideals on social media. Rather, volunteers are individuals who actively collaborate with government partners to spread political ideology or pro‐government messages.
  • Paid citizens. Some cyber troops are citizens who are actively recruited by the government and are paid or remunerated in some way for their work. They are not official government employees working in public service, nor are they employees of a company contracted to work on a social media, and they are also not volunteers because they are paid. Normally, these paid citizens are recruited because they hold a prominent position in society or online.

Organizational budget, behavior and capacity

Cyber troop teams differ considerably in their budgets, behaviors and capacity.

  • Budget information. Cyber troops spend various amounts of funds on their operations. The amount of publicly available information on budgets and spending is relatively limited.
  • Organizational behavior. Several different organizational practices of cyber troop teams were identified: (1) a clear hierarchy and reporting structure; (2) content review by superiors; and (3) strong coordination across agencies or team; (4) weak coordination across agencies or teams; (5) liminal teams. In some cases, teams are highly structured with clearly assigned duties and a reporting hierarchy, much like the management of a company or typical government bureaucracy. Tasks are often delegated on a daily basis.
  • Capacity building. Cyber troops will often engage in capacity‐building activities. These include: (1) training staff to improve skills and abilities associated with producing and disseminating propaganda; (2) providing rewards or incentives for high‐performing individuals; and (3) investing in research and development projects. When it comes to training staff, governments will offer classes, tutorials or even summer camps to help prepare cyber troops for engaging with users on social media.

Organizational density of cyber troops

In the following map, countries with a larger number of different kinds of cyber troop organizations (governments, political parties, civil society groups, organized citizens, or independent contractors) are shown in darker shades of red/orange. The United States has the largest number, with five different organizations.

Organizational density of cyber troops, 2017
Organizational density of cyber troops, 2017 (source: Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation)

Article source: Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation is licenced by CC BY-NC-SA 4.0.

Header image source: Adapted from GCHQ Building at Cheltenham, Gloucestershire by Defence Images, which is licensed by CC BY-SA 2.0. The Government Communications Headquarters (GCHQ) is identified as one of the British cyber troop organizations in the Global Inventory of Organized Social Media Manipulation.


Also published on Medium.

Bruce Boyes

Bruce Boyes (www.bruceboyes.info) is editor, lead writer, and a director of the award-winning RealKM Magazine (www.realkm.com) and currently also teaches in the University of NSW (UNSW) Foundation Studies program in China. He has expertise and experience in a wide range of areas including knowledge management (KM), environmental management, program and project management, writing and editing, stakeholder engagement, communications, and research. Bruce holds a Master of Environmental Management with Distinction and a Certificate of Technology (Electronics). With a demonstrated ability to identify and implement innovative solutions to social and ecological complexity, Bruce's many career highlights include establishing RealKM Magazine as an award-winning resource for knowledge managers, using agile and knowledge management approaches to oversee the implementation of an award-winning $77.4 million river recovery program in western Sydney on time and under budget, leading a knowledge strategy process for Australia's 56 natural resource management (NRM) regional organisations, pioneering collaborative learning and governance approaches to support communities to sustainably manage landscapes and catchments, and initiating and teaching two new knowledge management subjects at Shanxi University in China.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button