This article is part 3 of a series reviewing selected papers from Altmetric’s list of the top 100 most-discussed journal articles of 2018.
A May 2018 paper1 sought to understand how Twitter bots and trolls promote online health content. The researchers found that bots disseminated antivaccine messages, whereas Russian trolls were more political and divisive, promoting discord.
To arrive at their conclusion, the researchers compared the rates of vaccine-relevant messages by bots and average users, and conducted a content analysis of a Twitter hashtag associated with Russian troll activity. In regard to this hashtag, #VaccinateUS, the researchers state:
#VaccinateUS tweets were uniquely identified with Russian troll accounts linked to the Internet Research Agency—a company backed by the Russian government specializing in online influence operations. Thus, health communications have become “weaponized”: public health issues, such as vaccination, are included in attempts to spread misinformation and disinformation by foreign powers.
In a previous RealKM Magazine article, I looked at the US indictment charging thirteen Russian nationals and three Russian companies involved in the Internet Research Agency, and put forward options for how knowledge management can help the fight against online misinformation. In the article, I discuss how the actions of the Internet Research Agency can be described as information warfare. I also argued that because online misinformation is being used as a weapon, knowledge managers need to not just consider how knowledge is managed, but also become concerned with the truth or fiction of the knowledge that is being managed.
The two options I put forward for knowledge managers to consider are:
- Helping people to tell better stories online. This involves training people in combating fake news online through telling better stories on social media and YouTube. As knowledge management and business storytelling consultant Shawn Callahan advises, “To simply negate an untrue story only serves to reinforce the misinformation. A much more effective way of countering a misleading story is with another story.”
- Helping people to overcome the truth delusion. The “truth delusion” is the false belief that people generally sought to communicate the truth through the media until we suddenly entered a post-truth era. However, the evidence shows that there is and has for a very long time been a proportion of humans who will manipulate knowledge for dishonest purposes, and also a proportion of humans who will readily believe and spread manipulated knowledge. Accepting this reality means accepting that people can’t always be trusted to do the right thing, and then reflecting this ever-present potential for dishonesty in platform and system management.
Objectives. To understand how Twitter bots and trolls (“bots”) promote online health content.
Methods. We compared bots’ to average users’ rates of vaccine-relevant messages, which we collected online from July 2014 through September 2017. We estimated the likelihood that users were bots, comparing proportions of polarized and antivaccine tweets across user types. We conducted a content analysis of a Twitter hashtag associated with Russian troll activity.
Results. Compared with average users, Russian trolls (χ2(1) = 102.0; P < .001), sophisticated bots (χ2(1) = 28.6; P < .001), and “content polluters” (χ2(1) = 7.0; P < .001) tweeted about vaccination at higher rates. Whereas content polluters posted more antivaccine content (χ2(1) = 11.18; P < .001), Russian trolls amplified both sides. Unidentifiable accounts were more polarized (χ2(1) = 12.1; P < .001) and antivaccine (χ2(1) = 35.9; P < .001). Analysis of the Russian troll hashtag showed that its messages were more political and divisive.
Conclusions. Whereas bots that spread malware and unsolicited content disseminated antivaccine messages, Russian trolls promoted discord. Accounts masquerading as legitimate users create false equivalency, eroding public consensus on vaccination.
Public Health Implications. Directly confronting vaccine skeptics enables bots to legitimize the vaccine debate. More research is needed to determine how best to combat bot-driven content.
- Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., … & Dredze, M. (2018). Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. American journal of public health, 108(10), 1378-1384. ↩