Advertisement

Non-Genuine Actors

  • Susannah B. F. Paletz
  • Brooke E. Auxier
  • Ewa M. Golonka
Chapter
Part of the SpringerBriefs in Complexity book series (BRIEFSCOMPLEXITY)

Abstract

Non-genuine actors, including social bots and sockpuppets, are automated or manipulated to post information and sentiment to various social media platforms. Though not all social bots and sockpuppets are nefarious, there is evidence that some of these non-genuine accounts—especially on Facebook and Twitter—are maliciously managed by hackers or state actors, to spread misinformation and spam. Trolls may or may not be automated, but are consciously or unconsciously divisive.

Keywords

Social media Social media users Social media sharing Trolls Bots Sockpuppets State actors Political science Sociology Information science Sociopolitical Narratives 

References

  1. Adams, R., & Brown, H. (2017, October 17). These Americans were tricked into working for Russia. They say they had no idea. BuzzFeed News. Retrieved from https://www.buzzfeed.com/rosalindadams/these-americans-were-tricked-into-working-for-russia-they
  2. Agarwal, N., Al-Khateeb, S., Galeano, R., & Goolsby, R. (2017). Examining the use of botnets and their evolution in propaganda dissemination. Defence Strategic Communications, 2, 87–112.CrossRefGoogle Scholar
  3. Arnsdorf, I. (2017, August 23). Pro-Russian bots take up the right-wing cause after Charlottesville: Analysts tracking Russian influence operations find a feedback loop between Krelmin propaganda and far-right memes. ProPublica. Retrieved from https://www.propublica.org/article/pro-russian-bots-take-up-the-right-wing-cause-after-charlottesville
  4. Beskow, D. M., & Carley, K. M. (2018). Using random string classification to filter and annotate automated accounts. In R. Thomson, C. Dancy, A. Hyder, & H. Bisgin (Eds.), Social, cultural, and behavioral modeling. SBP-BRiMS 2018. Lecture notes in computer science (Vol. 10899, pp. 367–376). Cham: Springer.Google Scholar
  5. Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., et al. (2018). Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. American Journal of Public Health, 108, 1378–1384.  https://doi.org/10.2105/AJPH.2018.304567CrossRefGoogle Scholar
  6. Bu, Z., Xia, Z., & Wang, J. (2013). A sock puppet detection algorithm on virtual spaces. Knowledge-Based Systems, 37, 366–377.CrossRefGoogle Scholar
  7. Carroll, O. (2017, October 17). St Petersburg ‘troll farm’ had 90 dedicated staff working to influence U.S. election campaign. Independent. Retrieved from http://www.independent.co.uk/news/world/europe/russia-us-election-donald-trump-st-petersburg-troll-farm-hillary-clinton-a8005276.html
  8. Chan, A., & Dale, G. (2018). How Russian trolls won American hearts and minds: The techniques were not sophisticated, but the messages were on target. Medium. Retrieved from https://medium.com/techforcampaigns/how-russian-trolls-won-american-hearts-and-minds-30037e1e13b7
  9. Chen, A. (2015, June 2). The agency. New York Times. Retrieved from http://nyti.ms/1AHZ353
  10. Chen, A. (2016, July 27). The real paranoia-inducing purpose of Russian hacks. The New Yorker. Retrieved from https://www.newyorker.com/news/news-desk/the-real-paranoia-inducing-purpose-of-russian-hacks
  11. Chu, Z., Gianvecchio, S., Wang, S., & Jajodia, S. (2010). Who is tweeting on Twitter: human, bot, or cyborg? In Proceedings of the 26th Annual Computer Security Applications Conference (pp. 21–30). Austin, TX: ACM.  https://doi.org/10.1145/1920261.1920265CrossRefGoogle Scholar
  12. Confessore, N., Dance, G. J. X., Harris, R., & Hansen, M. (2018, January 27). The follower factory. New York Times. Retrieved from https://nyti.ms/2GpOgS0 or https://www.nytimes.com/interactive/2018/01/27/technology/social-media-bots.html
  13. Cook, D., Waugh, B., Abdipanah, M., Hashemi, O., & Rahman, S. A. (2014). Twitter deception and influence: Issues of identity, slacktivism and puppetry. Journal of Information Warfare, 13, 58–71.Google Scholar
  14. Crabb, E. S., Mishler, A., Paletz, S., Hefright, B., & Golonka, E. (2015). Reading between the lines: A prototype model for detecting Twitter sockpuppet accounts using language-agnostic processes. In HCI International 2015 – Posters’ extended abstracts (Vol. 528, pp. 656–661).Google Scholar
  15. Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59, 96–104.CrossRefGoogle Scholar
  16. Filipov, D. (2017, October 8). The notorious Kremlin-linked ‘troll farm’ and the Russians trying to take it down. The Washington Post. Retrieved from https://www.washingtonpost.com/world/asia_pacific/the-notorious-kremlin-linked-troll-farm-and-the-russians-trying-to-take-it-down/2017/10/06/c8c4b160-a919-11e7-9a98-07140d2eed02_story.html?utm_term=.6617c06e105a
  17. Hardaker, C. (2010). Trolling in asynchronous computer-mediated communication: From user discussions to academic definitions. Journal of Politeness Research, 6, 215–242.CrossRefGoogle Scholar
  18. Helmus, T. C., Bodine-Baron, E., Radin, A., Magnuson, M., Mendelsohn, J., Marcellino, W., et al. (2018). Russian social media influence: Understanding Russian propaganda in Eastern Europe. Santa Monica, CA: RAND Corporation. Retrieved from www.rand.org/t/RR2237CrossRefGoogle Scholar
  19. Kim, Y. M., Hsu, J., Neiman, D., Kou, C., Bankston, L., Kim, S. Y., et al. (2018). The stealth media? Groups and targets behind divisive issue campaigns on Facebook. Political Communication, 35, 515–541.CrossRefGoogle Scholar
  20. Lee, K., Caverlee, J., & Webb, S. (2010, July). Uncovering social spammers: Social honeypots + machine learning. SIGIR’10, Geneva, Switzerland.Google Scholar
  21. Liu, D., Wu, Q., Han, W., & Zhou, B. (2015). Sockpuppet gang detection on social media sites. Frontiers of Computer Science, 10, 124–135.  https://doi.org/10.1007/s11704-015-4287-7CrossRefGoogle Scholar
  22. Lokot, T., & Diakopoulos, N. (2016). News bots. Digital Journalism, 4, 682–699.  https://doi.org/10.1080/21670811.2015.1081822CrossRefGoogle Scholar
  23. McNamee, R. (2018, January/February/March). How to fix Facebook — before it fixes us. Washington Monthly. Retrieved from https://washingtonmonthly.com/magazine/january-february-march-2018/how-to-fix-facebook-before-it-fixes-us/
  24. Murthy, D., Powell, A. B., Tinati, R., Anstead, N., Carr, L., Halford, S. J., et al. (2016). Bots and political influence: A sociotechnical investigation of social network capital. International Journal of Communication, 10, 4952–4971.Google Scholar
  25. Myers, J. (2018, March 15). Meet the activist who uncovered the Russian troll factory named in the Mueller probe. National Public Radio. Retrieved from https://www.npr.org/sections/parallels/2018/03/15/594062887/some-russians-see-u-s-investigation-into-russian-election-meddling-as-a-soap-ope
  26. Oates, S. (2016). Russian media in the digital age: Propaganda rewired. Russian Politics, 1, 398–417.CrossRefGoogle Scholar
  27. Office of the Director of National Intelligence. (2017). Intelligence community assessment: Assessing Russian activities and intentions in recent US elections (ICA 2017-01D). Washington, DC: Office of the Director of National Intelligence. Retrieved from https://www.dni.gov/files/documents/ICA_2017_01.pdfGoogle Scholar
  28. Paul, C., & Matthews, M. (2016). The Russian “firehose of falsehood” propaganda model: Why it might work and options to counter it. Santa Monica, CA: RAND Corporation. Retrieved from https://www.rand.org/pubs/perspectives/PE198.htmlCrossRefGoogle Scholar
  29. Shane, S., & Isaac, M. (2017, November 3). Facebook says it’s policing fake accounts. But they’re still easy to spot. New York Times. Retrieved from https://www.nytimes.com/2017/11/03/technology/facebook-fake-accounts.html
  30. Shao, C., Ciampaglia, G., Varol, O., Flammini, A., & Menczer, F. (2017). The spread of fake news by social bots. ArXiv e-prints. arXiv:1707.07592 [cs.SI]
  31. Solorio, T., Hasan, R., & Mizan, M. (2013). A case study of sockpuppet detection in Wikipedia. In Proceedings of the Workshop on Language in Social Media (LASM 2013) (pp. 59–68). Atlanta, GA.Google Scholar
  32. Stewart, L. G., Arif, A., & Starbird, K. (2018). Examining trolls and polarization within a retweet network. In MIS2, 2018, Marina Del Rey, CA.Google Scholar
  33. Stukal, D., Sanovich, S., Bonneau, R., & Tucker, J. A. (2017). Detecting bots on Russian political Twitter. Big Data, 5(4), 310–324.CrossRefGoogle Scholar
  34. Sydell, L. (2017, October 29). How Russian propaganda spreads on social media. National Public Radio. Retrieved from http://www.npr.org/sections/alltechconsidered/2017/10/29/560461835/how-russian-propaganda-spreads-on-social-media
  35. Twitter Public Policy. (2018, January 19). Update on Twitter’s review of the 2016 U.S. election [Blog post]. Retrieved from https://blog.twitter.com/official/en_us/topics/company/2018/2016-election-update.html
  36. United States of America v. Internet Research Agency LLC, 18 U.S.C. §§ 2, 371, 1349, 1028A (District Court for the District of Columbia, 2018).Google Scholar
  37. Varol, O., Ferrara, E., Davis, C., Menczer, F., & Flammini, A. (2017). Online human-bot interactions: Detection, estimation, and characterization. In International AAAI Conference on Web and Social Media (pp 280–289). arXiv:1703.03107 [cs.SI].
  38. Villamarín-Salomón, R., & Brustoloni, J. C. (2009). Bayesian bot detection based on DNS traffic similarity. In SAC’90, Honolulu, Hawaii.Google Scholar
  39. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359, 1146–1151.CrossRefGoogle Scholar
  40. Woolley, S. C., & Howard, P. N. (2017). Computational propaganda worldwide: Executive summary (Working Paper No. 2017.11). Oxford, UK: University of Oxford. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Casestudies-ExecutiveSummary.pdf

Copyright information

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Susannah B. F. Paletz
    • 1
  • Brooke E. Auxier
    • 2
  • Ewa M. Golonka
    • 1
  1. 1.Center for Advanced Study of LanguageUniversity of MarylandCollege ParkUSA
  2. 2.Philip Merrill College of JournalismUniversity of MarylandCollege ParkUSA

Personalised recommendations