Close
Information Space Integrity

The Invisible Thumb: Studying the Market for Social Media Manipulation

By Sebastian Bay

From the 2014 invasion of Ukraine to more recent attempts to interfere in democratic elections worldwide, antagonists seeking to influence their adversaries have turned to social media manipulation. At the heart of this practice is a flourishing market, operating online mostly from within the Russian Federation, where buyers—including states and opportunists alike—seek vendors selling illusory social media engagement in the form of comments, clicks, likes, and shares.

Social media manipulation often relies on fake accounts ranging from the simple (without pictures or content) to the elaborate (“aged” accounts with long histories meant to be indistinguishable from genuine users). Often, these accounts are bots (short for robots) controlled by a computer program, which tend do very simple tasks like viewing videos or retweeting content in order to trick algorithms into perceiving content as more popular than it really is. This allows bot-operators to generate artificial reach, as typical users are more likely to trust and share content which has been liked by many others.

In its simplest form, social media manipulation can be conducted by real users who are paid to interact with posts using their own accounts.

More elaborate fake accounts—sometimes called sockpuppets—show signs of more direct human control. While less sophisticated accounts are used for simple tasks and are expected to be blocked quickly, the most advanced accounts may remain online for years. This difference is reflected in the price of fake accounts—from a few cents to hundreds of dollars for the most advanced.

In its simplest form, social media manipulation can be conducted by real users who are paid to interact with posts using their own accounts. For the most influential users, this can be very profitable: an influencer with 100,000 followers might earn $2,000 for a promotional tweet, and an influencer with a million followers can earn as much as $20,000 per promotional tweet.

As part of an ongoing research project, the NATO Strategic Communications Centre of Excellence has been experimenting with buying fake engagement on social media platforms to learn more about the subjects targeted, actors involved, and vulnerabilities that authoritarian influence campaigns aim to exploit. The results have been alarming: in February 2019, researchers at the Centre discovered that it is possible to buy engagement with the social media accounts of a United States senator without being stopped or even noticed by social media companies. Even after the researchers reported the fake activity to the platforms, the companies largely failed to identify and remove it in a timely fashion.

It is easy to find such services through major search engines. For less than twenty dollars, one can buy eighty fake comments and four thousand fake “engagements” (likes, shares, retweets, etc.).

The research discovered that it is easy to find such services through major search engines. For less than twenty dollars, one can buy eighty fake comments and four thousand fake “engagements” (likes, shares, retweets, etc.). The Centre also looked at the finances of several providers, many of which boasted annual revenues of hundreds of thousands of dollars. During a single week in April 2019, just one of the companies identified by the Centre serviced clients as varied as Brazilian and Malawian political interests, an American fitness coach, a rap artist, and several businesses including an herbal tea company and a hair stylist.

So far, this research has resulted in four conclusions:

  • The scale of the infrastructure for developing and maintaining social media manipulation software, generating fictitious accounts, and providing mobile proxies is greater than previously understood.
  • The openness of this industry is striking: rather than a shadowy underworld, researchers found an easily accessible marketplace that most web users can reach with little effort through any common search engine. In fact, providers advertise openly on major platforms.
  • Russian service providers seem to dominate the social media manipulation market. Virtually all of the major software and infrastructure providers identified by this research were of Russian origin.
  • The size of individual service providers is troubling. In a relatively short observation period, researchers identified many providers with more than 10 employees and significant revenue.

The extent of social media companies’ efforts to secure their platforms vary, but more such efforts are needed: despite commitments codified in September 2018 as part of the EU Code of Practice on Disinformation, this ongoing research shows that manipulating social media is still surprisingly cheap and easy.

More data and greater transparency are essential. Facebook estimated that in the last quarter of 2018, more than 115 million Facebook accounts were fake, but this figure provides little insight into the extent of fake activity. How many comments did inauthentic accounts post? How much engagement did they conduct? How much ad revenue did they generate? Only social media companies are in a position to answer these questions.

The real-world effects of social media manipulation have dominated headlines for years, from interference in US and European elections to lynchings, electoral violence, and suspected genocide in Asia and Africa. As legislators grow tired of social media companies’ failure to self-regulate, they may reach for regulatory solutions. However, solutions will need to deal not only with social media companies, but also with the online dealers providing manipulation tools. To date, these actors have received too little sustained scrutiny.

 

This post is partially drawn from a longer article, titled “The Black Market for Social Media Manipulation” written by NATO Strategic Communications Centre of Excellence in collaboration with the Ukrainian Social Media analytics company Singularex.

Sebastian Bay is a senior expert at the Technical and Scientific Development Branch of the NATO Strategic Communications Centre of Excellence in Riga, Latvia. Sebastian is currently the project manager of a NATO StratCom COE project on countering the malicious use of social media. Follow NATO StratCom COE on Twitter @STRATCOMCOE.

The views expressed in this post represent the opinions and analysis of the author and do not necessarily reflect those of the National Endowment for Democracy or its staff.

 

Image credit: Kostsov / Shutterstock.com