Internet Shill

From SI410
Jump to: navigation, search
An invisible or anonymous person can evoke the concept of an internet shill. License: CC BY-NC 2.0; Source: openverse. Author: shando

An internet shill (also called internet stooge or internet sock puppet) is a person, online persona, or website that imitates a real individual, institution, publisher, or otherwise acts as a decoy. Shills often promote messages in order to create an appearance of public acceptance, rebuke, or sentiment.[1] Individuals or organizations operating shill campaigns are usually compensated. In most jurisdictions, shilling is illegal when it brings harm to other parties while activities intended to not harm, such as entertainment, are usually considered legal.[2]

Shilling became a well-known concept to the general public in the 20th century when newspapers and magazines began publishing advertisements from businesses looking for paid shills.[3] Recently, presence of shills have migrated to internet spaces such as social media and e-commerce platforms, most notably where politics and commercial products are discussed or reviewed. Press coverage mentioning internet shills have risen since the 2016 U.S. general election, detailing patterns of personal shill accounts operating alongside comprehensive "shill websites" that portray themselves as larger, more legitimate media groups, spreading messages with hidden agendas.[4]

Ethical considerations include the impacts of manipulation or deception on society, political and economic impact, and hidden conflicts of interest. Increasing press coverage has raised public awareness of internet shilling and calls to regulate shilling activity by private groups as well as public institutions.


The term “internet shill” borrows from the traditional concept of shilling, which was commonly applied to people and schemes at carnivals or confidence games, and is likely a 20th century terem for "shillaber".[5] Historically, shills have also been synonymous with the terms “plants” and “stooges.” One potential source of the word may have originally referred to author Benjamin Penhallow Shillaber (1814–1890) who often wrote under a fictional pen name.


The traditional shell game often employs shills to lure unsuspecting "marks". License: CC BY-SA 2.0. Source: openverse. Author: elmada

Persons employing shill tactics appeared in great numbers in the 19th and 20th century, especially at carnivals, circuses, sideshows, and casinos, and are often associated with tricksters in gambling games and auctions. In gambling games, shills can portray themselves as unassociated passerbys who play and win. This artificial display helps to make winning look as more likely outcome to innocent bystanders who might now attempt to play.

One common street game is the "shell game" where the games-person places a ball in one of three cups and then mixes the cups. Players are then tasked with choosing the cup with the ball. Sometimes, shill players who are confederate with the game operator will "win" one or more rounds in view of other innocent spectators, who are then lured to play as well. However, the game operator adjusts the trick to ensure the innocent spectator loses the round and their bet.

Auctions also often attract shills, especially bidders who collude with auctioneers to drive up prices and increase auctioneer profits, which is usually a percentage of final sales. Online auctions attract a disproportionate share of shilling activity, which have motivated some auction websites, such as eBay, to expressly forbid shilling—although detection is often difficult.

Researchers have studied how swarm intelligence has “inspired” shills in the context of evolutionary cooperation. Shills are not necessarily malicious or harmful, although this is a more uncommon view, as shills are generally considered as a malicious phenomenon. Researchers have claimed that shills can have the effect of spreading positive messages and behaviors through a population to increase overall cooperation.[6]


Astroturfing is an organized campaign by concealed interests that coordinate together to promote a message or idea. Astroturfing creates a false appearance of mass acceptance or rejection within a population.[7] Often shill accounts are employed as part of an astroturfing campaign. Historically, the term "astroturfing" referred to "the deceptive practice of presenting an orchestrated marketing or public relations campaign in the guise of unsolicited comments from members of the public."[8] The concept has extended to include similar activities on the internet and other digital spaces.



Claims of in-person shilling is migrating to social media and the internet. Source: David Horsey/LA Times

Political astroturfing messages covertly support or reject a particular candidate or policies using manufactured strategic "top-down" activity that masks itself as "bottom-up" grassroots activity.[9] Among bots, and political insiders, shill accounts on social media have steadily increased its share of astroturfing. Some shill accounts spread information about their sponsoring campaign to political opponents, while other shill accounts post replies to opponents’ messages to “correct” people spreading information inimical to the shilling campaign.[10]


Some military powers engage in internet shilling to “influence internet conversations and spread pro-American propaganda.”[11] The U.S. Central Command (Centcom) writes blog posts, chatroom messages, and “other interventions” on foreign, non-English websites using accounts controlled by the Centcom program, which were portrayed as real individuals living across the globe.[12]


Businesses often maintain an internet presence in social media and elsewhere to promote products, services, and other messages—often through the use of shill accounts. One of the most common and controversial practices of business shills is writing fake product reviews; either positive reviews for their products, or negative reviews for competitor products. Some states, including New York, have attempted to mitigate this practice through government regulation.[13]


Some publishing marketers, including those within book and magazine companies, have been criticized for writing self-shilling material, especially positive reviews of their own products. One such publishing marketer professional, Todd Rutherford, was discovered as having created an entire company to generate positive reviews for books whose publishers sought them. "'I was creating reviews that pointed out the positive things, not the negative things,' Mr. Rutherford said. 'These were marketing reviews, not editorial reviews.'"[14]

Social media


Facebook is a popular platform that attracts shill accounts; the platform captures the attention of billions of users across the globe. In 2017, Facebook reported roughly 10% of accounts on Facebook are “duplicate” accounts, implying about 200 million accounts are accounts for non-persons.[15]


As the “front page of the internet,” Reddit is a highly-influential website with millions of active users.[16] High traffic and engagement is similar to dialogue in the “public square” that can influence a larger mass of people and “lurkers”. It’s an attractive platform for manipulation through shilling tactics, recently in finance and cryptocurrency.

Forbes reported in early 2017 that Reddit was being clearly manipulated by large finance firms. Janhoi McGregor, a senior contributor at Forbes, conducted a non-scientific experiment where he posted two fake news articles in large subreddits seen by millions of people. He then managed to vote those articles to the top of each subreddit with “fake accounts and fake upvotes” for less than $200.

Social-media promotion operations work frequently with financial firms to manage shill accounts to manipulate voting on posts and comments, as well as engage in commentary under the guise of an authentic user.[17] These services are offered by well-established, professional marketing firms.

Some commenters on Reddit have admitted to participating in shill activity. One particular user claimed to be a former public relations professional who had operated internet shill accounts to defend client products and services, and often smear and "gaslight" detractors as quacks and conspiracy theorists.[18]

Twitter & Instagram

Similar to Reddit and Facebook, Instagram and Twitter are both popular targets for shilling, especially in finance-related topics such as cryptocurrency and non-fungible tokens (NFTs). Job postings advertising for help in creating and managing shill accounts, including experience qualifications and possession of digital assets such as personal inventories of social media accounts.[19]

Celebrities are a particularly powerful type of shill when they promote a product or service without disclosing potential incentives. Some celebrities have been found to promote harmful content. One such celebrity, Jonathan Cheban, a close friend of Kim Kardashian, promoted a food intolerance test to his 2 million followers on Instagram. However, scientists said the food tolerance test had "nothing to do with [testing] food intolerance." The test sold over 1 million tests, and it was revealed Cheban was paid $5,000 for promoting the product.[20]

In that year, 2017, Instagram began rolling out efforts to increase transparency for paid shilling activity and improve long-term footing for commercial relationships with Instagram and its users.[21][22]


In addition to YouTube-place interstitial advertisements within videos, many popular YouTube content creators seek direct sponsorship deals or access to products as part of an “influencer” role they play on the social video website. Influencers promote products and services directly within the video content. Many are accused of being YouTube shills, however the accusation may not be as toxic as once believed.

Most youth on YouTube feel it’s acceptable for YouTube subjects to shill for sponsors, with 87 percent of respondents between ages 13-24 indicating drawbacks of sponsor shilling by YouTubers is worth the free price to watch videos.[23]


Recommendation systems

Shilling operations have targeted many recommender systems on many e-commerce platforms (among others) that manipulate recommendation algorithms to promote or restrict recommendations. Media and word-of-mouth reports of these operations have reduced trust consumers have for recommendation systems, which threaten the future and consumer value of such systems. Recent research has looked at methods to detect and counteract recommender shilling operations.[24]


Crowdturfing is the phenomenon where "a significant number" of individuals are compensated for performing questionable tasks as part of a manipulative online campaign. Recent studies have shown "not only do malicious crowd-sourcing systems exist, but they are rapidly growing in both user base and total revenue."[25] Crowdturfing is usually employed to encourage an effective information cascade to affect public discussion, consumer/citizen sentiment, and behaviors of susceptible users.


Many internet shills are automated bots that operate with minimal—or zero—human intervention. Bots often consist of social media accounts that post content tens or hundreds of times in relatively short time spans.[26] A 2020 academic study discovered roughly half of Twitter accounts discussing relaxing COVID-19 in the U.S. were bots.[27]

Commercial tools

For-profit companies have emerged that help individuals and organizations conduct automated shill operations on various platforms, such as Telegram Messenger.[28] One such company is QQSHILL, which provides automated marketing campaigns for the Telegram Messenger app for Android and iPhone. Services include a "Shilling Bot" that can send messages from many accounts to many groups, as well as a "DM Bot" that sends direct messages en-mass to groups on Telegram.[29]

Publicly-available tools

Software developers and hackers have published Python-based shilling programs that can be customized for each use case. One such program, the "Telegram Shill Bot", targets the Telegram Messenger app in a similar fashion as QQSHILL.[30] Instructions, code, and various warnings are given to potential bot operators. Basic bot-detection prevention features are commonly included in these types of packages.

Sock puppets

An internet sock puppet or catfish is a "fake persona used to discuss or comment on oneself or one’s work."[31] Sock puppets are usually human-operated but may also be bot-operated. Some national governments, such as China, have created sock puppet "armies" of tens of thousands of accounts tasked with influencing online discussing and "spinning" fact and rumor to achieve political ends. Chinese sock puppet operations "use specially trained - and ideologically sound - internet commentators.... dubbed the '50-cent party' because of how much they are reputed to be paid for each positive posting."[32]

In 2019, the state of California passed legislation that banned using undisclosed bots to deceptively influence people engaged in commercial transactions or voting in elections.[33] Two other states investigated companies engaged in undisclosed deception sock puppets and other malicious tactics.[34]

Many sock puppet accounts use so-called "aged" accounts that were previously created prior to resale, and usually contain a history of unremarkable activity and engagement that appears to represent a normal person. Facebook, Instagram, Reddit, and other proprietary accounts are catalogud and presented to potential buyers through websites such as FBACCS and ACSS Market.


Many internet sock puppets are involved in promoting a brand or product or service using a false identity. These manipulations have affected consumer trust of many established online retailers, many of whom actively attempt to prevent, detect, and remove bot-created content. The U.S. Federal Trade Commission are empowered "levy fines if a company engages in sock puppet marketing."[35]


Individuals and organizations often attempt to influence product and service reviews to encourage specific product recommendations to consumers. For example, Sony Pictures has been caught using shill accounts to post fake reviews that manipulate systems that recommend films to consumers. Given that "4% of users make a purchase before referencing existing user comments" these manipulations are likely to materially affect customer choices.[36]


Among other nations, the U.S. armed forces have been discovered conducting internet sock puppet operations that promote American propaganda and influence online discussion Critics have compared these operations to similar programs conducted within China, especially its program of censorship. Critics accuse the program of creating "a false consensus in online conversations, crowd out unwelcome opinions and smother commentaries or reports that do not correspond with its own objectives."[37]

Influencer marketing

Influencer marketing, such as celebrity product endorsements on social media sites, has been considered a "grey area" when juxtaposed with relatively less-transparent internet shilling. Platforms such as Facebook and Instagram have developed user policies to regulate "branded content" promoted by users.[38][39]

"Fake" influencers is a phenomenon which has grown along with "authentic" influencers. These activities are functionally equivalent to sock puppetry. Fake influencers often consist of "aged" social media accounts that are available for sale. A 2008 study found that about 12 percent of influencers in the U.K. had fake followers.[40] Twenty-four percent of U.K. influencers were also found to have choreographed followers.[41]

Virtual shills

Virtual shills are similar to bots and influencers, except associated online accounts contain a more comprehensive profile that is difficult to distinguish from a real person's account. Often profile pictures are AI-generated while text messages are human-generated. One particular virtual shill, named "Lil Miquela", managed to attract 1.5 million followers, many of whom are unaware Miquela is not a real human. Upon discovering an account is fake, some users complain violations of trust while other users say there is no problem.[42]

Ethical Issues


Customer reviews are a key feature in many e-commerce platforms, especially on popular outlets such as Some product sellers post fake positive reviews to build reputation, trust, and posture against competition.[43] Fake reviews are either posted directly by the selling party, or through a third-party provider.

Fake reviews highlighted on Amazon.

Most often, fake reviews are posted by purchased social media accounts reserved for the purpose of leaving fake reviews. Marketplace websites such as “ACSS Market” provide a well-stocked, liquid inventory of social media accounts for several platforms. These accounts feature varying characteristics that buyers can choose for particular use cases. One common characteristic are accounts that are “aged.” Aged accounts are often perceived as relatively authentic; representing a real person who is actively using the account in a transparent manner.[44] These previously-created accounts usually contain fake or boilerplate user-generated activity.

Since at least November, 2012, the customer review website "Yelp" has worked to detect and expose individuals and companies that game the review system through internet shills and other methods. One small business owner in San Francisco said he hired fake review writers because "it makes it easier for people to find you." Other businesses have avoided using shills to game reviews. A vice president of communications at one small jewelry store was approached by a company offering shilled reviews, but the vice president considered the activity "not just unethical, but it's probably illegal."[45]

In late January 2022, hackers gained access to several YouTube accounts and promoted the purchase of cryptocurrency using videos produced by the hackers.[46][47]

A 2009 ruling from the Federal Trade Commission states "that paying for positive reviews without disclosing that the reviewer had been compensated, amounted to deceptive advertising and made offending companies liable for prosecution."[48]


Ethical and moral concerns regarding shilling vary depending on context. Recent media coverage of celebrity shilling has cast the tactic in a negative light. A recent story in Slate authored by a former celebrity shill calls celebrity crypto shilling a “moral disaster.”[49] However, professional organizations are lacking and public consensus on ethical implications of influencer marketing is not clear and is debated. One study attributes the ongoing debate to a lack of understanding of ethical principles that support paid content. It argues influencers use the concept of "authenticity" as a guiding principle of good ethics, and that authenticity itself as "driving industry conversations and decisions regarding sponsored content."[50]


Shill bidding has adapted to the 21st century.

Studies have shown that the frequency of fraud correlates with an increase in online participants, of which shill bidding is most prominent.[51] Harm to authentic bidders comes in the form of overpriced sales.

In 2021, eBay was discovered to be colluding with partnered companies that employed bidding shills to push up the price of merchandise using “fake bids.” eBay then rewarded partnered companies who used the shills by offering special low-price auction fees.[52]

Shilling negatively impacts non-shill bidders in internet auctions by driving up the price of up-for-bid items, which unnaturally raises prices and causes winning bids to overbid. Bidders are deceived by shill-bidders' public information and will likely overvalue the item and cause additional harm to the winning bidder.[53]


In a general sense, four broad principles are used to determine what is ethical in health care: autonomy, justice, beneficence, and non-maleficence. Internet shills that compromise the "autonomy of thought, intention, and action when making decisions regarding health care procedures" may violate the autonomy principle in the eyes of medical ethicists. Beneficence, which "requires that the procedure be provided with the intent of doing good for the patient", might be challenged by shills advocating for procedures that may not advocate for the best patient outcome, such as unnecessary, high-profit procedures. Similarly, non-maleficence implies "do no harm" which could be violated by self-interested shills that influence patient or provider incentives.[54]

In medicine, shills have been known to promote the interests of support institutions above the interests of patients. In 2008, investigators from Congress investigated two reputable psychiatrists in Boston. The investigators determined both psychiatrists were part of an effort “to generate and disseminate data that would support use of an anti-psychotic drug, Risperdal, in children, a controversial target group.”[55]

See also


  1. “Definition of SHILL.” Accessed January 24, 2022.
  2. Underwood, John. Expert Character Assassination.
  3. The New Yorker. Jul 08, 1933. Accessed January 27, 2022.
  4. Parkinson, Hannah Jane. “Click and Elect: How Fake News Helped Donald Trump Win a Real Election.” The Guardian, November 14, 2016, sec. Opinion.
  5. “Definition of SHILL.” Accessed January 24, 2022.
  6. Duan, Haibin, and Changhao Sun. “Swarm Intelligence Inspired Shills and the Evolution of Cooperation.” Scientific Reports 4 (June 9, 2014): 5210.
  7. “Definition of ASTROTURFING.” Accessed January 27, 2022.
  8. Lexico Dictionaries | English. “ASTROTURFING English Definition and Meaning | Lexico.Com.” Accessed February 9, 2022.
  9. Kovic, Marko, Adrian Rauchfleisch, Marc Sele, and Christian Caspar. “Digital Astroturfing in Politics: Definition, Typology, and Countermeasures.” Studies in Communication Sciences 18 (November 14, 2018): 69–85.
  10. Morstatter, Fred, Robert P Trevino, and Huan Liu. “Characterizing and Identifying Shills in Social Media,” n.d., 6.
  11. Cobain, Ian, and Nick Fielding. “Revealed: US Spy Operation That Manipulates Social Media.” The Guardian, March 17, 2011, sec. Technology.
  12. Cobain, Ian, and Nick Fielding. “Revealed: US Spy Operation That Manipulates Social Media.” The Guardian, March 17, 2011, sec. Technology.
  13. “A.G. Schneiderman Announces Agreement With 19 Companies To Stop Writing Fake Online Reviews And Pay More Than $350,000 In Fines | Eric T. Schneiderman,” September 26, 2013.
  14. Streitfeld, David. “The Best Book Reviews Money Can Buy.” The New York Times, August 25, 2012, sec. Business.
  15. Heath, Alex. “Facebook Quietly Updated Two Key Numbers about Its User Base.” Business Insider. Accessed January 28, 2022.
  16. Books, Podcasts, Wharton Business Daily, and North America. “Is Reddit the Most Influential Site on the Internet?” Knowledge@Wharton. Accessed January 27, 2022.
  17. McGregor, Janhoi. “Reddit Is Being Manipulated By Big Financial Services Companies.” Forbes. Accessed January 27, 2022.
  18. “Expect PR Bullshit A….” Reddit Comment. R/Worldnews, April 8, 2015.
  19. Upwork. “Nft Shill - Promotion - Twitter and Instagram.” Accessed January 28, 2022.
  20. The Counter. “When Instagram Celebs Are Paid to Shill Sham Science,” July 5, 2017.
  21. Burch, Sean. “Instagram Wants Its Paid Shills to Be More Transparent,” June 14, 2017.
  22. Burch, Sean. “Instagram Wants Its Paid Shills to Be More Transparent,” June 14, 2017.
  23. Los Angeles Times. “Most Young Viewers Feel It’s OK When YouTube Stars Shill for Sponsors, Study Says,” March 29, 2016.
  24. Zhou, Wei, Junhao Wen, Qiang Qu, Jun Zeng, and Tian Cheng. “Shilling Attack Detection for Recommender Systems Based on Credibility of Group Users and Rating Time Series.” PLOS ONE 13, no. 5 (May 9, 2018): e0196533.
  25. Wang, Gang, Christo Wilson, Xiaohan Zhao, Yibo Zhu, Manish Mohanlal, Haitao Zheng, and Ben Y. Zhao. “Serf and Turf: Crowdturfing for Fun and Profit.” ArXiv:1111.5654 [Cs], May 18, 2012.
  26. Matthews, Jeanna. “Bots and Trolls Control a Shocking Amount of Online Conversation.” Fast Company, June 29, 2020.
  27. MIT Technology Review. “Nearly Half of Twitter Accounts Pushing to Reopen America May Be Bots.” Accessed February 9, 2022.
  30. Python Awesome. “A Simple Bot in Python That You Can Use to Shill (i.e. Send Messages) Your Token,” September 3, 2021.
  31. Spy, Word. “Sock Puppet - Word Spy.” Accessed February 9, 2022.
  32. “China’s Internet ‘Spin Doctors,’” December 16, 2008.
  33. “Bill Text - SB-1001 Bots: Disclosure.” Accessed February 9, 2022.
  34. Buffington, Kimberly. “Bots, Sock Puppets and a Reckoning for a Social Media Influence Peddler.” Internet & Social Media Law Blog, April 16, 2019.
  35. “What Is Sock Puppet Marketing? - Definition from WhatIs.Com.” Accessed February 9, 2022.
  36. Zhou, Wei, Junhao Wen, Qiang Qu, Jun Zeng, and Tian Cheng. “Shilling Attack Detection for Recommender Systems Based on Credibility of Group Users and Rating Time Series.” PLOS ONE 13, no. 5 (May 9, 2018): e0196533.
  37. Cobain, Ian, and Nick Fielding. “Revealed: US Spy Operation That Manipulates Social Media.” The Guardian, March 17, 2011, sec. Technology.
  38. Facebook Business Help Center. “Branded Content Policies.” Accessed February 9, 2022.
  39. “Branded Content on Instagram | Instagram Help Center.” Accessed February 9, 2022.
  40. September 13, Jonathan Owen / and 2018. “‘Blurred Lines’ - Closing in on the Influencer Frauds.” Accessed February 9, 2022.
  41. Field, Hayden. “‘Influencer Fraud’ Costs Companies Millions of Dollars. An AI-Powered Tool Can Now Show Who Paid to Boost Their Engagement.” Entrepreneur. Accessed February 9, 2022.
  42. “Meet the CGI Influencers That Are Fooling Everyone on Instagram.” Accessed February 9, 2022.
  43. Martínez Otero, Juan María. “Fake Reviews on Online Platforms: Perspectives from the US, UK and EU Legislations.” SN Social Sciences 1, no. 7 (July 21, 2021): 181.
  45. News, A. B. C. “Yelp Outs Companies That Pay for Positive Reviews.” ABC News. Accessed February 9, 2022.
  46. Android Headlines. “Multiple YouTube Accounts Are Being Hacked To Shill Cryptocurrency,” January 24, 2022.
  47. Arun Maini. “I Think Someone Just Got into My YouTube Account and Posted Something Did Anyone Manage to Get a Screen Recording?” Tweet. @Mrwhosetheboss (blog), January 24, 2022.
  48. Smith, Mike Deri. “Fake Reviews Plague Consumer Websites.” The Guardian, January 26, 2013, sec. Money.
  49. McKenzie, Ben, and Jacob Silverman. “Celebrity Crypto Shilling Is a Moral Disaster.” Slate, October 7, 2021.
  50. Wellman, Mariah L., Ryan Stoldt, Melissa Tully, and Brian Ekdale. “Ethics of Authenticity: Social Media Influencers and the Production of Sponsored Content.” Journal of Media Ethics 35, no. 2 (April 2, 2020): 68–82.
  51. Majadi, Nazia, Jarrod Trevathan, Heather Gray, Nazia Majadi, Jarrod Trevathan, and Heather Gray. “A Run-Time Algorithm for Detecting Shill Bidding in Online Auctions.” Journal of Theoretical and Applied Electronic Commerce Research 13, no. 3 (2018): 17–49.
  52. Steiner, Ina. “EBay Faces ‘Vertical’ Setback amid Shill Bidding Scandal.” EcommerceBytes (blog). Accessed January 24, 2022.[!PermaLink].
  53. Kauffman, Robert J., and Charles A. Wood. “The Effects of Shilling on Final Bid Prices in Online Auctions.” Electronic Commerce Research and Applications 4, no. 1 (March 1, 2005): 21–34.
  54. “Medical Ethics 101.” Accessed February 9, 2022.
  55. The New York Times. “Opinion | Expert or Shill?,” November 30, 2008, sec. Opinion.

(back to index)