Conspiracy-driven violence is on the rise in the United States and social media companies, like Youtube, are not doing enough to prevent the spread of dangerous misinformation. Youtube still operates as a breeding ground for radicalization, even with increased pressure stemming from the FBI including radical conspiracy theories in the list of domestic terror threats in 2019. Kidnappings, a bomb threat, an assassination plot, and murder are all some of the heinous actions that have been motivated by conspiracy theories in the past couple of years.
Eduardo Moreno, a train engineer from California, intentionally derailed a freight train last year in an attempt to strike the navy hospital ship Mercy. Moreno wanted to “‘wake people up’” to unsubstantiated, suspicious activity aboard the ship regarding COVID-19. Conspiratorial beliefs like this circulate on platforms like Youtube, often leading to tragedy and destruction.
Youtube played a key role in the rise of the QAnon conspiracy by providing legitimacy to the movement through likes and subscribers. Moderators from 4chan, a leading message board for the alt-right, live streamed discussions of QAnon on Youtube and built a large network of followers using the platform. Youtube also enriches the pockets of creators of QAnon channels and incentivizes others to create conspiracy theory channels for monetary gain with merchandise opportunities and advertising.
Youtube benefits in this process too because conspiracy channels attract a lot of advertisement viewers to the platform. The collective followers of two popular conspiracy channels, David Icke and London Real, could bring in over $40 million a year to Youtube and Facebook. Besides advertising revenue, Youtube might also generate income from sales made in London Real’s online store, which followers are directed to through links on the platform.
Extensive research has been conducted surrounding the dark side of Youtube’s recommendation algorithm. Youtube’s algorithm pushes content that it thinks the user will want to watch and can lead users to engage with a dark cycle of misinformation. For example, if a viewer engages with anti-vaccine videos, they may be recommended to click on a QAnon video also discussing this conspiracy. Although it has decreased the number of QAnon videos it recommends, likely by cutting some QAnon channels out of its recommendation algorithm, conspiracies continue to circulate on the platform. This recommendation system is influential worldwide as it generates more than 70% of the total watch time on the platform.
A study from the University of California, Berkeley, found that while Youtube’s efforts to crack down on specific conspiracies, like QAnon, were successful, others continue to thrive on the site. People can become victims to the new phenomenon of conspiracy theory addiction when there are still harmful conspiracies flourishing on Youtube.
Conspiracy theory addiction influences the way a person interprets events and is correlated with having overall negative beliefs surrounding trust and empowerment. This addiction is motivated by a need to control, belong, and understand the world around us. Unfortunately, early indicators show that belief in conspiracies leads to more confusion and isolation, which is the exact opposite of the original motivation.
Conspiracies typically have a game-like quality that draws people in and contributes to the addictive nature of conspiracies. With QAnon, “Q” would leave behind clues on a message board and people would try to decipher the riddle. The enticing nature of conspiracy theories contributes to nearly 50% of Americans believing in one or more conspiracy.
What can Youtube do?
One option is to continue to follow in the footsteps of other social media sites like Twitter and put posts in context with links to accurate information. Youtube implements “information cues” which create links to a Wikipedia page debunking the conspiracy theories discussed in videos, but this tool is only as effective as the ability to keep up with new conspiracies. Also, the general distrust and paranoia that accompanies conspiracy theorists may render these useless.
A more thorough approach is to completely ban all harmful conspiracy theory content. Youtube has stopped short of banning harmful conspiracies because it fears overstepping its role and eliminating videos that fall in a gray area. Conspiracy theories can be laced with truth, which makes the platform hesitant to institute a monolithic ban.
The other problem preventing Youtube from banning content is its struggle to draw a line in the sand about what is considered harmful conspiracy content. Youtube’s current stance is limited to taking down videos around conspiracies that have incited violence. Youtube should instead build a banned list of theories that have been cited as motives for violent and non-violent crime and have been linked to psychological distress. A more aggressive position is necessary because of the role that Youtube plays beyond contributing to violence. Distrust of the systems around us can lead to serious mental health crises and isolation from those who love us the most.
Allowing conspiracy theories to be shared on Youtube is dangerous because they can serve as an entry point to a harmful world for viewers. Youtube needs to understand how the platform grants false credibility to conspiracy theories through repetition and “likes” generated by viewers. It is time for Youtube to take responsibility for their authoritative role and not continue to generate revenue from content that leads to tragic acts.
The views expressed above are solely the author's and are not endorsed by the Virginia Policy Review, The Frank Batten School of Leadership and Public Policy, or the University of Virginia. Although this organization has members who are University of Virginia students and may have University employees associated or engaged in its activities and affairs, the organization is not a part of or an agency of the University. It is a separate and independent organization which is responsible for and manages its own activities and affairs. The University does not direct, supervise or control the organization and is not responsible for the organization’s contracts, acts, or omissions.