Earlier this month, Twitch temporarily suspended a channel called Patriots’ Soapbox. The channel is not enormous, generally pulling 20-50 concurrent viewers over the course of its 24/7 talk show-style streams, sometimes spiking into the 100 range. But Patriots’ Soapbox is notorious: Some credit the organization behind it, which has bases of operation on many different platforms including YouTube and Discord, with having helped start the now-infamous QAnon conspiracy movement, while others acknowledge that it has at least been a key part of QAnon’s evolution and popularization. Twitch’s structure has so far kept Patriots’ Soapbox small, but there are other, potentially more pernicious avenues for conspiracies to take root on the platform.
QAnon is a vast far-right conspiracy theory that claims a number of notable politicians and celebrities are actually devil-worshipping pedophiles running a global child sex ring with the goal of extracting an imaginary drug called “adrenochrome” that can only be produced by torturing children. Only Donald Trump, who has declared himself a fan, can save them. It is patently, hilariously false, a quasi-religious reaction partially to concerns about real issues like human trafficking and the growing power of wealthy elites, but mostly as backlash to progressive social movements (thus the inherent contradiction of deeming Trump—a wealthy, powerful, verifiably corrupt elite—the movement’s god-king).
For some adherents, it’s more than just a conspiracy; it’s an all-encompassing worldview that subsumes other conspiracies and beliefs to paper over the overwhelming complexities of modern reality with a simpler good/evil dichotomy. Since QAnon’s inception in 2017, its followers have gone so far down the rabbit hole as to commit actual crimes. The process of assembling an endless database of conspiratorial faux-knowledge has an easy appeal, providing people with a semblance of agency during an era of chaos. QAnon’s greatest strength is its malleability: It uses legitimate concerns as camouflage, inserting itself into mainstream conversations with hashtags like #SaveOurChildren, a tactic that bad-faith online movements have been employing since Gamergate and long before.
Patriots’ Soapbox has done the lion’s share of its work on platforms like YouTube and Discord, creating a bridge between the recesses of the 8chan/kun internet and the mainstream (and even providing a platform for Q-friendly political candidates like congressional candidate Lauren Boebert). But it has also maintained a Twitch presence where it re-broadcasts its real-time YouTube theorycrafting, in which on-screen personalities and chat frequently puzzle out supposed meanings from the eponymous Q’s nonsensical and false missives. Twitch chat, too, joins in. This format lets viewers be part of the QAnon project of collaborating to build an unassailable foundation of what, to followers, feels like research.
Based on Twitter activity from Patriots’ Soapbox and a number of followers, Twitch appears to have suspended the channel on September 9, specifically for violating its policies around “hateful conduct.” In a statement to Kotaku, a Twitch representative said that “the safety of our community is our top priority, and we reserve the right to suspend any account for conduct that violates our rules, or that we determine to be inappropriate, harmful, or puts our community at risk.” At some point during the past few days, Patriots’ Soapbox’s Twitch channel returned, meaning that it was only suspended, not permanently banned.
At the time of its suspension, Patriots’ Soapbox had been on Twitch for more than a year. Twitch only took action shortly after other platforms like Twitter and Facebook (ineffectively) cracked down on the QAnon conspiracy colossus in July and August, respectively. Twitch’s move was made from within the safe enclave of precedent, which is a common tactic among large social platforms. Arguably, however, all of these platforms waited too long, with only Reddit banning QAnon forums back in 2018, before the conspiracy spring-boarded off the aforementioned online platforms and into the mainstream consciousness. Even now, it is not hard to locate over 200 videos and streams on Twitch that, in their titles, include truncated versions of QAnon’s trademark slogan, “Where we go one, we go all” (which mystifyingly comes from the 1996 Ridley Scott film White Squall).
Most of those videos, streams, and streamers are as small as Patriots’ Soapbox’s Twitch channel, many smaller. This gets to the heart of why Twitch likely did not see the need to move quickly when it came to acknowledging Patriots’ Soapbox and why it continues to play host to other QAnon streamers. QAnon does not have the kind of foothold on Twitch that it has on other platforms, even others frequented by younger audiences like TikTok. But Twitch and its underlying culture are much more complicated than just which channels are big and which ones aren’t. This explains why Twitch remains both uniquely inoculated against and vulnerable to QAnon-like conspiracies.
Compared to other platforms that are now struggling against full-on QAnon infestations, Twitch is unique in a handful of different ways. Foremost among them, Twitch is not powered by a series of algorithms that effectively create pipelines between conspiracies and the personalities who propagate them.
“If you’re someone who wants to spread a message in support of Q, the fastest way to make sure lots of people see it is not to go stream it on your Twitch channel that has five viewers,” Will Partin, a researcher at Data & Society specializing in misinformation and disinformation (and an occasional Kotaku contributor), told Kotaku over a Discord voice call. “It’s to make a tweet, hit it with the ‘Where we go one, we go all’ hashtag, and then let the audience on Twitter pick it up and go wild.”
Dan Olson, a YouTuber and occasional Twitch streamer who recently released a long-form critical video called “In Search Of A Flat Earth” that (spoilers) is actually about QAnon’s insidious spread, took it a step further, saying that the central obstacle on Twitch isn’t just a lack of algorithmic automation, but the particular style of organization that takes its place.
“I think the big thing that’s kept [QAnon] away from Twitch is all the reasons why succeeding on Twitch is such a nightmare,” Olson told Kotaku in a DM. “The way that stream categories are compartmentalized makes it really hard to float your ‘let’s decode Q drops’ stream, and the fact that Twitch has basically no archive of past streams unless you’re already a successful streamer makes it that much harder to ‘pill’ [or convert] randos who wander by when you’re not online.”
Twitch, for better and worse, is a constantly teetering balancing act of manual organization. Stream categories largely correspond to individual video games or activities like “Just Chatting” and “Music.” Viewers can click into these categories and select from streams suggested by a very simple recommendation system, but most instead opt to sort streams by concurrent viewers, from high to low (until recently, this was Twitch’s default). This means that it’s exceedingly difficult for streamers to even temporarily catapult their way into a higher tier of visibility through a viral moment or popular video. A smaller channel will likely not suddenly be surfaced to somebody who’s not looking for it. Twitch users can create and share “clips” of their favorite moments, but these aren’t linked to other content by an algorithm that susses out viewers’ interests in the same way as on, say, YouTube. Sharing, too, is much more manual. Many clips blow up because they’ve gotten big on outside websites and forums, like popular subreddit Livestreamfail. The net effect of all of this is that Twitch does not naturally lend itself to linked webs of conspiracy.
Twitch’s approach to content moderation is also more directly hands-on than that of other platforms. If a channel breaks the rules, an in-house moderation team (as opposed to an outsourced one, or artificial intelligence) reviews what occurred and decides whether or not to suspend the channel in question. Sometimes, this leads to inaction, even where overt terms of service violations or conspiratorial content is concerned. Patriots’ Soapbox is one example. Another is President Donald Trump’s own Twitch channel, which has been airing rallies, speeches, and the occasional panel since October of last year. In June, around the same time that Reddit banned the notorious “The Donald” subreddit, Twitch temporarily suspended Trump’s channel for hateful conduct, but not before its largely un-moderated chat became a hotbed of racism, sexism, and QAnon conspiracy theories.
So it’s not that these conspiracy theories don’t exist on Twitch; it’s that, so far, they have largely been constrained to expected channels without a reliable means of seeping out.
“You’ll definitely get drive-by holocaust denial commenters,” said Olson, “and there’s piles of streamers who hold conspiratorial beliefs, but it tends to be a secondary thing where a person who absolutely is an authentic WoW streamer suddenly breaks out all their weird opinions about bitcoin.”
Twitch is capable of acting quickly against overtly objectionable channels. On September 11, Enrique Tarrio, a key figure in the violent neo-fascist Proud Boys organization (which has collaborated with QAnon believers before and contains some itself), announced that he’d started a Twitch channel, which was promoted by the Proud Boys organization during its annual gathering in Las Vegas. That same day, a researcher of the far right who goes by the handle We Will Be Ruthless on Twitter publicly questioned Twitch’s decision to allow this. According to AntiFash Gordon, an activist and researcher who works alongside others to deplatform far-right extremists, this led many people to report the channel to Twitch. A few hours later, Twitch banned the channel. This all occurred in under 24 hours, a much faster turnaround time than many Twitch bans—even ones that involved harassment of the platform’s stars.
“We don’t like fascists having any platform, and as researchers, we’re all aware of how critical Gamergate was for the alt-right’s ability to attract new members in the lead-up to the 2016 election,” said AntiFash Gordon in a DM to Kotaku. “So we’re especially wary of the far-right trying to recruit among gamers.”
In the cases of Tarrio and Trump, Twitch mostly reacted to obvious figures known for spreading misinformation and hate. Given the way QAnon and its spore-like sub-conspiracies work—glomming onto the concerns of particular platforms and populaces and mutating to meet them where they’re at—its potential spread might not stand out so much. Julian Feeld, co-host and producer of the QAnon-researching/lampooning “QAnon Anonymous” podcast, thinks that big or growing Twitch streamers would have to open the door to wider discussion of QAnon on the platform.
“I can absolutely see it moving to Twitch, not in a conscious way, necessarily,” Feeld told Kotaku over a Discord voice call. “I think that’s where we have to look for the next wave on Twitch: people who are about to get very big or are already big. Because if you remember at the beginning of QAnon, when everything was a bit more pedestrian and we weren’t really talking about the platforms yet, [actress] Rosanne Barr came out in public and even though she never became a Q influencer, she fed a kind of movement of people going ‘Wait a second: Roseanne Barr believes that?’”
It would not be unprecedented. In May of this year, then-Twitch star Guy “Dr Disrespect” Beahm spent a chunk of a stream reading through debunked coronavirus conspiracy theories and watching a widely debunked video suggesting that 5G cellular technology causes covid-19, leading many other streamers to also discuss coronavirus conspiracy theories in the following weeks. Beahm was banned from Twitch in June for reasons that are still not publicly known, but that Kotaku has verified did not pertain to his sharing of conspiracy theories. In the wake of that ban (and the ensuing speculation about what happened), some fans resorted to conspiracy theories of their own. Some accused various female Twitch streamers of somehow being responsible (they weren’t), while others went full-QAnon, tagging messages of support on Twitter with the “Where we go one, we go all” hashtag or speculating that Beahm had either uncovered some element of the nefarious child sex ring operation or was part of it. None of these things were true.
Still, as a previous Kotaku investigation into conspiracy culture on Twitch uncovered earlier this year, Twitch’s lack of transparency and unwillingness to divulge sought-after answers creates information vacuums. Viewers fill these with conspiracy theories—often about Twitch streamers and employees (especially where female streamers are concerned).
Twitch, due to its structure as a platform with a calcified upper class of popular personalities who, by virtue of being so popular, dictate many of the platform’s norms, is largely impenetrable to those outside the central Twitch community. This, according to Feeld, is why Patriots’ Soapbox has failed to gain much traction; it functions as a tiny island cordoned off from Twitch’s continent of creators and memes, a boomer-run exercise in shouting “How do you do, fellow kids?” into the void. Recent streamers who have broken the standard gamer mold but still managed to get big on Twitch—chess grandmaster Hikaru Nakamura and leftist political commentator Hasan Piker, for example—have done so by integrating into the Twitch community, learning the meme-laden lingo and collaborating with preexisting stars. No full-on conspiracy theorists have done this yet. That does not mean they can’t, however. Though TikTok’s algorithm-centric nature offers more mobility for creators than Twitch, that platform does provide a blueprint for how conspiracies could take hold among Twitch’s younger-skewing set of creators and viewers. It’s all about appealing to that particular community’s priorities and beliefs.
“[QAnon influencers] were able to exploit Twitter and YouTube properly, but there are some platforms, new platforms, where they’re not able to do it,” Feeld said. “Originally the theory was that Hillary Clinton was molesting people in a basement because it came from the Podesta emails and all that, but the new generation just skip the Podesta emails and go straight for [the idea that] Justin Bieber is screaming for help in this ‘Yummy’ [music] video. He’s trying to tell us something about these [powerful] people, and it’s that they’re eating children and babies, and they’re using hot dogs or pizzas as symbols… Even on TikTok, it’s not the old Q influencers making a name [for themselves]. It’s brand new kids who are getting pilled through other shit, and they become big influencers. [Trump-supporting TikToker] Judith Rose is a good example of that. She’s a rising star right now for sure.”
Feeld added, however, that even in that case, the demographics are still different: “The new generation [of QAnoners] tend to be relatively new born-again Christians or people who are super into meditation or New Age stuff, or self improvement and stuff. They don’t usually participate in Twitch culture on their way to becoming a QAnon person, but there’s obviously exceptions. There are the original 4chan nerds who are obviously way more kind of attuned to this stuff. But I mean, I think the 4chan guys [go on Twitch] more to, you know, get women who show cleavage banned or whatever the latest shit is.”
With QAnon now a mainstream mainstay, the platforms that allowed it to fester for so long can, at best, do damage control. But, as with pizzagate, flat earth, anti-vaxx, and covid denialism before (and still very much adjacent to) it, there’s a clear pipeline for ludicrous-sounding conspiracies to warp entire populations’ perceptions. So what can platforms do now? What should they be doing to lessen Q’s continued influence and halt the rise of whatever the big conspiracy, the next Q, is?
Partin thinks proactiveness is key. “It’s a lot harder to debunk information that’s already in the world than it is to get ahead of something and inoculate people with high-quality understanding and context that is then going to be resilient to sort of the low-quality or problematic information that comes along,” he said. “So in some ways, it’s kind of too late for Q in that sense, but… What would it have looked like for a company to really play heads-up ball with, like, the Wayfair stuff the second it popped up, asking ‘Alright, who is susceptible to this? What can we get in front of them to try and prevent this from becoming a broader conspiracy?’ But of course that’s really hard. It’s a major investment of resources. But I think being proactive, getting in front of it, will not be a universal solution, but it will be part of the solution.”
Feeld thinks that platforms will always be beholden to their own, typically reactive corporate interests, which makes them inherently unreliable when it comes to stopping conspiracies before they have a chance to take root. As soon as platforms start going after entities most would identify as dangerously far-right, they’ll have an incentive to crack down on progressive outlets as well, even if there’s not an actual equivalence there (Facebook’s recent ban of some QAnon and right-wing militia groups also took out a slew of antifascist and anti-capitalist news pages, for example). He believes that longer-term solutions have to start at the societal level.
“When you talk about solutions to problems like this, it’s a bit like when you talk about child trafficking,” he said. “We had a [podcast] episode about child trafficking recently, and we explored how to solve it. You should not be scanning the airport for children who look weird with their parents or whatever; you should be investing in anti-poverty measures, a place to sleep for underage homeless people who are having to trade sex for a safe place, and those kinds of things. A lot of these are systemic answers so that people don’t get to the point where they’re completely primed for this by political extremism all around them, including Fox News and the President. We have to stop it before it gets that point. You can crack down on the community, of course, and I think it’s useful. Every time you knock down Stormfront, it’s a good thing. But is it a solution? No, it’s a management of the symptoms.”