Internet cesspool 8chan should be allowed to fester

In the wake of acts of mass violence that have become all too regular, it is all too tempting to try to wipe out the digital roots of the white supremacist rhetoric that have catalyzed these tragedies. Specifically, internet forum 8chan has come under scrutiny, with shootings at Christchurch, Poway and El Paso having been forecasted and later applauded by users on the site (New York Times, “‘Shut the Site Down,’ Says the Creator of 8chan, a Megaphone for Gunmen,” 08.04.2019). Media attention given to mass shootings has declined as we’re faced with distressing phenomena ranging from climate change to impeachment. Even so, the underlying causes for acts of mass violence—hateful rhetoric fermenting on the darkest corners of the internet—persist, and so should our attention to these ideologies. Despite the El Paso and Dayton shootings slipping out of the view of the public eye, the evil causing these tragedies endures.

After these mass shootings, the lair for internet incels and soon-to-be infamous murderers has suffered attacks from government agencies and various denizens of the web. For example, the House Homeland Security Committee subpoenaed the website’s current owner, and even the site’s creator, Frederick Brennan, told the New York Times that it was time to “Shut the site down.”(The Verge, “8chan owner Jim Watkins subpoenaed to testify before House committee,” 08.14.2019). Moreover, 8chan has suffered blows to its infrastructure, with web security platform Cloudflare terminating its service after the El Paso shooting (The Cloudflare Blog, “Terminating Service for 8Chan,” 08.05.2019). In an age in which the individual’s voice is amplified by social media and the internet, where we have a stronger-than-ever ability to proclaim what is acceptable and unacceptable, a new question arises: What are we to do? Allow harbors of hatred, or dispel them?

It’s too easy of a “solution” to ban 8chan and websites like it. We cannot simply ban and bring down websites containing content we do not like, even if the content is hateful, and even if posts like that of the El Paso shooter originated there. The reason, however, is not to defend the First Amendment. Not only is white supremacy a plague that transcends borders, and therefore can’t be confined by the laws of merely one nation, but the First Amendment does not cover true threats like those perpetuated on private websites like 8chan, anyway (Middle Tennessee State University, “First Amendment Encyclopedia: True Threats,” 2017) .

There is a reason we ought not concern ourselves with stomping out the anthills that are websites like 8chan. It goes like this: Clusters of hateful rhetoric will inherently spring up on the internet until hatred itself is annihilated, and we ought to take advantage of white supremacy’s hiding in broad daylight before it is driven into dark shadows to fester, morph and prepare itself for the next attack without the general public being ready. 8chan, 4chan and dwindling subreddits were under the spotlight that is the surface web (the parts of the internet that users can freely access through the use of search engines, as opposed to the deep web or darknet). As much as the U.S. government is failing in its efforts to combat hatred, there’s still a general awareness of where proponents of this hatred linger. If 8chan were to be taken down, the bastions of hate would further diversify, and internet extremists would simply gather in darker corners. This risk is corroborated by CNET reporter Oscar Gonzales, who notes that while 8chan’s popularity has declined since its inception, “there are many other derivative chan boards, such as Endchan, 7chan and Dreamchan. None match 4chan’s popularity, but some have active, if small, communities.” (CNET, “8chan, 4chan, Endchan: Here’s what you need to know,” 09.06.2019). For each internet domain stomped on, the oil fire is magnified elsewhere.

At least, this is the case with Americans who use 8chan, argues BuzzFeed contributor Ryan Broderick. Referring to 8chan less as a specific website and more as a stand-in term for hateful ideology altogether, Broderick argues, “Shutting down [8chan] is unlikely to eradicate this new extremist culture, because 8chan is everywhere.” That is, the white supremacy which has become so permissible in American society is already festering beyond the brink of simple one-step elimination. It’s deeply and culturally embedded.

This process of migration of users from one website to another is already familiar to the observant eye. For example, Reddit used to be home to many more hateful subreddits covering topics now popular on 8chan, like violence against women and the celebration of incel culture. Though the exact transitions and migratory paths of users are murky, when Reddit cracked down on its more dangerous subreddits, 4chan slowly blossomed into “one of the worst places on the internet.” (The Daily Dot, “The most disturbing controversies in 4chan history,” 04.27.2017). Likewise, when 4chan heavily moderated forums discussing the GamerGate controversies—in which online hate groups attacked feminism in video games—many silenced users flocked to 8chan, which New York Times reporter Kevin Roose described as becoming “a catchall website for internet-based communities whose behavior gets them evicted from more mainstream sites.” (New York Times, “‘Shut the Site Down,’ Says the Creator of 8chan, a Megaphone for Gunmen,” 08.04.2019).

The point is, when a bigger venue for discourse is dispelled, its affected users creep to a smaller venue. Eventually, the dispersal becomes too much for the public eye to follow, and we lose track of where hateful individuals digitally congregate, thereby severely limiting our collective ability to find, negate and understand hatred at its ideological roots. I’ve written about the importance of keeping our ideological enemies within eyesight before, in the context of allowing mass shooter manifestos to circulate (Miscellany News, “Censoring mass shooter manifestos ignores root of crisis,” 04.03.2019). Censoring manifestos, like that of the Christchurch shooter, effectively masks the fleshed-out “Great Replacement” ideology, thereby giving it a cloak under which it may spread.

In the context of websites like 8chan, we would suffer the same consequences by forcing the site’s users into further obscurity, where they may plot in private. The perpetrators of Christchurch, El Paso and other shootings were already driven out of communities like Reddit and 4chan, and the public only picked up on their violent posts in the forms of headlines cherry-picked from lesser-traveled internet locales after the shootings already took place. This is not to advocate for, say, elevating or publicizing hate speech that motivates and culminates in mass violence. However, when hate speech and ideology are located in spaces, digital or physical, where the public can find it, the public can combat it. Thanks to the internet, we are in a position in which we have the potential to keep ourselves safer.

One Comment

  1. Wow so why wasn’t Facebook brought offline when the terrorist live streamed Murdering Muslim people in a mosque in New Zealand ?????? Double standards sounds like it to me, at least you can’t live stream anything to 8 Chan . So Facebook is a worse place for Hate than 8chan . Also the manifesto did not originate on 8 Chan is originated on Instagram and someone copy and pasted it to 8 Chan, 8chan also told law enforcement and congress.

Leave a Reply to Alex Cancel reply

Your email address will not be published. Required fields are marked *