The Reddit Mods/Moderation is Out of Control

Earlier last week Reddit suffered a major outage, which motivated me to write this post. Reddit, like a lot of the internet, has followed a trajectory of decline over the past decade or so. It’s tempting to want to blame the admins or political bias, and although those play a role, the main problem are the mods for individual subs. Too much domain and keyword filtering, arbitrary removals of content, algorithmic removals/filtering, etc.

Political bias does not explain the removal of content even when it agrees with the putative objective or ideology of the the sub it was posted on–I’m not talking about trolling or poor fit (e.g. posting pro-DeSantis stories on a ‘Bernie Sanders sub’).

Contributing to Reddit not uncommonly means having to read what is effectively a user’s manual of what is disallowed or what guidelines must be met for the particular sub one wish to contribute to, in addition to Reddit’s own sitewide rules, which over the years have also become more strict. So, two sets of rules. It’s understandable that guidelines are needed to prevent spam and abuse, but it’s gotten way worse and capricious.

Such rules include correctly flaring one’s posts and or requesting flair; using certain mandatory titles, prefixes, or tags; meeting some unspecified karma threshold; making sure the post meets the minimum length, which sometimes requires adding filler; or avoiding certain words that will trigger the automated removal of said post, and so on. It becomes like a game of “Mother May I” in which you have to sometimes submit the post multiple times to figure out how to not trip a certain filter, like post length or having to add certain prefixes. Some subs have vey strict filters (like a minimum of 500 characters or needing 1,000 karma) but they are undisclosed until after you have made the post, which is deleted, so you have to make it again. And this is no guarantee; even content that gets up-voted and that follows the rules to a “T” is removed for no reason.

I tested this myself by submitting a dozen links to various Reddit subs. I submitted blogs that were pertinent to the chosen subs and with a long history of up-voted submissions to those particular subs (not off-topic or ‘troll’ posts).

After a day, only about half of the submissions survived. Most were removed within minutes or instantly, some even after getting up-votes by other users. This is with using a very old, well-established Reddit account with considerable karma, so I can only imagine how much worse it would be if someone tried to use a new account. For example, such as the submission below, which got a couple up-votes but was still removed within an hour:

This got a score of “5” but did not survive to see the next day:

I personally was not banned or warned on any of the subs, so it’s not like they had a problem with me per say, but for whatever reason the content was not good enough. I don’t really care that much. I wanted to see if it is as bad as it was the last time I tried a couple of years ago, and it is.

It’s out of control. I can understand some links being removed. But half of them? Especially when they are on-topic, from decent/reputable sources, and not full of ads?

Moderating is something that you do, like a job or other activity, not something which is completely outsourced to algorithms. If they don’t want to moderate, they should be removed by the admins and replaced by new moderators who will.

The usual counterargument is along the lines of “people have no idea how much spam popular subs get,” which may be true, but moderating is a job, which means it entails work. If you don’t want to do the work, quit. But many even less popular subs still have tons of filtering. And just because there is a lot of spam does not mean the user experience has to be shit or users presumed as spammers. Email obviously has a lot of spam yet it still works well.

There are no shortage of people who want to be mods, especially for large subs. Being a mod for a large sub is a coveted position despite how overworked or un-paid moderators claim their job is, which leads to the contradiction: if being a mod is so much work or so underappreciated, why are mods so unwilling to quit? Why do we have so settle for the same fixed roster of mediocre mods who don’t want to do their jobs well and seem to have contempt for the users, when there is potentially a huge pool of better, more qualified users who can and want to moderate? The problem is not one of demand, in which finding workers is too hard.

The notion of Reddit mods being selfless unpaid volunteers is misleading. There have been instances of moderators selling their influence or being paid to remove unflattering or negative content. In addition to considerable gatekeeping power to affect discourse, such as for Covid or other newsworthy events by deciding what content or discussion is allowed or not.

In trying to diagnose the problem, I think what happened was trolls and other bad-faith posters (not blaming any one side) ruined Reddit by creating a low-trust environment, first in 2016-2017 during the election, and then it got worse in 2020 during Covid. Each of these were ratcheting events that made censorship worse, not just in regard to Covid or politics, but Reddit overall. And then also the various crackdowns and sub purges that come in waves, like in 2018 or 2021. Low trust means the presumption of guilt.

The tendency as time goes on is for communities to become increasingly moderated and censored to the point of being useless, because rules are like an escalator: you cannot go back down.

There are many subs that have succumbed to this fate. And it’s not like new subs are easily able to replace them, due to winner-take-all effects. Alternative subs that are more lightly moderated find it hard, if not impossible, to gain enough traction. If you want to discuss economics, you’re stuck with /r/economics. There are smaller ones, but /r/economics has dominated as far back as 2015. If you think work sucks, there is /r/antiwork, which despite having the worst possible PR disaster ever, subscriber growth has not slowed much, and the sub is still very popular:

Despite the rise of TikTok, Instagram, Telegram, etc. the ‘Reddit sub hegemony’ has not been disrupted. It’s still the same moderators and subs on top. It shows that scarcity will always exist no matter how much abundance or choice technology promises. There will always be funnels and the ‘canonical collection’. For the same reason the top bands of 20-50+ years ago continue to sell out. The same books. For decades, Warren Buffett has been regarded as the greatest investor ever.

[As told by Alice Schroeder’s 2008 biography, The Snowball: Warren Buffett and the Business of Life, by the turn of the millennium Warren Buffett’s star began to fade, having missed out on the ’90s tech boom. This was a pivotal moment. What happened was the tech stocks crashed, which the vast majority did not recover, and then also 911, which shifted attention away from technology even more so. This made Berkshire Hathaway much stronger, relatively speaking, and cemented Buffett’s reputation.]

But to bring this back, Reddit bears no resemblance to how it used to be, when it not trying to be the ‘front page of the internet’, but was just a place for people to share links and discuss things, without having to walk a tightrope for the privilege of not having your content deleted. Maybe Reddit cannot be saved, nor should it. Or maybe Reddit has simply evolved to some new state of badness in which we have to either take it or leave it.