Reddit recently announced the success of a new content policy it unveiled in June. Designed to combat hateful content on the platform, the update provides strict guidelines to regulate hate speech, which culminated in the removal of nearly 7,000 subreddits. The company is celebrating an 18 percent drop in hateful content since the policy rework. However, while Reddit rejoices, users are still experiencing a special kind of abuse on the platform. “There are still, to date, many places on Reddit that don’t feel like safe spaces,” Reddit user u/Dsporachd wrote. “We haven’t seen any reduction in homophobic, transphobic or racist hate even after the admins took all of this supposed action… so far, these problem users haven’t been effectively dealt with and are still a pervasive issue on the site, especially on subreddits for minority communities.”
Targeting Diverse Voices
By day, 43-year-old Jefferson Kelley is an IT network technician troubleshooting broken network components, but in his downtime, he detangles the crossed wires of an internet community bristling with faulty connections. For the past three years, Kelley has been an acting moderator for r/BlackPeopleTwitter, one of Reddit’s largest communities. With over 4 million subscribers, r/BlackPeopleTwitter might be one of the platform’s most popular communities, but as a space prioritizing the voices and commentary of Black people, combating racist vitriol is baked into the experience. “Ever since I became a moderator, I was able to see the daily struggle that moderators of a Black-centered community have to go through… if something positive about someone Black or a person of color reaches the front page we just get an absolute flood of racist comments,” he said during a phone interview. “The most we can do is remove the comments and ban the user, but it takes them a few seconds to create a new account and come back to do the same thing," continued Kelley. “Whenever we get to the front page [hate comments go] through the roof.” The Reddit front page, known as r/all, is where the top posts of the day from various subreddits are compiled via the site’s unique algorithm. Designed as a place for users to congregate and experience the best of different communities, it has become a point of contention for those spaces based around historically marginalized groups, according to Kelley.
Trolling Continues
Moderators from other communities echo similar feelings. r/RuPaulsDragRace is a subreddit for super fans of the Emmy-award winning reality competition show of the same name. As a show featuring the art of drag, the subreddit naturally highlights queer and trans artists, which moderators say is a recipe for disaster on the Reddit front page. Speaking through direct messages under the anonymity of their usernames, these moderators were candid about their experience with some of Reddit’s most virulent users. “If you’re an a**hole looking to troll, you’re going to head to r/all and then beeline right for that drag queen. It’s where drive-by comment trolls go to fish for targets,” wrote moderator u/VladislavThePoker. “And it happens every time we hit r/all. We’ve been called every name in the book, threatened, and doxxed.” According to moderator u/DSporachd, the subreddit was flooded with a deluge of homophobic, racist, and transphobic comments after the untimely death of RuPaul’s Drag Race fan favorite Chi Chi DeVayne. The comments mocking the deceased and the subreddit users are only a snapshot of the content these moderators are in charge of regulating on the social media platform.
A Long Way to Go
Even with Reddit’s recent efforts, harassment and mistreatment are part and parcel of the experience of being a minority on the platform. Reddit administrators can cheer change on the platform, but the ones most involved in the communities paint a bleaker picture. This speaks to the larger issue of Reddit’s previously hands-off approach to policy moderation. In the post-Black Lives Matters era, it seems the social media powerhouse might finally be taking its platform seriously. And while other moderators are less sure about the success of the policy, Kelley is more optimistic about the future. “The core of the platform has become a weapon of anonymity, but there’s a change now and it’s because these admins have stopped taking this stance to leave it up to us moderators to run the nuthouse,” Kelley said. “There’s been a lot more productivity and not as many hoops to jump through to get admin intervention, so what’s happening is more of a crackdown.”