Moderators Mental Health Negatively Impacted by their Job

Moderators are used on every social media platform including Facebook, Instagram, and even the Art Crime Archive. Their job is to sift through all of the objectionable content that users try to upload and block that content from being posted. Their job is just like any other where they work from 9-5 making decisions of whether to allow or to not allow some content to be displayed. You can begin to imagine what kind of content these moderators are sifting through. In an article posted on Jolt Digest, authors explained how moderators are exposed to child pornography, violence including but not limited to beheadings, and hate speech on a daily basis. There is more and more evidence being presented that the intensity of these graphic images are affecting the mental health of the employed moderators. With more and more people uploading content to social media everyday, there is a higher demand for moderators. In addition, moderators are required to meet a quota of thousands of photos a day just to keep their job. In a documentary titled, The Cleaners, past moderators were interviewed on their thoughts about the job. One woman explained how the images she saw were so graphic that she had to quit her job because she was becoming depressed. Another woman even sued TikTok because of how much her job as their moderator affected her. She claimed that there is no effort being made to blur out any of the harmful content that they have to see.

What would happen if we didn’t have moderators? Social media users, including children and adolescents, will be exposed to all of the violent content that moderators are exposed to. This would cause a huge decrease in social media usage because parents and individuals would stop using the internet in order to avoid exposure to these graphic images. This scenario is clearly not realistic therefore moderators are necessary and crucial to have. It has been suggested a possible solution to help ease the distress of moderators is screen them and inform them of all the possible content they might see when doing the job.

There are many issues at play here. The first is the root of it all which is the fact that there are individuals even wanting to upload these graphic images. It isn’t possible to eliminate these individuals except to raise awareness about the effects these images have on people and how they are not okay. Some individuals receive excitement and pleasure from the types of images that they are attempting to spread online. You could try to stop them by tracking them down and reporting them to the police however it is very common for these individuals to remain anonymous. The anonymity of online platforms makes it so incredibly easy for hateful and gross images to be spread online. There isn’t really a clear way to solve this problem because a lot of it is out of our control. We could use AI as the new moderators so that human beings aren’t being negatively affected however then you take away thousands of jobs.

Overall, it is clear that being a moderator is a tricky job. You are exposed to hateful, violent, and disturbing images that negatively impact your mental health. The art crime being committed here is the posting of these disturbing and violent images that harm the people who see them. This could be viewed as an attack on social media that is a beneficial platform for the spread of a lot of beautiful and beneficial artwork. It is important that we try to find a better solution for the sake of the mental well being of all moderators.

(Visited 10 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *