The “Wild World” of Social Media Content Moderation

When it comes to our current entertainment lives, I believe you must have thought of the Internet first. Short videos and a large amount of online information have swept our lives. But there’s a lot more going on behind the scenes online than meets the eye. The Internet, this huge media community, provides people with unprecedented convenience and connection. Public communication has brought great convenience to our lives, but it has also caused users to have great concerns about the security of online privacy. (Karppinen, K. 2017)

However, as we clicked repeatedly, the backend system conducted audits. Are there hidden dangers, one after another?

Content Moderation: The Basis for People’s Participation in Online Activities

Network information review is the basis for people to participate in online activities, and it plays a vital role in maintaining the media community environment and protecting user safety and rights. At the same time, this is also a topic of common concern to many parties in society. (Flew, T. 2021) When we publish a blog or post a video on TikTok or Instagram, the system needs to review it. Occasionally, the system may remove videos due to violations or low views. The content review is not only relevant to the work’s publisher but also closely related to each of us. As participants in online communities, we click to watch works, comment, like, and chat with family and friends. This series of actions is all under the watchful eye of the media platform’s review system. (Nissenbaum, H. 2018)

Every day, billions of people use social media platforms online. Everyone’s participation in one media event per day necessitates a review of billions of backstage personnel. Therefore, content moderation is not an easy task for online platforms. Due to the huge amount of content released by media platforms every day, platforms must adopt extremely fast review speeds to maintain the normal operation of the platform.

Do you know???
Ordinary people with meager salaries perform most of the review work, not advanced artificial intelligence, and deep algorithms! Dealing with such large volumes and extreme speeds over long periods of time always leads to some mistakes.

Recall: Have you ever experienced an information leak? Strange harassing phone calls, weird spam text messages.

In 2018, Facebook had a security incident that leaked the personal information of nearly 50 million users. The reason is that hackers exploited a vulnerability in Facebook’s “View As” feature to steal access tokens to the platform and gain full control over user accounts.
 
In 2018, the Cambridge Analytica Company improperly shared data from up to 87 million Facebook users.
 
In 2019, the Federal Trade Commission (FTC) found TikTok guilty of illegally collecting personal information from children under the age of 13 on the platform. The Federal Trade Commission (FTC) fined TikTok $5.7 million for violating the Children’s Online Privacy Protection Act (COPPA).
 
In 2020, Instagram was hacked. Hackers discovered a vulnerability in the Instagram platform that allowed them to gain unauthorized access to their registered user accounts. This vulnerability exposed the personal information of millions of users, including emails, addresses, and phone numbers.

The privacy and security of personal online information have always been a concern for people. Incidents like this have occurred one after another in recent years. Is it really because hacker technology is too powerful, or is there a flaw in the network platform’s audit method? Whatever the reason, online platforms are facing a difficult challenge. The right to privacy is considered an inherent human right. (Flew, T. 2021) Information about users’ security and privacy issues needs to be protected in all aspects!

Huge Workload: The Crisis of Inevitable Mistakes

The scale and speed of content review on social media platforms make errors inevitable. Platforms can only incur more labor costs to audit content more carefully. On the other hand, the auditing process’s secrecy leaves users without a clear understanding of how the platform audits. This has caused many users to question the consistency and fairness of the content auditing system, which in turn has led to accusations of bias in the auditing process. (Suzor, N. P. 2019) This means that platforms must strike a balance between efficiency and the nuances of vetting. However, as participants in the media community, we are not aware of these systematic vetting regimes and would question whether they are due to differential treatment by platforms. This also means that there is a crisis of trust on the Internet. (Terry Flew. 2019) This makes our lives uneasy in media communities, which are always full of crisis and uncertainty. 

Differential Treatment: The Specificity of Celebrities

In 2019, Trump published a post in which he referred to migrants crossing the United States border as “invaders” and suggested that law enforcement should respond with violence. The post violated the platform’s hate speech policy, which prohibits content that promotes or incites violence, discrimination, or hatred based on protected characteristics such as race, ethnicity, or immigration status. Instead of removing the post, however, Facebook stated that the policy does not apply to newsworthy content published by politicians. To some extent, the platform is indeed committed to free speech. Society generally sees celebrity advocacy as beneficial for solving problems and providing social value. (Karppinen, K. 2017) But, for the average person, this is an objective bias. As a result, some users feel that the platform is applying a different content censorship system for public figures and ordinary people. The platform is violating the human rights of ordinary people.

In 2020, TikTok moderators were instructed to suppress “ugly” or “poor” user content in the interest of livening up the platform. For users, this is tantamount to an unequal review system. In terms of freedom of speech, is there a distinction between “beauty” and “ugliness”? Furthermore, TikTok has deleted or restricted the content of ordinary users discussing sensitive topics, such as mental health or LGBTQ+ issues. But for celebrities, similar content remains unchanged.

The Needs of Users: What is Rule

What is the company’s responsibility for users, and what guidelines will the reviewers follow when conducting content reviews? During reviews, frequent incidents of bias and differential treatment have left users unable to let go. (Suzor, N. P. 2019) Incidents like this will clearly make users feel that the platform has different review systems for ordinary users and prestigious celebrities. For similar topics, the platform can quickly catch and delete content posted by ordinary people but chooses to ignore content posted by celebrities. Is this a result of celebrities spreading newsworthy information? Does freedom of speech not apply equally to ordinary people and celebrities? Such incidents will inevitably bring psychological discomfort to users. What are the rules for platform content review? Can anyone give a standard answer?

When we register for social media platforms such as Facebook, Instagram, etc., a pop-up window will appear. Ask if you agree with their terms of service and the rules by which these platforms operate. Terms of Service documents are designed to protect a company’s legitimate interests, giving them almost absolute power over how the platform operates. This means that social media platforms have the final say on what content is allowed on their platforms, and users are bound by their rules.

Difficulties for the Platform: Balancing Freedom of Expression and Privacy

This clause not only restricts users, but also protects them from potential harassment by spam content. But there is still a key issue in platform review, which is how to strike a balance between freedom of speech and the maintenance of various aspects, such as politics, pornography, and violence policies. Backend reviewers face a huge workload every day, and it is difficult to make quick decisions considering many cultural, social, and political factors in the face of tremendous pressure.

In addition to lacking transparency, the review process frequently takes place behind closed doors. (Suzor, N. P. 2019) This move obscures the reasons behind content removal or account suspension for users. Critics have criticized large platforms like Facebook for their opaque moderation systems, prompting calls for greater accountability and clarity in content decisions. But paradoxically, social media platforms try to remain neutral. For this reason, they hide the large-scale review process. For the user, it’s like exploring in a completely dark room. This is nothing short of difficult.

These issues have raised concerns among users about the regulatory framework and review systems of media platforms. The public onslaught on media platforms further reinforces the need to address these issues. Public shock can be a little difficult to understand. For example, live streaming of violent events on social media or the platform’s involvement in controversial events. The emergence of these events illustrates the complex and important role of digital media platforms in shaping public discourse and behavior. These incidents have also triggered more discussions and debates on how platforms conduct content reviews and platform governance.

Broader Implications of Platform Content Moderation: Society and Individuals

Platform content moderation affects not only individual users but also broader social issues like privacy, security, and disinformation. Social media platforms have a significant impact on public discourse and can shape public opinion on a variety of topics. There are troubling differences in how institutional media platforms moderate content from different groups.  (Flew, T. 2021) The lack of transparency and accountability in the content review system will inevitably cause users to worry about their own information security and privacy when using media platforms.

As the platform continues to develop, the user base grows. Transparency and fair and clear moderation rules are increasingly urgent. External oversight and accountability mechanisms are necessary to safeguard users’ rights on media platforms and guarantee fair, impartial, and open content review.

The Platform Faces a Critical Test: Developing a Coherent Program

Media platforms need to face this tough test. Media platforms need to adopt a consistent program to deal with content auditing. Faced with these challenges, the area of platform content auditing appears a little different. Platforms need to sustain their users, who have a strong demand for transparency, privacy, and security guarantees. As a result, platforms are increasingly aware of the need for openness and transparency for user privacy and security. Because of this, platforms are taking steps to provide clearer explanations of their rules and processes. Platforms that are unregulated or only self-regulated seem to be coming to an end in large numbers. (Terry Flew. 2019) For example, companies such as Facebook have published clear internal workbooks stating that they are working to develop more user-friendly rules for their platforms’ content censorship systems. Despite the incomplete implementation of this initiative, we are still a long way from achieving true transparency and comprehensive protection of users’ privacy and security. For users, the news is undoubtedly a happy one.

The number of people involved in content review on the platform is huge. Everyone has a different inclination toward reviewing content due to their own preferences and cultural views. Therefore, uniform review standards play a crucial role in effective review policies. Today’s social media platforms continue to grow, and the number of reviewers continues to increase. This means that platform companies must further define standardized vetting criteria to maintain fairness for platform users and create a positive community environment.

Conclusion

Media platforms’ content audits are complex tasks that require examination from a variety of angles. The platform is making efforts to create a good community environment. We believe we will see a great community soon, although there is still a long way to go. For users, we need to understand and appreciate the difficulty of reviewing content on the platform.

Going forward, platforms must prioritize transparency, accountability, and fairness in their review processes to create a more democratic, safe, and free-speech community environment for all users. Platforms need to uphold the values of freedom of expression while ensuring a safe and respectful environment for all users. However, there are challenges in balancing freedom of expression with a safe and transparent media community. This further highlights the difficulty and complexity of developing uniform and standardized rules for content moderation. But despite this, we firmly believe that the platform will continue to innovate and upgrade, and we are committed to creating a user-friendly media community sooner rather than later. Let’s continue to advocate for transparency, fairness, and accountability in content moderation when engaging in interactions on social media to create a more fair and transparent media community sooner rather than later.

Let’s explore the wild world of content moderation together, and we look forward to welcoming a democratic, safe, and free-speech media community soon! 

Reference

Flew, T. (2021). Issues of Concern. In T. Flew, Regulating platforms (pp. 72–79). Polity.

Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9

Terry Flew. (2019). Platforms on Trial. Intermedia, 46(2), 18–23. https://eprints.qut.edu.au/120461/

Suzor, N. P. (2019). Lawless: The Secret Rules That Govern Our Digital Lives. Cambridge University Press. https://doi.org/10.1017/9781108666428

Karppinen, K. (2017). Human rights and the digital. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media and Human Rights (pp. 95–103). https://doi.org/10.4324/9781315619835

Be the first to comment

Leave a Reply