Social Media Companies Cannot Keep Content Moderation Secret Any Longer

It’s no secret that most online platforms are rarely transparent about major decisions, like the way they control what pops up on your feed and what ads you receive. In the same way, if you are a marketeer or have a company page on Facebook, you might have experienced the deletion of such a page without any further comments from the platform about the reasons for the content moderation. But this needs to change! Indeed, Social media can’t moderate content in the shadows anymore as this has effects on media censorship and may promote misinformation.

History Of Social Media Content Moderation

Social Media Cannot Keep content moderation secret
Social Media Can’t Secretly Moderate Content

If you rewind to 2016, you’d notice that social media platforms have very little involvement in quelching misinformation. These older methods include blocking certain content as well as accounts. It developed through the years into a more complex set of tools. Including but not limited to posts deletion and recommendations banning. 

Despite the development of these social media tools, misinformation remains to be a serious issue that plagues the online world. Becoming increasingly harmful during crucial events, like the past election days. Online misinformation definitely swayed the election, with the viral #BidenCrimeFamily that swung a few voters in Trump’s favor.

What Do The Social Media Companies Do When Coming to Content Moderation?

Socmed companies resorted to content moderation tools like blocking accounts, deleting posts, and more. However, this puzzled journalists and researchers alike. Their plan to eliminate misinformation lacked transparency. There’s no clear discussion of their line of thinking as well as the triggers that led them to such decisions. So much of their choices are shrouded in darkness, leaving the users and journalists alike to fill in the blanks.

See also  Tech Stocks Face Post-Election 2020 Headwinds

Pathway For Improvement

content moderation needs rules in social media
Social Media Can’t Secretly Moderate Content

Misinformation is harmful to people’s lives. This severe problem threatens people’s safety and public resources. It is, therefore, paramount for social media corporations to eliminate this problem. As it is also bad for business.

Section 230 from the Communications and Decency Act urges social media corporations to combat misinformation. However, so many politicians are threatening to abolish this section altogether. This is for them to forward their selfish and political interests. They take advantage of the algorithm for their propaganda. Moreover, it’s important to retain the section unless a better policy is in place.

What We Need

Transparency is very important. Some regulation without any transparency is just too much power in their hands. This power can be bought and used for selfish purposes. In the end, it’s the consumers who will suffer the most. As they are mere victims of an entity they entrust.

There must be a collective effort from the users. We must all clamor for transparency. Moreover, misinformation is largely based on who has the say in different subjects. They are usually wealthy and in positions of power. We can’t let them run the digital world and skew facts to their advantage. Lastly, a safe space with minimal and reasonable censorship is best for everyone. Additionally, the digital world should promote beneficial and collective values. Including knowledge, care, equity, and democracy. Furthermore, if the social media corporations won’t lift a finger to do what’s right, then the users must collaborate to uninvent the online world as we know it, and then create a new version with the least harm possible.

See also  Better Video Calling As NVIDIA Planned To Use AI To Slash Bandwidth

Still have questions, or want to know more about the latest tech news? Contact us at today!