Moderation system

On Internet websites that invite users to post comments, a moderation system is the method the webmaster chooses to sort contributions that are irrelevant, obscene, illegal, or insulting with regards to useful or informative contributions.

Various types of Internet sites permit user comments, such as: Internet forums, blogs, and news sites powered by scripts such as phpBB, a Wiki, or PHP-Nuke. Depending on the site's content and intended audience, the webmaster will decide what kinds of user comments are appropriate, then delegate the responsibility of sifting through comments to lesser moderators. Most often, webmasters will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site.

Social media sites may also employ content moderators to manually inspect or remove content flagged for hate speech or other objectionable content. In the case of Facebook, the company has increased the number of content moderators from 4,500 to 7,500 in 2017 due to legal and other controversies. In Germany, Facebook is responsible for removing hate speech within 24 hours of when it is posted.[1]

Supervisor moderation

Also known as unilateral moderation, this kind of moderation system is often seen on Internet forums. A group of people are chosen by the webmaster (usually on a long-term basis) to act as delegates, enforcing the community rules on the webmaster's behalf. These moderators are given special privileges to delete or edit others' contributions and/or exclude people based on their e-mail address or IP address, and generally attempt to remove negative contributions throughout the community.

Commercial content moderation (CCM)

Commercial Content Moderation is a term coined by Sarah T. Roberts to describe the practice of "monitoring and vetting user-generated content (UGC) for social media platforms of all types, in order to ensure that the content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context." [2] While at one time this work may have been done by volunteers within the online community, for commercial websites this is largely achieved through outsourcing the task to specialized companies, often in low-wage areas. Employees work by viewing, assessing and deleting disturbing content, and may suffer psychological damage.[3][4][5][6][7][8] Secondary trauma may arise, with symptoms similar to PTSD.[9] Some large companies such as Facebook offer psychological support,[9] but critics claim that it is insufficient.[10][11]

Facebook

Facebook has decided to create an oversight board that will decide what content remains and what content is removed. This idea was proposed in late 2018. The "Supreme Court" at Facebook is to replace making decisions in an ad hoc manner.[11]

Distributed moderation

Distributed moderation comes in two types: user moderation and spontaneous moderation.

User moderation

User moderation allows any user to moderate any other user's contributions. On a large site with a sufficiently large active population, this usually works well, since relatively small numbers of troublemakers are screened out by the votes of the rest of the community. Strictly speaking, wikis such as Wikipedia are the ultimate in user moderation, but in the context of Internet forums, the definitive example of a user moderation system is Slashdot.

For example, each moderator is given a limited number of "mod points," each of which can be used to moderate an individual comment up or down by one point. Comments thus accumulate a score, which is additionally bounded to the range of -1 to 5 points. When viewing the site, a threshold can be chosen from the same scale, and only posts meeting or exceeding that threshold will be displayed. This system is further refined by the concept of karma—the ratings assigned to a user's' previous contributions can bias the initial rating of contributions he or she makes.

On sufficiently specialized websites, user moderation will often lead to groupthink, in which any opinion that is in disagreement with the website's established principles (no matter how sound or well-phrased) will very likely be "modded down" and censored, leading to the perpetuation of the groupthink mentality. This is often confused with trolling.

Spontaneous moderation

Spontaneous moderation is what occurs when no official moderation scheme exists. Without any ability to moderate comments, users will spontaneously moderate their peers through posting their own comments about others' comments. Because spontaneous moderation exists, no system that allows users to submit their own content can ever go completely without any kind of moderation.

gollark: Quantum computing will improve, but mostly still be stuck as a very expensive shiny toy in 2030, though perhaps with some utility in doing specific calculations in research.
gollark: That "less vague" one was for the next decade, by the way.
gollark: Less vaguely, laptops/desktops or other discrete computer-type things will begin to (continue to, actually?) decline as people begin using phones with better dockable IO.
gollark: More vague predictions: computers will generally get faster, but also half the computing power will end up wasted on increasingly flashy animations, poorly programmed applications, and other random nonsense like that.
gollark: Okay, no response right now.

See also

References

  1. "Artificial intelligence will create new kinds of work". The Economist. Retrieved 2017-09-02.
  2. "Behind the Screen: Commercial Content Moderation (CCM)". Sarah T. Roberts | The Illusion of Volition. 2012-06-20. Retrieved 2017-02-03.
  3. Stone, Brad (July 18, 2010). "Concern for Those Who Screen the Web for Barbarity" via NYTimes.com.
  4. Adrian Chen (23 October 2014). "The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed". WIRED. Archived from the original on 2015-09-13.
  5. "The Internet's Invisible Sin-Eaters". The Awl. Archived from the original on 2015-09-08.
  6. University, Department of Communications and Public Affairs, Western (March 19, 2014). "Western News - Professor uncovers the Internet's hidden labour force". Western News.
  7. "Invisible Data Janitors Mop Up Top Websites - Al Jazeera America". aljazeera.com.
  8. "Should Facebook Block Offensive Videos Before They Post?". WIRED. 26 August 2015.
  9. Olivia Solon (2017-05-04). "Facebook is hiring moderators. But is the job too gruesome to handle?". The Guardian. Retrieved 2018-09-13.
  10. Olivia Solon (2017-05-25). "Underpaid and overburdened: the life of a Facebook moderator". The Guardian. Retrieved 2018-09-13.
  11. Gross, Terry. "For Facebook Content Moderators, Traumatizing Material Is A Job Hazard". NPR.org.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.