This article mentions sensitive topics that may be triggering to some readers, including racism, sexual violence, animal cruelty, drug use, and/or child abuse. Continue at your own discretion.
Content moderation helps a website function properly. The internet is full of dark (and vocal) corners where all kinds of ugly content and language exists. Virtually any website that hinges on user-contributed content has some kind of moderation practice -- and the bigger the website, the more difficult the job.
Take, for example, Meta Platforms (META) - Get Free Report breakout social media site Facebook. Moderators have to sift through any and every kind of content on a daily basis that may not be suitable for all audiences.
Employees at Meta's third-party content moderation contractor Sama spend hours filtering videos and reading posts containing incredibly sensitive materials, including racism, sexual violence, animal cruelty, drug use, child abuse, and more. The work is heavy, and sustainably doing it can require taking robust work-life balance measures, including self-harm reduction methods. Maintaining employees would likely mean providing them with counseling, adequate time off for mental health rest, and a high rate of pay.
Union-Busting Suit Filed Against Sama & Meta
Nairobi-based subcontractor Sama and Meta have been in litigation for nearly a year over some egregious labor disputes. According to a suit filed by a former employee, he was fired after trying to form an employee union that would advocate for mental health support for moderators. The suit also alleges that Sama used ‘deceptive’ means to trick applicants into taking the job, that both Meta and Sama created of a workplace that discouraged any kind of negative feedback, and that Sama used Meta’s software to track employee productivity.
This week, Sama laid off approximately 3% of its staff, mostly from Nairobi. In a statement, Sama said the decision was a response to the "current economic climate." This comes after the company announced it would be ending its partnership with Meta Platforms.
A Meta spokesperson confirmed the end of the contract in a statement. "We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content."
There's been no elaboration on what kind of services would be provided and for how long.
OpenAI Update Used Meta's Content Monitoring System
Many of the moderators affected by the closure had just finished working on OpenAI's ChatGPT project, an artificial intelligence chatbot meant to mimic human conversation. The non-profit research company had to rely on Meta's moderation resources to remove harmful content from its AI chatbots. Now that Meta and Sama have parted ways, moderation for future updates will have to be outsourced somewhere else. And Meta may choose another company that's just as likely to land the company in hot water.
OpenAI has grown its technology thanks, in part, to some very high-profile investors, including Fidelity Investments, Andreessen Horowitz, and most recently Microsoft (MSFT) - Get Free Report. Other well-known investors include PayPal (PYPL) - Get Free Report co-founder Peter Thiel and Tesla (TSLA) - Get Free Report and Twitter mogul Elon Musk, who donated a whopping $10 million to the project.