Kenya court order puts Meta content moderation services in uncertainty

Avatar
WhatsApp Business hits 200 million monthly active users, announces new features

The Kenyan High Court has placed a court order preventing Meta, the parent company of Facebook, third-party content moderators from reviewing any content on the platform. This order extends to Majorel, Meta’s new content moderation partner.

According to a Techcrunch report, the Kenyan Court has pushed for Meta’s ex-content moderation partner, Sama, to continue the contract until the case before the court is determined.

This case began last year when Meta was sued for $2 billion in Kenya’s High Court for allegedly encouraging hate speech, inciting ethnic conflict, and failing to moderate content in Eastern and Southern Africa. Meta and Sama, the sub-contractor, have faced many legal issues with their content moderation policies, the workplace culture, and the negligence of employees’ mental health.

However, Sama said earlier this year that it will discontinue its services for the IT giant and close its regional headquarters by March 31st. While it could not change its employment terms since interim court orders barred it from conducting any reviews, TechCrunch reported that it had already placed its content moderators on paid leave beginning April 1st.

This came after the firm shut down its content moderation division to focus on labelling work (computer vision data annotation), and the court prevented Sama from firing more than 200 moderators at its hub in Kenya.

It is unclear who is currently assisting the tech giant in minimizing content that incites hate, misinformation, and violence on its platforms, particularly in African nations, as the content review process for Meta, Facebook’s parent company, in Africa is being stalled. It is, however, important to note that the corporation relies heavily on third-party content moderators.

Read Also: Meta sued over $2 billion for fueling ethnic violence in Ethiopia

Meta’s content moderation partners protest against judicial restrictions

While this suspension of Facebook’s content moderation services makes the platform vulnerable, it is a significant setback in the fight against harmful content on social media platforms, particularly in Africa, as it is now exposing the safety and security of Facebook users.

It is also restricting both content moderator services, Sama and Majorel and endangering their revenue growth, especially as we are still in the Q2 of 2023, and it is unsure how long the court case will drag on.

TechCrunch citing a legal document it had seen, reported that Majorel is protesting against the court orders as it is prohibiting it from providing content review services to Meta, claiming that they endanger both its business continuity and the livelihoods of the 200 moderators it hired after establishing a hub in Kenya late last year.

Sama, on the other hand, whose contract was supposed to expire on March 31st, will end up costing the company a significant amount of money in wages by keeping the moderators with no job for as long as the court case continues, which will also have an impact in the company’s financial operations and annual budget plans.

Complying with local laws and regulations

Photo Credit: Capacity Media

Africa has a large growing concern about the spread of misinformation and hate speech on social media platforms, and this could partly be because of the diversity of languages in the region, so there is strong importance for proper regulations.

But, it is very important that international companies, including social media platforms, abide by local laws and ordinances. The recent court ruling in Kenya serves as a reminder of the difficulties social media sites encounter while negotiating the various countries’ varied legal and regulatory frameworks.

While social media platforms must abide by local laws and ordinances, it is also crucial that these laws and ordinances do not infringe on users’ and content moderators’ fundamental rights.

Large worldwide corporations like Facebook and other social media platforms must fully accept responsibility for the content that appears on their platforms and try to make sure that objectionable content is removed as soon as possible.

However, the platforms themselves cannot take full responsibility for this. To ensure that social media networks are held responsible for the information on their platforms, governments and regulatory authorities must also play a part in ensuring that these laws and ordinances do not infringe on users’ and content moderators’ fundamental rights.


Technext Newsletter

Get the best of Africa’s daily tech to your inbox – first thing every morning.
Join the community now!

Register for Technext Coinference 2023, the Largest blockchain and DeFi Gathering in Africa.

Technext Newsletter

Get the best of Africa’s daily tech to your inbox – first thing every morning.
Join the community now!