25.7 C
Port Louis
Friday, April 26, 2024

Download The App:

Read in French

spot_img

Rights Groups Urge Meta To Improve Content Moderation In Africa

Must Read

Rights groups have urged Meta Inc to improve its content moderation in Africa after its prime third-party contractor in the continent declared that it would no more screen harmful posts for the social media giant.

Meta’s subcontractor firm Sama, based in Kenya, had said on January 10 that it was shutting down its content moderation services for Meta Platforms- Facebook, WhatsApp, Instagram, Watch- as it takes steps to focus on structuring operations.

Sama also said it would be sacking its 3% staff, totaling 200 employees, to streamline its operations and boost efficiency. However, It will go on providing data labeling services to Meta.

The decision has come at a time when both Sama and Meta face legal action over alleged labor abuses and not allowing workers’ unions in Kenya. 

Another legal action that Meta faces is that of giving permission to feature violent posts on Facebook, inciting civil conflict in neighboring Ethiopia. Both companies have defended their record.

Digital rights activists said steps taken by Meta to prevent harmful content in African countries were completely just not enough compared to richer nations, and urged the company to drastically improve its moderation processes.

“With the exit of Sama, now would be a good chance for Meta to put things right and ensure better labor conditions for African moderators in the region,” said Bridget Andere, Africa policy analyst at Access Now, a non-profit organization.

She further said, “Meta should increase the number of moderators for the region to adequately cover local languages and dialects, and also be more transparent about their algorithms which are promoting harmful content.” 

Calling Sama’s withdrawal not impactful, Meta did not divulge details about finding a new third-party contractor in East Africa. 

“We respect Sama’s decision to exit the content review services it provides to social media platforms,” said a Meta spokesperson.

“We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” the spokesperson added. 

Meta’s legal troubles

Meta started facing legal actions in East Africa from May last year after former moderator Daniel Motaung filed a legal case over poor working conditions in Kenya.

The petition was also filed against Sama and it alleged that workers moderating Facebook posts faced irregular pay, inadequate mental health support, anti-union activity, and violations of their privacy and dignity.

Sama has, however, denied the allegations. On the matter, Meta said it requires “its partners to provide industry-leading pay, benefits and support to workers”.

A ruling on whether a Kenyan court can hear the complaint is expected on February 6. 

The Ethiopian lawsuit was filed by two Ethiopian researchers and Kenya’s Katiba Institute rights group last month. They argued that Facebook’s recommendation engines boosted violent and nasty messages in Ethiopia, including a few that were posted before the father of one of the researchers was killed.

The complainants are asking for Meta to take emergency steps to demote violent content, increase staff for moderation in Nairobi, and create remuneration funds of about $2 billion for global victims of violence incited on Facebook.

Meta said it has strict rules outlining what is and is not allowed on Facebook and Instagram.

“Hate speech and incitement to violence are against these rules and we invest heavily in teams and technology to help us find and remove this content,” the Meta spokesperson said.

“Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions.”

Thousands of moderators examine social media messages that can include violent, lewd, racist, or other inappropriate content globally. Instead of working directly for tech corporations, many people work for independent contractors.

The working conditions of content moderators and Meta’s efforts to halt hate speech and violent content have already drawn criticism.

An $85 million settlement between Facebook and more than 10,000 moderators was granted by a California judge in July 2021 after the company was accused of failing to shield its employees from psychological harm caused by their exposure to violent and graphic content.

Sama will pay all moderators 15 days of pay for each year they worked with the contractor, including those who have faced worst of content on Facebook like, graphic violence, suicide and child pornography 

They will be taken to their home countries, from Nairobi, after March 31. This payment is very much smaller as compared to what Facebook offered to its 11,000 permanent employees it laid off in 2022.

All US-based employees received 16 weeks of salary, plus an additional two weeks for each year of service. In a meeting on Wednesday of last week, the moderators were informed of the details of their dismissal.

- Advertisement -spot_img

More Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest Articles