Meta Sued For US$1.6n For Promoting Ethiopia’s Ethnic Violence

Meta Raises Alarm About 400 Malicious Apps

Amnesty International stated today that Meta must modify its business methods to guarantee that Facebook’s algorithms do not magnify hatred and foment ethnic strife, after the filing of a groundbreaking legal case against Meta in Kenya’s High Court.

According to the lawsuit, Meta used an algorithm that prioritizes and promotes racist and violent content on Facebook to promote speech that led to ethnic violence and deaths in Ethiopia.

The petitioners want Facebook’s algorithms to cease suggesting such information to Facebook users, and they want Meta to set up a victims’ fund of $200 billion ($1.6 billion USD). Amnesty International is one of six human rights and legal groups involved in the lawsuit.

“The spread of dangerous content on Facebook lies at the heart of Meta’s pursuit of profit, as its systems are designed to keep people engaged. This legal action is a significant step in holding Meta to account for its harmful business model,” said Flavia Mwangovya, Amnesty International’s Deputy Regional Director of East Africa, Horn, and Great Lakes Region.

One of Amnesty’s staff members in the region was targeted as a result of posts on the social media platform.

“In Ethiopia, the people rely on social media for news and information. Because of the hate and disinformation on Facebook, human rights defenders have also become targets of threats and vitriol. I saw first-hand how the dynamics on Facebook harmed my own human rights work and hope this case will redress the imbalance,” said Fisseha Tekle, legal advisor at Amnesty International.

Fisseha Tekle is one of the petitioners who brought the action after receiving a barrage of nasty Facebook remarks for his work exposing human rights issues in Ethiopia. He is an Ethiopian national who now resides in Kenya, fears for his life, and refuses to return to Ethiopia to see his family because of the abuse he has received on Facebook.

Abraham Meareg, the son of Meareg Amare, a University Professor at Bahir Dar University in northern Ethiopia, who was tracked down and assassinated in November 2021, weeks after messages inciting hatred and violence against him emerged on Facebook, has also filed a court suit. According to the lawsuit, Facebook erased the nasty messages more than eight days after Professor Meareg’s death, three weeks after his family had first alerted the company.

Abraham Meareg has informed the Court that he is afraid for his safety and is seeking refuge in the United States. His mother, who escaped to Addis Abeba, is profoundly traumatized and cries in her sleep every night after witnessing her husband’s murder. Regional police raided the family’s house in Bahir Dar.

The derogatory postings directed towards Meareg Amare and Fisseha Tekle were not isolated incidents. According to the lawsuit, Facebook is rife with hostile, inciteful, and dangerous messages in the context of the Ethiopia war.

Meta powers Facebook’s news feed, ranking, recommendations, and groups features using engagement-based algorithmic processes, changing what is viewed on the network. Meta revenues are generated when Facebook users stay on the network for as long as possible by selling more targeted advertising.

Displaying inflammatory information, such as that which encourages hatred and constitutes incitement to violence, antagonism, and prejudice, is an efficient technique of keeping users on the site for longer. As a result, the promotion and amplification of this sort of information is critical to Facebook’s surveillance-based business model.

Internal investigations dating back to 2012 revealed that Meta was aware that its algorithms may cause substantial real-world harm. In 2016, Meta’s own study said unequivocally that “our recommendation methods exacerbate the problem” of extremism.

Amnesty International showed in September 2022 how Meta’s algorithms aggressively amplified and promoted information inciting violence, racism, and prejudice against the Rohingya in Myanmar, significantly raising the probability of a major violent outbreak.

“From Ethiopia to Myanmar, Meta knew or should have known that its algorithmic systems were fuelling the spread of harmful content leading to serious real-world harms,” said Flavia Mwangovya.

“Meta has shown itself incapable to act to stem this tsunami of hate. Governments need to step up and enforce effective legislation to rein in the surveillance-based business models of tech companies.”

The lawsuit also says that Meta’s strategy in crisis circumstances in Africa differs from that in the rest of the globe, notably North America.

During a crisis, the corporation can make unique tweaks to its algorithms to swiftly eliminate incendiary information. However, despite being implemented elsewhere in the world, the petitioners claim that none of these modifications were made during the crisis in Ethiopia, guaranteeing that damaging information continued to spread.

Internal Meta papers revealed by whistleblower Frances Haugen, known as the Facebook Papers, revealed that the $300 billion corporation in the United States did not have enough content moderators who spoke local languages. According to a study by Meta’s Oversight Board, Meta has not committed enough resources on moderating content in languages other than English.

“Meta has failed to adequately invest in content moderation in the Global South, meaning that the spread of hate, violence and discrimination disproportionally impacts the most marginalized and oppressed communities across the world, and particularly in the Global South.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here