A new lawsuit accuses Meta of inflaming the civil war in Ethiopia

Microsoft
By Microsoft 10 Min Read

“I hold Facebook personally responsible for my father’s murder,” he says.

Today, Abrham, as well as fellow researchers and Amnesty International legal adviser Fisseha Tekle, filed a lawsuit against Meta in Kenya, alleging that the company allowed hate speech to run rampant on the platform, leading to widespread violence. The lawsuit requires the company to de-prioritize hateful content in the platform’s algorithm and increase its content moderation staff.

“Facebook can no longer be allowed to prioritize profit over our communities. Like radio in Rwanda, Facebook stoked the flames of war in Ethiopia,” says Rosa Curling, director of Digital, a UK-based non-profit that addresses human rights abuses by global tech giants. The organization supports the petition. “The company has clear tools in place: adapt their algorithms to degrade viral hate, hire more local staff and make sure they are well paid and their work is safe and fair, to prevent this from continuing.”

Since 2020, Ethiopia has been involved in civil war. Prime Minister Abiy Ahmed has responded to attacks on federal military bases by sending troops to Tigray, a region in the country’s north that borders neighboring Eritrea. An April report released by Amnesty International and Human Rights Watch found substantial evidence of crimes against humanity and a campaign of ethnic cleansing against ethnic Tigrayans by Ethiopian government forces.

Fisseha Tekle, Amnesty International’s chief researcher in Ethiopia, further implicated Facebook in spreading offensive content which, according to the petition, endangered the lives of her family. Since 2021, Amnesty and Tekle have drawn widespread reproaches from supporters of the Tigrayan campaign in Ethiopia, apparently for failing to place the blame for wartime atrocities on Tigrayan separatists. Indeed, Tekle’s research into countless crimes against humanity amid the conflict has laid its hands on warring parties on all sides, finding separatists and the Ethiopian federal government mutually guilty of systematic murders and rapes of civilians. Tekle told reporters during an October press conference, “There is no innocent party that hasn’t committed human rights violations in this conflict.”

In a Foxglove statement shared with WIRED, Tekle spoke of witnessing “firsthand” Facebook’s alleged role in blurring research aimed at exposing government-sponsored massacres, describing social media platforms that perpetuate hate and disinformation as corrosive to the work of human rights defenders.

Facebook, which is used by more than 6 million people in Ethiopia, has been a key avenue through which narratives targeting and dehumanizing Tigrayans have spread. In a Facebook dated July 2021 to send remaining on the platform, Prime Minister Ahmed called the Tigrayan rebels “weeds” that need to be pulled. in any case, the Facebook documents revealed that the company lacked the ability to properly moderate content in most of the country more than 45 languages.

Leaked documents shared by Facebook whistleblower Frances Haugen show that parent company Meta’s leadership remained well aware of the platform’s potential to exacerbate political and ethnic violence during the Tigrayan war, sometimes gaining Ethiopia’s special attention. from the company’s core risk and response team. By 2021 at least, the documents show, the conflict in the country had raised enough alarms to warrant the formation of a war room-like operation known as IPOC, a process Facebook created in 2018 to respond quickly to “times of crisis.” ” politicians.

Compared to its usual content moderation processes, the IPOC is viewed internally as a scalpel, deployed not only to anticipate emerging threats but also to assess cases of “overwhelming abuse” spurred by political flashpoints. This includes the use of so-called “break the glass” measures: dozens of “levers” IPOC teams can deploy during exceptionally inciting events to quell spikes in hate speech on the platform. In the US, for example, this included the November 2020 election and the subsequent attack on the US Capitol.

In testimony before the US Senate last fall, Haugen compared the violence in Ethiopia to the genocide of more than 25,000 Rohingya Muslims in Myanmar, war crimes for which Facebook has been internationally condemned for its instigating role. from the United Nations Human Rights Council, among others. “What we saw in Myanmar and what we’re seeing now in Ethiopia are just the beginning chapters of a story so terrifying that nobody wants to read the end of it,” Haugen, a former Facebook product manager, told lawmakers.

As of December 2020, Meta did not have hate speech classifiers for Oromo and Amharic, two of the major languages ​​spoken in Ethiopia. To compensate for understaffing and the absence of classifiers, the Meta team looked for other proxies that would allow them to identify dangerous content, a method known as network-based moderation. But the team struggled because they found, for reasons not immediately clear, that Ethiopian users were far less likely to take actions Facebook has long used to help detect hate speech, which included showing up too many facial reactions “angry”. An internal proposal has suggested abandoning this model entirely, instead replacing it with one that gives more weight to other “negative” actions, such as users disliking pages or hiding posts. It is not clear from the documents whether the proposal was accepted.

In its 2021 roadmap, Meta designated Ethiopia as a country at “severe” risk of violence and, in an assessment of the company’s response to incitement and violent content, rated its capacity in Ethiopia as 0 out of 3. However, in another document, a Meta staffer acknowledged that the company lacked “human review capability” for Ethiopia in the country’s run-up to elections.

The petitioners asked the High Court to issue a statement holding Meta responsible for violating a list of fundamental rights guaranteed by Kenya’s 2010 Constitution: the right to freedom of expression and association; the right not to be subjected to violence or not to unnecessarily disclose information about one’s family or one’s private affairs; and the right to equal protection under the law, among others. In addition, the petitioners asked the court to order the establishment of a victim fund of more than $2 billion, with the court itself dispersing the funds on a case-by-case basis. Finally, they asked the court to force Facebook to declare that its algorithms will no longer promote inciting, hateful and dangerous content and downgrade it wherever it is found, as well as launch a new crisis mitigation protocol “qualitatively equivalent to those implemented in the US” , for Kenya and all other countries whose content Meta moderates from Nairobi.

“Kenya is the content moderation hub for Oromo, Tigrinya and Amharic posts. These are the only three Ethiopian languages, out of the 85 spoken in the country, that current Facebook content moderators can even attempt to cover,” Curling says. “There are currently 25 Facebook content moderators working on Ethiopia-related content for a country of 117 million people. The decisions of these people, forced to work in appalling and unfair conditions, about what jobs are removed and what remains online are made in Kenya, and it is the Kenyan courts, therefore, that must determine both men’s legal challenge.

Meta spokeswoman Sally Aldous told WIRED that inciting hatred and inciting violence is against company policies. “Our security and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions,” she says. “We hire people with local knowledge and expertise and continue to build our capabilities to detect infringing content in the country’s most widely spoken languages, including Amharic, Oromo, Somali and Tigrinya.”

Aldous did not address whether the company has more than 25 country-focused moderators and whether the company has an IPOC team focused on the conflict in Tigray beyond the country’s election cycles.

Meanwhile, following the death of his father, Amare and his family were forced to flee their home. He is currently awaiting the outcome of his asylum application in the United States.

“Every dream we had has collapsed,” she says.

Share This Article
Leave a comment