Meta faces £1.8bn lawsuit over claims it inflamed violence in Ethiopia

Son of murdered academic calls on Facebook owner to ‘radically change how it moderates dangerous content’

Meta faces a $2.4bn (£1.8bn) lawsuit accusing the Facebook owner of inflaming violence in Ethiopia after the Kenyan high court said a legal case against the US tech group could go ahead.

The case brought by two Ethiopian nationals calls on Facebook to alter its algorithm to stop promoting hateful material and incitement to violence, as well as hiring more content moderators in Africa. It is also seeking a $2.4bn “restitution fund” for victims of hate and violence incited on Facebook.

One of the claimants is the son of Prof Meareg Amare Abrha, who was murdered at his home in Ethiopia after his address and threatening posts were published on Facebook in 2021 during a civil war in the country. Another claimant is Fisseha Tekle, a former researcher at Amnesty International who published reports on violence committed during the conflict in Tigray in northern Ethiopia and received death threats on Facebook.

Meta has argued that courts in Kenya, where Facebook’s Ethiopia moderators were based at the time, did not have jurisdiction over the case. The Kenyan high court in Nairobi ruled on Thursday that the case fell within the jurisdiction of the country’s courts.

Abrham Meareg, the son of Meareg, said: “I am grateful for the court’s decision today. It is disgraceful that Meta would argue that they should not be subject to the rule of law in Kenya. African lives matter.”

Tekle said he cannot return home to Ethiopia because of Meta’s failure to make Facebook safe. “Meta cannot undo the damage it has done, but it can radically change how it moderates dangerous content across all its platforms to make sure no one else has to go through what I have,” he said. “I look forward to this matter now being heard by the court in full.”

In 2022 an analysis by the Bureau of Investigative Journalism and the Observer found that Facebook was letting users post content inciting violence through hate and misinformation, despite being aware that it was fuelling tensions in Tigray.

Meta rejected the claims at the time, saying it had “invested in safety and security measures” to tackle hate and inflammatory language along with “aggressive steps to stop the spread of misinformation” in Ethiopia.

In January the company said it was removing factcheckers and “dramatically” reducing the amount of censorship on the platform, although it would continue to tackle illegal and high severity violations.

Meta said it did not comment on ongoing legal matters.