As company pursues growth on continent, it stands accused of not putting enough money into moderating content
Facebook has been accused of failing to invest sufficiently to combat misinformation as it pursues rapid growth in Africa, where the Covid pandemic has highlighted the outsize role played by social media in online discourse.
Traditional media and governments have an increasingly limited ability to control information flows on the continent, as social media platforms including Facebook seek to expand rapidly, though largely without fanfare.
“Facebook are losing users left, right and centre in the global north, so where are the new users coming from? The global south,” said Anri van der Spuy, a senior researcher at Research ICT Africa, a thinktank.
Sub-Saharan Africa has a population of 1.1 billion and, at an average of about 30%, internet use is three times higher than a decade ago.
Toussaint Nothias, research director at the Digital Civil Society Lab of Stanford University, who has worked extensively on Facebook, said it was “generally accepted” that Facebook had launched an “aggressive expansion” in the global south to win new users following a decline in the developed world.
“Africa has a young growing population and so offers opportunities for Facebook to become an entry to the internet, via Facebook, WhatsApp, Instagram or whatever. That can be monetised down the line,” he said.
Many – but not all – academic studies have linked Covid vaccine hesitancy with misinformation circulating on social media in Africa, as elsewhere.
In some parts of the continent, such as South Africa, hesitancy was the biggest challenge facing vaccination campaigns.
Dr Matshidiso Moeti, WHO regional director for Africa, has talked of an “infodemic”, which she defines as “a glut of information with misinformation in the mix [which] makes it hard to know what is right and real”.
False information circulating on social media included claims that black people cannot contract Covid-19 or that it can be cured with steam or traditional remedies such as herbal tea. Conspiracy theories describing plots by western companies or governments to test vaccines in Africa or slow demographic growth have also spread widely.
“The regulation side is very problematic,” said van der Spuy. “It has not been resolved in the global north either but the risks are much bigger in the south … you don’t have the same safety net of literacy skills and ability to cross-check nor the safeguard of adequate policies or capable institutions … Facebook is investing in addressing some of these challenges, but not nearly enough.”
Facebook relies on an expanding network of hundreds of third-party factcheckers across Africa to initiate investigations and respond to complaints from users. If concerns are found to be justified, warnings are attached to posts, which are also downgraded in the algorithms that direct traffic. Some accounts are taken down.
A spokesperson for Meta, which owns Facebook, described misinformation as a complex and constantly evolving societal challenge for which there is no “silver bullet”.
But, they said, Facebook now employed a global team of 40,000 working on safety and security, including 15,000 people who review content in more than 70 languages – including Amharic, Somali, Swahili and Hausa, among others.
This helped the company “debunk false claims in local languages, including claims related to elections and vaccines”.
“We’ve also made changes to our policies and products to ensure fewer people see false information and are made aware of it when they do, and have been highlighting reliable vaccine information through our global Covid-19 information centre,” the spokesperson said.
However, posts are not usually removed unless seen as directly encouraging violence or hate, leading to concerns that some may be viewed by large audiences even after being flagged as false or misleading.
“They do take things down occasionally but it takes a long time,” said Stuart Jones, director of the Centre for Analytics and Behavioural Change in South Africa, which monitors social media in the country.
Facebook claims that more than 95% of the time when people see a factchecking labels, they don’t go on to view the original content.
Other platforms are also struggling to contain misinformation.
“Social media [in South Africa], especially Twitter, is dominated by anti-vaccine voices,” said Jones.
“We’ve not identified organised networks but dealing with people with very loud voices speaking often and very passionately. The pro-vaccine voices are more moderate and don’t get the same outrage and aren’t shared as much. So the algorithms kick in and it just all runs away.”
Frances Haugen, a former manager at Facebook turned whistleblower, has said that her concerns over an apparent lack of safety controls in non-English language markets, such as Africa and the Middle East, were a key factor in her decision to go public.
“I did what I thought was necessary to save the lives of people, especially in the global south, who I think are being endangered by Facebook’s prioritisation of profits over people,” Haugen told the Guardian last year.
Workers at factchecking organisations across Africa who spoke to the Guardian on condition of anonymity said they were confident their work made some difference but worried that the impact was very limited.
“What we do is important and does stop some people reading stuff that simply isn’t true. But I worry that it really is just a tiny fraction of what’s out there,” one said.
Some say it is difficult to judge to what extent Facebook’s downgrading of such posts in news feeds restricts exposure and worry that the company has not released a breakdown of figures for funding of factchecking operations in Africa.
“There seems to be as little as possible real investment on the continent in terms of engaging people directly or hiring people with real local knowledge,” said Grace Mutung’u, a policy researcher and lawyer based in Nairobi, Kenya.
“It is a matter of accountability. If you take up such a huge responsibility in society, you should equally invest in solving the problems that come out of it. They have the resources, what is lacking is the willpower.”
Officials at the WHO say they are concerned about encrypted private applications such as WhatsApp, which remain “invisible”, as it is impossible to know what is being said or shared, and very difficult to intervene to stem the flow of false information.
WhatsApp is also owned by Meta, which owns Facebook. The company said it was taking steps to address the problem.
Nothias said that there was no easy obvious solution to the problem of content moderation, but “simple things” such as committing greater resources would help.
“Currently, in comparison to the wealth of the company and its social responsibility … it is pretty minimal,” he said.
“They are just not taking it seriously enough, or putting enough money into it. When you consider it really just is a question of their social responsibility against their duty to their investors, it’s not so hard to understand. They are just a corporation.”
With about 2 billion monthly active users, WhatsApp is the single most active and popular mobile messenger app. That kind of popularity tends to make software vulnerable, which...Read more