If he were Afghani, Somali or Congolese, Mark Zuckerberg is what western media would call “a warlord.” After all, the Frankenstein monster that he created played no small role in a genocide in Myanmar in which tens of thousands have died. Zuckerberg is a white American and so he is called an entrepreneur par excellence. However, what Frances Haugen, a former Facebook Product Manager said when she testified before a sub-committee of the US Senate last Tuesday, left no doubt about what the man ranked fifth richest person in the world actually is.
“In places like Ethiopia, it is literally fanning ethnic violence,” Haugen told the Sub-Committee on Consumer Protection, Product Safety, and Data Security.
The “it” refers to engagement-based ranking, a Facebook algorithm that ranks content on the basis of engagement by subscribers. In terms of this algorithm, the controversial rap song that, in transmuted form, refers to the Botswana government leadership as “dicks”, is still ranked higher than Covid-19 PSAs from the Ministry of Health and Wellness. Resultantly, the former will be popping up in the feed of many more subscribers than the latter and on a more regular basis. That is because engagement-based ranking prioritises provocative content that incites misinformation, division, hate speech and violence.
Ordinarily, US Senate public hearings shouldn’t concern us but what Haugen had to say should be of interest to everyone in Botswana – even those who are not on Facebook. An engineer and data scientist by training, Haugen is a specialist in algorithmic product management and before Facebook, worked on ranking algorithms at Google, Pinterest and Yelp. While at Facebook, she played no small part in ensuring that the “dicks” song would get more engagement that public health PSAs. Odious as it is, the song pales in comparison to what the Facebook algorithm in question does with regard to a bigger, more profound issues: national security and social cohesion.
At the turn of the 21st century, during President Festus Mogae’s first term, parliament debated a motion to introduce statutory provisions for the establishment of community radio stations. In rare show of bipartisanship, MPs expressed apprehension that such stations could be used to fan the flames of ethnic divisions and hasten a Rwanda-like genocide – whose memory was still fresh in the minds of most. In opposing the motion, then Gaborone West MP, Robert Molefhabangwe, cited the example of Radio Télévision Libre des Mille Collines (RTLM), a Rwandan radio station which played a significant role in inciting genocide that took place from April to July 1994. Top of the culprit list of RTLM announcers was Kantano Habimana, who called for “those who have guns [to] immediately go to these cockroaches [and] encircle them and kill them.” By “cockroaches”, Habimana, who was Hutu, was referring to the Tutsi.
Two decades later, Botswana has still not made any law for the establishment of community radio stations but there has been an ironic development. Today, one can very easily establish a web radio station than can function as a community radio station. Thankfully, the danger that the latter type of radio station can pose to national security was overstated because there has been no incident that should cause grave concern.
On the other hand, Facebook is becoming a national security threat in Botswana. Officially town or village Facebook pages are a platform to exchange information on business, culture, and social issues. In reality though, some are transformed into vectors of hate speech and tribal division. In less than two months, one southern page has carried as many posts attacking a particular northern tribe. For both posts, engagement, as judged by the number of likes and comments, is extremely high. In at least two other separate pages, those who don’t know regional languages of the people those pages target, but want to participate in group discussions, are digitally shoo-ed away simply because they are cultural outsiders. Imagine what someone who was so expelled would want to treat people who speak those languages who live in his/her villages where the latter are themselves cultural outsiders. In similar category is a “tribal-boyfriend” list from someone, a woman possibly, who certainly has to have been around. On the list are names of Botswana tribes and alleged generic traits of boyfriends from such tribes. On the face of it, it is innocent fun but there is nothing funny about perpetuating tribal stereotypes.
Engagement with aforementioned content is typically extremely high, which leads to Facebook’s engagement-based ranking algorithm automatically prioritising it. The result: incitement of misinformation, division and hate speech.
There has emerged a slew of Botswana-based Facebook publications which obviously don’t have as many layers of editorial gate-keeping as traditional media news operations. They typically carry one side of the story, angle practically all their stories provocatively and don’t moderate their comment boards. Some administrators prowl the wards of village collecting what Batswana call tshele (snarky remarks) and publishing it as national news. This type of journalism – such as it is, pits neighbour against neighbour, relative against relative, teachers against students, residents against leaders. When such content is published, the Facebook algorithm that Haugen complains about ensures that this news gets a high ranking. When this happens, the fabric that holds family and society together frays.
Then there is the lone “influencer” who acts individually and not as part of a rudimentary news operation. In order to “influence”, you basically have to say outrageous and provocative statements that will drive up traffic to your website. Not too long ago and without mentioning names, the Botswana Police Service communicated to the mainstream media, fear that some Facebook influencers were purposefully hell-bent on destabilising the country.
Like other Third World countries, Botswana is particularly susceptible to the danger that Facebook poses because none of its indigenous languages, not even Setswana – which is the national language – is supported by artificial intelligence tools (“integrity systems”) that detect and take down hate speech. Giving Ethiopia as an example, Haugen said that while the East African country has 100 million people and six languages (the correct figure is actually 92), “Facebook only supports two of those languages for integrity systems.” Research by the Massachusetts Institute of Technology also found that “communities that speak languages not prioritised by Silicon Valley suffer the most hostile digital environments.”
Not supporting a language for integrity systems means that Facebook’s AI tools can’t detect and take down offensive content. Within a more precise Botswana context, it means that if someone anywhere in the country writes something really offensive about a particular tribe in their mother tongue, Facebook’s integrity systems won’t detect it. Such content would get a lot of likes and attract a lot of commentary and would automatically be prioritised by the engagement-based ranking algorithm. Facebook would actually be doing the exact same thing that the Rwandan radio station did back in 1994 but on a much larger scale.
Botswana is susceptible in another dimension. Haugen told the Committee that “Facebook is the Internet for lots of the world. If you go to Africa, the Internet is Facebook.” Resultantly, what is on Facebook represents all the knowledge that its habitués are exposed to. That knowledge is a toxic mix that incites misinformation, division and hate speech. In Ethiopia and Myanmar, that toxic mix has led to ongoing ethnic violence and there is no basis to suppose that in the long-term, Botswana will be exempted from such repercussions.
Facebook and other Silicon Valley tech companies do something even more sinister, which some have called “brain hacking.” These companies use techniques that engineer addiction in users, prompting a US comedian to refer to Zuckerberg & Co as “little more than drug dealers.” One company, called Dopamine Labs, which was founded by a neuro-psychologist and a neuro-economist, has been very clear about the fact that it is intentionally exploiting people’s psychological vulnerabilities to keep them addicted. You don’t have to imagine what that is doing to labour productivity in Botswana because you can actually see it with your own eyes.
Before Haugen, a former Facebook executive called Chamath Palihapitiya, told an audience at the Stanford University’s Graduate School of Business that Facebook is having a harmful effect on society.
“The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse. No cooperation. Misinformation. Mistruth,” he said.