The 21st century colonisation of Africa does not involve armies, argues cognitive scientist Abeba Birhane, but the mass harvesting of valuable data.
What value do you put on all the data gathered about you on any given week, or any given day? Technologies such as AI, facial recognition software and now contact-tracing apps are increasingly being deployed all over the world.
Meanwhile, data privacy advocates, backed by legislation like GDPR in Europe, are trying to give at least some power back into the hands of the average person.
However, according to Abeba Birhane – currently working towards her PhD in cognitive science at University College Dublin (UCD) – these efforts are not being felt equally around the world.
In particular, the Ethiopian researcher argues that in her native continent of Africa, data harvesting by major companies – largely originating from the western world – has ushered in a new age of imperialism and colonial conquest.
‘The more privileged you are, the more agency you have to decide what you can avoid’
– ABEBA BIRHANE
The overlapping worlds of AI, ethics and data science is something of a passion for Birhane, particularly when it comes to how the technology we use every day shapes us as human beings.
Speaking with Siliconrepublic.com, Birhane said her “gateway” into the area of AI and cognitive science was the fact that ethics becomes “an unavoidable topic” when trying to discuss them.
“At the start of my PhD, I proposed that everybody comes to be who they are through their environment and the digital sphere plays a huge role, whether you’re aware of it or not,” she said.
“But not everybody is impacted equally. The more privileged you are, the more agency you have to decide how and in what way it impacts you and what you can avoid.”
The digital conquest of Africa
This is what led her to study the concept of the digital colonisation of Africa, which she laid out in a piece in Real Life last year. As she explained in the article, there appears to be a growing technology evangelism across the continent, with the adoption of any practices and applications that fly under the banner of being ‘disruptive’ or having something to do with AI.
“In the race to build the latest hiring app or state-of-the-art mobile banking system, start-ups and companies lose sight of the people behind each data point,” she wrote. “Data is treated as something that is up for grabs, something that uncontestedly belongs to tech companies and governments, completely erasing individuals.”
Speaking now, Birhane said that this mindset comes with the misperception that plenty of data and state-of-the-art algorithms brought by nations, considered part of the ‘global north’, will solve all of Africa’s problems.
“To me it’s similar to what the European colonisers did [between the 15th and 20th centuries]. They go into Africa and they make-believe they are there to help people who can’t help themselves and they are the saviours, the solutions to our problems,” she said.
“It’s that frustration at trying to point out that when big tech companies enter Africa, they are really looking after their own businesses. They want to accumulate wealth and they want to bring in their own perspectives, their own way of doing things [that become the] norm. Then Africans have no choice but to follow.”
She’s quick to point out, however, that this is not to dismiss all technology outright. Some good can come from the technology these massive companies bring, but also from indigenous individuals and developers.
“At the end of the day, you want to find that balance where the perspective of the local expert is respected and it’s at the centre of data practices or AI technologies, and you want to see it benefit local communities.”
‘It’s just incompatible with human welfare’
One area where AI has, for years, had a clear issue that fails to go away is racial bias. Extensive work by Joy Buolamwini and other researchers has helped highlight the fact that facial recognition AI trained predominantly by white men is often unable to read the faces of black people.
Dealing with this issue can be seen as part of an effort to bring diversity to the forefront of the technology that is being designed to play such a crucial role in our daily lives.
For Birhane, however, a more important question to ask is whether our priority should be on trying to fix systemic societal biases, rather than just trying to fix facial recognition technology.
“If you’re going to use facial recognition system to flag black communities and punish vulnerable people, what point is there in making your face recognition accurate if you’re going to use it to punish people anyway?” she said.
“Tech companies won’t agree because that would mean less financial gain and less power, so that would not sit well. This is why you have to look at the capitalist system as a whole and what comes into existence from within that system. The deeper it goes, it’s just incompatible with human welfare.”
THERE IS NO SUCH THING AS A "CRIMINAL" FACE! This discussion ended with the declaration of phrenology as pseudoscience. The rise of ML tools claiming to identify criminality from faces is no reason to have that discussion & normalize it again.
— Abeba Birhane (@Abebab) May 7, 2020
To trace or not to trace?
One recent technology development that Birhane is adamant she won’t be partaking in is that of contact-tracing apps. As the coronavirus pandemic continues and governments across the world try to find ways of tracking the spread of the virus, technology – and particularly the phones in our pockets – has been put forward as a solution. In Ireland, the Department of Health has tasked Waterford-based NearForm to create an app based on the API developed jointly by Apple and Google.
According to Birhane, such apps will not only normalise mass surveillance during the pandemic, but long after it has passed.
“If an Irish contact-tracing app comes into effect, I will be throwing my smartphone into the Liffey,” she said, half-jokingly.
“It’s just a way of hiding the real issues, which is a strong health system, much more robust physical support and communal support. The app gives you this false sense of security that you have things under control.”
With a year left to go in her PhD, Birhane admitted it has been quite an experience so far since she made the move from Ethiopia to Ireland.
“It was totally a culture shock and I’m still getting used to it, but I have family, friends and a huge community here so it’s been great.”
Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republic’s weekly digest of need-to-know tech news.