From climate crisis modelling to medtech, Sarah Robinson says it’s time we find out what the public thinks about the software encompassing our lives.
As the home of so many multinational tech companies, Ireland has a unique role in setting the responsibility agenda for software, particularly as the artificial intelligence debate unfolds. While software impacts every aspect of our lives, from social change to climate change, little is known about what the public thinks about software and how it could be developed and implemented more responsibly.
Recently, Open AI executives issued an open letter stating that: “Mitigating the risk of extinction from AI should be a global priority.” This letter came months after another signed by more than 1,000 AI experts, calling for an immediate pause to all AI research until we can ensure the “effects will be positive and risks manageable”.
The debate is raging as software impacts our lives. The public are central stakeholders in deciding what being responsible for software should mean. So, researchers at Lero and University College Cork want to determine what the public thinks.
Software’s role in the climate crisis
For example, do you understand software’s role in the climate crisis? Perhaps you are interested in how software can help us reduce and mitigate climate-related issues or other societal challenges. Many people will be familiar with Food Cloud, an Irish social enterprise that matches companies with surplus food with communities needing it. It is just one instance of how software can help us to reduce food waste and, in turn, support the fight against the climate crisis.
On the other hand, if we are to meet our climate goals, our daily lives supported by software will also need to change. We often don’t see how our daily practices – saving our photos or backing up our messages to the cloud – might be creating harm. Greenpeace suggests that if the cloud were a country, it would be the biggest polluter in the world and the 6th highest electricity consumer globally. In Ireland, data centres consume more electricity than all rural homes combined, with an estimated 50-60pc of the data being used for streaming services.
Fake news and social change
Software is not just changing our physical world and environment, it is also altering our social worlds, producing new social inequalities while tackling others. Polarising content, mis and disinformation and conspiracy theories can rapidly spread with consequences for society and democracy.
Whistleblower Frances Haugen showed how Facebook’s lack of safety mechanisms led to the spread of fake news during the US Capitol insurrection on 6 January. Closer to home, British teenager Molly Russell took her own life after months of being flooded with content that legitimised self-harm and suicide as a response to depression.
The Rohingya, a minority Muslim group in Myanmar, is suing Facebook for amplification of hate speech, which the UN indicates has led to genocide. Haugen indicated that almost 90pc of Facebook’s moderation efforts focus on English, yet most users are non-English speakers.
Facial recognition software
Human rights abuses are also happening using AI and facial recognition software. Research by my Lero colleague Dr Abeba Birhane and others found that data used to train AI is contaminated with racist and misogynist language. As AI tools such as ChatGPT become widespread, the use of biased datasets is leading to harm and further marginalisation for already marginalised groups.
For example, facial recognition software trained on biased data is widely used for law enforcement surveillance and research suggests leads to false positives, particularly if you are a woman or black.
In Ireland, the Oireachtas is deliberating on legislation to enable the Gardaí to use body cameras and examining the introduction of facial recognition to target crime. It has also been used by the Department of Social Protection to detect fraud. However, little is known about what the public thinks about this type of software and the data that trains it. We want to find out and share that information.
Breakthroughs and benefits
While there are certainly downsides to software, there is also a wealth of undeniable benefits, many of which we take for granted.
Being responsible about software means building software that enriches lived experience, rather than hindering it.
Software is now ubiquitous in medicine and healthcare and has undoubtedly saved many lives. Another Lero colleague Prof Derek O’Keefe is involved in a project on Clare Island offering virtual clinics and digital health monitoring. While Lero’s Prof Conor Ryan’s research applies machine learning techniques to support improved medical diagnosis through semi-automated mammography.
Socially, software can support our relationships enabling us to talk and see loved ones all over the world, and during the pandemic, created a lifeline to bring some social connection to our locked-down lives.
While there is a lot in the media about AI, and what kind of world it is creating there needs to be more information about how the public perceives recent advancements in software and software in general. You can have your say in this short survey.
Sarah Robinson is a senior postdoctoral researcher at Lero, Science Foundation Ireland’s research centre for software and the School of Applied Psychology at University College Cork.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.