Irish DPC warns AI companies to be careful when using public data

21 Jul 2023

From left: Ultan O’Carroll and Robert Pisarczyk at the summit. Image: Lucy Nuzum/Oblivious AI

Speaking at the Eyes-Off Data conference, DPC deputy commissioner Ultan O’Carroll said data privacy investigations can ‘ruin organisations overnight’.

A deputy commissioner of the Irish Data Protection Commission (DPC) has warned companies developing AI technologies to be careful when launching their products, especially when the models are trained on public data.

“It might be too late for them to change their product when regulators get involved,” said Ultan O’Carroll, DPC deputy commissioner for technology and operational performance, at a privacy-focused conference in Dublin yesterday (20 July).

“It’s not a closed book at this stage, but I do think there will be more regulation. But at the same time, when it comes to personal data, GDPR is king at the moment.”

O’Carroll was addressing an audience at the inaugural Eyes-Off Data Summit 2023 organised by Irish start-up Oblivious AI and data security platform Antigranular.

Held at Epic, the Irish Emigration Museum in Dublin’s Docklands, the event saw speakers from across the world come together to discuss the rapid rise of AI and how to be responsible with data.

Speaking of incidents reported to the Irish DPC, O’Carroll said that human error is the leading cause of data breaches and that organisations are responsible for getting their own technologies under control.

“50pc of data breaches are based around misaddressing or mislabeling… and most of the time it boils down to governance and control over your technology,” he told the audience, adding that data privacy investigations “can ruin an organisation overnight”.

As per GDPR rules, organisations are required to report personal data breaches to the relevant supervisory authority “where the breach presents a risk to the affected individuals”. Organisations must do this within 72 hours of becoming aware of the breach.

On the recent EU adequacy decision for safe data transfers with the US, O’Carroll said that the move is likely to be challenged because federal law “hasn’t changed” and that national security laws in the US are “more extreme” compared to the Europe.

He also warned against ‘privacy-washing’ GDPR certifications from commercial organisations, noting that there are no official GDPR certifications. “We see that and we laugh,” he joked.

Jack Fitzsimons, co-founder of Oblivious AI and host of the summit, said that the debate around data privacy has sparked the interest of many as AI continues to get more advanced.

He told the conference that “the current status quo poses significant challenges” to a range of professionals including data scientists, lawyers, compliance officers, businesses, as well as consumers. And the solution to these problems are privacy enhancing technologies, or PETs.

“PETs allow us to unlock the power of the world’s most sensitive data without compromising privacy. We’re trying to build a world where data respects its boundaries, trust is brokered via reliable technologies and privacy is the default, not just an afterthought.”

Oblivious AI was co-founded in 2020 by Fitzsimons and Robert Pisarczyk. Based in Dublin, the start-up builds tools to allow data scientists and machine learning models to work on sensitive data while enforcing confidentiality constraints and brokering trust between businesses.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Vish Gain is a journalist with Silicon Republic

editorial@siliconrepublic.com