Lea Kissner on user privacy, encryption and ‘tricky human implications’

22 Nov 2019805 Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Lea Kissner. Image: Humu

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Lea Kissner, chief privacy offer at HR software company Humu, reflects on her time as Google’s global privacy lead and how modern security culture needs to change.

Lea Kissner’s career in privacy began when she earned a PhD in computer science, with a focus on cryptography, from Carnegie Mellon University. “My educational background is probably best described as a lot of math and computer science with humanities on the side,” she explained.

Currently, she serves as chief privacy officer at Humu, a HR software company that builds behavioural change technology for use in organisations. Prior to this, however, she spent 12 years working at Google, starting out developing the company’s security infrastructure.

“I designed and built quite a bit of Google’s infrastructure security back in the day, and what we would later call privacy engineering, like working on logs anonymisation.

“Over time, I took more and more of what the other security folks thought were ‘weird’ problems – which often meant the ones with tricky human implications. Eventually, we started calling those fields ‘privacy engineering’ and ‘trust and safety’, and the field of fighting abusive behaviour online”

From these, Kissner built a multi-disciplinary team to work on the privacy of Google’s products, eventually becoming the overall technical lead for privacy at the firm.

‘We have a lot of work to do’

Of course, privacy in the digital age has become an increasingly contentious issue, and it increasingly seems as if it could impossible to maintain with our reliance on with modern technology. Addressing these issues with privacy is complex, but not impossible, according to Kissner.

“I think that we have a lot of work to do,” she said. “I’m not satisfied with any privacy settings set-ups I’ve ever seen, for instance … Some of that is taking what people on the forefront of the field have learned, systematising it and building it into infrastructure. We shouldn’t all have to learn the hard way. We still have a lot of research to do.”

Kissner noted that a major part of this work will be in motivating companies, non-profits and governments to take user privacy seriously, which can be achieved both through legislation and protest.

Lately, many US lawmakers have been turning to the EU’s GDPR as a framework upon which to form laws that may be applied in the US, however, Kissner said that some of the currently proposed laws at state-level in the US, such as California’s Consumer Privacy Act (CCPA), “do not apply to non-profits or governments at all”.

‘Installing a backdoor in encryption technologies is like installing a backdoor in a balloon: poking one hole makes the whole thing fail’
– LEA KISSNER

Instead, she would prefer to see something that applies across the entirety of the US, as opposed to legislation for each individual state.

“GDPR-style legislation is the gold standard around the world and we should 100pc have a federal law in the US along those lines,” she said. “Though I’d love a few tweaks.”

“The level of privacy protections in the US should be higher, and uniform legal protections will lead to better protection in practice than a patchwork of laws.

“Speaking as an engineer, contradictory high-stakes requirements are prone to bugs in the system.”

On a personal level, Kissner advised some simple, common-sense measures to make it less likely that your privacy will be breached, such as doing due diligence research into what companies are handling your data.

“I try my level best to know what data I personally find sensitive and put that only in the hands of companies I trust. I secure my accounts by using strong passwords in a password store and two-factor authentication when it’s offered. I keep my software up to date. Hackers are a lot less respectful of my data than anyone else, so I try to keep it out of their hands.”

The backdoor debate

There has been fervent debate about whether or not major tech firms should be required to install backdoors in messaging services with end-to-end encryption. The conversation reached a crescendo back in October, when the US Department of Justice asked Facebook to halt its grand plan to encrypt messaging all of its platforms, including Facebook, Instagram and WhatsApp.

The argument from governments tends to be that law enforcement needs to have backdoor access in order to effectively police crimes such as the dissemination of child abuse images, election interference and international terrorism. Privacy advocates, on the other hand, tend to see this as a major violation of citizen privacy, arguing that the legislative framework already exists to subpoena that kind of information.

Kissner is concerned that, as it currently stands, there’s no way to create a backdoor without collapsing the encryption entirely and compromising all of the communications therein. “Installing a backdoor in encryption technologies is like installing a backdoor in a balloon: poking one hole makes the whole thing fail. That wouldn’t be such a big deal, except that encryption is one of the strongest protections we have in a world filled with hackers.

“We use encryption to protect our lives and our data from all the smart, malicious people out there who would like to break into everything from your banking credentials to compromising photos – that’s before we even get into the countries which either have hackers of their own or hire companies like the NSO Group to go after their political enemies.”

Kissner pointed out that the US government has previously trying to roll out a form of cryptographic backdoors in the 90s, when the Clipper chip was debuted. However, it was, Kissner put it, “fatally flawed” and “would have let anyone into your private data”, as detailed in a research paper published in 1994 by Matt Blaze. By 1996, the chip was essentially defunct, and the project was effectively shelved.

“Cryptographers have tried ever since. We literally do not understand how to build backdoors without serious risks,” Kissner said. “Nor does law enforcement know how to deal with the problems inherent in the fact that every single government wants access to data and none of them agree on when it’s appropriate.”

The modern approach to data privacy

Kissner said that the modern debate surrounding data privacy has one vital voice missing: that of engineers.

“One of the most important elements that many of these conversations are missing is the people who really understand how the rubber meets the road: engineers and people who study user experience.

“If you want privacy that works, you can’t just throw laws at it. I’ve been sent regulations to comment on that would do things like outlaw every effective form of email spam filtering. That wasn’t the intention, mind you, and they wouldn’t have known that without having engineering eyes in the mix.”

Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republic’s weekly digest of need-to-know tech news.

Eva Short is a Journalist at Siliconrepublic.com

editorial@siliconrepublic.com