The Friday Interview: Professor Fred Piper, University of London


18 Mar 2005

After working in the information security industry for more than 20 years, most of us could be forgiven for having a well-developed sense of paranoia, but in fact Professor Fred Piper (pictured) is no hardbitten conspiracy theorist; instead he is a thoughtful and thought-provoking speaker, preferring to raise legitimate questions than offer controversial opinions to generate shock headlines.

Nor is he a ranter and raver. He has no axe to grind with ‘big government’ and he genuinely believes the police have better things to do with their time than going through his email inbox, for example. When he states “I have nothing to hide” he does so matter of factly rather than defensively. However, he respects the views of those more cynical or suspicious than him and acknowledges that their concerns are equally valid.

Piper is director of information security group, at Royal Holloway, University of London. He has worked in information security since 1979. He got into the field of cryptography in the late Seventies but “you soon learn that on its own it’s not much use to anyone — it’s a minor part of a major subject”. That led him into the wider realm of information security and now he can look back on more than 20 years advising private firms and governments about codes, security and management.

This experience gives him a perspective on security and technology developments over that time and he has a keen eye for the trade offs that take place between the rights that individuals are prepared to give up in order to feel secure, versus the risks that those rights will be taken away.

This all speaks to the wider e-government project with its oft-touted ‘single view of the citizen’. Although he doesn’t subscribe to the view that governments pursue secret agendas in promoting technology, one of Piper’s firm beliefs is that they should do more to encourage debate around the subjects of security and privacy, especially now that IT is so closely connected to those issues. “It needs a mix of openness, accountability and trust. You’ve got to have a justification for introducing security technology,” he says. Modern IT puts features such as tracking or surveillance within easy reach and there is a case to make that they can be useful or even necessary. However, these are all emotive subjects: it is that very fact that makes debate all the more necessary, Piper maintains.

He cites the congestion charge introduced in London (which is also an idea floated from time to time as a possible solution to Dublin’s traffic problems). “The people who use their cars very little might assume that road charging is acceptable, but if you have road charges then you need to be able to track where cars are going,” Piper says. “You need a tracking device; tracking is a form of surveillance so there is the potential for anybody to know where your car has been throughout the year. To many people that is undesirable.”

Another public sector initiative with even more obvious implications for privacy would be national identity cards, which are on the agenda lately as Ireland’s Justice Minister Michael McDowell TD has acknowledged that such a system may need to be introduced here if it is adopted in the UK. It’s yet another issue that operates on two levels, encompassing public trust while underpinned by technology. “As an individual, I have no problem with carrying something that identifies me,” Piper says. “To some people, it’s a horrendous invasion of privacy – but that statement comes down to people’s perceptions.”

He uses a sombre example from history to illustrate the unintended consequences of giving up personal information: in the Thirties residents in Holland had to specify their religion when registering for medical benefits. This seemed a reasonable request but a year later when the Nazis invaded, they used the medical records to identify Jews in the country. “The value of the information had changed,” Piper elaborates. “When they gave the information it was harmless.”

According to Piper, problems occur because there has been little discussion as to the merits and effects of such an ID card scheme and what uses there would be for the information the cards would contain. “Even politicians can’t decide because nobody’s telling us how they can be used. When you get an electronic online identification scheme, you need to know what it’s for, how it’s going to be used and what are the consequences of that.”

He gives the hypothetical case where ID cards become mandatory. What are the consequences of being stopped by a policeman and the biometric identifier in the card fails, he asks. If all that happens is that the policeman tries another biometric authentication device, then the consequences aren’t all that grave. A worse alternative is that the person may be asked to come to the station because the data on the ID card and the individual don’t match, leading the officer conclude that the person isn’t who they claim to be. The problem, says Piper, is where the system relies on faulty technology. “Use of biometric data is sensible; biometrics do attempt to recognise the individual, but the technology will not be 100pc accurate. There will be false acceptances and false rejections and those are unavoidable,” he points out.

All of this is not to decry technology out of hand. From his own background Piper has seen the development of cryptography — essentially a scrambling technique for encrypting information — to a point where it is widely used. “If you want to send anything confidential, you encrypt it. There are so many algorithms now; it’s so much better understood than it was 20 years ago. The ability to design your own algorithm is no longer required. You’re using it without knowing – when you shop on the internet and you send your credit card details, or when your GSM phone makes a link with a landline, it’s encrypted.”

He argues that instead of depriving citizens of their rights to privacy, encryption actually makes privacy easier. But it’s not a good thing completely, he acknowledges. “It’s two sided, that’s the problem. What it does is prevent intelligence gathering by law enforcement in certain situations.” Where is the line drawn between a journalist protecting his sources and a paedophile hiding inappropriate content? The technology is the same in both cases but the goals are clearly different. To Piper, this raises the question of whether we all have the same equal rights to privacy. Should the police possess technology capable of breaking encryption codes so they can prove their case? Many might argue that they should have that right in such obvious cases where it helps catch and convict a paedophile or helps prevent a terrorist attack.

The prospect of that right being abused is an equally legitimate concern. Those who defend the access to private information by the authorities often put forward the argument that runs along the lines of: “If you’ve got nothing to hide, why do you care?” Piper feels that this shouldn’t be starting point for any discussion over security rights, because the very notion of security is not an objective one and it depends to a large degree on people’s points of view. “It ought to be up for discussion,” he states. “There needs to be a debate where the Government makes the case for what they want and tell us the consequences or the potential consequences of that. Surveillance and intelligence gathering are an important part of law enforcement; the police have a job to do and it is not interested in prying. I’m happy to give up some data to government; my personal view is that I have nothing to hide that’s worth making a fuss about. I do respect the fact that some people really do care and you have to accommodate that as much as you can.”

By Gordon Smith