How a peculiar Android game sent Jason Hong down a privacy rabbit hole


4 Apr 2018

Jason Hong, associate professor in the Human-Computer Interaction Institute at Carnegie Mellon University. Image: CMU

After discovering that a pre-installed game on Android asked for some peculiar permissions, Carnegie Mellon’s Jason Hong set out on a research mission.

The recent Cambridge Analytica scandal has made millions of people around the world hyper-sensitive to what data our phones are collecting, especially when it comes to the permissions that apps request.

This has led us to questions such as: why does an innocent-looking mobile game need our permission to access our phone contacts and microphone?

While it is now in the public consciousness, Jason Hong, associate professor in the Human-Computer Interaction Institute at Carnegie Mellon University (CMU), was looking into the problem in 2010 when Google approached him to analyse its operating system, Android.

As an undergraduate, Hong studied physics at Georgia Tech before deciding to make the switch to computer science.

He went on to do a PhD at the University of California, Berkeley, in the early 2000s before joining the faculty at CMU, where he has continued to focus on digital privacy, security and the internet of things (IoT).

What inspired you to become a researcher?

I thought I was going to be a programmer in Silicon Valley, but instead I accidentally fell into research.

It was my sophomore year at Georgia Tech, and I was taking a software engineering course with Prof Gregory Abowd. I did well in the course, and he asked me afterwards if I wanted to help him out that summer on some research. I didn’t have any internships lined up at that point, and so I joined his research group.

At that point, most of the classes I had taken felt too structured, but here I was being asked to help figure out solutions to problems that no one else in the world was thinking of. It was a really fun experience; I felt like I could really make a positive difference in the world with my work.

Can you tell us about the research you’re currently working on?

One of our bigger projects is in improving smartphone privacy. It started with Google sending some Android smartphones to my research group around 2010.

I was trying out some apps, and found it odd that the Blackjack game said it used GPS location data. It turned out that a lot of apps accessed location data for unclear reasons, with some aggressive apps accessing other sensitive data such as contact list, microphone and more.

PrivacyGrade.org, one of our research projects, assigns a grade to Android apps based on a model we’ve created of people’s expectations. We had lots of people fill out online questions about their expectations, created our privacy model and then applied that model to about a million smartphone apps.

Our ongoing research looks at what we call ‘purposes’. We want app developers to add a short purpose describing why they are requesting sensitive data, which can be used to improve user interfaces describing what the app is doing.

We are also developing a suite of techniques to check that those purposes are correct. For example, if an app says that it is using your location data for searching for nearby things, but we detect that it is also using your data for advertising, we can flag that and alert app stores and users.

In your opinion, why is your research important?

There’s a growing debate about privacy around the world. Modern technologies make it possible to capture and model people’s behaviours at a fidelity and scale that has never before been possible.

On the one hand, this data can be used for a lot of good, such as healthcare, urban planning, sustainability and safety. On the other hand, this same data can be abused in many ways.

Unless we can find ways of addressing people’s legitimate privacy concerns, we will never be able to harness the full power of all of the amazing technologies that are being developed.

What commercial applications do you foresee for your research?

Right now, none directly. I’ve found that work in privacy is hard to commercialise.

Everyday end users haven’t proven willing to pay for privacy in the past. On the other hand, corporations are becoming more interested in privacy due to regulatory reasons, but it’s not clear that the market potential is large enough that a start-up could grow into a medium or large company.

I think the larger application for this kind of research is in influencing industry and public policy. Elements of our past work have influenced folks at the Federal Trade Commission and Google, and we know that folks at Apple and Amazon know about our work, too.

What are some of the biggest challenges you face as a researcher in your field?

Right now, I’m facing two big challenges.

The first is research funding. Government and industry funding is becoming harder to secure due to smaller pools of money, increasing competition and shorter time horizons. I tell our PhD students who are seeking to be faculty that funding really is the worst part of the job.

The second big challenge is time. I teach one course, manage a research team of about 10 people, serve on editorial boards for journals, review a few papers each month, help with steering the direction of my department, and travel to conferences and research group meetings several times a year, on top of being the father of a young daughter. It’s been really tough to strike a work-life balance.

Are there any common misconceptions about this area of research? How would you address them?

Privacy is more than just hiding information. Privacy is all about our relationships – with other people, with corporations and organisations, and with governments.

The exact same action in one context might be deemed as creepy, while in another context is fine. For example, imagine a stranger following you around and asking intrusive questions. Now imagine a close friend or a parent doing the same.

Another common misconception is that today’s youth don’t care about privacy; they do. It’s just that they have different notions and different ways of how they manage privacy.

What are some of the areas of research you’d like to see tackled in the years ahead?

I’m really fascinated by the emerging IoT. We can now weave computation, communication and sensing into our physical world.

But the early forms of these technologies are leading to a large host of problems with privacy and security. If we’re not careful, we’re going to cause some serious damage to society.

Are you a PhD researcher? Can you explain your work in three minutes of engaging chat? Then you could be our next Researchfest champion. Find out how to apply here.