Intel, which employs more than 4,000 people in Ireland, relies on futurists like Brian David Johnson to deliver a vision of how we will be using technology in the future.
When he’s not writing 1,000-word articles for The Wall Street Journal on his smartphone in airport lounges, painting, or directing feature films, Intel’s Brian David Johnson is obsessed with the future. It’s his job.
A principal engineer in charge of future casting for the chip giant, Johnson’s task is to find out as much as he can about people, society and technology, and then develop the data models and a pragmatic vision to help Intel make the products of the future; beyond chips for computers, smartphones and servers to in-car computers and robots for the elderly.
‘You just made a lot of people very uncomfortable’
A case in point would be a book he wrote called Screen Future. Almost a decade ago, Johnson had outraged senior engineers when he predicted a time when people would be using energy-efficient screens of various sizes rather than power-hungry PCs to do their jobs and get their entertainment. In effect, he predicted the rise of devices like the iPad and the smartphone.
His boss, Intel director Genevieve Bell, told him: “You just made a lot of people very uncomfortable. Keep it up.”
Keeping engineers and scientists on their toes is the formula that drives Intel’s multi-billion-dollar revenues. As Intel co-founder Andy Groves famously said, only the paranoid survive.
Last week, Intel reported quarterly revenues of $13.5bn and, in its relentless pursuit of Moore’s Law (the principle that the number of transistors on integrated circuits doubles every two years), the chip giant is about to begin manufacturing in volume products based on 22-nanometre trigate technology.
I ask Johnson how a futurist works. “The process is based on social science first of all, and we go all over the world and study people on every continent except Antarctica. Then we work with the computer science. We try to understand what people want and then figure out what they are likely to do with the technology to make their lives better.
“Most futurists start with the math of the future – GDP, population growth, etc – but I prefer to be constantly in motion, talking to people and creating experience models to figure out what it will be like to live in 2025.”
UX didn’t exist a decade ago
To give an idea of how much has changed in the last decade alone, Johnson says the words ‘user experience’ (UX), which are now the mantra of tech leaders across the world in terms of smartphones, branding, social media and digital media, didn’t exist a few years ago.
“I was one of the guys at Intel who said that the future of computational devices was going to be entertainment. It was hard for guys in a world where you made PCs for the internet and TVs for TV to realise that the future was going to be about a set of screens that did everything – watch TV on, read comics on, talk via video with friends on. And the size of that screen – be it a tablet, a smartphone or an interactive TV – would be dictated by what works for you. The idea wasn’t that we should get obsessed about specific devices, but concentrate on the experience people would want.”
Such predictions of a multiscreen future helped Intel to devise system on a chip (SOC) technology and low-power chips like the Atom processor.
Strangely, for a man whose job is about the future, Johnson is reluctant about making predictions.
“Making specific predictions does me no good, in fact I consider it to be intellectually dishonest. If someone makes a prediction, they are usually trying to tell you something.
“I’m judged by creating models and getting the technology right, and that has manifested in such things as SOCs inside set-top boxes in the living room or inside mobile phones.”
Johnson says arguments such as whether the PC will kill the TV or whether the tablet computer would kill the PC are pointless in the multi-screen world of today.
“There is no one form factor that will win anymore. I travel a lot and I could use an ultrabook or a tablet computer, but most of the time I use my smartphone. primary and instinctive.
“If I am coming up with a model for what TV will look like in 2015, it is crucial that I talk to different broadcasters and regulators and understand the intricacies of the business, such as copyright regulations.”
He lauds Netflix CEO Reed Hastings for identifying a model for how consumers will consume content.
“Netflix radically changed the delivery of optical media, it was a massive innovation and completely disrupted broadcasting and broadband.”
Journalism’s golden age
In terms of the media, Johnson believes we are entering a golden age for journalism.
“I’m a big fan of news and journalism. News isn’t going to go away, neither is media, but how it is delivered is changing. Mashable has figured it out. What is changing is the maturation of the business model and how it translates to a future where people have all these different screens to get their content on. Media businesses are still trying to figure it out. But we also need to remember how new all of this is.”
He has a point. The world has surpassed 7bn people and, according to the World Bank, 75pc of the world’s population – 6bn people – carry mobile devices. Fourteen years ago most people didn’t have mobile devices, yet today some smartphones are more powerful than computers on desks.
Eventually Johnson sees the devices getting smaller, thinner and almost invisible.
“When you talk about silicon architecture, right now we are at 22 nanometres, which is extremely tiny. When you look to 2020, the size of meaningful computational devices could reach almost zero. Moore’s Law will keep going until we get to virtually zero.
“The next big focus for Intel is seven nanometres. When we get to that level, chips will be so small that they can be powered by friction, the heat of your body or the movement of your hand.
“Once you have computation moving to almost zero, it means we can make anything into a computer.
“What I try to do as a futurist is make sense of it, understand how we will interact and what it will feel like to be a human. When we enter the world of almost zero-sized computers I think computing will become more human and instinctive, it won’t be command and control like it is today.
“People, when surrounded by computational intelligence, will have a personal relationship with their computers. Your devices will know you as you move through your life.
“Imagine if your device knows when you go into a pub and will not allow you to post onto Facebook.”
The best from tech
But instead of a frightening dystopian future of machines controlling our lives, Johnson says the key is getting the best out of technology.
“Fear sells, fear is part of our brains. But there is so much we can do with computational intelligence as we age in our homes, for example.
“For me, a robot is a laptop with legs. But what if we put that technology to good use and make it a part of a healthcare system that allows people to stay in their homes and age without having to go into a nursing home?
“I’m as big a fan of science fiction as anyone, but if we want to get down to the business of knowing the future we need a different kind of narrative.
“We need to be looking at how technology can transform our future to have productive lives. What I tell people who are a little frightened by the future and technology is do not be passive, you cannot just let the future happen to you. The future is made by the every day actions of people.
“For example, when people ask me what they can do for their future in this time of accelerated innovation I just tell them: ‘Learn how to code.’ I tell them this because they will have an understanding of how things will work, it doesn’t mean they will have to be programmers.”
Join Ireland’s digital leaders who will gather at The Digital Ireland Forum to discuss the opportunity for Ireland to become a global centre for the creation and management of the digital content and services today’s connected consumer demands.