Computing the future


28 Nov 2002

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Computing and futurology have been bed partners for years, with every gadget prompting a new round of anticipation of the possibilities it might open up.

One of the basic facts of IT business life is that even when there is consensus, common practice, standards and so on, speed of adoption in practical terms is variable and, more importantly, there is a spread in timing.

Some organisations today have gone on to Windows XP, while many of the small and indeed not so small are still happily running on Windows 98 and the second last edition of Office and other applications.

PC speeds are up to 3GHz on an Intel Pentium 4 chip, but again there are certainly lots of four-year-old Pentium II PCs. That newish and probably now permanent cliche ‘the device of choice’ ranges from desktop PC to sleek titanium shell notebooks to PDAs (personal digital assistants) to the new tablet PCs. Unix and its talented sibling Linux are making strides in the corporate world for all sorts of reasons, but principally because they are equal players in the internet world.

The internet is the one great leveller in this whole heterogeneous mix of operating systems, applications, ‘legacy systems’, gadgets and gizmos with go-faster stripes – anything that does not speak passable IP (internet protocol), HTML (hypertext markup language) or XML (extensible markup language) is definitely out.

But, it is all going further than that, in the sense that the entire computing and telecommunications universe is moving towards open systems: every element, hardware and software must potentially interoperate with everything else.

Down at our human-user level, we really should not need to know or care about how all of this is accomplished by the technology. We will just work through our applications and what we see, choose or enter onscreen. Because our interface will be what we currently call a browser, the frontend of almost all applications will look and feel the same, probably tailored for personal preferences, so minimal training will be required.

The phrase ‘commodity computing’ has been loosely used to describe the next phase, when it will all be so powerful and widespread that we will take it for granted. ‘Pervasive computing’ is similarly used to suggest the ways in which all sorts of things will be to some degree computerised and internet-linked. The new reality of mobile users and seamless access by wireless in locations like airports, convention centres and hotels is growing rapidly because of its obvious value.

Liz Dempsey of Hewlett-Packard’s personal systems group points out that all of the company’s products are wireless enabled. “Users will choose what style of device they want or need, which may be different at different times, to access the resources they require,” she says.

But, portability and sexy new appliances are in some ways a distraction from the core values: business will come to rely on what we will call ‘any and always’ computing. That means access to applications or data through a service that is always-on, always available, any time, anywhere.

At the organisation’s end of things, this will increasingly mean centralising data, according to Colm McVeigh, sales director of Oracle in Ireland. “A database today is a living, pulsating thing with transactions and interactions going on all the time. We are also trying to make the consolidated knowledge in an organisation permanent and coherent and available to the next user who comes along. The types of data include email documents and voice and video as well as transactions. The complexity and the power required now come in managing all of that. So, there are powerful technical and costs arguments for putting all of that on a single platform in one place, especially for reliability, security and fault-tolerant availability,” says McVeigh.

His argument is that the software architecture is what ensures the ‘any and always’ regardless of the access device.
‘Device agnostic’ is the descriptive phrase used by Mru Patel, head of desktop solutions for Sun Microsystems in Britain and Ireland. “Just as your always-on ringtone means you can phone anyone, so a web tone will mean that you can access any device with an IP address anywhere in the world. The complexities above that are something the carrier networks can fight over. With open systems, it will not matter what applications you are using any more than it matters now what operating system or email client your device runs or whether your connection is wired or wireless,” says Patel. He also points out that it will not matter if your device is way behind the current smart model. “Most users, say 85pc, need only the same set of quite orthodox tools such as email, word processing, access to screens of information and so on. Those things can now be web-based from any appropriate device and the savings can, in turn, be used to give more specialist users the powerful or special devices they need,” he adds.

The thinking is moving inexorably towards distributing the computing, not the computers. For the actual number-crunching processing power required, IBM is putting its bet on peer-to-peer or distributed grid computing that harnesses the sum of all the hardware capacity.

“For years now, we have all been looking at individual PCs with their processors totally under-utilised and their hard disks half empty,” says John Scully, head of international services at IBM Ireland. “The next step has to be a computing utility that can respond to local surges in demand wherever and whenever they occur. Traditionally, organisations built up their own IT infrastructure with specially-written application software or tailored versions of packages. But, with the on-demand utility model they can just invoke and use what they need, when they need it. Most importantly, they can pay accordingly and avoid capital expenditure by going to a usage-based and outsourced model,” he adds.

Scully uses the telling example of payroll: “It is actually run 12 times a year or maybe 52. It involves complicated rules and annual changes. In the new model, we can retain our employee and corporate data securely, but process it as required through a utility ‘Irish payroll’ software application that will always be up to the minute. Paying for each run will always be cheaper than investing in our own copy with annual upgrades.”

One key to all of this, of course, is the telecommunications infrastructure to pull it together. That ranges from broadband to portable wireless devices, whether connected by 3G or WiFi services. “The kernel will be the final convergence of voice and data – in a few years we won’t even think about it any more – and live, instantaneous communications,” says Garry O’Callaghan, enterprise business manager of Siemens Ireland. “To a fair degree, movement towards the vision has been held back because of the crisis in the telecoms industry worldwide and developers will be slow to go to market until the network is there. But, making data available anywhere, any time is what we are building devices and systems for and there is plenty of scope for different designs and solutions for different needs and users,” he says.

O’Callaghan insists that the specific tools are irrelevant: “We use what is available. Videoconferencing is clearly a potential advance on the phone, because we humans communicate so much better with face and gesture, combined with speech. But, even as reasonable videoconferencing is becoming a reality, and we look forward to videophones, we are thinking ahead to 3D video holograms. But, the phone and mobile phone are still the champions because they work. They are here and part of our culture.”

Teleworking is another example that is here now because it is possible and practical, but when the full broadband communications and computing tools are available and ubiquitous, we can foresee a new work culture. “I might work for a local company in the morning and a Californian outfit in the afternoon or evening,” O’Callaghan says, “and the idea of loyalty to an ongoing permanent employer might begin to be old-fashioned. In such a labour market, employers might have to compete harder and pay more for skills, knowledge and availability.”

PricewaterhouseCoopers has been in the business of looking ahead for well over a decade. Bob Semple, partner in charge of global risk management solutions, has one final word of warning. “In this more open electronic world, accountability and trust must be managed at all times and the systems must be always-on and instant like their host environment. Audit trails, action logging and ‘non-repudiation’ are essential elements and digital certificates will be needed for a range of transactions. All in all, we will have to mature greatly into a new age of electronic security,” he says.

By Leslie Faughnan