Most of us know at least one person who gets great pleasure from winding up friends and colleagues by making a statement or professing an opinion primed with shock value and uttered purely in the hope of getting a reaction. Needless to say, it usually works.
Nicholas Carr, editor of the Harvard Business Review, has managed to pull off this trick and much of the technology community in the US has taken the bait. Earlier this summer his prestigious publication printed a paper he wrote which carried the provocative title IT Doesn’t Matter.
According to Carr, information technology has become a commodity in the same way as electricity or railways did in preceding centuries: he argues that it’s getting harder and harder to use technology — which he defines as the processing, storage and transmission of data — to gain any kind of edge over competitors. As IT has progressed, so has it become standardised. The core functions of IT “are becoming costs of doing business that must be paid by all but provide distinction to none”, he says. Carr draws up a new creed for IT management based on three main planks: spend less; wait for products to become cheaper before making an IT purchase; and focus on risks to the system rather than on opportunities to improve it.
At a certain level, what he has to say is true: there are elements of the computing infrastructure that are now commodities, PCs being the obvious and most widespread example. Methinks Carr has his tongue wedged firmly in his cheek though, when he questions the need to write your own supply chain management software when ready-made, off-the-shelf alternatives are available. While it’s true that one bean counter is just as likely as the next to be using an Excel spreadsheet to sketch out a rough picture of his business, the higher up the software food chain you go, the less things are as commoditised as Carr would have us believe.
Why else do all of the major business software vendors employ armies of consultants and partners to help implement their products? Because no two organisations are alike and one size does not fit all. The notion that this end of the market is commoditised is stretching a point.
And as for web services? Pull the other one. Yes, it’s true, these software components are based on agreed industry standards (and are therefore a commodity in Carr’s eyes), but they’re nowhere near as ubiquitous as Carr would have us believe. To Carr’s way of thinking, IT is a cost to the business — as much as 50pc of capital expenditure, he says — and one that now needs to be tightly controlled and even reduced. “There is good reason to believe that companies’ existing IT capabilities are largely sufficient for their needs,” he writes. I can think of a few IT managers laughing up their sleeves at that one.
Most people responsible for major information systems could reel off a long list of features they would love to add to their applications, but which will forever remain in the ‘nice to have’ column. At the coalface, it’s make do and mend.
The fact is, most organisations are already spending less — industry analyst figures support this. True, it’s a sign of the times, but three years ago, when tech spending went skyward, this part of Carr’s argument might have been visionary; now it’s just trite. More often than not, IT is actually used by organisations as a way to reduce costs where possible, as well as allowing them to function more efficiently.
It’s true that in a lot of organisations, some servers are doing much of the work while others lie idle. Many businesses have more printers than they really need. Wherever there is opportunity to cut down on the amount of physical machines, then there is a compelling case for doing just that. But simply spending less is a miserly way of going about it.
There is a view abroad in the industry that the best way of consolidating systems is to port software applications on to entirely new hardware; a smaller number of servers than before that are capable of taking on a greater amount of the work than their older counterparts were able to do. As part of such an upgrade, organisations may opt to standardise on an operating system, further reducing complexity and, so the argument goes, simplifying management. But it first requires outlay for new hardware that is up to the task.
While we’re on the subject of snappy statements, “you’ve got to spend money to save money” is one that seems more appropriate to the discussion at hand. To put things into a local perspective, in Ireland, we’re not as far ahead on the technology curve as many of the organisations in the US that Carr talks about; few would argue that many of our public services are long overdue the kind of shot in the arm that IT is best placed to supply.
The worst outcome of Carr’s spot of kite-flying would be if some government department mandarins fill their heads with foolish notions that technology need not be at the forefront of Ireland’s development. On the contrary, IT has never mattered more than it does right now. Whether or not you agree with Carr’s thesis, it’s fair to say he has sparked a debate worth engaging in — and maybe that was the idea all along. Sometimes those annoying friends have their uses after all.
By Gordon Smith