Virtually speaking


29 Dec 2008

With virtualisation use growing, is it time it became part of your organisation’s IT strategy? We take soundings from the industry

The IT industry is used to big corporations setting the trends and expecting the rest of the sector to take their hands and pull them through into the next phase, iteration or development. With servers and server management the reverse might possibly be the case, as smaller to medium-sized enterprises are taking a leading role in demanding that their servers do more, be more efficient, consume less power and so on. With SMEs being the predominant business culture in Ireland, the industry can expect to see shifts in what type of servers become available, how software licensing is done and whether cooling and power can be used with greater ease as vendors seek to satisfy this most important and lucrative client base.

Of course, it is not just SMEs looking for greater efficiencies; the large corporates are also endeavouring to cut their IT costs as well, but it is an interesting change as smaller companies – on cursory evidence at least – are pulling the strings on this one. One of the key technologies that has been in the ascendancy for the past few years is virtualisation. It has been hyped and hoorayed, not only by vendors and implementers, but also by researchers, analysts and other industry experts.

And with good reason. According to recent research from IDC, some 54pc of businesses not using virtualisation expect to do so in the next 18 months. The study also observed that while both large companies and smaller organisations see virtualisation as key to their data centre strategy, their next challenge will be to make more effective use of this capability. “Virtualisation use has exploded since our last survey of the European market,” says Chris Ingle, consulting and research director with IDC’s Systems Group. “Both large organisations and smaller businesses are using the technology for a wider range of applications and for business-critical projects. As use of virtualisation grows, the challenges around managing complexity, finding skills and software licensing become more apparent.”

IDC’s study also revealed that organisations are increasing their virtualisation of x86 systems for core business applications, although the majority of virtualisation is still for test and development and for network server applications. Growth of virtualisation as a strategy is still strong, rising from 46pc of the base to 54pc.

What is interesting is that virtualisation is growing as a data centre strategy in itself, rather than as part of other projects. This supports the view that virtualisation is increasingly seen as a standard for a wide range of workloads. And it’s something that Fran McNally of A&O  Systems & Services agrees with. He says businesses need to recognise that virtualisation technologies have uses outside of the traditional server and test environments. According to McNally, they can also be implemented in the desktop environment to centralise and therefore reduce desktop administration and further reduce costs by using dumb terminals or open source clients to access centrally located processing power. VMware’s VDI and ACE products are good examples of this, he says.

As a result of the reduced physical footprint and lower power consumption of a virtualised environment, it can often make financial sense for companies to have their virtualised systems located in third-party hosted facilities, rather than maintaining their own data centres. McNally points out that locating a company’s virtual environment in a hosted data centre can provide savings because the company no longer has to physically support its own server environment or support the equipment associated with it, which typically includes air conditioning units, fire suppression systems, UPS and also comms equipment.

“Remotely hosted virtual environments can also be used as part of a disaster recovery solution, where the remote virtual environment can be used as the failover system. This process can even be automated by using tools such as CA’s WAN Sync or a Microsoft geographically dispersed cluster,” says McNally.

Virtualisation can also free companies from dependency on legacy hardware, whereby a particular application may be tied to a particular legacy OS (operating system), that is in turn dependant on legacy hardware. “With virtualisation, you can break this underlying dependency,” McNally states.

To implement a virtualisation strategy, McNally recommends that companies acquire a unified, end-to-end solution, based on a powerful, reliable and scalable IT infrastructure. “Provisions need to be made for future growth of the virtual environment, both for the individual systems growth and for adding additional systems. This growth could consist of additional storage, CPU power or bandwidth,” he says. In terms of boosting the power of an organisation’s IT setup, by separating the layers of the computing stack, a virtualised IT environment makes it possible to quickly deploy new capabilities without having to configure components. “In a virtualised environment, testing requirements and application compatibility issues are reduced, processes are easier to automate and disaster recovery is easier to implement. In addition, new software releases can be tested and rolled out with greater confidence than ever before.”

Fran Ryan, enterprise solutions manager for Commtech, says server virtualisation will enhance the way companies look at their computer infrastructure and will also reduce the server footprint, which will lead to better use of the remaining machines. “It’s a fundamental shift for a lot of people on how they think about key application deployment and also in their purchasing habits with their server vendor.”

According to Ryan, server provisioning and deployment is now as simple as “a couple of mouse clicks” to create a new machine. This, he says, negates the whole business case for the traditional server acquisition process such as looking for quotes, finance, the ordering process and the implementation phase. “This level of dynamism was unheard of before virtualisation. The server now becomes a more dynamic asset that can be moved around, backed up and copied far more quickly and vigorously than in a traditional hardware environment.”

While blade servers have been the belle du jour of the server hardware world for some time, Ryan feels even they will feel the pinch from virtualised machines. Pointing out that most virtualisation projects are based around reducing server numbers, Ryan adds: “Suffice to say, virtualisation has superseded the rackable-to-blade migration because – with respect to the hardware vendors – they would love to see you do a one-for-one refresh of all the servers forever.”

So has server-based computing beat out the heavy client model? Ryan believes that it ultimately depends on the applications. He admits that in many cases heavy clients were an absolute necessity, due to workload or the type of job the IT infrastructure was being asked to do. “Traditionally speaking, the average office worker is probably using a machine that draws more resources than necessary. The typical client would be a laptop running 2-3GB of memory and running XP or Vista, and there is a debate that if you’re running three or four key applications, why can’t that also be run on virtual machines on a thin client? A desktop can be virtualised in the same ways as a server and the user can then log on and request his virtual PC.

So does this development suggest a return to dumb terminals? “Let’s stay away from that phrase, it upsets too many people,” Ryan jokes. “But there is a cultural change to go through here for users to rebalance their requirements, based on what they do on their machine with what it is capable of. For instance, a standard PC runs on a 350-400 watt power supply. Typically, only 25 watts would be needed for the apps they run.”

John Casey, sales manager at Datapac, says a virtualisation strategy all depends on the organisation and what it wants to achieve. In his experience, some companies are using virtualisation for legacy systems, while others are using it to reduce the number of servers in the organisation. He also believes that virtualisation has kicked the demand for blade servers into gear due to companies getting rid of brick servers like the tower and rack formats. “HP and IBM have been talking about the amount of blades shipped, which they say is now above 50-50 … I think it’s around 60-40 in favour of the blades. So that is a big increase,” says Casey.

According to Kevin Reid, business development manager at Sureskills, the move to server-based computing has always had classic difficulties – does the company go client/server or three-tier? What is the application between where the processing takes place and where the client sits and so on? “We are really moving back to a mainframe-like model where you’re using the client purely as a presentation layer, and whether that’s web-based or a thin client, it doesn’t matter because it’s much more presentation focused than processing focused. Whether you use a two-tier model at the backend for app processing and data storage or a single-tier combing, those doesn’t matter, but there is a very strong move towards this setup.”

Reid has an interesting take on the different players in the server market, seeing little difference among them. He believes the problem with vendors trying to make a distinction between themselves is that all the manufacturers are trying to develop products that are essentially very similar. “They are trying to hit a density point, performance point and cost point so the differentiators are very slim. Even the management products that wrap around them – HP and Dell licence the same technology from Altiris, so they’re not even able to differentiate in that area. Therefore, a lot of it comes down to their supply chains – how do they manufacture? How do they provision? How do they deliver? And then of course, cost. The main players would be HP and Dell. There’s also IBM, but I think they’ve slipped back a bit recently.”

Reid also believes outsourcing will play a key role in future server management paradigms. “I’m a strong believer in outsourcing management tasks or a lot of skills that are difficult to maintain in small and medium organisations. Once you get into the larger companies, it’s easier to cover the tech requirements. Most organisations should be looking at outsourcing specific skilled tasks where they are not going to be able to maintain the skill set or the knowledge base.”

Rowan O’Donoghue, director of innovation and development at Origina – formerly known as Unitech – says the main virtualisation player from his perspective would be IBM on the Unix platform. According to O’Donoghue, the dominant player in the Intel environment would be VMware, with Microsoft chomping at its heels, along with Citrix and its Zen Server portfolio. “Microsoft will become a bigger player eventually … maybe the functionality of its software is not as developed as VMware’s. For instance, one of the benefits of VMware

is live migration from one server to the other and it can do that in a few seconds, whereas with Microsoft’s products, it’s a couple of minutes. But it is very early days for Microsoft,” he suggests.

Origina provides outsourcing services to its client base and O’Donoghue sees that trend continuing, though he adds that it would depend on how much of a pain point it is for customers. “Virtualisation should limit the amount of admin and overheads you have, but depending on the environment, the customer may wish for their staff to be more focused on projects rather than day-to-day administration. We see it as a continuing trend in that space where they are either looking for data-hosting companies to manage their servers in data centres or smaller niche companies to provide the service remotely. And with that they can remain focused on the true value to the business’s bottom line,” he concludes.

The IT industry is used to big corporations setting the trends and expecting the rest of the sector to take their hands and pull them through into the next phase, iteration or development. With servers and server management the reverse might possibly be the case, as smaller to medium-sized enterprises are taking a leading role in demanding that their servers do more, be more efficient, consume less power and so on. With SMEs being the predominant business culture in Ireland, the industry can expect to see shifts in what type of servers become available, how software licensing is done and whether cooling and power can be used with greater ease as vendors seek to satisfy this most important and lucrative client base.

Of course, it is not just SMEs looking for greater efficiencies; the large corporates are also endeavouring to cut their IT costs as well, but it is an interesting change as smaller companies – on cursory evidence at least – are pulling the strings on this one. One of the key technologies that has been in the ascendancy for the past few years is virtualisation. It has been hyped and hoorayed, not only by vendors and implementers, but also by researchers, analysts and other industry experts.

And with good reason. According to recent research from IDC, some 54pc of businesses not using virtualisation expect to do so in the next 18 months. The study also observed that while both large companies and smaller organisations see virtualisation as key to their data centre strategy, their next challenge will be to make more effective use of this capability. “Virtualisation use has exploded since our last survey of the European market,” says Chris Ingle, consulting and research director with IDC’s Systems Group. “Both large organisations and smaller businesses are using the technology for a wider range of applications and for business-critical projects. As use of virtualisation grows, the challenges around managing complexity, finding skills and software licensing become more apparent.”

IDC’s study also revealed that organisations are increasing their virtualisation of x86 systems for core business applications, although the majority of virtualisation is still for test and development and for network server applications. Growth of virtualisation as a strategy is still strong, rising from 46pc of the base to 54pc.

What is interesting is that virtualisation is growing as a data centre strategy in itself, rather than as part of other projects. This supports the view that virtualisation is increasingly seen as a standard for a wide range of workloads. And it’s something that Fran McNally of A&O  Systems & Services agrees with. He says businesses need to recognise that virtualisation technologies have uses outside of the traditional server and test environments. According to McNally, they can also be implemented in the desktop environment to centralise and therefore reduce desktop administration and further reduce costs by using dumb terminals or open source clients to access centrally located processing power. VMware’s VDI and ACE products are good examples of this, he says.

As a result of the reduced physical footprint and lower power consumption of a virtualised environment, it can often make financial sense for companies to have their virtualised systems located in third-party hosted facilities, rather than maintaining their own data centres. McNally points out that locating a company’s virtual environment in a hosted data centre can provide savings because the company no longer has to physically support its own server environment or support the equipment associated with it, which typically includes air conditioning units, fire suppression systems, UPS and also comms equipment.

“Remotely hosted virtual environments can also be used as part of a disaster recovery solution, where the remote virtual environment can be used as the failover system. This process can even be automated by using tools such as CA’s WAN Sync or a Microsoft geographically dispersed cluster,” says McNally.

Virtualisation can also free companies from dependency on legacy hardware, whereby a particular application may be tied to a particular legacy OS (operating system), that is in turn dependant on legacy hardware. “With virtualisation, you can break this underlying dependency,” McNally states.

To implement a virtualisation strategy, McNally recommends that companies acquire a unified, end-to-end solution, based on a powerful, reliable and scalable IT infrastructure. “Provisions need to be made for future growth of the virtual environment, both for the individual systems growth and for adding additional systems. This growth could consist of additional storage, CPU power or bandwidth,” he says. In terms of boosting the power of an organisation’s IT setup, by separating the layers of the computing stack, a virtualised IT environment makes it possible to quickly deploy new capabilities without having to configure components. “In a virtualised environment, testing requirements and application compatibility issues are reduced, processes are easier to automate and disaster recovery is easier to implement. In addition, new software releases can be tested and rolled out with greater confidence than ever before.”

Fran Ryan, enterprise solutions manager for Commtech, says server virtualisation will enhance the way companies look at their computer infrastructure and will also reduce the server footprint, which will lead to better use of the remaining machines. “It’s a fundamental shift for a lot of people on how they think about key application deployment and also in their purchasing habits with their server vendor.”

According to Ryan, server provisioning and deployment is now as simple as “a couple of mouse clicks” to create a new machine. This, he says, negates the whole business case for the traditional server acquisition process such as looking for quotes, finance, the ordering process and the implementation phase. “This level of dynamism was unheard of before virtualisation. The server now becomes a more dynamic asset that can be moved around, backed up and copied far more quickly and vigorously than in a traditional hardware environment.”

While blade servers have been the belle du jour of the server hardware world for some time, Ryan feels even they will feel the pinch from virtualised machines. Pointing out that most virtualisation projects are based around reducing server numbers, Ryan adds: “Suffice to say, virtualisation has superseded the rackable-to-blade migration because – with respect to the hardware vendors – they would love to see you do a one-for-one refresh of all the servers forever.”

So has server-based computing beat out the heavy client model? Ryan believes that it ultimately depends on the applications. He admits that in many cases heavy clients were an absolute necessity, due to workload or the type of job the IT infrastructure was being asked to do. “Traditionally speaking, the average office worker is probably using a machine that draws more resources than necessary. The typical client would be a laptop running 2-3GB of memory and running XP or Vista, and there is a debate that if you’re running three or four key applications, why can’t that also be run on virtual machines on a thin client? A desktop can be virtualised in the same ways as a server and the user can then log on and request his virtual PC.

So does this development suggest a return to dumb terminals? “Let’s stay away from that phrase, it upsets too many people,” Ryan jokes. “But there is a cultural change to go through here for users to rebalance their requirements, based on what they do on their machine with what it is capable of. For instance, a standard PC runs on a 350-400 watt power supply. Typically, only 25 watts would be needed for the apps they run.”

John Casey, sales manager at Datapac, says a virtualisation strategy all depends on the organisation and what it wants to achieve. In his experience, some companies are using virtualisation for legacy systems, while others are using it to reduce the number of servers in the organisation. He also believes that virtualisation has kicked the demand for blade servers into gear due to companies getting rid of brick servers like the tower and rack formats. “HP and IBM have been talking about the amount of blades shipped, which they say is now above 50-50 … I think it’s around 60-40 in favour of the blades. So that is a big increase,” says Casey.

According to Kevin Reid, business development manager at Sureskills, the move to server-based computing has always had classic difficulties – does the company go client/server or three-tier? What is the application between where the processing takes place and where the client sits and so on? “We are really moving back to a mainframe-like model where you’re using the client purely as a presentation layer, and whether that’s web-based or a thin client, it doesn’t matter because it’s much more presentation focused than processing focused. Whether you use a two-tier model at the backend for app processing and data storage or a single-tier combing, those doesn’t matter, but there is a very strong move towards this setup.”

Reid has an interesting take on the different players in the server market, seeing little difference among them. He believes the problem with vendors trying to make a distinction between themselves is that all the manufacturers are trying to develop products that are essentially very similar. “They are trying to hit a density point, performance point and cost point so the differentiators are very slim. Even the management products that wrap around them – HP and Dell licence the same technology from Altiris, so they’re not even able to differentiate in that area. Therefore, a lot of it comes down to their supply chains – how do they manufacture? How do they provision? How do they deliver? And then of course, cost. The main players would be HP and Dell. There’s also IBM, but I think they’ve slipped back a bit recently.”

Reid also believes outsourcing will play a key role in future server management paradigms. “I’m a strong believer in outsourcing management tasks or a lot of skills that are difficult to maintain in small and medium organisations. Once you get into the larger companies, it’s easier to cover the tech requirements. Most organisations should be looking at outsourcing specific skilled tasks where they are not going to be able to maintain the skill set or the knowledge base.”

Rowan O’Donoghue, director of innovation and development at Origina – formerly known as Unitech – says the main virtualisation player from his perspective would be IBM on the Unix platform. According to O’Donoghue, the dominant player in the Intel environment would be VMware, with Microsoft chomping at its heels, along with Citrix and its Zen Server portfolio. “Microsoft will become a bigger player eventually … maybe the functionality of its software is not as developed as VMware’s. For instance, one of the benefits of VMware is live migration from one server to the other and it can do that in a few seconds, whereas with Microsoft’s products, it’s a couple of minutes. But it is very early days for Microsoft,”

he suggests.

Origina provides outsourcing services to its client base and O’Donoghue sees that trend continuing, though he adds that it would depend on how much of a pain point it is for customers. “Virtualisation should limit the amount of admin and overheads you have, but depending on the environment, the customer may wish for their staff to be more focused on projects rather than day-to-day administration. We see it as a continuing trend in that space where they are either looking for data-hosting companies to manage their servers in data centres or smaller niche companies to provide the service remotely. And with that they can remain focused on the true value to the business’s bottom line,” he concludes.

By Eamon McGrane