Ten years on to a virtualised world.
In January 2000, the world let out a collective sigh of relief as we were universally delivered from a frenzy of activity around what came to be known as Y2K (and the bonanza for COBOL programmers). As we realised this non event, Microsoft released a whole range of products to the industry including Windows 2000, AKA NT5, and Windows Server 2000.
Now, 10 years on, the server industry has evolved significantly and as we approach the end of support life, for Windows Server 2000 on 13 July 2010, it is interesting to take a look back at the lessons learned and how servers evolved to meet the growing demands of business needs, regulations and industry standards.
Factors such as 64-bit support, the emergence of dual- then multi-core processors and explosive data growth have all shaped the evolution of server platforms over the last decade. And more crucially we have seen an increase in the emphasis on integrated server infrastructure software, accompanied by a drive to lower operating costs, the latter in particular pushing virtualisation to the fore.
Virtualisation comes of age
It is no surprise that the pace of virtualisation has been rapid in the downturn.
The 2009, IDC IT Trends report for Ireland cited cost saving as the key driver for virtualisation in 65pc of all cases. Yet while it is absolutely about enabling cost savings on one level, the drivers of virtualisation are broad and diverse. They range from power consumption and server utilisation concerns, to enabling disaster recovery and business continuity.
The transformative effect of virtualisation on IT and business goes beyond cost savings, and today’s vision for the data centre is that it become a shared pool of computing resources that can ebb and flow between applications as dictated by usage demands and business need.
Key to enabling this vision is the concept of virtualisation mobility – the ability to migrate running virtual machines from one physical host to another in milliseconds without service interruption.
By combining virtualised mobility with a comprehensive management solution, all sorts of possibilities open up, including consolidation of workloads on fewer servers as demand allows, ultimately management and monitoring of workloads and failover.
While concerns about utilisation, hardware maintenance and upgrades don’t disappear, it is a first step towards utility computing for the enterprise, much as public cloud computing promises to deliver at the other end of the platform continuum.
So what now?
So what of Windows 2000 Server or indeed aging server operating systems in general in an increasingly virtualised world?
One of the benefits of virtualisation is that it can remove your exposure to aging hardware, with backup and restore reducing to management of a VHD file, for example (a simplified view of the reality, but more straightforward than backing up the physical machine).
For those servers which are still running Windows 2000 Server due to perceived challenges of upgrading, custom applications or perhaps a dependency on now-unsupported third-party applications, it is perhaps no surprise that many organisations turn to virtualisation to try to prolong the lifespan of the server OS beyond its normal support life cycle. Thus the Windows 2000 Server end of life event becomes an interesting milestone, raising concerns of security, compliance, cost and indeed performance.
What happens after 13 July 2010?
Beyond 13 July 2010, customers who remain on Windows 2000 Server will be running their organisations on a server OS no longer supported with security hotfixes, patches or service packs.
Aside from the obvious security angle, from an IT governance and regulatory perspective this is a risk to the business and it poses an interesting question for organisations subject to compliance regulations, many of which (such as PCI – PCI 6.1(b), ISO 9000, ISO 27002, and EU Data Protection Directive 95 46 EC) require patched and updated operating systems and applications.
Aside from assessing the immediate impact to the business if there is a problem with the unsupported environment, what about the potentially further-reaching impact from a compliance perspective?
In organisations subject to SOX for example, would a failure in an unsupported environment constitute a violation of control processes? As part of a risk-management strategy to prevent loss of data (and with a view to passing SOX audits) out-of-support systems, be they virtualised or not, be they operating systems or applications, demand attention.
The vulnerabilities of an out-of-support server OS make migration a top priority for 2010 and Microsoft recommends that their customers plan a migration off Windows 2000 Server before it reaches end of life on 13 July 2010.
With the rapidly increasing changes in the business landscape and current trends towards a mobile workforce, the ease of provisioning and increased agility through virtualisation and a mature management and security strategy are likely to figure prominently in the coming decade.
Beyond the initial ‘hard’ savings of server consolidation and associated reduced capital expenditure, it is what will allow IT to deliver greater value to the business in the longer term.
How Microsoft can help
To assist in planning this migration, Microsoft has made free tools and technical guidance available on the Windows Server product page and the Windows 2000 Server End of Support Solution Center.
The first step is to plan a risk assessment – either through partner or the customers’ own resources – to catalog any Windows 2000 Server units (using the Microsoft Assessment and Planning toolkit ) and to use the output of the assessment as a basis to put a migration plan in place to migrate those servers and their workloads to a supported platform.
By Ronan Geraghty, business manager, Server and Tools, Microsoft Ireland