Menu
Menu
Server withdrawal

Server withdrawal

CIOs and technology providers share their experiences of recent virtualisaton projects.

With the onward march of virtualisation into new sectors, server virtualisation remains top of IT managers’ agendas, with Gartner reporting such projects can reduce TCO costs by 25 per cent. These were considerations for CIOs Quinton Hall of Tourism Holdings Ltd (THL) and Aubrey Christmas of the Employers and Manufacturers Association (EMA) Northern.

For both of them, virtualisation formed a major part of their recent IT system upgrades.

“The purpose was to reduce the number of servers and provide faster, more cost-effective environments, provide a more cost-effective DR solution, as well as improving the ease and management time of the environment,” says Hall.

The work, carried out by systems integrator Gen-i, also aimed to boost the performance of IT systems so online travel products could be brought to market faster and improve the integration of the various companies within THL.

Work on the project began in August 2007 and was completed in June of this year. Around 80 of THL’s 120 servers were virtualised, once Gen-i and THL decided which ones were most suited for virtualisation. New hardware was bought to run the system, such as an IBM SAN, IBM server hardware, including blades and VMware.

The project went smoothly, Hall says, but there were challenges around ensuring the ‘virtual instances’ were built according to the application requirements. A lot of configuration was required in some instances.

Some configuration issues remain and with hindsight, he would have preferred more time in the testing and specification process.

IT bosses, Hall advises, need to define the project well, complete a comprehensive evaluation of the environment and test well. “Consider the product roadmap and evaluate against your needs,” he continues.

The company plans to rationalise further, consolidating server roles, monitor and improve performance, with Hall adding there is potential for virtual desktop delivery and virtualised applications.

Aubrey Christmas of the EMA reports a similar story with his organisation’s IT upgrade. Christmas says virtualisation was a byproduct to his main project of upgrading IT systems, which sought a scaleable, versatile and stable IT infrastructure.

“The business case was based on a blend of cost savings, DR enablement, power usage and reduction in the amount of technology needed to support future IT needs,” he says.

The EMA consolidated 14 physical servers into three servers, creating more space for itself, but also with the ability to add more servers when needed.

Dell supplied the hardware, Microsoft the software and VMware contributed the virtualisation software.

Work began last October, with support from Dell, which also helped train EMA IT staff to manage the project and handle any configuration or other issues. “The project was implemented in stages with the first stage being the complete virtualisation of the existing environment, [migrating from physical servers to virtual], which was completed in October last year. Initially, there were a few performance issues with regards to disk inter-operability, but Dell has been a tremendous help with providing better storage solutions to overcome the issue,” he says.

“The second part of the project was completed in mid-July this year, with a new set of Dell servers hosting new virtual servers for a new ‘greenfield’ environment, which we will be moving into. This completed the overall project; to have a completely virtualised environment spanning multiple sites.”

Now, the EMA plans to use virtualisation to consolidate systems and storage with partner organisations across New Zealand.

Christmas says working with the right partner is essential.

“Don’t be afraid to ask questions. Make sure you feel comfortable with your partners. The technology does work, but like all IT projects know what you are getting into. Planning the virtual environment beforehand is also crucial to ensuring that possible bottlenecks, such as disk interoperability contention and RAM usage do not cause grief to the users later on.

“The three Cs; communication before, communication during, and communication after are important to reducing/mitigating risk and ensuring a successful project,” says Christmas.

His observations are echoed in the vendor community active in the virtualisation space.

Unisys Asia-Pacific VP of systems and technology, Geoffrey Dickson, says virtualisation originally aimed to reduce server space and energy costs, though now the real benefit is to standardise environments and make processes more consistent and suitable for automation.

“Automating processes like capacity management and patch updates frees up the IT team’s time, so they can focus on those decisions that must be made by humans. By automating certain processes, the concept of a real-time infrastructure — one which can readily respond to the changing demands of the business — comes closer to reality,” says Dickson.

“In 2009 organisations will have the tools to implement a truly dynamic, real-time infrastructure based on the foundations of a virtualised environment. In particular, automation tools allow IT managers to set rules around standard processes, such as capacity management and patch updates, and are now readily available. But the key step in the evolution of virtualisation is not a new technology or tool, but rather it will allow a fundamental change to the way that IT is used to support business goals and operations — not just cut costs,” says Dickson.

David Spratt, Gen-i manager for technology strategy and capability development agrees. He says virtualisation is moving from servers to applications and networks, with the major vendors all having products giving better functionality and lower prices. Open source alternatives also exist.

“If you can virtualise servers, storage and applications, why own any of them? Why not go to the service provider and buy as a utility?” he asks.

US firms are now buying virtual servers from their data centre provider and never see the hardware. It is software as a service and works like the storage individuals can obtain from Google.

Spratt says Gen-i will probably offer such services one day, but it leads to issues of “who owns your data, who is responsible for security and what happens if your storage vendor goes under?”

In the meantime, however, organisations must deal with licensing issues from creating all those extra servers. But Microsoft and other vendors are working on new licensing models to prevent ‘bill shock’ in a virtualised environment, with Gen-i also having licensing experts to address such issues.

Symantec, on the other hand, raises issues of security and business continuity for organisations looking into virtualisation. “Symantec’s strategy is to leverage virtualisation technology to secure, manage and protect the information that matters. Symantec helps customers use virtualisation to separate out valuable information and manage it easily, protect it completely and control it automatically,” says Paul Lancaster, director, systems engineering ANZ for Symantec.

While data centre virtualisation is about virtualising hardware (server and storage optimisation, network management), high availability is still a requirement in many data centres today.

He says Symantec helps this happen by providing a single application to secure, manage and provide business continuity — regardless of architecture — making it easier for customers to select and change hardware and platform vendors.

Symantec is working with virtualisation providers on solutions based around clustering technologies, storage, security and recovery solutions, similar to existing products serving physical environments, Lancaster adds.

Now that data centres sit on both sides of the user-supplier fence, how do they manage?

VWware says Revera is its largest deployment in New Zealand, though the centre also uses Microsoft HyperV for small mid-tier hosting customers.

Roger Cockayne, managing director, Revera, says IT managers need to consider what might happen should there be a power outage, especially during the 12-second gap from the power going out and the UPS kicking in. IT managers and utility providers must also look at redundancy, which for virtualised environments means clustering.

“You must have a hot sphere, so if one physical server fails all the virtual machines running there will automatically restart on another physical server. Virtual servers must run in a blade agency connected to redundant switches for both storage and networks, so if any one component fails, server availability remains stable,” he says.

However, Cockayne warns the biggest issue associated with virtualisation and consolidated computing is cooling.

Most cooling systems have limitations and organisations need to consider reasonably commoditised solutions able to withstand various conditions.

Revera recently devised what it calls ‘eco-pods’ for cooling, to better manage the massive volumes of superheated air generated by blade servers at its Albany and Wellington blade server hosting platforms.

Virtualisation, he continues, is central to his business model of offering utility infrastructure to users unable to afford the cost of the required scale, size or expertise to buy rich hardware and get value out of it. But offering such utility services also presents challenges to Revera and other datacentres.

“Because you are now running multiple customers in one environment, and that’s a completely different ballgame to running the infrastructure for just one organisation. There are a lot of security concerns and it really changes, and in some cases limits work design choices. In many respects virtualisation is a gold rush, with IT service companies reinventing themselves as providers of virtualised services.”

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

Tags virtualisation

Show Comments

Market Place