Common wisdom has it that virtualization is the best course to improve scalability, continuity, performance, and resource efficiency on everything from desktop to servers to storage. In practice, however, that is easier said than done. Part of the difficulty lies in knowing where to start the process and how to contain the risk and where to stay physical.
The typical response to the pressure to cut costs and increase efficiency is to create virtualized environments in order to appear proactive in leveraging the new trend. Such an approach, however, can be more harmful than helpful.
“One of the main problems with virtualization today, is many organizations use server virtualization to reduce server sprawl,” said Jeff Holland, a technical architect for Systems Alliance. A haphazard approach without benefit of a strategic virtualization plan merely leads to more sprawl — this time in virtual machines (VMs).
“When there is no procedure for deploying these virtual machines, IT deploys a new VM to support a lightweight application that could have been easily deployed on an existing server,” explains Holland. Other reasons for virtualized sprawl are “some applications and services that do not work nicely with others on the same machine.”
So where are the best places to begin your company’s migration to virtualization? Here are the top five places, in no particular order, along with the corresponding experiences and advice of IT leaders who have been there and done that:
New application “virtualize first” approach – “Every new application should be virtualized unless there is a good reason not to do it,” advises Renata Budko, co-Founder of HyTrust. “Another area that fits virtualization perfectly are ‘cookie cutter’ infrastructure pods, such as retail branch environments. Once such environments are virtualized and consolidated it’s easy to replicate them into similar installations elsewhere.”
Test and development – “They are ideal for sandboxes since they can be setup quickly and managed outside of production IT systems,” explains Swastik Lahiri, lead principal at Technisource. “Plus, it doesn’t have the long lead procurement times that are often the case with physical hardware.” The same reasons why virtualization is a good fit for test and development environments “makes it attractive for proof of concepts and rapid prototyping,” he said.
QA and engineering departments – “We manage about 50 virtual servers for our QA environment alone, and are able to ramp up all the servers required for a new project and recycle the servers from old projects in a matter of minutes rather than days,” said Jenson Crawford, director of Engineering at Fetch Technologies. IT’s goal, he says, is to virtualize 100% of the company’s servers. “This has paid off fabulously, as the QA and engineering staff are able to deploy and manage servers as needed, reducing our time to deliver software.”
Low risk services – Move the easy stuff — Web servers, print servers, file servers, single-system applications, etc. — first. “Co-locating these environments on virtual machines delivers quick wins in business continuity, agility, resource efficiency, and of course cost savings — both cap-ex and op-ex,” explains Andi Mann, vice president of Product Marketing at CA Technologies Virtualization and Service Automation Business Unit. Moving low-risk services such as HR systems — file servers and Intranet applications, for example, but not payroll or e-mail — onto virtual machines is “a great next step into production virtualization.”
Systems with low predicted requirements – Obvious system candidates to be virtualized can be rapidly determined by their low and predictable requirements, said Dave Sobel, CEO of Evolve Technologies and Microsoft’s 2010 MVP for virtualization. Here is his partial checklist to get you started:
- Systems with minimal processor utilization
- Systems with minimal RAM requirements
- Systems that do not require large quantities of drive storage
- Redundant or warm-spare servers
- Occasional- or limited-use servers
- Systems where many partially-trusted people need console access.
While these top five uses of virtualization will help you work forward in a relatively pain-free manner, there are areas likely to incite nightmares should you tread unprepared.
“The worst targets are the least standardized, since they are subject to potential architectural changes that require a lot of re-work after each modification,” warns Kelly Herrell, CEO of Vyatta. “Re-worked designs are one thing; re-worked virtual designs compound the challenge.”
A prolific and versatile writer, Pam Baker’s published credits include numerous articles in leading publications including, but not limited to: Institutional Investor magazine, CIO.com, NetworkWorld, ComputerWorld, IT World, Linux World, Internet News, E-Commerce Times, LinuxInsider, CIO Today Magazine, NPTech News (nonprofits), MedTech Journal, I Six Sigma magazine, Computer Sweden, NY Times, and Knight-Ridder/McClatchy newspapers. She has also authored several analytical studies on technology and eight books. Baker also wrote and produced an award-winning documentary on paper-making.