While dynamic computing offers significant benefits, a cautious company will be careful to avoid some of the pitfalls along the way.
1. Drinking Too Much Cool Aid: All too often, CIOs become enthralled with the promise of dynamic computing and commit too many resources to implementing it where there is no concrete business case. The business needs should drive technology decisions – not the other way around. Dynamic computing takes advantage of economies of scale and requires upfront investments that become a fixed cost for management, monitoring and detection.
If a company is only making a handful of changes to its environment each year, dynamic computing may not be the best approach. For example, there is no point in creating a multi-tier automated provisioning system if the business plans do not show a forecast of multiple changes and new services requiring it. On the other hand, if your environment requires a high level of flexibility and scalability because your technology needs are continually changing, a dynamic computing strategy is worth investigating.
2. Can’t See the Forest for the Trees: Often companies get so bogged down with technology and logistical details that during the time they spend developing a large-scale dynamic computing environment, they fail to see that their solution no longer addresses changing business needs.
Companies should be wary about becoming too granular and prescriptive in their dynamic computing solutions. The purpose of an efficient computing environment is to minimize change and the costs involved.
Despite the automation that allows you to provision and de-provision, or move virtual machines from one host to another, do not use up the cost and time benefits achieved through constant minor tweaking with marginal additional benefit. If you’re constantly tweaking your capacity and trying to follow demand exactly, you will lose sight of the larger issues. Worse, you will divert important resources that should be focused on managing and adapting to change. For example, sometimes the highest level of availability can be achieved by simply finding a configuration that works, and leaving it alone for as long as possible.
3. Time is of the Essence: Many projects as sweeping and significant as dynamic computing can collapse under their own weight, layers on dependencies and benefits and payback many years out. By the time a monolithic IT project is implemented, the business often changes so much that the benefits of the original project are no longer relevant.
We saw this in the 1990s when many customers implemented business processes that became so large and unwieldy that they became unmanageable. Companies need to keep in mind that timeliness is key to successful dynamic computing initiatives, and that solutions can become dated if they take too long to deliver. Companies today need to be able to see results within the first year, so planning the implementation is key.
4. Going it Alone: Implementing a dynamic computing solution impacts many areas of IT. Many companies feel they can go it alone and begin implementing a dynamic computing solution based on their own internal assessment. However, talented people with real-world dynamic computing experience are rare. If a company does have the internal expertise, it often lacks the objectivity necessary to evaluate and change its own environment. Third-party experts who can provide unbiased guidance can be crucial factors in planning and implementing innovative dynamic computing solutions.
Gavin Williams is director of Infrastructure and Security Solutions at IT consultancy Avanade.