What goes around comes around. Do you know how old virtualization is really? Although it’s a buzzword today thanks to the rapid growth of cloud computing and vendors like Amazon with its Elastic Cloud Compute offering, virtualization is over 30 years old. It was invented as a datacenter capacity planning solution to share lumpy (and expensive) mainframe resources between different applications. Then the PC arrived, hardware prices plummeted and virtualization problem was no longer relevant – until the need to address reliability, availability and security between multiple systems came back again.
The promise of virtualization and the need to plan
Virtualization also offers great potential for cutting down on energy consumption. This is a factor of growing importance in data centers, not only because of the cost it represents, but also because of the ecological impact. Even if you build solar-powered data centers, the less solar panels you have to put in, the better. However, the move from computer programs running in native mode on separate physical servers to virtual mode sharing one or several networked servers needs proper planning, not just for energy consumption, but also for the total amount of processing power and storage. Planning can be significantly improved by modeling virtualization to predict outcomes or requirements given different scenarios (computing use cases).
Not such plain sailing for datacenter capacity planning
Cloud computing is here, but at least one survey says that data centers are lagging badly behind. Planning and modeling seem to have been forgotten in a large number of centers already 20 years old. In fact, Gartner group says that if your data center is more than seven years old, it’s already obsolete. As a result data centers and their networks also score low in terms of user confidence, something that realistic models of replacement installations could perhaps address. However, those involved in planning will need to get their models right in the first place. Data center planning is also dangerously rich in the potential for adopting the wrong models or for making datacenter capacity planning mistakes.
Getting the models right
Modeling virtualization for datacenter capacity planning can be done properly with a common sense approach and comparison with resources for modeling virtual environments. With the interlinking of data centers either via intranets or extranets, information is also available for networking and network virtualization modeling. Models will likely need to be constructed over timelines of say five years to take account of amortization periods and the evolution of data-processing needs. Uncertainty in the growth of the number of users and the quantity of applications will need to be handled appropriately too.
Basic building blocks for visualizing virtualization
While the concept of virtualization is generally the same (share computing resources without physical constraints), the practical manifestations differ. There are a number of architectures that organizations can use, each one having an impact on computing power and energy consumption. Possibilities range from multiple logical universes all running on one base operating system to users accessing databases distributed across a variety of different machines and IT environments. Virtualization processes may also cost less in terms of staffing if automation technologies for systems are used.
If you’d like to know how Analytica, the modeling software from Lumina, can help you to model computing installations, then try the free edition of Analytica to see what it can do for you.