Skip to content
Analytica > Blogs > Data center capacity planning & modeling

Data center capacity planning & modeling

What goes around comes around. Do you know how old virtualization is really? Although it’s a buzzword today thanks to the rapid growth of cloud computing and vendors like Amazon with its Elastic Cloud Compute offering, virtualization is over 30 years old. It was invented as a datacenter capacity planning solution to share lumpy (and expensive) mainframe resources between different applications. Then the PC arrived, hardware prices plummeted and virtualization problem was no longer relevant – until the need to address reliability, availability and security between multiple systems came back again.

A simple model of how virtualization works

Image source: vmware.com

The promise of virtualization and the need to plan

Virtualization also offers great potential for cutting down on energy consumption. This is a factor of growing importance in data centers, not only because of the cost it represents, but also because of the ecological impact. Even if you build solar-powered data centers, the less solar panels you have to put in, the better. However, the move from computer programs running in native mode on separate physical servers to virtual mode sharing one or several networked servers needs proper planning, not just for energy consumption, but also for the total amount of processing power and storage. Planning can be significantly improved by modeling virtualization to predict outcomes or requirements given different scenarios (computing use cases).

When modeling virtualization goes awry

Image source: technet;com

Not such plain sailing for datacenter capacity planning

Cloud computing is here, but at least one survey says that data centers are lagging badly behind. Planning and modeling seem to have been forgotten in a large number of centers already 20 years old. In fact, Gartner group says that if your data center is more than seven years old, it’s already obsolete. As a result data centers and their networks also score low in terms of user confidence, something that realistic models of replacement installations could perhaps address. However, those involved in planning will need to get their models right in the first place. Data center planning is also dangerously rich in the potential for adopting the wrong models or for making datacenter capacity planning mistakes.

Getting the models right

Modeling virtualization for datacenter capacity planning can be done properly with a common sense approach and comparison with resources for modeling virtual environments. With the interlinking of data centers either via intranets or extranets, information is also available for networking and network virtualization modeling. Models will likely need to be constructed over timelines of say five years to take account of amortization periods and the evolution of data-processing needs. Uncertainty in the growth of the number of users and the quantity of applications will need to be handled appropriately too.

A more complex model of virtualization Image source: commons.wikimedia.org

Basic building blocks for visualizing virtualization

While the concept of virtualization is generally the same (share computing resources without physical constraints), the practical manifestations differ. There are a number of architectures that organizations can use, each one having an impact on computing power and energy consumption. Possibilities range from multiple logical universes all running on one base operating system to users accessing databases distributed across a variety of different machines and IT environments. Virtualization processes may also cost less in terms of staffing if automation technologies for systems are used.

If you’d like to know how Analytica, the modeling software from Lumina, can help you to model computing installations, then try the free edition of Analytica to see what it can do for you.

Share now   

See also

Building electrification: heat pump technology

Lumina set out to build a useful tool to assess the benefits of heat pumps. Learn more about heat pumps and their impact.

More…

Decision making when there is little historic precedent

Learn how to make decisions and strategic plans in uncertain situations, where historical data is not available. See how to model this in Analytica with clarity and insight.

More…

Does GPT-4 pass the Turing test?

UCSD researchers conducted an online Turing test of GPT-4 with 652 human participants. Humans were not fooled ~60% of the time.

More…

What is Analytica software?

Analytica is a decision analysis tool that helps you generate clearer and more justified results through modeling.

More…

Download the free edition of Analytica

The free version of Analytica lets you create and edit models with up to 101 variables, which is pretty substantial since each variable can be a multidimensional array. It also lets you run larger modes in ‘browse mode.’ Learn more about the free edition.

While Analytica doesn’t run on macOS, it does work with Parallels or VMWare through Windows.


    Analytica Cubes Pattern

    Download the free edition of Analytica

    The free version of Analytica lets you create and edit models with up to 101 variables, which is pretty substantial since each variable can be a multidimensional array. It also lets you run larger modes in ‘browse mode.’ Learn more about the free edition.

    While Analytica doesn’t run on macOS, it does work with Parallels or VMWare through Windows.


      Analytica Cubes Pattern