Friday, September 17, 2010

3 keys to Capacity Planning for Virtualization

Fact:  Everyone’s going virtual to reduce cost and improve utilization. Why not?  Sharing of resources and paying for resources on-demand should be beneficial.  But that just makes the job of the capacity planner even more difficult!

3 big challenges in Virtualization that do not exist in traditional capacity planning:

1. Capacity of each box is dynamically allocated, so which virtual machine (VM) actually got how much of resources?

2. Each box has overhead utilization from each VM due to the hypervisor which reduces performance, so what is the critical number of VMs to be configured on each box?

3. Cost models are complex with options for on-demand vs dedicated vs burst capacity, so which option should be chosen?

Solutions: 1. We use a couple of key performance metrics (Transaction Rate and Response Time) that are uniform across all layers and applications.  This tells us how much capacity was effectively used by each application.

2. We use a slowdown factor that discounts available resources due to hypervisor utilization.  As a result, we can derive the optimal number of VMs on each box.

3. Because of solutions 1 & 2, we are able to accurately forecast capacity requirements which can then be compared with the different cost models.  If capacity requirements are high but stable, dedicated would be cheaper, but on-demand is better for unstable capacity requirements.

Saturday, September 11, 2010

Decision Support for Public Healthcare Administration in Indiana

Decision support using analytics sounds great!  But where do we begin?  There's so much data being collected and stored and secured to the nth degree, but now what?

The main issue with all the data we are collecting is the data is usually inconsistent.  This is because there are a number of 'events' both on the demand as well as on the supply end which distort the picture.  So, is low throughput due to fewer resources or low demand?

Therefore, the CIO's decision support group would first need to "cleanse" the data and wrap a structure around it. Then decision support is a matter of applying one of the many analytical tools out there in the right context.  This begs the question - how much time and effort would that take?

Well, it only took 2 months by Gravitant's professional services group to get the FSSA of Indiana up and running.  Gravitant's BusinessMatrix platform was used followed by their AdvancedAnalytics modules to provide visibility into throughput and timeliness, followed by decision support for bottleneck identification and optimal resolution options.