Friday, July 27, 2012

Cloud Technology Spectrum

Doesn't it make you cringe when people use the term cloud brokerage when they really mean cloud marketplace?  Or, when they say they provide virtualization management whereas they really provide cloud management services?  A number of such cloud terms are used interchangeably every day, but the challenge is that cloud terminology has yet to reach steady state.

In this article, we hope to clarify a few cloud technology terms using Level of Integration as the criteria.

Note that each layer builds upon the layer below it, thus leading to a spectrum of cloud technologies - aka - The Cloud Technology Spectrum.  The table below shows some of the key providers in each layer of the spectrum.

There are a number of other providers as well, and some providers in fact go over multiple layers of the spectrum.  But the point here is to note that in order for any technology to claim to be in a specific layer, it should effectively integrate at least one item from each of the layers below.

When deciding to migrate to the cloud, it is important for consumers to know where in the spectrum they would end up if they purchased some piece of cloud technology, and how much additional effort would be required on their part.  The Cloud Technology Spectrum helps in this step of the process.

Go to Cloud Deployment Tree for a view of the different cloud deployment options...

Monday, April 30, 2012

Secure Environment for Federal Government Cloud Pilot

How is the Federal government hoping to achieve the $12 Billion in projected annual savings?  This projection was quoted by the MeriTalk Cloud Computing Exchange and published today by Forbes.com, and it doesn't seem too optimistic given that the Federal government is already saving approximately $5.5 Billion per year.

These savings have been achieved by individual agencies adopting cloud solutions, but such organic growth will only go so far.  In order to expand this in a generic and scalable manner, the Federal government would need a secure environment to test the cloud and run pilot programs.

A Fire-fort?

Key features of such an environment:


1. Multi-provider provisioning and compliance
Agencies should be able to provision resources across cloud providers without having to worry about vendor lock-in.  This would require the use of a brokerage platform that enables auto provisioning across providers.  Monitoring would also be necessary to ensure the providers maintain SLA compliance, failing which they would be quarantined.

2. Fed certified cloud providers
The list of cloud providers should include those that are FedRAMP certified, or at least FISMA compliant.  Agencies should be able to compare providers side by side and pick the best-fit provider.  This requires standardization of cloud offerings and pricing models.

3. Integration with existing data centers private / hybrid clouds
Agencies should be able to interoperate between the cloud and their existing data centers and private clouds.  This provides a backup plan in case the cloud solution does not succeed.  For this feature, the test environment would need to be agnostic across VMware, Xen, Hyper-V, vCloud Director, etc.

4. Connectivity to existing security frameworks
The test environment should be integrated with the security frameworks currently used by the Federal government.  In this way, valuable resources need not be wasted in re-designing a security framework that is already very efficient.  Instead, resources can be assigned to enhance the existing framework with intrusion detection and intrusion prevention features.

5. Complete cost transparency
First of all, agencies should not be required to sign multi-year contracts with cloud providers.  Secondly, the cost of cloud services should be visible at the highest level so that budgets may be allocated based on resource requirement.  This allows complete auditability as well.

6. Recalibration based on historical data
Cloud usage data should be constantly correlated with cost to ensure that cost is minimized without impacting mission goals.  This requires the test environment to be powered by advanced analytics engines for continuous recalibration through command and control.

All the above features would need to be tested by the Federal government through a pilot program before executing any major cloud migration initiatives.  If successful, the test environment can then be established as the official government cloud portal which is bound to be successful because it has been built on NIST standards and governed through strict monitoring and compliance.

Friday, January 27, 2012

Can Clouds Plug the Ozone Hole? (pun intended…)


Environmental protection has been a major concern over the past few years... and if it hasn't been an issue for us, it probably should be.  In any case, as an IT analyst it is important to know where we fit in and scrutinize our contribution to the environment from an analytical perspective, leaving all subjectivity aside.


For those of us who are not EPA experts, let us say we can help conserve the environment by:
1. Protecting the environment from pollution and habitat degradationCloud computing does not do much when it comes to habitat degradation or water pollution, but it does play a part in controlling air pollution.  This is because physical servers are consolidated into more efficient blades and chassis in the cloud.  Consolidation of resources results in less power and cooling requirements, which in turn reduces air pollution.  Moreover, cloud data centers can be placed in colder parts of the world to further save on power for cooling.
2. Sustaining the environment by avoiding depletion of natural resources
In the same way that cloud data centers can be placed in cold parts of the world, they can also be placed in remote areas with high wind (to harness wind power) or areas with more direct sunlight (for solar power).  As a result, alternative sources of energy can be used to power cloud data centers.  This placement of cloud data centers away from consumers is feasible because data and compute processing is not lost over wireless networks (unlike power loss during transfer of electricity from wind farms in the West coast to consumers in the rest of the country).


However, there are a number of underlying assumptions that need to be satisfied for cloud to successfully deliver Green-IT...
Assumption 1: Utilization of cloud resources is high and efficient.
Underutilization greatly reduces the consolidation ratio from physical to cloud resources and power savings are minimal.  Efficiency in the cloud can be boosted by turning VMs on/off based on demand (i.e. autoscaling) and load balancing between VMs.
Gravitant's CloudMatrix technology specializes in "optimizing" the cloud for consumers through a SaaS console across multiple providers.
Assumption 2: Data being collected is summarized and compressed before storage.
Otherwise, the constant collection and storage of data will lead to data obesity which brings into question "how much duplication there is and more importantly how much integrity does the data have?" (CloudVisions).
EXAR's hifn technology provides data deduplication and data compression services.
Assumption 3: Virtualization and storage caching technology is continuously improving.
Otherwise, the ever increasing processing and data needs will catch up and diminish the relative benefit of the cloud.
Cisco and EMC are constantly improving their virtualization and thin provisioning technology respectively.


Therefore, it is safe to say that Cloud computing can deliver Green-IT provided that the right tools are used and innovation continues unabated.