Monday, January 28, 2013

Who Controls the Cloud Market – Providers or Consumers?

We first went from reserving cloud capacity to securing capacity on-demand, and then we even started to bid for unused capacity in the spot market – all in an effort to decrease cost in the cloud.  Can we take this one step further?  Instead of us bidding for capacity, wouldn’t it be interesting if we can get providers to bid for our demand?

Retail Supply Chain Market Analogy

In fact, this is a common phenomena in the retail supply chain industry.  For example, Walmart has a large amount of freight that needs to be shipped between different cities over the course of the year.  So, every year an auction is conducted in which Walmart lists all their shipments, and carriers such as JB Hunt, Schneider, Yellow etc. bid for the opportunity to carry these shipments using their fleet of trucks.  The reason carriers are bidding for retailer demand is because in general, capacity exceeds demand in the retail industry.

Cloud Computing Market

Keeping this in mind, let us now take a look at the Cloud Computing Market.  Does capacity exceed demand or is it the other way around?  A quick way to find out is by observing spot prices in the cloud market.  In today’s market, Amazon’s Spot Instances are 86% cheaper than their on-demand instances, and Enomaly’s SpotCloud also shows lower spot prices across the board.  This leads us to believe that capacity exceeds demand in the cloud market as well.  A related indicator is the predominance of data center consolidation initiatives in both the commercial and government marketplaces.
Since capacity exceeds demand, consumers have an upper hand and are in control of the cloud market at the moment.  Moreover, they should be able to replicate what is being done in the retail supply chain industry.  In other words, cloud consumers should be able to auction off their demand to the best fit lowest price cloud provider.

So, …

Consumers should seize the opportunity and control the market while the odds are in their favor i.e. Demand < Capacity.  At the same time, Service Integrators and Value Added Resellers can help Enterprise IT consumers in this process by conducting Primary-Market auctions using Cloud Service Brokerage technology.

Friday, July 27, 2012

Cloud Technology Spectrum

Doesn't it make you cringe when people use the term cloud brokerage when they really mean cloud marketplace?  Or, when they say they provide virtualization management whereas they really provide cloud management services?  A number of such cloud terms are used interchangeably every day, but the challenge is that cloud terminology has yet to reach steady state.

In this article, we hope to clarify a few cloud technology terms using Level of Integration as the criteria.

Note that each layer builds upon the layer below it, thus leading to a spectrum of cloud technologies - aka - The Cloud Technology Spectrum.  The table below shows some of the key providers in each layer of the spectrum.

There are a number of other providers as well, and some providers in fact go over multiple layers of the spectrum.  But the point here is to note that in order for any technology to claim to be in a specific layer, it should effectively integrate at least one item from each of the layers below.

When deciding to migrate to the cloud, it is important for consumers to know where in the spectrum they would end up if they purchased some piece of cloud technology, and how much additional effort would be required on their part.  The Cloud Technology Spectrum helps in this step of the process.

Go to Cloud Deployment Tree for a view of the different cloud deployment options...

Monday, April 30, 2012

Secure Environment for Federal Government Cloud Pilot

How is the Federal government hoping to achieve the $12 Billion in projected annual savings?  This projection was quoted by the MeriTalk Cloud Computing Exchange and published today by, and it doesn't seem too optimistic given that the Federal government is already saving approximately $5.5 Billion per year.

These savings have been achieved by individual agencies adopting cloud solutions, but such organic growth will only go so far.  In order to expand this in a generic and scalable manner, the Federal government would need a secure environment to test the cloud and run pilot programs.

A Fire-fort?

Key features of such an environment:

1. Multi-provider provisioning and compliance
Agencies should be able to provision resources across cloud providers without having to worry about vendor lock-in.  This would require the use of a brokerage platform that enables auto provisioning across providers.  Monitoring would also be necessary to ensure the providers maintain SLA compliance, failing which they would be quarantined.

2. Fed certified cloud providers
The list of cloud providers should include those that are FedRAMP certified, or at least FISMA compliant.  Agencies should be able to compare providers side by side and pick the best-fit provider.  This requires standardization of cloud offerings and pricing models.

3. Integration with existing data centers private / hybrid clouds
Agencies should be able to interoperate between the cloud and their existing data centers and private clouds.  This provides a backup plan in case the cloud solution does not succeed.  For this feature, the test environment would need to be agnostic across VMware, Xen, Hyper-V, vCloud Director, etc.

4. Connectivity to existing security frameworks
The test environment should be integrated with the security frameworks currently used by the Federal government.  In this way, valuable resources need not be wasted in re-designing a security framework that is already very efficient.  Instead, resources can be assigned to enhance the existing framework with intrusion detection and intrusion prevention features.

5. Complete cost transparency
First of all, agencies should not be required to sign multi-year contracts with cloud providers.  Secondly, the cost of cloud services should be visible at the highest level so that budgets may be allocated based on resource requirement.  This allows complete auditability as well.

6. Recalibration based on historical data
Cloud usage data should be constantly correlated with cost to ensure that cost is minimized without impacting mission goals.  This requires the test environment to be powered by advanced analytics engines for continuous recalibration through command and control.

All the above features would need to be tested by the Federal government through a pilot program before executing any major cloud migration initiatives.  If successful, the test environment can then be established as the official government cloud portal which is bound to be successful because it has been built on NIST standards and governed through strict monitoring and compliance.

Friday, January 27, 2012

Can Clouds Plug the Ozone Hole? (pun intended…)

Environmental protection has been a major concern over the past few years... and if it hasn't been an issue for us, it probably should be.  In any case, as an IT analyst it is important to know where we fit in and scrutinize our contribution to the environment from an analytical perspective, leaving all subjectivity aside.

For those of us who are not EPA experts, let us say we can help conserve the environment by:
1. Protecting the environment from pollution and habitat degradationCloud computing does not do much when it comes to habitat degradation or water pollution, but it does play a part in controlling air pollution.  This is because physical servers are consolidated into more efficient blades and chassis in the cloud.  Consolidation of resources results in less power and cooling requirements, which in turn reduces air pollution.  Moreover, cloud data centers can be placed in colder parts of the world to further save on power for cooling.
2. Sustaining the environment by avoiding depletion of natural resources
In the same way that cloud data centers can be placed in cold parts of the world, they can also be placed in remote areas with high wind (to harness wind power) or areas with more direct sunlight (for solar power).  As a result, alternative sources of energy can be used to power cloud data centers.  This placement of cloud data centers away from consumers is feasible because data and compute processing is not lost over wireless networks (unlike power loss during transfer of electricity from wind farms in the West coast to consumers in the rest of the country).

However, there are a number of underlying assumptions that need to be satisfied for cloud to successfully deliver Green-IT...
Assumption 1: Utilization of cloud resources is high and efficient.
Underutilization greatly reduces the consolidation ratio from physical to cloud resources and power savings are minimal.  Efficiency in the cloud can be boosted by turning VMs on/off based on demand (i.e. autoscaling) and load balancing between VMs.
Gravitant's CloudMatrix technology specializes in "optimizing" the cloud for consumers through a SaaS console across multiple providers.
Assumption 2: Data being collected is summarized and compressed before storage.
Otherwise, the constant collection and storage of data will lead to data obesity which brings into question "how much duplication there is and more importantly how much integrity does the data have?" (CloudVisions).
EXAR's hifn technology provides data deduplication and data compression services.
Assumption 3: Virtualization and storage caching technology is continuously improving.
Otherwise, the ever increasing processing and data needs will catch up and diminish the relative benefit of the cloud.
Cisco and EMC are constantly improving their virtualization and thin provisioning technology respectively.

Therefore, it is safe to say that Cloud computing can deliver Green-IT provided that the right tools are used and innovation continues unabated.

Tuesday, December 20, 2011

What Do We Mean by Cloud?

From Gravitant's blog.

“In all the ambiguity of what adds value to the Cloud or what facilitates the Cloud, Gravitant sits at the intersection of both, which makes it a pure Cloud company with all the experience, expertise, and solutions built around the Cloud.”

I’ve been writing mostly about what we’ve been developing for and around the Cloud in Gravitant, recently. Now is the time to elaborate a little bit about what’s being said and done about the Cloud outside of Gravitant. I am not intending to analyze specific articles, rather present an overall picture of the impression I get about what is out there and where Gravitant stands in this picture.
As Cloud is getting hype and determining the next generation of IT and what the Internet constitutes of, it is getting a whole lot of attention from the actors of the sector and beyond. While the Cloud has defined itself during its construction with a bottom-to-top approach, recently the new actors of the Cloud are trying to define/re-define the Cloud with a top-to-bottom view.
The concept of IT resource sharing can be dated back as far as the use of mainframe, the Internet, VMware, or EC2 depending on your perception. However, the name “Cloud” -which is cleverly set by the way- comes definitely after commoditization of IT resources, which is very recent. Before Cloud became the “Cloud”, standards of traditional IT had given direction to all innovative efforts towards Cloud. These efforts have been very technical and mostly motivated by infrastructure oriented improvements. Later on, the commoditization of IT resources has required the business model to be well defined. Although there is a lot of technical, infrastructural advancements noted, probably most of the focus is in the definition of business of the Cloud.
I have read many blog articles, white papers and research papers about the Cloud in addition to web content of cloud companies. If there is one thing common among all these articles, that is what exactly could be labeled as Cloud is not very clear. I get same kind of confusion among my colleagues as an Analytics professional, too. Most of the time, boundaries of the field of Analytics is not very clear. It makes sense in both cases, because their definitions of businesses are still in progress. However, I believe certain examples could draw a more indicative line of what could be called as a pure Cloud effort.
Most of the work branded as Cloud efforts are actually conversion of existing desktop software to SaaS. Especially, if you search keywords “Cloud” and “Analytics”, you will see many analytics tools as SaaS. Although I believe every type of Cloud effort is a brick in the wall while constructing a whole Cloud environment, I think we should start distinguishing what Cloud effort is made “for” cloud and what Cloud effort is made by “facilitating” Cloud. So if I have to give an example, if you convert a management software to a SaaS application, then you are “facilitating” Cloud. If this management software is used to manage your Cloud resources, then this is an effort made “for” Cloud. Although there is a considerable gray area in the intersection of the both, I hope the example makes itself clear to the reader. 
Where does Gravitant stand at this intersection? First of all, Gravitant is an established Cloud brokerage company which is enlisted by Gartner’s recent report on Cloud brokerage companies. NIST defines a cloud broker as “…an entity that manages the use, performance and delivery of cloud services and negotiates relationships between cloud providers and cloud consumers.” In the light of this definition, Gravitant’s CloudMatrix and CloudWiz tools manage all traditional IT resources and Cloud resources end to end from sourcing to provisioning and monitoring. They include very powerful and intelligent capacity planning, advanced monitoring and advanced analytics tools which enable enterprises to strategically and tactically plan capacity of their IT resources on the Cloud and in-house in addition to efficiently analyzing huge data collected from the resources and proposing the most effective Cloud Analytics solutions. All these efforts are made for Cloud to make Cloud a more manageable and less costly environment for IT needs of enterprises. 
On the other side, Gravitant’s major Cloud brokerage and management tools CloudMatrix andCloudWiz are ultimate user friendly, fast and smart SaaS applications. They naturally run on the Cloud very efficiently, reliably and securely. Gravitant runs all its other applications and internal IT resources on the Cloud. So Gravitant facilitates the Cloud and has the first-hand Cloud experience as a Cloud user.

Gravitant both adds value to the Cloud and uses it for its own benefit. All these Cloud centric activites make Gravitant a pure Cloud company. Gravitant’s Cloud network grows very fast day by day including Amazon, Terremark, Savvis, Rackspace, IBM, etc. There is a lot to learn about Gravitant’s cloud experience. If you have any ideas, thoughts or questions to add to this discussion of what is “for” cloud and what is “facilitating” cloud, please respond to this post or contact us so that we can share the intellectual part of the Cloud experience together.

Monday, December 12, 2011

Cloud Deployment Tree

The spectrum of cloud deployment models are many, and everyone has a unique combination. Follow this cloud deployment tree to identify the combination that best suits your requirements.

We have intentionally avoided industry terminology in the tree due to lack of standardization. However, the legend can be used to map each combination to commonly used industry terms (as of today). The legend also shows industry leaders for each combination.

This is the very first step in Cloud Assessment.  The next step is to determine if your application would even be feasible in the cloud.  Click here to see if your application would be a good fit in the cloud...

Wednesday, December 7, 2011

Gravitant published in latest Gartner Report

What makes a Cloud Services Broker (CSB)?
Gartner identifies three primary roles that qualify a company to be a CSB:

  • Aggregation (across VARs, IT distributors etc)

  • Integration (with SIs etc)

  • Customization (for SIs, PS etc)

"As both an enabler and a cloud brokerage, Gravitant pulls together a number of the capabilities that IT organizations, VARs and SIs, and public cloud providers can use to extend the value of their offerings." - Daryl Plummer (Gartner Analyst)

Full report here...