Updating Gartner’s cloud IaaS evaluation criteria

In February of this year, we revised the Evaluation Criteria for Cloud IaaS (Gartner paywall). The evaluation criteria (now rebranded Solution Criteria) are essentially the sort of criteria that prospective customers typically include in RFPs. They are highly detailed technical criteria, along with some objectively-verifiable business capabilities (such as elements in a technical support program, enterprise ISV partnerships, ability to support particular compliance requirements, etc.).

The Solution Criteria are intended to help cloud architects evaluate cloud IaaS providers (and integrated IaaS+PaaS providers such as the hyperscale cloud providers), whether public or private, or assess their own internal private cloud. We are about to publish Solution Scorecards (formerly branded In-Depth Assessments) for multiple providers; Gartner analysts assess these solutions hands-on and determine whether or not they have capabilities that meet the requirements of a criterion.

The TL;DR version

In summary, we revised the Solution Criteria extensively in 2019, and the results were as follows:

  • The criteria have been updated to reflect the current IaaS+PaaS market.
  • Expectations are significantly higher than in previous years.
  • Expectations have been aligned to other Gartner research, taking into account customer wants and needs in the relevant market, not just in a cloud-specific context.
  • Many capabilities have been consolidated and are now required.
  • Most vendor scores in the Solution Scorecards have dropped dramatically since last year, and there is a much broader spread of vendor scores.

The Evolution of Customer Demands

The Evaluation Criteria (EC) for Cloud IaaS was first published in 2012. It received a significant update every other year (each even-numbered year) thereafter. When first written, the EC reflected the concerns of our clients at the time, many of whom were infrastructure and operations (I&O) professionals with VMware backgrounds. With each iteration, the EC evolved significantly, yet incrementally.

In the meantime, the market moved extremely quickly. The market evolution towards cloud integrated IaaS and PaaS (IaaS+PaaS) providers, and the market exit (or strategic de-investment) of many of the “commodity” providers, radically changed the structure and nature of the market over time. Cloud IaaS providers weren’t just expected to provide “hardware infrastructure”, but also “software infrastructure”, including all of the necessary management and automation. This essentially forced these providers into introducing services that compete in many IT markets and in an extraordinary number of software niches.

Furthermore, as the market matured, the roles and expectations of our clients also evolved significantly. The focus shifted to enterprise-wide initiatives, rather than project-based adoption. Digital business transformation elevated the importance of cloud-native workloads, while IT transformation emphasized the need for high-quality cloud migration of existing workloads. The notion that a cloud IaaS provider could successfully run all, or almost all, of a customer’s IT became part of the assumptions that needed to underpin the provider evaluation process. 

Today’s cloud IaaS customers have high expectations. Experienced customers are becoming more sophisticated, but late adopters also have high expectations of a provider that have to be met to help the customer overcome barriers to adoption.

For 2019, we decided to take a look at the EC“from scratch”,  in order to try to construct a list of criteria that are the most relevant to  the initiatives of customers today. In many cases, our clients are trying to pick a primary strategic IaaS provider. In other cases, our clients already have a primary provider but are trying to pick a strategic secondary provider as they implement a multicloud strategy. Finally, some of our clients are choosing a provider for a tactical need, but still need to understand that provider’s capabilities in detail.

Constructing the Revision

The revision needed to keep a similar number of criteria (in order to keep the assessment time manageable and the assessment itself at a readable length) — we ended up with 265 for 2019.

In order to keep the total number of criteria down, we needed to consolidate closely-related criteria into a single criterion. Many criteria became multi-part as a result. We tried to consolidate the “table stakes” functionality that could be assumed to be a part of all (or almost all) cloud IaaS offerings, in order to make room for more differentiated capabilities. 

We tried to be as vendor-neutral as possible. The evaluation criteria have evolved since the initial 2012 introduction; when we introduced new criteria in the past, we often ended up with criteria requirements that closely mirrored the feature-set of the first provider to offer a capability, since that provider shaped customer expectations. In this 2019 revision, we tried to go back to the core customer requirements, without concern as to whether cloud provider implementations fully aligned with those requirements — the criteria are intended to reflect what customers want and not what vendors offer.  There are requirements that no vendors meet, but which we often hear our clients ask for; in such cases we tried to phrase those requirements in ways that are reasonable and implementable at scale, as it’s okay for the criteria to be somewhat aspirational for the market.

We tried to make sure that the criteria were worded using standard Gartner terms or general market terminology, avoiding vendor-specific terms. (Note that because vendors not-infrequently adopt Gartner terms, there were cases where providers had adopted terminology from earlier versions of EC, and we made no attempt to alter such terms.)

We tried to keep to requirements, without dictating implementation, where possible. However, we had to keep in mind that in cloud IaaS, where there are customers who want fine-grained visibility and control over the infrastructure, there still must be implementation specificity when the customer explicitly wants those elements exposed.

Defining the Criteria

During the process of determining the criteria, we sought input broadly within Gartner, both in terms of discussing the criteria with other analysts as well as incorporating things from existing Gartner written research. (And the criteria reflect, as much as possible, the discussions we’ve had with clients about what they’re looking for, and what they’re putting into their RFPs.)

In some cases, we needed input from specialists in a topic. In some areas of technology, clients who need to have deep-dive discussions on features may talk almost exclusively to analysts specialized in those areas. Those analysts are familiar with current requirements as well as the future of those technology areas, and are thus the best source for determining those needs. For example, areas such as machine learning and IoT are primarily covered by analysts with those specializations, even when the customers are implementing cloud solutions. There are also areas, such as Security, where we have detailed cloud recommendations from those teams. So we extensively incorporated their input..

We also looked at non-cloud capabilities when there were market gaps relative to customer desires. There are areas where either cloud providers do not currently have capabilities, or where those capabilities are relatively nascent. Thus, we needed to identify where customers are using on-premises solutions, and want cloud solutions. We also needed to determine what the “minimum viable product” should be for the purposes of constructing a criterion around it.

Feedback from non-cloud analysts was also important because it identified areas where clients were not using a cloud solution because of something that was missing. In many cases, these were not technology features, but issues around transparency, or the lack of solutions acceptable on a global basis.

Finally,  the way that customers source solutions, build applications, and manage their data is changing. We tried to ensure that the new criteria aligned with these trends.

Because more and more of our clients are deploying cloud solutions globally, every criterion also had some requirements as to its global availability. These are used only for advisory purposes and are not part of scoring. 

The vendors were allowed to give feedback on the criteria prior to publication. We wanted to check if the criteria were reasonable, and seemed fair. We incorporated feedback that constituted good, vendor-neutral suggestions that aligned to customer requirements.

The End Results

When you see the Solution Scorecards, you may be surprised by lower scores on the part of many of the providers. We’re being transparent about the Evaluation Criteria (Solution Criteria) revision in order to help you understand why the scores are lower.

The lower scores were an unintentional side-effect of the revision, but reflect, to some degree, the state of the market  relative to the very high expectations of customers. Note that this year’s lower scores do not indicate that providers have “gone backwards” or removed capabilities; they just reflect the provider’s status against a raised bar of customer expectations. 

We expect that when we update the scorecards in the second half of this year, scores will increase, as many of the vendors have since introduced missing capabilities, or will do so by the next update.  We retain confidence that the solution criteria are a good reflection of a broad range of current customer expectations. Because many vendors are doing a good job of listening to what customers and prospects want, and planning accordingly, we think that the solution criteria will also be reflected in future vendor roadmaps and market development.

We discuss the Solution Scorecards and scores in a separate blog post.

 

Posted on July 30, 2019, in Infrastructure and tagged , , , , , . Bookmark the permalink. 3 Comments.

  1. Welcome Everyone
    Thanks for checking out my about page. My name is Leticia.
    I have worked since high school in this niche. My passion for writing started at a young age. I wrote poetry as a child and eventually went on to work with my school newspaper.
    This early tryst into journalism eventually led me to academic writing. There is plenty of work for skilled writers. I specialize in essays, but have the skills to do all types of academic writing.
    Reach out for more information about rates and a price quote. I’m looking forward to helping you.

    Academic Writer – Leticia – http://www.magnoliacharterschools.org Corps

    Like

  2. Thanks for stopping by my page! I’m Leonard Rossi.
    Even though I jokingly credit my grandmother for my writing talent, I know that it is a talent I have fostered from childhood. Though my grandmother is a writer, I also started out young.
    I’ve always had a way with words, according to my favorite teacher . I was always so excited in science when we had to do a research paper .
    Now, I help current students achieve the grades that have always come easily to me. It is my way of giving back to communities because I understand the troubles they must overcome to graduate.

    Leonard – Professional Writer – socialwebtools.info Corp

    Like

  1. Pingback: Gartner’s cloud IaaS assessments, 2019 edition | CloudPundit: Massive-Scale Computing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: