The value, or not, of hands-on testing

At Gartner, we generally do not do hands-on testing of products, with the exception of some folks who cover consumer devices and the like. And even then, it’s an informal thing. Unlike the trade rags, for instance, we don’t have test labs or other facilities for doing formal testing.

There are a lot of reasons for that, but from my perspective, the analyst role has been primarily to advise IT management. Engineers can and will want to do testing themselves (and better than we can). Also, for the mid-size to enterprise market that we target, any hands-on testing that we might do is relatively meaningless vis a vis the complexity of the typical enterprise implementation.

Yet, the self-service nature of cloud computing makes it trivially cheap to do testing of these services, and without needing to involve the vendor. (I find that if I’m paying for a service, I feel free to bother the customer support guys, and find out what that’s really like, without being a nuisance to analyst relations or getting a false impression.) So for me, testing things myself has a kind of magnetic draw; call it the siren song of my inner geek. The question I’m asking myself, given the time consumed, is, “To what end?”

I think the reason I’m trying to do at least a little bit of hands-on with each major cloud is that I feel like I’m grounding hype in reality. I know that in superficially dinking around with these clouds, I’m only lightly skimming the surface of what it’s like to deploy in the environments. But it gives me an idea of how turnkey something is, or not, as well as the level of polish in these initial efforts.

This is a market that is full of incredible hype, and going through the mental exercise of “how would I use this in production” helps me to advise my clients on what is and isn’t ready for prime-time. An acquaintance once memorably wrote, when he was disputing some research, that analysts sit at the apex of the hype food-chain, consuming pure hype and excreting little pellets of hype as dense as neutronium. I remember being both amused and deeply offended when I first read that. Of course, I think he was very wrong — whatever we’re fed in marketing, tends to be more than overcome by the overwhelming volume of IT buyer inquiry we do, which is full of the ugly reality of actual implementation. But the comment has stuck in my memory as a dark reminder that an analyst needs to be vigilant about not feeding at the hype-trough. Keeping in touch by being at least a little hands-on helps to innoculate against that.

However, I realized, after talking to a cloud vendor client the other day, that I probably should not waste their phone inquiry time talking about hands-on impressions. That’s better left to this blog, or email, or their customers and the geek blogosphere. My direct impressions are only meaningfully relevant to the extent that what I experience hands-on contradicts marketing messages, or indicates a misalignment between strategy and implementation, or otherwise touches something higher-level. My job, as an analyst, is to not get lost in the weeds.

Nevertheless, there’s simply nothing like gaining a direct feel for something, and I am, unfortunately, way behind in my testing; I’ve got more demos accumulating than I’ve had time to try out, and the longer it takes to set something up, the more it lags in my mental queue.

Bookmark and Share

Posted on January 11, 2009, in Analyst Life and tagged , , . Bookmark the permalink. 4 Comments.

  1. Lydia, I understand why you and Gartner have generally not tested the enterprise software products you’ve traditionally covered. Having been personally involved in a number of product evaluations including “proof-of-concept” or “lab trial” exercises that have been quite painful and time consuming, as a consultant I’ve tried to steer clear of these unless my client has truly dedicated the time and resources to conduct a meaningful test. I’ve seen vendors come onsite and still take 2 weeks to install their own product, and a lab trial last 6 months.

    With the cloud products I’ve looked at recently, the situation seems very different. The installation is now gone (for the most part), and much of the setup and configuration is often unnecessary to be able to test drive the product in a somewhat meaningful way.

    Alas, there is still not enough time to demo the hundreds and thousands of SaaS solutions out there, with more and more launching every day. Of course there are tons of people blogging about what they find, and no need to personally test everything. It’s an effort to sift through these though, to identify the main points that would affect a buy decision and not “get lost in the weeds” as you say.

    What I’d love to find to help select products for clients is a “SaaS Industry Blueprint” – an analysis of the best products for each industry. As in, take the top-right SaaS products from your Magic Quadrants, and map them out per industry along functional lines; some horizontally applicable (accounting, CRM, ERP, BI) and some industry specific (medical imaging systems, legal systems). Throw in some good infrastructure products (integration platforms, e-commerce apps) and we’d have a good map for companies. I know there are many complexities (e.g. company size, country, level of bias, multi-product fit) and this is a monumental task – even for a single industry – but I haven’t seen any attempts yet.

    Like

  2. Well, here’s a SaaS Industry Roadmap just posted. Not exactly what I’m looking for (functional business apps) but a start.

    Like

  3. Very nice, I sure will be coming back more often. I bookmarked your site also, thank you. This is my loved sites : erp

    Like

  1. Pingback: Email Marketing Strategy » Blog Archive » The value, or not, of hands-on testing « CloudPundit: Massive ...

Leave a comment