The value, or not, of hands-on testing
At Gartner, we generally do not do hands-on testing of products, with the exception of some folks who cover consumer devices and the like. And even then, it’s an informal thing. Unlike the trade rags, for instance, we don’t have test labs or other facilities for doing formal testing.
There are a lot of reasons for that, but from my perspective, the analyst role has been primarily to advise IT management. Engineers can and will want to do testing themselves (and better than we can). Also, for the mid-size to enterprise market that we target, any hands-on testing that we might do is relatively meaningless vis a vis the complexity of the typical enterprise implementation.
Yet, the self-service nature of cloud computing makes it trivially cheap to do testing of these services, and without needing to involve the vendor. (I find that if I’m paying for a service, I feel free to bother the customer support guys, and find out what that’s really like, without being a nuisance to analyst relations or getting a false impression.) So for me, testing things myself has a kind of magnetic draw; call it the siren song of my inner geek. The question I’m asking myself, given the time consumed, is, “To what end?”
I think the reason I’m trying to do at least a little bit of hands-on with each major cloud is that I feel like I’m grounding hype in reality. I know that in superficially dinking around with these clouds, I’m only lightly skimming the surface of what it’s like to deploy in the environments. But it gives me an idea of how turnkey something is, or not, as well as the level of polish in these initial efforts.
This is a market that is full of incredible hype, and going through the mental exercise of “how would I use this in production” helps me to advise my clients on what is and isn’t ready for prime-time. An acquaintance once memorably wrote, when he was disputing some research, that analysts sit at the apex of the hype food-chain, consuming pure hype and excreting little pellets of hype as dense as neutronium. I remember being both amused and deeply offended when I first read that. Of course, I think he was very wrong — whatever we’re fed in marketing, tends to be more than overcome by the overwhelming volume of IT buyer inquiry we do, which is full of the ugly reality of actual implementation. But the comment has stuck in my memory as a dark reminder that an analyst needs to be vigilant about not feeding at the hype-trough. Keeping in touch by being at least a little hands-on helps to innoculate against that.
However, I realized, after talking to a cloud vendor client the other day, that I probably should not waste their phone inquiry time talking about hands-on impressions. That’s better left to this blog, or email, or their customers and the geek blogosphere. My direct impressions are only meaningfully relevant to the extent that what I experience hands-on contradicts marketing messages, or indicates a misalignment between strategy and implementation, or otherwise touches something higher-level. My job, as an analyst, is to not get lost in the weeds.
Nevertheless, there’s simply nothing like gaining a direct feel for something, and I am, unfortunately, way behind in my testing; I’ve got more demos accumulating than I’ve had time to try out, and the longer it takes to set something up, the more it lags in my mental queue.