Ease of use
Microsoft's dearly-departed Courier is a harbinger for showing enterprises that there's no such thing as one-click success - particularly in cloud computing.
Last month, Microsoft quietly put a gun to the head one of the most interesting projects I've ever seen being developed at the Redwood-based firm and without a hint of remorse, pulled the trigger. And I'm glad that they did.
For those who have no clue what I'm talking about, I'm referring to its now-departed Courier dual-screen tablet designed for creative, artistic types. Rendered videos of the proposed device's UI are still available on Youtube (no one ever laid eyes on the finished product) and truth be told, are very impressive indeed. Microsoft had envisioned a book-shaped device far beyond even what its famous Cupertino rival could deliver; the elegant pen-based interface allowed you to move seamlessly between applications, create notes on the fly, annotate them and then append to digital ‘swatches' which can be laid out like so much confetti.
Some may see it as nothing more than a notebook for artists but I think it had real potential for anyone who needed to keep track of multiple projects and were capable of understanding their own handwriting. The interface seemed straight out of Spielberg's 2002 sci-fi film Minority Report, with interface completely visible and swooping into view only when needed, where needed.
But there is, I think, a very good reason why the Courier never came to market and that swoopy interface is the key - and incidentally, also explains why recent developments like cloud computing should be taken with a metric tonne of salt.
Simply put, the Courier promised too much. That interface described an ease of use that might not be possible in this decade, and similarly, cloud computing promises to shift essential services off your servers at the click of a mouse - but it's just not possible.
Last month, I attended the Cloud computing and virtualisation summit in Abu Dhabi and listened to speaker after speaker talk about how simple it was to shift to the cloud, the massive cost savings that were possible and how it could dramatically simplify your infrastructure. The audience was receptive, but there were some, like Dubai Aluminium's famously-outspoken vice president of IT Ahmad Al Mulla, who were less enthused.
"It's not [that] simple. I don't care who says whatever, I don't believe it," he declared.
I'm inclined to agree with him. To hear about vendors talk about cloud computing is hear tales of a Courier-style future; but the harsh reality is that every enterprise is very different. For the younger organisations, new builds and new datacentres are ripe for moving to the cloud, but older ones will still require massive amounts of development time to make sense of a proper strategy.
And then there's costs. Yes, cloud computing has immense potential to reduce your infrastructure needs, but what do you do with the systems you decommission? Scrap them? Redistribute is the obvious answer, but with IT projects being put on hold everywhere, you're very likely to end up with a load of servers being put in storage rather than put to work.
Redistribution of resources is another timebomb of a problem. What do you do with the employees who are no longer maintaining all those systems? The politically-correct thing to say of course, is to put them to work on other projects. But IT staff are not interchangeable widgets, and a server tech is a (generally) a poor fit for application development or a customer-facing role without some serious retraining time. A number of boardrooms might simply take the most judicious option available and slash and burn at the workforce, unless the CIO is tough enough to stand his ground and resist the cost pressures. Most aren't.
Perhaps I am being overly pessimistic. But as a natural cynic, I remain deeply sceptical of this brave new world being painted by vendors, which seems much like the fanciful rendered drawings on the now-dead Courier. In their rush to market, vendors may need to take a lesson from Microsoft, which eventually killed the device because it couldn't manage to make sense of what it had in the end. Enterprises and vendors alike should heed that lesson.