At some point the economic unviability of private clouds will become clear. Math will win out.
Stories by Bernard Golden
Cloud computing is remaking the entire IT stack, from the foundation to the customer level. The application layer is no different.
It’s a software-driven, open source world, and we’re just living in it.
When large IT organizations embrace open source software, good things can happen. And with any kind of shift in methodology, there are obstacles to overcome. When it comes to adopting open source, though, enterprises may longer have a choice.
Proponents of cloud computing often insist that one of its biggest benefits is not decreased costs, but greater agility. To them, obtaining infrastructure in minutes rather than months is a game-changer. Of course, as I noted in a column last year, viewing lower costs and increased agility as separate cloud characteristics is incorrect; instead, lower costs and greater agility are, so to speak, two sides of the same coin.
In 1865, the English economist William Stanley Jevons published The Coal Question, a book with a prosaic title that contained profound implications. Jevons set out to establish the size of England's coal reserves, a critical question for industrial and naval power. During his research, he stumbled upon a curious paradox: As coal use became more efficient due to the advent of better quality steam engines, coal consumption rose rather than fell.
Two weeks ago AWS announced its financial results and, for the first time, broke out AWS revenues. AWS, it said, achieved $4.6 billion in 2014, and will reach $6.2 billion in 2015, with a growth rate of 49 percent -- which is accelerating. Perhaps more surprising is that AWS is not a low-margin business -- it achieves around 17 percent operating margins, much higher than the overall Amazon business itself.
In 2014, Gartner introduced a prescriptive organization model for enterprise IT called "Bimodal IT." It posits that IT organizations of the future will have two separate flavors, if you will: Type 1 is traditional IT, focused on stability and efficiency, while Type 2 is an experimental, agile organization focused on time-to-market, rapid application evolution, and, in particular, tight alignment with business units.
My last post noted that the IT industry appears to suffer from cloud computing ennui, as the number of Google searches for the term over the past two years has dropped significantly. I also said that other evidence indicates that many IT users appear to have put cloud computing in the "done and dusted" category despite not really understanding it very well.
In my recent post on IDC's 2014 predictions about how what it calls the "third platform" will radically disrupt the IT ecosystem, I note that the most intriguing prediction addresses how technology users will leverage the third platform to disrupt existing non-technology industries.
In the past, infrastructure deployment and application updates both slowed the development lifecycle. Now that cloud computing lets organizations provision resources in minutes, not months, it's time to alter the application lifecycle accordingly. DevOps can help -- but only if it extends beyond 'culture change' to actually achieve continuous deployment.
Most of today's applications, and all of tomorrow's, are built with the cloud in mind. That means yesterday's infrastructure -- and accompanying assumptions about resource allocation, cost and development -- simply won't do.
Cloud computing is increasingly becoming the rule and not the exception for application deployment. This will make 2014 an interesting and disruptive year for vendors, service providers and IT organizations grappling with this change.
Countless words have been written about cloud computing economics. The catchphrase is summed up as "OpEx vs. CapEx," shorthand for rent vs. buy, with an ongoing and endless vociferous argument on the topic.
Alvin Toffler introduced to the term 'information overload,' while Ray Kurzweil told us we'll be overload with more information each decade than in the previous century. There's a lesson for the IT departments of today (and tomorrow): Ignore emerging technology, despite its flaws, at your own risk.