In business discussions around technology, it can be easy to get lost in the weeds. As IT departments and media try to forecast the next new wave of applications, the viability of one platform over another gets put into question, as does the hype cycle around emerging technologies. Society's tendency to focus on the micro trends puts the industry at risk of ignoring larger, more urgent technology issues. And at present, there is no technology discussion bigger than cloud computing.
The media has covered cloud debates to the point of saturation. Is cloud is hype or reality? How do you define cloud computing? What applications are best suited for the cloud? In truth, the question we should all be debating is what major challenges can cloud computing solve.
The answer is the IT energy crisis.
The word "crisis" may sound like a hyperbolic word, but it is very real. The explosion of online data is well documented with IDC estimating that the "digital universe" will reach 1.8 trillion gigabytes by the end of 2011. And, as of 2007, data centers represented almost 2% of the total electricity consumed in the U.S. The reality is that storing, managing and extracting value from that data consumes a great deal of energy. And, while this exponential increase in data and energy consumption is somewhat manageable now, at the pace it is evolving it soon won't be. An innovative solution (or set of solutions) is needed to adjust and respond to this trend.
Cloud computing offers a new approach.
Most would agree that cloud computing has been an evolution from previous models such as utility computing and the SaaS model. Key to cloud compute is design intended to provide immediate scalability for businesses of all sizes in an extremely flexible manner. The key advantages that have propelled the cloud have primarily been flexibility, simplicity and cost. But, in the last few years, as cloud computing has grown in prominence, a few other forces have been at play in IT.
First, as the economic downtown shook the very foundations of the IT market, understanding how to get more out of your investments became paramount. Second, as organizations strived to reduce cost and maximize resources, they were faced with increased compute demands and rising energy costs, which is counterintuitive to solving the cost/resources issue. The two forces (reducing cost while managing increased compute demands) are at odds with each other.
In response to the concerns over rising IT energy consumption, many organizations, including the federal government, created their own initiatives to try and deal with the spiraling power problem. The government went on record saying they aimed to decrease data center energy costs by 10-15 percent. Also, specific local authorities, including NYSERDA in NY State offered financial incentives as great as $10 million to reduce data center energy consumption.
Slideshow: The World's Coolest Data Centers
So while cloud has been gaining steam with the early adopter crowd for being robust and user-friendly, enterprises, SMBs and public sector organizations have all been starving for a solution that can cut energy costs in a meaningful way. The two were destined to meet.
If only it were as simple as it sounds. In theory, porting over the compute workloads from local data centers to the cloud just passes on the energy costs from many locations to few. The energy crisis hasn't been solved, it's just been consolidated. This exact factor is why energy efficiency is the single largest issue associated with cloud computing and makes it imperative that cloud providers implement extremely energy efficient server solutions. This protects the viability of IT not only today, but in the future as well.
So, how do you do that?
First, look at the brains of the servers, the processors. At the "core of the cloud" are millions of tiny microprocessors that do the computational work on the back end. In traditional IT, before the energy crisis, we experienced the gigahertz wars. It appeared for some time that processors would just keep getting faster and faster, cranking up the clock speeds to the point where one day we would have a 10 GHz chip and it would change the world forever. That's laughable now, as we all know a chip that fast would probably melt right through the server. As a result, it became necessary to re-think the process, and now it is all about balance.
For a cloud computing provider who is looking to build a server farm comprising thousands upon thousands of servers, it's critical to deliver the appropriate application performance to customers, while also keeping the energy efficiency of those servers really high. It's an extremely precise balancing act that can feel as much like a science experiment (finding just the right mixture of ingredients -- price, performance and power) as it does an IT project.
With this in mind, a question many CIOs and IT professionals find themselves asking is "can't I just use an ultra low power processor, like those used in laptops or cell phones, for cloud computing?" The answer is you can, but it comes with sacrifices. The most important metric when looking at building out a hyperscale data center is price/performance/watt. With a mobile-based processor for a cloud deployment, you give up performance for the sake of price and power. Not to mention, there are inherent risks in a platform that isn't tested or validated. These are all important issues to consider.
Is the solution to building an energy efficient IT environment based on cloud computing entirely about processor balance? No, it's only one piece of the puzzle. It will become increasingly important to make server platforms and processors intelligent when it comes to power management. The workloads that run in cloud environments are not continuous, heavy tasks, like you would see in an HPC environment. Therefore, continuing to make the server intelligent enough to know when a workload has scaled down enough so you can effectively "power down" parts of the platform to conserve energy and then power back up -- going from resting to active state, quickly and efficiently. This is an area that needs to remain a huge focus for the IT industry going forward.
Beyond the processor, overall data center design and location strategies need to be improved upon. Specifically, one of the most expensive areas of IT comes from provisioning and delivering power and cooling. But, how will the data center evolve to deal with it? In the longer term, data centers designed will be designed to run at a much higher ambient temperature so that less cooling is required. Free air cooling may be a legitimate option in the next 5-10 years, removing the need for expensive air conditioners within the data center, which would have a drastic impact on IT spending.
In the short term, however, we can expect data centers to be built in closer proximity to power plants as well as natural bodies of water. The former helps reduce power loss in transmission, while the latter reduces the need for more advanced cooling techniques. If we are moving from a world of many small data centers located everywhere to fewer cloud data centers strategically located around the world it is imperative to think long and hard about how to really optimize those sites. In the years to come it is likely that many mega data centers will be clustered in specific regions of the world because of their inherent benefits. That is already beginning to occur with a variety of collocation facilities in Ashburn, Virginia.
The debate will rage on about the role cloud computing will play in the future of IT, but we can't ignore the elephant in the room. Energy consumption is a large and looming issue in IT, and needs critical consideration in designing the data centers of the future. Whether you believe these data centers will be in the cloudy skies or not, businesses must take new approaches to significantly reduce the financial and environmental strain that server energy consumption is causing. While energy efficiency in IT has been an important topic for years, we are now on the verge of incredible change when it comes to energy in this industry -- for the better. As we prepare for this change, it will require the collective effort of technology vendors, federal and local regulation, IT professionals and -- of course -- CIOs to set the agenda for businesses large and small.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.