Silver lining

Following the era of the desktop PC, mainstream computing's big shift to web-based software and service infrastructure is well under way.
  • Brad Howarth (Unknown Publication)
  • 21 September, 2008 22:00

Scott McNealy, chairman of Sun Microsystems, may have been a visionary, or he may have just been lucky. But when he declared that "the network is the computer" in the late 1990s, he outlined a model that today is looking more and more like reality. Taking processing power out of the hands of corporate IT and distributing it around the world is discussed at length in the 2008 book, The Big Switch by Nicholas Carr (author of the controversial tome Does IT Matter?). He traces the evolution of Sun's old marketing slogan into what he calls the "World Wide Computer" - a programmable, highly scalable computing environment available to anyone.

In 2007, Google chief executive Eric Schmidt referred to it as "the computer in the cloud", using the word "cloud" as a metaphor for the widely distributed nature of the web itself. The cloud concept caught on00 and today encompasses a huge collection of web-based consumer computing activities. But, according to Carr, the utility of the cloud computing model means corporations will increasingly shift their requirements into the cloud as well.

If he is correct, the consequences for IT departments are dire.

"One of the key challenges for corporate IT departments, in fact, lies in making the right decisions on what to hold on to and what to let go," Carr writes. "In the long run, the IT department is unlikely to survive, at least not in its familiar form."

Whether you believe such predictions or not, there are fundamental changes occurring in the way computing tasks are managed and executed today. These changes will have ramifications for how even the largest corporations operate and manage their technologies.

According to Merrill Lynch, the market for cloud computing services could be worth $US100 billion ($116 billion) in three years. The cloud is growing and expanding its range of capabilities in the process.

The origins of cloud computing are hazy, but can be seen in the launch of eBay in 1995 and Hotmail in 1996. These ubiquitous, scalable systems helped change the way consumers thought about the web, taking it from a series of mostly static pages to a home for functional, interactive applications.

In 1999 came's hosted customer relationship management (CRM) system, which brought business functionality to the model. Then, in 2002, online retailer Amazon launched Amazon Web Services, offering facilities such as scalable virtual private servers, storage and a hosted message queue, along with full e-commerce and payment services. Today a business can use Amazon's processing power to run an entire retail operation "in the cloud", without having to spend a cent on its own infrastructure.

According to Gartner fellow and vice-president David Cearley, cloud computing encompasses a range of pay-as-you-go computing models, but is essentially the evolution of current off-premise services.

"The previous iteration was about outsourcing," Cearley says. "And the difference is that cloud computing brings together some of the components of outsourcing with web-based architectures and delivery models."

Cloud computing in its simplest sense covers the various computing and storage "as a service" offerings from companies such as EDS, Hewlett-Packard and Amazon, but with an accent on flexibility. The next level is represented by application infrastructure services such as's, which lets clients put whatever they want into the cloud, using the same basic underlying services that power's own software.

Expansive skies

Cloud computing can also encompass traditional software as a service applications, but the definition can be expanded to cover a full business process, such as within Amazon's Fulfillment Web Service, which enables traders to tap into Amazon's fulfilment services as though they were their own.

"There's a whole range of services that can be sourced from the cloud, with different levels of integrity and risk and privacy issues, depending on which level you're looking at," Cearley says.

He cites Japan Post and the Walt Disney Company as examples of organisations adopting the systems infrastructure layer of cloud computing, using not just's CRM software, but also building specific applications using its on-demand software platform.

"They can use these application service platforms to more rapidly prototype, develop and deliver those applications, so it enables them to spend a fixed amount of time and effort to create a wider range of end-user applications," Cearley says.

Data storage and archiving services hosted in the cloud are also growing in popularity, such as that offered by US-based Iron Mountain. It provides an online analogue to the archiving services for paper-based records that are popular today.

Storage technology company EMC has launched MozyEnterprise as a subscription service that can back up PCs and Windows-based servers anywhere in the world.

EMC's vice-president of technology alliances, Chuck Hollis, says many IT users are excited by the prospect of only paying for what they use.

"If you happen to hire a thousand people you turn the knob up a thousand, if you happen to have a bad day and you need to let a thousand people go you just turn the knob the other way," Hollis says. "Buying technology and putting it in your data centre doesn't give you those knobs. Buying [technology] from a cloud service gives you that flexibility."

The cloud has been put to good use by The New York Times to create and power its online collection of full image scans from the newspaper from 1851 to 1922, dubbed the TimesMachine. It had already created digital versions of its paper-based content as TIFF images, complete with associated metadata relating to the article content, but needed to convert these to a format that was easier for consumers to work with online.

By working with Amazon Web Services, the newspaper was able to process the 405,000 large TIFF files and associated data and convert them into 810,000 portable network graphics (PNG) files. Using Amazon's resources meant the newspaper could complete the task in just 36 hours.

Cearley says such examples are growing more common, but he points out they generally represent important but non-critical applications, rather than core transaction processing or enterprise resource planning systems.

"There are a number of banks and financial institutions and insurance companies and stock exchanges I talk to that are experimenting with using things like Amazon Web Services to store old customer data and make it available to customers at a low cost for analysis of stock transactions and things like that," he says.

"But the model of cloud computing and software as a service is going to have to prove itself to a much greater extent and provide even better quality of service capabilities to capture some of those [more critical] applications."

Foggy patches

For starters, service providers need to counter the security and privacy issues that immediately arise when they are handed their client's sensitive data. Also, users of Amazon's S3 web data storage service were hit by recent failures in the service, reinforcing perceptions of the possible loss of control inherent in the model.

Then there is the potential for jurisdictional issues related to locations at which corporate data is stored, with the European Union placing restrictions on corporate data leaving its boundaries.

According to Cearley, there is often little visibility or control over what is happening behind the service boundary. But this paves the way for a new class of management capabilities that extend from managing data inside the enterprise to managing it out in the cloud as well.

"Just like every generation of computing, security and management is the ugly stepchild that is thought of later," Cearley says. "But enterprises are going to move beyond simple experimentation to try to do more sophisticated things, and then they are going to think about security and management.

"We are only now starting to see some of the vendors starting to think about what it means to do management in the cloud rather than in the enterprise."

Many examples of scalable computing services exist today, including Australian-based hosting service provider Vigabyte. The company uses virtualisation software from VMware to create a hosting environment in which customers can easily increase or decrease the amount of processor power, memory and disk space they use.

The managing director of Sydney-based website technology developer bwired, Sam Saltis, says his company's rapid application development platform, CoreDNA, could not function effectively were it not hosted in a cloud environment such as that provided by Vigabyte.

"The platform has been designed to create a very rapid response to the user requirement," Saltis says. "Now I can bring on any client I want and the way the architecture has been designed, I can scale my application. And underneath I have an infrastructure that I can dial up.

"I no longer need a network specialist, and I no longer need to do capacity planning.

"It's allowed me to throw away capital expenditure and changed everything to operational expenditure.

"And I can provide enterprise infrastructure that scales as they scale."

Some industries have already moved into the cloud, such as the construction project management sector, where technology firms Aconex, Incite and Project Centre provide hosted project management systems.

Thomas Patellis, collaboration services manager at construction company Baulderstone Hornibrook, says he has been using Project Centre for the past five years as a central repository for correspondence and storage of records and documents.

He says such web-based tools are essential because of the collaborative nature of projects that can involve thousands of users across hundreds of companies.

"The clients and other parties involved in this collaboration are a lot more comfortable when their data is not stored within our own system," Patellis says. "And you don't need to access it through the Baulderstone network; you can access it from anywhere in the world."

While cloud computing today primarily refers to accessing external services, Cearley believes it will also have significant ramifications for the design and delivery of the next generation of data centres.

"The models being implemented by cloud vendors enable enterprises to start building their own internal cloud model," Cearley says. "And by doing this and exploring access to external cloud services, over time enterprises will have a much more flexible environment."

IBM has been exploring this future in a joint development with Google called Blue Cloud. IBM's vice-president of IT optimisation and system software, Rich Lechner, says his company has been intrigued by Google's ability to deploy new applications and expand capacity rapidly, and has been keen to learn.

The goal is to harness virtualisation and grid computing capabilities that are inherent in cloud infrastructure and introduce these into client data centres to create homogenous resource pools that are highly virtualised and well managed.

"Our clients are asking how they can implement the same concepts inside their data centres, and by using this capability ourselves we are certainly exploring the notion of providing this kind of capacity to our outsource clients," Lechner says.

Blue Cloud is being used for IBM's own development purposes, and the company has also created a cloud technology centre in China. More will follow.

"The express purpose of that is to work with customers on applying the concepts in more traditional enterprise environments," Lechner says. He cites potential uses for cloud infrastructure as analysis of real-time data streams related to financial markets, running "Monte Carlo" modelling algorithms for scientific or financial applications, or processing large volumes of medical images. More basic uses include rapid provisioning for internal development activities, as is happening today inside IBM with its own research cloud.

"In the future, we'll be able to deploy production capacity in the same way, and our customers are looking at the same thing," Lechner says. "In a global market where you have real-time analytics and data streaming, it is possible to have 100-times spikes in demand, and in that instance you could not have enough latent capacity lying around to be able to meet that short-term spike."

He says customers are also interested in the capacity of cloud services to meet such unexpected demand spikes.

"By treating IT as a service and using service orientation and cloud computing you are able to insulate the applications and the data from the underlying infrastructure," Lechner says. "That means not only is your infrastructure more flexible, it also means by definition your infrastructure becomes extensible."