Lessons on the Future of IT From 'Future Shock' and 'The Singularity Is Near'
- 17 September, 2013 16:26
More than 40 years ago, Alvin Toffler unleashed the book Future Shock on the United States. In it, he used the term "information overload" to refer to the disoriented reaction experienced by people when they feel overwhelmed by constant technological churn. Summed up, his thesis is that technology is developing faster and faster - and faster than people can respond to it, leaving them anxious and befuddled.
The theme of ongoing and accelerating change isn't the sole province of Toffler, of course. The influential Ray Kurzweil, now Google's Director of Engineering, states in The Singularity is Near that technological change is increasing so rapidly that each decade of this century will change as much as the entire 20th century did.
So it was with some puzzlement that I read a recent piece by Joel Mokyr, a Northwestern University professor specializing in economic history, addressing a current meme that technology progress has stopped. Mokyr's position is summed up by that famous phrase, "You ain't seen nothin' yet."
I think the reasoning underlying this "nothing is changing" idea is that people feel OK with the way things stand; nothing further is needed or possible. Perhaps they feel that it's been such a struggle to get on top of things as they stand that any further change should be renounced.
Future of IT Means Bigger, Faster, Stronger Innovation
This reminds me of what I see each day in IT: Someone up to speed on the last major technology shift but digging in when confronted with the next one, asserting plaintively that the new thing falls short in some aspect - while not realizing that, when the last major technology shift occurred, it received the exact same kind of criticism. Furthermore, he or she fails to remember that the last major technology shift overcame its initial shortcomings while solving a problem (or set of problems) that the previous technology platform was unable to address.
For example, when the Web first came onto the scene, Web-based applications, as compared to the then-standard client/server architectures, were criticized as suffering from crude UIs, unimpressive performance and complexity due to poor software components. Over time, however, browser UI support and performance improved, due to multiplexing multiple interactions in a single network round trip and bigger network pipes.
Eventually, Web apps improved enough that "production" applications could run over the Web. Meanwhile, client/server apps never could address the major benefit of Web-based applications: Easy access from anywhere in the world with a standard browser interface. This meant that, once Web apps improved enough, client/server apps rapidly fell out of favor.
Turning to today's technology environment, we've moved past PC-based browser apps and are just about ready to recognize smartphones and tablets as first-class client devices. Companies are devising BYOD policies, increasing the range of devices that they will support. It's all sorted out.
Except it's not. Smartphones and tablets aren't the final stop on the device journey. They're just today's resting point in the relentless onward push of Moore's Law. Computing power will continue to shrink and make ever-smaller and specialized devices workable.
Instead of criticizing Google Glass for its manifest shortcomings, then, recognize it for what it represents: The inevitable move of computing power into personalized devices that are more natural and aligned with our ongoing daily activities, acting as supplemental aids for specific purposes in our lives.
In five or 10 years, each of us will be surrounded by a collection of devices that serve one or more purposes. Of course, they'll all connect to cloud-based applications that contain the individual's relevant data, as well as larger processing capability to transform data into a useful set of information that can be consumed by a specific device.
New, Cool Tech Means Bigger Role for IT Departments
What does this imply for IT organizations? It means software is eating the world.
This pithy meme, coined by Mark Andreessen, sums up the reality that digital is replacing analog in the way companies engage with customers, partners, prospects, employees and other important constituents. We've reached the tipping point regarding the desired way to interact with organizations; wading through a phone tree is far less palatable than using a website to research or purchase a product.
This means IT will have a much larger job, and fulfill a much more important role, than in the past. Be careful what you wish for - the new role is much more under the microscope than the old days of running transactional back-office systems. Here's why.
BYOD isn't a "slightly larger approved" list. That ship has already sailed. It's pointless to think there will be an approved list to which employees are limited. In fact, many of the people accessing applications won't be employees. The cycle time of devices is shrinking, and the portfolio of devices people will carry is exploding.
Indeed, to even use the term "carry" is to miss much of the point; people will be surrounded by devices, many of which can be termed "carried" only as a gross kind of associational term. Glasses, watches, pens, cars, shoes - all will transmit or receive data and operate applications of more or less complexity upon the data. The critical task will be to make data available and let people consume it as they will.
APIs rule. The open data movement represents a good model for what the future holds - not necessarily in the sense that data should be available to anyone, but that mechanisms should be in place to enable access. This means embracing convenient APIs with sufficient management and performance to support unforeseen access as new applications consume data. Managing APIs that make data available will be a core IT skill.
Centralized data begets decentralized consumption.The true crown jewel of any company is its data and what it represents. Information needs to be protected - but access needs to be liberated, with a recognition that most of the consumption will occur off premises.
Networks need an upgrade. It should be obvious that connectivity, high bandwidth and low latency will be crucial in this new world. This is especially true as you begin to build true hybrid applications, with tiers in different locations and access tied to corporate infrastructure in order to access data. Whatever your projections for company bandwidth requirements, figure that they are way too low. Start thinking about what you'd do if you had to have 10 times as much network performance.
Those Who Ignore History Doomed to Repeat It ...
IT is a funny animal. It pays lip service to innovation, but typically attempts to clutch yesterday's platform as the true apotheosis of computing, assessing new technology offerings as inadequate and unlikely to ever attain satisfactory functionality and performance. Of course, without fail, the new improves enough that it displaces the previous incumbent, and the cycle begins again.
Those who attempt to force-fit cloud computing into yesterday's virtualization environment will, inevitably, find that, despite their protests, their favored platform will pass into little-honored obsolescence.
Bernard Golden is senior director of Cloud Computing Enterprise Solutions group at Dell. Prior to that, he was vice president of Enterprise Solutions for Enstratius Networks, a cloud management software company, which Dell acquired in May 2013. He is the author of three books on virtualization and cloud computing, includingVirtualization for Dummies. Follow Bernard Golden on Twitter @bernardgolden.
Read more about cloud computing in CIO's Cloud Computing Drilldown.