Yesterday we kicked off Technology Services World here in Silicon Valley. As always, the opportunity to meet so many service leaders in one place is energizing. My conversations with folks yesterday spanned everything from “How do you align ten regions around the world on common service processes?” to “Our CEO can’t spell the word services—is it time for me to start looking for a new job?” And I relish every one of those exchanges.
One of the topics I put on the table during my opening keynote was the impact of utility computing on how customers are viewing our technology solutions. The term “utility computing” is not mine. The IT industry has been using it for years. Here is Wikipedia’s description:
Utility Computing is the packaging of computing resources, such as computation, storage and services, as a metered service similar to a traditional public utility (such as electricity, water, natural gas, or telephone network). This model has the advantage of a low or no initial cost to acquire computer resources; instead, computational resources are essentially rented – turning what was previously a need to purchase products (hardware, software and network bandwidth) into a service.
This repackaging of computing services became the foundation of the shift to “On Demand” computing, Software as a Service and Cloud Computing models that further propagated the idea of computing, application and network as a service.
There was some initial skepticism about such a significant shift. However, the new model of computing caught and eventually became mainstream with the publication of Nick Carr’s book “The Big Switch”.
The challenge with utility computing is not the concept of customers paying for IT capabilities “on demand.” The challenge with utility computing is that it changes how customers view what they are consuming.
When Electricity was Technology
In the conference yesterday, I used the example of electricity. Most folks believe New York City was one of the first cities in the United States to have electricity. Actually, Madison Wisconsin beat New York City by about a month. You can go online and read the articles that were written the day the power plant went live in Madison. The articles provides all kind of gory details related to the amount of copper wire used to wire homes, the gauge of the wire, the voltage sent out to the households, etc. Because, in 1882, electricity was still considered a “technology.” Did you know that when electric doorbells were first being installed in houses, that visitors were afraid to use them? Folks were afraid they would be electrocuted by this new technology.
When it comes to electricity today, I would argue there are only two things you care about:
- When you turn the light switch on, the lights come on.
- How much is that electric bill each month?
Why? Because electricity is a utility—you expect it to be always on, always available, and relatively inexpensive to use.
From Technology to Utility
What happened to electricity is what is happening to many of our IT solutions. What once was considered a “technology” is now being considered a “utility.” And I am not simply referring to IT offerings such as storage, or CPU cycles. I am thinking of applications like email, CRM, etc.
And when an offering migrates from being viewed as a “technology” to being viewed as a “utility”, the entire business model for the company providing that offering is impacted. And this is exactly why the halls of TSW are buzzing this week…
Tags: utility computing