The Next Big Thing
Posts about next generation technologies and their effect on business.

How big is the information explosion?

A report published by the University of California, San Diego, calculates that American households collectively consumed 3.6 zettabytes of information in 2008. The average American consumes 34 gigabytes of content each day. This doesn't mean we read that much every day - it means that our brain processes that much  through our eyes and ears in a single 24-hour period. That information comes through various channels, including the television, radio, the Web, e-books and video games.

 

Roger Bohn, professor of technology management and co-author of the study, said, "Gaming saw the
biggest leap in the number of bytes we consume and the amount of time devoted to this platform."

 

"Hours of information consumption grew at 2.6 percent per year from 1980 to 2008, due to a combination of population growth and increasing hours per capita, from 7.4 to 11.8. More surprising is that information consumption in bytes increased at only 5.4 percent per year."

 

Yet the human ability to process information remains relatively constant.

 

 

Autonomous car to climb Pikes Peak at racing speed

Looks like Stanford is going to tackle climbing Pikes Peak. It will not be the first autonomous car to climb Pikes Peak (a challenging 12.4-mile ascent that includes 156 turns and ends more than 14,000 feet above sea level) but they'll try to do it the fastest. Earlier unmanned cars did the climb in just over 47 minutes or about 16 mph. The car will be an autonomous Audi TTS.


"There's no room for error. If you don't know what's going to happen, you shouldn't try it, because it's a long way down." said mechanical engineering professor Chris Gerdes.


The hope is that these kinds of techniques could be applied to take over if a driver accidentally went to sleep.


It's come quite a way from being the grand challenge of 2005.

Is "good enough" a fad or the future

In this month's Wired magazine there is an article titled: The Good Enough Revolution that discusses how new markets are taken over by products that are "good enough" and undercut technically better approaches. The examples discussed are the Flip camera undermining the camcorder market, or MP3 taking on the recording industry and even the Predator drone replacing manned aircraft.

 

These products grab market share because they make our selection standards shift and what was important before may no longer be a deciding factor in making the decision. Clayton Christensen in his book The Innovator's Dilemma talked about this phenomenon (Joe Hill did a blog entry on it a while back). Jerry Pournelle has also been talking about selecting "good enough" products in his Chaos Manor blog for almost as long as I can remember.

 

A reason this is important to IT organizations today is the whole concept of Cloud techniques, including SaaS. Granted there are many scenarios where an in-house or custom developed solution would be better, but the decision criteria may have shifted so those "better" characteristics are irrelevant. We need to focus on "What does the business want vs. what do they need?" and how do the characteristics of the product or service address them. For example: A cloud provider may not be able to guarantee anything close to 100% uptime from end-to-end for a transaction, although they may be willing to do this within their four walls. Is this guarantee good enough? If not are there alternatives?

 

I was in a Azure meeting in Dallas the other day and when I brought this issue up, a person in the audience said "No one can guarantee end-to-end!". I totally disagree. We've done it for years, although not in a cloud context.

 

I'd say organizations need to get used to the "good enough" thought process and be flexible about their critical characteristics. The way they have always measured success may no longer be what is most important to the business. This concept has been around for a very long time and is not a fad but the future.

Darwin and the Cloud

2009 is both the 150 year publication anniversary of the foundational work On "The Origin of Species" by Charles Darwin and his 200 year birthday. What would he tell us today about the evolution of cloud computing or is this a far fetched question? 


Well, not really. Consider the key role of adaptability for the survival of living systems ("Survival of a species is determined by their ability to adapt to a changing environment," a quote by Charles Darwin). This sounds familiar when it comes to IT solutions struggling to keep pace with the ever changing business environment. So where does this adaptation capability come from in general terms?


It comes from investing in flexibility in addition to improving efficiency (i.e. cutting cost). Isn't cloud computing a lot about balancing economy of scale efficiencies, providing responsiveness, flexibility through virtualization, and automation efforts? If so, Darwin would probably give it a good chance to survive. 


Cost, while of paramount importance, e.g. in a biological context in the sense of minimizing energy consumption, is only one side of the coin. To probe further, the book by C.K. Prahalad and M.S. Krishnan (The New Age of Innovation) mentioned in my last contribution actually provides a good discussion of the efficiency versus flexibility tension. 


"This device saves you half the work!" - "Fantastic, I take two of them." (Unknown source)

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation