The Next Big Thing
Posts about next generation technologies and their effect on business.

Displaying articles for: May 2009

Cloud Computing – Will History Teach Us Nothing – Yet Again?

We have seen the early versions of Cloud Computing models, with the promise of rapid service invocation, pay-per-use, ease of scaling, and simplified service-to-service interaction, and one can only help but think back to the mid-‘90s when the Internet and the Internet Protocol, HTML, HTTP, the Web Browser , Web Server , World Wide Web and the multitude of disruptive technologies came together in a complete computing industry shift from the Client/Server computing model. Early adopters created ‘pretty pages', then began to provide processing engines (i.e., CGI, FastCGI, App Servers), then small enterprising developers and groups became startups, and startups required more complex solutions to process electronic business and commerce, and then Enterprises who were reticent jumped to take their ‘Brick and Mortar' solutions to ‘Click and Mortar', and the .COM boom hit, and on and on, until the complexities of the various consumer, provider, supplier, and business partner interaction made the original premise of the Web a bit more daunting.


Now, at the dawn of Cloud Computing, some of the same growing pains are happening. Early adopters and disruptive innovators create the layers of Cloud Computing using some Web innovations integrated into a new construct, such as Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS), and early adopters are creating simple Cloud Services (like ‘pretty pages'), encapsulating the cloud in simple PaaS constructs (much like HTML, to CGI, to App Serving, to Java and .NET, to Enterprise Application Integration, Service-Oriented Architecture, Web Services and complex application/service design patterns), and creating early SaaS capabilities (like early Ecommerce Web Sites that matured to enterprise business-to-business, business-to-supplier, business ecosystem constructs). These components of Cloud will mature as well - the question is, will they daunt the industry in the same manner as previous shifts have? I'm reminded of a Sting song I viewed on Youtube recently - "History Will Teach Us Nothing". Did any of the computing paradigm shifts make the industry more structured? Or, have enterprises failed to mature as technology advanced?  What will history teach us this time, so that the next shift (and there will be one) doesn't push us to repeat the sins of the past?

The business value of projects like DeepQA

Lately, there has been quite a bit of discussion and media coverage over the project to have a computer compete on Jeopardy. Some compare this to a computer being able to play chess. To me, there is much more behind this from a business perspective.

The game of 20 questions was cracked in 2004 using some relatively straight-forward  techniques. Now DeepQA aims to do for Jeopardy what was done for 20 Questions 5 years ago.

The reason this is important to business is that it could allow for removing people from "normal" support call questions. HP has done some work with this as well, allowing a computer to be a destination for support interaction from Microsoft OCS. Customers can type questions to an address from their desk and receive answers about numerous kinds of support questions (e.g., How do I change my password?). Having a more flexible solution on the other end of the line will limit the amount of training for the humans and increases the likelihood that the response will be useful, providing greater value more quickly and in a consistent manner.

I think back on the early ‘90s, when I wrote a decision tree based approach to diagnosing AppleTalk network problems. Of course, back then it was because I didn't want people to call me when our Macintosh network broke down for something that I'd encountered before. I wrote that diagnostic tool in HyperCard though.

We've all experienced those situations where we sit on the end of a help desk call, feeling like we were talking to a computer following a script. Soon, we may be able to base that feeling on fact; it may actually be a computer on the other end of the line.

In the Shadow of the Cloud – Part 2

In part 1, I discussed the motivation and expected benefits that many enterprises are seeking from the use of cloud computing services. In this entry, I'll briefly compare and contrast the non-cloud alternatives, and how enterprises can consider which workloads can be used for different style of infrastructure service delivery.


The options for sourcing infrastructure services can be classified along two broad dimensions:


1. Provisioning of resources, either shared or dedicated to applications, and


2. Facilities that are either on premise or off.


The following chart depicts the selection and placement of applications or workloads among different styles of infrastructure services, including traditional datacenters, outsourcing, private utility and public cloud services.



The traditional private data center has been, and is still, well-suited for mission critical applications that require stringent security along with stable and predictable resource requirements. These systems and applications are typically mature legacy applications and COTS packages. As such, there are little to no comparable "cloud" infrastructure available for these older platforms. Keeping these systems where they currently reside will likely be more cost efficient that a variable payment approach, although outsourcing or a private utility may be an option.


Outsourced facilities provide managed services for applications and systems that are found in the traditional data center model. Generally, outsourcing provides value to the enterprise by leveraging economies of scale and reach offered by the 3rd party outsource provider. It is focused on leveraging the skills across multiple customers rather than the hardware.


The private utility model provides infrastructure services for systems and applications that have highly variable demand cycles, such as market facing web applications. Some refer to this model as an "internal cloud" or "private cloud", and is based on the use of technologies such as virtualization and automation within a private data center, to meet the varying business need of newer applications and systems. A key factor for the effective use of a private utility model is a high degree of standardization within the infrastructure and the application delivery platform


The public cloud has only recently emerged as a viable option for enterprise IT services, and many still have significant concerns related to issues like security, privacy, data integrity, lock-in, availability and so on. The case for public cloud services varies greatly between large enterprises and small business and startup companies. The latter have little or no embedded suite of infrastructure and applications, and the availability of business applications and low cost, ready-made infrastructure is a compelling proposition. 


Large, multi-national enterprises have additional barriers to overcome, such as an existing suite of applications and services that have been modified and upgraded over many years to serve the evolving needs of their businesses. They usually have stringent Service Level Agreements (SLA) and auditing requirements that are difficult for cloud providers to meet. Large enterprises need to assess the value of public computing services against the ‘integration costs' of overcoming the barriers to incorporate the services into the enterprise IT service portfolio.


Except for some very basic web applications, today's enterprise class business systems cannot be easily moved to a public cloud model. Ultimately, business applications will need to be re-architected and re-written to achieve the expected level of performance, scalability and economic advantages that are associated with cloud computing. New projects should be evaluated for their ability to take on this more flexible approach.

‘Engineering the Future’ day

May 13th is Engineering the Future Day and marks the IEEE's 125 years of engineering the future.


The IEEE on their website provides you with all the tools and resources you need to get involved and have fun with your celebration, include background on the IEEE and a technology trivia quiz.


Although some may laugh at the geeky tips in the tips on how to celebrate document, I did like the one:


"Encourage teachers in your local area to spend a day helping kids learn about technology related careers. Lesson plans are available on http://www.tryengineering.org/".


Remember, celebrate responsibly.

Labels: Engineering| IEEE| Youth

The MacGuffins of IT Projects

I was reading a short article in Wired magazine (Issue 17.05, May 2009), which talked about MacGuffins. This is a term coined by Alfred Hitchcock to describe the thing in a movie that captures the viewer's attention and drives the action. In "Raiders of the Lost Ark" this is the Ark of the Covenant, in "The Adventures of Buckaroo Banzai Across the 8th Dimension" it is the Oscillation Overthruster, and so on. So, does this idea translate to the world of IT and specifically to the industry hype cycles? In our quest to give our clients a clear focus and direction for their IT spend, are we perhaps guilty of oversimplification?

The MacGuffin is especially common in thrillers and interestingly enough, there also seem to be no shortage of MacGuffins in the IT hype cycles. In fact, they are all around us and they are pitched not only at the IT professional but also at our clients as well. In "The Maltese Falcon", the hunt is for the titular black statue of a falcon that is made of gold. When we pitch solutions to our clients, we often assume that we need an equally clear vision of the "gold" to make them want to join our quest. However, we also need to be clear that not all MacGuffins are real.

It is interesting that we can also learn from films about the way some of these MacGuffins evolve during IT projects. Generally, in films, though not always, the MacGuffin starts off as the central focus of the film, and later declines in importance as it becomes more important to see how the struggles and motivations of characters play out; Sometimes leading to a situation where the MacGuffin is all but forgotten by the end of the film. I can't help seeing a parallel with some IT hype that has appeared over the years. The assertion that Enterprise Resource Planning (ERP) systems, for example, would revolutionize back office IT and reduce cost, being a classic example. For a number of companies, this became an interesting journey but not always to where the initial vision promised.

You do need to have a clear vision to get people on-board with a project. However, this isn't just about talking to your client, making him feel special, not giving him opportunity to think or to ask what the mumbo-jumbo means. It's not about getting them to sign and then we'll get some good people in and figure this thing out as we go along. Rather, the initial vision must be backed up by clear goals and equally clear and measurable metrics so we can see if we are realizing these goals.

More on Haptic feedback for touch screens

A number of years ago, former EDS Fellow, Randy wrote a blog entry about touch feedback being a problem for virtual screens. It looks like those haptic feedback are coming a bit closer to production.

It's funny how adding another dimension to the user interaction experience can make it so much more rewarding. Enabling a touch screen to "touch back", and provide a range of textures for your experience and provide new depth to a virtual reality experience could have some very interesting applications. The article in IEEE Spectrum describes how eventually you could use the display to "tell how a garment feels".

It looks like LG, Samsung, and many others are doing this in a small way on their phones, but if all they're doing is using the phones vibration ringer to provide gross vibration feedback, it could be added to all the Smartphone devices before long.

I can just imagine the possibilities for feedback you can provide a user on their interaction with corporate systems if touch screen devices become more common in the workplace.

Swine flu and business continuity

I sat in on some presentations by a group of seniors and grad students a week or so ago at the University of North Texas. One of the presentations was about business continuity. Unfortunately, they had a very IT-centric view that was more about data disaster recovery than real business continuity.

During the discussion, we asked them about many different issues to address in a business continuity plan, and one thing came up over and over (usually in a humorous fashion), and that was the implications of Swine Flu.

This article in IEEE on-line discusses some simulation work being done at Virginia Tech using a model called EpiSimdemics. This model tries to project the implications of a disease outbreak based on the demographics of those infected. The areas identified as needing attention (e.g., rail) may surprise you. Some of the fellows within EDS did quite a bit of analysis on this from a business continuity and cultural perspective a number of years ago, and looked at how IT could help prepare an organization.

One of the ways that IT helps address a situation like this is via simulation and modeling (a proactive approach), but there are other ways as well, like allowing people to work at home and travel less (a reactive approach) and automation of processes, reducing the amount of people interaction all together. It is clear that with the amount of travel today and the continued reliance on people working together to generate value for an organization, that disease outbreak needs to be part of any organizations business continuity planning activity, just like fire or an earthquake. We may get a bit more notice and time to react, but its implications can be just as profound.

In the Shadow of the Cloud – Part 1

Reading the IT media these days, coverage of cloud computing overshadows most other subjects. Cloud is clearly a hot topic these days, with many of the attributes of a fad. There is an overwhelming amount of information, some of which is conflicting, circulating around the industry. CIO's and other business leaders are trying to keep up with these trends, but sometimes it is difficult to separate aspirations from reality.

In "Enterprise Cloud Forecasting", one of the strategies I put forth was the "replacement strategy", by using the cloud to replace or re-host existing IT systems. The combination of the current economic situation along with claims of raw IT services at pennies an hour is driving much of the initial interest in using the cloud as a replacement for private IT infrastructure. What we've seen is that when enterprises inquire about "cloud computing", they are usually interested the associated benefits that cloud promises and may not realize the alternatives available to deliver similar results. The focus is on the flexibility provided by cloud, such as:

  • less investment in physical assets
  • resources that are pooled and shared across workloads
  • scalability and flexibility
  • the ability to pay for resources as they are consumed

It is important to have a holistic perspective on all of the various application and infrastructure options available to move towards these goals. Organizations need to recognize that the majority of these benefits can be achieved through in-house and hosted environments by employing other techniques, such as virtualization and automation approaches.

In evaluating cloud versus traditional IT service delivery, the following points should be considered:

1. Enterprises will use the cloud and cloud computing technologies to add value or reduce cost of business operations - the cloud is a means to an end; not an end in itself.

2. Cloud will not replace private datacenters for traditional computing any time soon - cloud is a selective and additive capability for an Enterprise IT portfolio.

3. The future of enterprise IT will be a hybrid environment for the foreseeable future, comprised of both dedicated and cloud-based resources and services.

4. The cloud is the next stage in the evolution of the Internet, moving beyond transporting information as a service to acting upon it as well.

5. Enterprise adoption of cloud computing technologies and services is limited by perceived and real barriers stemming for requirements for security, privacy, trust and quality. 

There are other well-known alternatives for delivering IT services, even in the "shadow of the cloud". Part 2 of this series will compare and contrast the ‘non-cloud" alternatives, and how enterprises can consider what workloads are suitable for different models of infrastructure services.

Labels: IT Services

'Robodoc' makes his rounds to drive innovation in healthcare

I was reading the morning paper today and came across an interesting article about a new technology that made its debut in Folsom, CA affectionately named "Robodoc".

"Robodoc" shouldn't be confused with the software product "ROBOdoc" that helps software development organizations automate software documentation in source code. It does however have closer ties to "RoboCop", a police officer of the future featured in a 1987 thriller. "Robodoc" is a robotic medical device that automates the delivery of potentially life-saving care to stroke patients. Treating stroke patients is time-sensitive, as studies have shown that some patients have a better chance of making a reasonable recovery if a clot-busting drug is administered within a three-hour window.

"Robodoc" allows a doctor located miles, or even continents away to examine a patient brought into an emergency room through a laptop equipped with a joystick. A video screen positioned on top of the robot's body displays an image of the doctor. Cameras mounted above the screen are the robot's eyes, giving the doctor a view of the patient.

This remote presence tool in the telemedicine arsenal allows hospitals access to scarce specialists wherever they are in the world to evaluate a patient in a matter of minutes and deliver care that could be life-saving. The implementation of this technology in Folsom was only made possible by a philanthropic donation by a local family.

As with all new technologies in healthcare, incentives encourage adoption. The potential to improve patient care and mortality rates is endless. What's good for the patient is also good for the doctor. Doctors could be anywhere in the world, including the sipping wine at the Tignanello vineyards in Tuscany, Italy. I can see some new federal regulations governing the use of  "Robodoc" in the near future! Seriously, why does it take the philanthropic generosity of private citizens to drive innovation in healthcare?  Shouldn't every hospital have one?

Search
Showing results for 
Search instead for 
Do you mean 
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation