Yesterday, I met with a number of technologists and educators from North Texas (Interlink) to discuss the changes that educators need to prepare for in their high school and college curricula. It was a lively discussion and reminded me of the issues IT organizations have in determining where to encourage their people to develop themselves and prepare the organization for the future...
This sort of training tool would encourage people to get out and interact with other employees and find out information about the corporate culture and resources – going far deeper than most employee orientation today.
One of the parts of Autonomy that shows the kind of shift in thinking that can take place when you think about a world with abundant computing is Aurasma. This technology may change the way we look at real-world objects.
We have seen augmented reality demonstrations before, but this is not an approach to enable a special case of what’s around us. It is about enabling almost everything around us. Aurasma uses a scaled down version of the Autonomy’s IDOL pattern recognizer to identify images stored in a vast database, and then converts those images into related video, games...
The demo at the top of this post shows how it could be used to help train employee, customers and a wide variety of other applications.
We’re all familiar with using 3D techniques and image replacement to enable a virtual reality experience.
Recently there has been a focus on haptics or touch to enhance virtual-reality (VR) experiences. But the sense of smell is rarely a factor. I came across this story of a group of researchers at the University of Tokyo is working to integrate the sense of smell to change an individual’s perception of taste.
They were able to trick people who were eating a plain cookie into thinking the cookie was whatever flavor they selected. The group is making use of the fact that taste is affected by what we see, hear, and smell, as well as the texture of the food, among other things.
"We are using the influences of these sensory modalities to create a pseudo-gustatory display," says Takuji Narumi, an assistant professor at the University of Tokyo. "The aim is to have subjects experience different tastes through augmented reality by only changing the visual and olfactory stimuli they receive."
I can’t think of any business applications right now, but it does make me wonder about other uses.
This Technology Review blog entry describes a "enhanced vision system" from General Motors that can highlighting landmarks, obstacles and road edges on the windshield in real-time. The video in the entry also talks about integrating with GPS systems, to clearly mark desired locations. By using a variety of sensors, various hazards and points of interest can also be shown.
"The merger between reality and computer displays is becoming more prevalent and transparent. There were a few good entries on this topic in 2009, and I expect it to increase radically in the mobile computing space. This is one significant way of overcoming information overload issues with the massive amount of data being collected."
"To turn the entire windshield into a transparent display, GM uses a special type of glass coated with red-emitting and blue-emitting phosphors--a clear synthetic material that glows when it is excited by ultraviolet light. "
Looks like this particular technology is a way off since the articles says it will not be part of a production car until 2018 at the earliest.
One of the other items described in the video was the use of eye-tracking to achieve an effective virtual interface. They are getting more information than just how to align the graphics, since eye-tracking adds a new dimension to the interaction with the computer, allowing it to fade into the environment - when done correctly. The eye-tracking can see how effective the virtual display is at attracting attention and aid with attention management for the driver.