The Next Big Thing
Posts about next generation technologies and their effect on business.

New vision for computing

eye.pngIEEE Spectrum had an article on moving display technology closer to the eye. Whether it is virtual reality goggles or contact lens enabled displays, it appears there is a great effort being applied to move displays closer than ever. The demonstration of a combined contact/glasses based display approach shows the level of innovation that is underway – not that I think that approach will be viable in the marketplace.

 

If you combine that with speech or gesture recognition, it leads to a technological approach that could be safer and more ubiquitous than what’s been done before. Naturally, there are some people who think that these displays are risky in certain circumstances.

 

Even as access to networking and computing permeate more of our business and personal lives, the display has been one dimension that has been holding back application in many domains. I can easily see a mechanic or others who hands are typically busy doing work using techniques like this to reference manuals… and facilitate decisions. Who knows if these techniques can be applied in a transparent and effective way, they could lead to the one display that is used by all the devices around us.

 

It makes me ask questions about how applications would change if this were available? What new business solutions are possible??

Full body Virtual Reality - in your future?

I recently saw a story about The Omni a new virtual reality gaming device that launched a funding campaign in Kickstarter. 

It is a platform with a low friction, grooved base that allows users to walk or run in place. That movement translates directly into any keyboard-compatible game, allowing for an even more natural interface. It can be used with head up displays like the Oculus Rift and motion controllers like the Xbox Kinect to provide a very high level of realism to virtual reality.

 

There are also health benefits, since you’re not just sitting playing – you actually need to move large muscles to play the game.

 

As I looked at the capabilities, I couldn’t help but wonder about its application in the business environment. Maybe not for the knowledge worker (although thinking about that may be innovative) but for training and orientation. Let’s say you are a telecom worker who goes in the field and makes adjustments at a communications center – it may help to know what it will look like when you get there.

 

Of course, why not put a robot in the corner of the building that can make adjustments… and you could interact with the environment without actually having to travel there. Moving bits not atoms

 

 

 

A view from within a game: 

 

 

 

 

Adding more senses to the virtual reality experience

We’re all familiar with using  3D techniques and image replacement to enable a virtual reality experience.

 

Recently there has been a focus on haptics or  touch to enhance virtual-reality (VR) experiences. But the sense of smell is rarely a factor. I came across this story of a group of researchers at the University of Tokyo is working to integrate the sense of smell to change an individual’s perception of taste.

 

They were able to trick people who were eating a plain cookie into thinking the cookie was whatever flavor they selected. The group is making use of the fact that taste is affected by what we see, hear, and smell, as well as the texture of the food, among other things.

 

 

"We are using the influences of these sensory modalities to create a pseudo-gustatory display," says Takuji Narumi, an assistant professor at the University of Tokyo. "The aim is to have subjects experience different tastes through augmented reality by only changing the visual and olfactory stimuli they receive."

 

I can’t think of any business applications right now, but it does make me wonder about other uses.

 

Technology Trends in Retail

This past week I was catching up on my reading and found two articles about how computing will change the face of retail the first in IEEE Computer and the second was in Bloomberg BusinessWeek titled: Virtual Shopping in 3D.

 

Both articles show how consumerization is entering into the retail space in a big way, with the Microsoft Kinects spurring the imagination far outside the gaming community. There is a great example from FaceCake which virtualizes the “dressing room” experience in a way that speeds up the shopping process. What I found interesting is that most of the technology shown in the video is actually more likely to be found at home than at a retailer – possibly changing the definition of what shopping means.

 

In a similar vein there was a demonstration by Tissot at Harrods of using virtual reality techniques to supplement even the window shopping experience.

 

One of the technologies I was able to embrace last week when I was at the HP labswas a wall size display – the one I saw was probably 15 feet long but there are installations much larger. It was running at mutliples of high def resolution. This comes from the team researching the mobile and immersive experience of the future.

 

This technology was applied at the CES earlier this year to have a full size 3D display of earth wind and fire. I had a chance to see that video in the lab and it was strange to walk right up and almost step into a life-size 3D display.

 

It is clear that the 3D sensing and display technologies can change retail going forward.

Reach out and touch, well nothing…

One of the problems with virtual reality is that it is so, well, virtual...but maybe not for long.  Researchers at the Computer Vision Lab at ETH Zurich have developed a method to produce virtual copies of real objects that can be touched and sent via the Internet. This article talks about the efforts to create virtual reality you can touch.

 

In order to accomplish this, they’ve used a 3D scanner to record the image and dimensions of the object. Next a probe with a force, acceleration, and slip sensor collects information about the object’s shape and solidity, and a model is created on the computer. It can then be displayed remotely allow a user to sense the object using a haptic (touch) device and while viewing it with 3D glasses.

 

Not sure what the business implications will be but it does make for some interesting remote collaboration possibilities. Taking these approaches from a haptic pen to gloves or other more intuitive approaches would definitely make the experience more user friendly.

 

Here is also an article from the BBC on using virtual reality to tackle tough questions.

Search
Showing results for 
Search instead for 
Do you mean 
Follow Us
Featured
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.