We’re all familiar with using 3D techniques and image replacement to enable a virtual reality experience.
Recently there has been a focus on haptics or touch to enhance virtual-reality (VR) experiences. But the sense of smell is rarely a factor. I came across this story of a group of researchers at the University of Tokyo is working to integrate the sense of smell to change an individual’s perception of taste.
They were able to trick people who were eating a plain cookie into thinking the cookie was whatever flavor they selected. The group is making use of the fact that taste is affected by what we see, hear, and smell, as well as the texture of the food, among other things.
"We are using the influences of these sensory modalities to create a pseudo-gustatory display," says Takuji Narumi, an assistant professor at the University of Tokyo. "The aim is to have subjects experience different tastes through augmented reality by only changing the visual and olfactory stimuli they receive."
I can’t think of any business applications right now, but it does make me wonder about other uses.
This past week I was catching up on my reading and found two articles about how computing will change the face of retail the first in IEEE Computer and the second was in Bloomberg BusinessWeek titled: Virtual Shopping in 3D.
Both articles show how consumerization is entering into the retail space in a big way, with the Microsoft Kinects spurring the imagination far outside the gaming community. There is a great example from FaceCake which virtualizes the “dressing room” experience in a way that speeds up the shopping process. What I found interesting is that most of the technology shown in the video is actually more likely to be found at home than at a retailer – possibly changing the definition of what shopping means.
In a similar vein there was a demonstration by Tissot at Harrods of using virtual reality techniques to supplement even the window shopping experience.
One of the technologies I was able to embrace last week when I was at the HP labswas a wall size display – the one I saw was probably 15 feet long but there are installations much larger. It was running at mutliples of high def resolution. This comes from the team researching the mobile and immersive experience of the future.
This technology was applied at the CES earlier this year to have a full size 3D display of earth wind and fire. I had a chance to see that video in the lab and it was strange to walk right up and almost step into a life-size 3D display.
It is clear that the 3D sensing and display technologies can change retail going forward.
One of the problems with virtual reality is that it is so, well, virtual...but maybe not for long. Researchers at the Computer Vision Lab at ETH Zurich have developed a method to produce virtual copies of real objects that can be touched and sent via the Internet. This article talks about the efforts to create virtual reality you can touch.
In order to accomplish this, they’ve used a 3D scanner to record the image and dimensions of the object. Next a probe with a force, acceleration, and slip sensor collects information about the object’s shape and solidity, and a model is created on the computer. It can then be displayed remotely allow a user to sense the object using a haptic (touch) device and while viewing it with 3D glasses.
Not sure what the business implications will be but it does make for some interesting remote collaboration possibilities. Taking these approaches from a haptic pen to gloves or other more intuitive approaches would definitely make the experience more user friendly.
Here is also an article from the BBC on using virtual reality to tackle tough questions.