IEEE had an article D-Wave's Quantum Computing Claim Gets Boost in Testing that looked into D-Wave’s claims of having a quantum computer that companies can buy. Organizations like Google are buying and NASA has committed to some testing on the Google system, so it is clear there is some momentum behind what they sell. The tests are showing that a different kind of computing is involved that is useful for certain kinds of optimization and security related problems.
Although it is unlikely that quantum computers will be hitting the mainstream business computing market in the foreseeable future, there are some industries (like logistics and energy management) where they could prove useful sooner than others. This technology is something organizations should be aware of, even if it will not be useful for years in their area, since the approach is so radically different.
Rice University professor Moshe Vardi is quoted as saying: “I do not expect this to happen in the very near future, but I do believe that by 2045, machines will be able to do if not any work that humans can do, then a very significant fraction of the work that humans can do.”
This podcast is worth listening to even if it is only for you to think about the role of IT in this sort of work function shift. Today, many IT organizations are focused on “back office” IT. In this new world, the separation between the back office and customer interface will be much more permeable. When the IT organization is involved in strategic planning, they can take the lessons learned from automation in cloud computing… and push the rest of the organization to embrace the effect to business consistency, quality as well as costs, consumption and the workforce.
The market is “blindly developing the technologies” and IT leadership should have a vision for their organizations on the implications.
I sat in on an IEEE Computing meeting in Dallas last week with the title: What’s Next for VoIP? As I sat through the session, I was beginning to wonder about the limited scope and its relevance going forward.
Voice is nice. I’ve mentioned before that Unified Communications is important, but with all the effort focused on video (e.g., conferencing with Skype and efforts like Google Glasses) as well as other techniques to provide a wider range of interaction (e.g., Making a Huggable Internet), where will our bandwidth consumption be in a few years. Haptics and dimensions of sensory interaction are advancing quickly like digital taste and scent. I wonder if these techniques will ever significantly affect business computing.
Since the more senses we use to make decisions, the better/faster those decisions may be – it does make me wonder if we’ll soon put our hand on a device (or even our computer) to feel the vibration of a machine in the field. That is essentially what a microphone and a speaker accomplish, they are just aimed at transferring the vibration to our ears. Similar techniques could apply to using our nose to smell the way a chemical process is cooking. Our bodies are made to do that kind of thing. Why limit ourselves to bar charts and other gages for our eyes to consume? Of course, we’ll need the bandwidth to carry all that information.
Not that long ago, we thought that the entire Internet bandwidth would be taken up by SPAM. In November of 2012, one third of the NA Internet bandwidth consumption was taking up by just one application – NetFlix. It is strange how quickly our concerns can change.
A while back, I posted a likely capabilities comparison of a static memory solution based on Memristor technology to the current solutions based on Flash. Flash has always had an inherent reliability problem – you could only write a relatively few number of times before it would stop working properly. There was a great deal of work in the flash space to try and hide that problem from the users of the technology.
It looks like a Taiwan-based company Macronix may have found a workaround that reduces the flash memory fadeout problem. They put a layer of tiny ‘heaters’ in the chip, move the data out of the way every once in a while and cook that portion of the chip back to its native state. They claim to allow for 100 Million Write cycles.
The team is going to present their findings next week at the: IEEE International Electron Devices Meeting (IEDM) in San Francisco. The technique doesn’t get around flashes relatively slow response and large size compared to other more advanced static memory approaches that are on the way, but it should allow the technology to be competitive longer (depending on how long it takes to move into production).
Their presentation is titled: Radically Extending the Cycling Endurance of Flash Memory (to > 100M Cycles) by Using Built-in Thermal Annealing to Self-heal the Stress-Induced Damage. The authors are H.-T. Lue, P.-Y. Du, C.-P. Chen, W.-C. Chen, C.-C. Hsieh, Y.-H. Hsiao, Y.-H. Shih, and C.-Y. Lu.
Using a 3D simulator to visualize a computing environment is a long way from traditional IT security, but the drama of video gaming actually enables analysts to watch over their networks more effectively. An article New Scientist titled: The real Ton: IT security as a shoot ‘em up describes some research under development at the Lincoln Laboratory, part of the Massachusetts Institute of Technology. In the simulation, network administrators can patrol their environments as if they were playing a first-person shooter - much like in the cult film Tron.
This was presented at the IEEE High Performance Extreme Computing Conference in September 2012. Both the paper and the presentation are available for download.
Although not really gamification in the purist definition, it definitely is an example of serious gaming moving into the IT field.