In the past when I’ve talked about the abundance of capabilities in IT, I’ve been talking about traditional IT techniques. There are many disruptive influences out there that may cause greater shifts than we can image. Recently there was an article published by Greg Snider from HP Labs talking about Cog ex Machina, an approach to computing – an architecture based on the abundance of computing. It seems to me that this could have wide implications to enterprise computing.
The example in the article is using Graphic Processing Units (GPU) rather than CPUs to perform computing. GPUs are ubiquitous and cheap, providing more than 1000 cores on a chip at very low cost. Put them together into a processing array and it is feasible to have millions of cores applied to business problems like big data.
“Cog is aimed especially at “cognitive applications,” applications which must autonomously and adaptively interact with a changing and uncertain world. The programming paradigm contains only two abstractions: dynamic fields, which represent state information as multi-dimensional arrays; and operators, which combine field states to produce new states for dynamic fields. The hardware platform is abstracted away so that programmers do not see—nor need to worry about—cores, threads, locks, communication or synchronization.”
It is an architectural approach that moves away from the bit being the basic unit of computation to fields. This should be an interesting area to watch going forward. As we think about cognitive computing requirements for the enterprise of the future, disruptive technologies like this will be required.