Lately, I’ve been a number of conversations with people about the strategic use of technologies. I mentioned the criteria I use to evaluate trends and technologies. We then typically get into a discussion about the difference in impact between some of the technologies that are much discussed today and how the tactical use differs from the strategic use.
- Analytics – Although you may need to gather more data and keep it longer, there is not enough attention space to sustain the effort unless you simplify, automate and focus attention only on what needs human involvement. Time to action/decisions has to be the measure of impact.
- Cloud – Although it may reduce costs in certain circumstances, the strategic impact of cloud techniques (whether it is infrastructure, processes or people) is to increase flexibility. If through the use of cloud techniques you end up increasing the flexibility, it cannot be sustained.
- Mobility – The mobility strategy for a business has to focus on improving the access to corporate information and reducing the latency in the decision-making process. If the focus remains on the devices, it will also fail.
These current technology directions (and others) have a strategic side and a tactical manifestation – make sure you know what is important to your business over the long haul when creating your plan of attack. If you want to reach the top you still go up one step at a time, but it is easy to lose sight of the goal along the way. Identify the metrics to measure progress and then measure the impact along the way and make adjustments.
When I was writing this post I felt it was a bit risky, since these technologies are viewed as so important today. The real point of the post is to view them strategically and not just a buzzword or fad. This tactical approach may be the reason that for some organizations, innovation is not working out.
Flash memory was once viewed as special tool to improve performance or allow for easy transportation of information (e.g., thumb drive – I can’t recall the last time I gave someone a CD, let alone a floppy drive). Now flash memory devices are a standard component of any storage performance strategy.
As the Solid State Drive (SSD) came on the scene, it was used as a plug replacement for spinning media hard drives, providing better performance, but the characteristics of an SSD are actually quite different. The storage industry has only now started to design storage systems that take advantage of the differences in flash memory.
The Flash Translation Layer (FTL) translates the typical hard drive block-device commands and structure into comparable operations in flash memory. FTL is really a compromise for compatibility, since there is no need for the block and sector structure in flash. Additionally, the SSD controllers must perform a number of additional functions such as garbage collection, write amplification, wear leveling, and error correction, since the writeable life span of each storage cell of flash is limited (although there is discussion of a cure to this long-time flash illness). We’re going to see more applications that skip the need for FTL and take direct advantage of flash’s direct memory access capabilities.
High performance software capabilities such as databases currently circumvent the Operating System file system to attain optimal performance. Modern file systems such as Write Anywhere File Layout (WAFL), ZFS (which used to stand for the Zettabyte File System), and B-tree file system (Btrfs)are designed to take advantage of the various storage medias capabilities. The resulting systems were more efficient and easier to manage.
Storage system performance was a concern when operations were measured in milliseconds. It matters more on flash devices, whose operations are measured in microseconds. Future technologies like Memristor that will be even faster demand and optimized approach to long term storage and access of information. Compromises for convenience will exist but the penalties in performance will be high, impacting the application portfolio of organizations.
Are the measures we understand and use today up to the needs of tomorrow? IT within the organization has access to the tools and the connections to do more...
Rice University professor Moshe Vardi is quoted as saying: “I do not expect this to happen in the very near future, but I do believe that by 2045, machines will be able to do if not any work that humans can do, then a very significant fraction of the work that humans can do.”
This podcast is worth listening to even if it is only for you to think about the role of IT in this sort of work function shift. Today, many IT organizations are focused on “back office” IT. In this new world, the separation between the back office and customer interface will be much more permeable. When the IT organization is involved in strategic planning, they can take the lessons learned from automation in cloud computing… and push the rest of the organization to embrace the effect to business consistency, quality as well as costs, consumption and the workforce.
The market is “blindly developing the technologies” and IT leadership should have a vision for their organizations on the implications.