Lately, I’ve been a number of conversations with people about the strategic use of technologies. I mentioned the criteria I use to evaluate trends and technologies. We then typically get into a discussion about the difference in impact between some of the technologies that are much discussed today and how the tactical use differs from the strategic use.
- Analytics – Although you may need to gather more data and keep it longer, there is not enough attention space to sustain the effort unless you simplify, automate and focus attention only on what needs human involvement. Time to action/decisions has to be the measure of impact.
- Cloud – Although it may reduce costs in certain circumstances, the strategic impact of cloud techniques (whether it is infrastructure, processes or people) is to increase flexibility. If through the use of cloud techniques you end up increasing the flexibility, it cannot be sustained.
- Mobility – The mobility strategy for a business has to focus on improving the access to corporate information and reducing the latency in the decision-making process. If the focus remains on the devices, it will also fail.
These current technology directions (and others) have a strategic side and a tactical manifestation – make sure you know what is important to your business over the long haul when creating your plan of attack. If you want to reach the top you still go up one step at a time, but it is easy to lose sight of the goal along the way. Identify the metrics to measure progress and then measure the impact along the way and make adjustments.
When I was writing this post I felt it was a bit risky, since these technologies are viewed as so important today. The real point of the post is to view them strategically and not just a buzzword or fad. This tactical approach may be the reason that for some organizations, innovation is not working out.
After I got my Slate7 last week (which I have been very happy with by the way), I now see a whole new set of tablet-based platforms being discussed in the press. The Split x2 (for Windows 8) and the SlateBook x2 a serious tablet/laptop for Android.
It is clear there is a great deal of innovation and anticipation taking place in this space. When I think about how you use a tablet (e.g., less than an arm’s length away but a relatively fixed distance) it seems to by crying out for glasses free 3D – if you could only spare the power.
Today, HP Launches Industry’s Most Complete Software-defined Network Fabric for Cloud. This network fabric is built on HP FlexNetwork architecture, enabling business agility for clients by delivering two times greater scalability and 75 percent less complexity over current network fabrics while reducing network provisioning time from months to minutes.
This is possible by:
- Improving IT productivity by unifying the virtual and physical fabric with new HP FlexFabric Virtual Switch 5900v software, which, in conjunction with the HP FlexFabric 5900 physical switch, delivers advanced networking functionalities such as policies and quality of service to a VMware environment. Integrated Virtual Ethernet Port Aggregator (VEPA) technology provides clear separation between server and network administrations to deliver operational simplicity.
- Reducing data center footprint with the HP Virtualized Services Router (VSR), which allows services to be delivered on a virtual machine (VM), eliminating unnecessary hardware, by leveraging the industry's first carrier-class software-based Network Function Virtualization (NFV).
As organizations move to software defined networks, some fundamental changes in the approach will be required and these products are a start down that path. Here is a video with a bit more high level discussion and some details:
This week I had the opportunity to attend one of Leon Kappelman’s classes at the University of North Texas to participate in interactions with students about their senior project/presentation. The teams of students were covering a number of topics like BYOD, Cloud adoption, Biometric based security… All topics where I felt fairly comfortable.
One presentation was focused on Data Management in the Age of Big Data and they had one concept well understood that many analysts miss.
The opportunity for better decision making.
The team focused on 5 key issues. The lack of:
- Data Governance
- Data Quality standards and management
- Data Architecture and Security
- Operations support
- Business buy-in
We had quite a discussion about the business buy-in issue, since we needed them to explain why it would get this far without buy-in but they explained that the issue orbited around business culture and the implications advanced analytic techniques would have on the culture.
I was happy to see these students internalized these concepts, and hope the organizations they move into after graduation are ready for their perspective.
Flash memory was once viewed as special tool to improve performance or allow for easy transportation of information (e.g., thumb drive – I can’t recall the last time I gave someone a CD, let alone a floppy drive). Now flash memory devices are a standard component of any storage performance strategy.
As the Solid State Drive (SSD) came on the scene, it was used as a plug replacement for spinning media hard drives, providing better performance, but the characteristics of an SSD are actually quite different. The storage industry has only now started to design storage systems that take advantage of the differences in flash memory.
The Flash Translation Layer (FTL) translates the typical hard drive block-device commands and structure into comparable operations in flash memory. FTL is really a compromise for compatibility, since there is no need for the block and sector structure in flash. Additionally, the SSD controllers must perform a number of additional functions such as garbage collection, write amplification, wear leveling, and error correction, since the writeable life span of each storage cell of flash is limited (although there is discussion of a cure to this long-time flash illness). We’re going to see more applications that skip the need for FTL and take direct advantage of flash’s direct memory access capabilities.
High performance software capabilities such as databases currently circumvent the Operating System file system to attain optimal performance. Modern file systems such as Write Anywhere File Layout (WAFL), ZFS (which used to stand for the Zettabyte File System), and B-tree file system (Btrfs)are designed to take advantage of the various storage medias capabilities. The resulting systems were more efficient and easier to manage.
Storage system performance was a concern when operations were measured in milliseconds. It matters more on flash devices, whose operations are measured in microseconds. Future technologies like Memristor that will be even faster demand and optimized approach to long term storage and access of information. Compromises for convenience will exist but the penalties in performance will be high, impacting the application portfolio of organizations.