The Next Big Thing
Posts about next generation technologies and their effect on business.

Grading my predictions for 2013

grading predictions.pngAt the end of every year that I’ve been making annual predictions, I grade my predications made in the previous December

(200620072008200920102011, 2012). It's time to look at 2013. 2013 has been the start of a turnaround for HP. We’re not out of it all yet, but we’re definitely making progress. In a way, the same thing could be said about the economy and the industry as a whole.

 

I said that 2013 would be a year of expectation -- changing the very foundation of how IT is judged. HP’s efforts around the new style of IT attests to that and many of the trends I talked about in 2012 (and earlier) began to generate business value.

 

I’ll grade myself with the following scale again this year:

A: Big changes during the year that are having wide effect.

B: Notable progress through the year and isolated areas of significant impact.

C: Progress with some impact

D: Little progress or impact – but work still taking place

F: No progress or the concept abandoned in any commercial sense.

 

Grade

Prediction

Rational

A

Organizations will have a higher expectations of security based on what everyone has experienced and learned. The battle over Internet censorship and control will reach new heights in 2013.

Thanks to the Snowden issue, this one definitely came out big, although in a way none of us may have expected.

C

Software defined networks will make communications as virtualized and flexible as the computing infrastructure. This versatility will become an expectation.

I facilitated a discussion on SDN back in September and throughout our talk it was clear that progress has been made, but we’re still only scratching the surface.

A

IT organizations will expand their definition of “customer” and their analytics to include suppliers, partners, consumers and anything/one that can make a difference

Although Big Data was not new in 2013, it definitely started to penetrate even the most slow to adopt organization’s thinking. There is definitely progress being made, although I still wonder about the bias issue.

B

We can expect to see bigger data and even bigger storage, with copious amounts of information coming from more sensors in more places. Organizations will no longer be satisfied with using only 3-5% of the data available. Beyond there being more data, the information collected will be of a wider variety (including video, sound…) so transforming the information from one format to another and back will be increasingly important.

This is a case of definite progress being made but I am not sure organizations are yet using double digit percentages of the information available to them.

B

The whole concept of ‘In Memory’ computing will be up for a shift in expectations for where and how it is used.

SAP Hana (probably the most notable of the large commercial applications in this space) is now being looked at seriously for a wide range of database applications. It is not too widespread but HP and SAP are definitely making inroads.

D

Widespread acceptance of new and improved NFC capabilities for payment and identity. The Internet of Things (IoT) will become just the Internet. Individuals will be able to add IoT capabilities independent of the original manufacturer, if desired. Although enterprises may still be crawling their way to the IoT, consumers will embrace IoT in 2013.

Although the Internet of Things is real, it has not made the progress I expected it to make in 2013. The consumer space has not really moved all that much more quickly than the Enterprise space. Sure there are devices and applications, but are they really having the impact they should.

D

The availability of different disruptive display technologies in 2013 to shift our thinking about where and when a display is needed (or even possible).

Although there are some new interface approaches and techniques, displays have not really shifted significantly in 2013.

D

One of the other core shifts in expectation will be around simplicity and the use of automation to focus attention and automate more business processes. The concept of human augmentation of automation will be significantly less foreign at the end of 2013 than it is today.

This is another case where there has been some progress, but not nearly as much as I’d hoped. Human augmented automation is about as foreign to strategic planning now as it was in 2012.

C

Enterprises will begin to address the issue that most of the apps in production can’t really unleash the power of the cloud. 2013 should see new tools and techniques to address this potential.

Application portfolio management is definitely part of a move to greater value in IT, but I’d say the adoption is only slightly more than 2012.

C

IT will begin to see ways to virtualize the mobile experience in new, secure and innovative ways.

Once again there has been progress, but it has primarily been incremental in nature. No radically new devices or approaches have come on the scene, although HP has services that understand virtualization in the mobile space, they are just not yet in demand.

B

The skills within the organization will be a constraint on value generation. Gamification, as an example, is a skill that will be recognized and move hand-in-hand with strategic change.

I do believe that gamification and its understanding by organizations shifted significantly in 2013, but that might just be because I kept talking with people about it.

C

Using the contextual information available from big data and the need for attention engineering, individuals and corporations will have greater expectation on how information is delivered to them.

Although to most businesses the expectations on information delivery is changing, I don’t think it has made significant change from the approach used in 2012.

B

There will also be a shift in how products are personalized as 3D printing moves out of limited use and becomes significantly more mainstream with some parts of the world having 3D printing capabilities as a local service. 

2013 was a good year for 3D printing. Most people have heard about it, even if they have not held something that has been through a 3D printing process. Commercial entities have begun to embrace the possibilities.

D

Implementation of IPV6 is going to be a focus in 2013.

Now there are those who are pushing back and saying they may never need to go to IPV6, the workaround are good enough.

D

Realization that automation is the new off-shore, specifically in development

I don’t believe this moved much in 2013. Very few organizations use significant automation techniques in the development space.

 

Based on these scores, my predictions for 2013 were not too conservative. My personal goal is to get close to a C+. If I get too high a grade, I am not trying to stretch my thinking (or yours for that matter) enough.

 

My view is the same as when I finished up my post in 2011:

 

“Having said all that, it is a great time to be in IT. Most of our concerns are currently driven by an overabundance of capabilities that most organizations have not tapped into effectively. Those who can have the vision will be in for quite a ride this year as they look to do more with more.”

 

I should have my predictions for 2014 out by the middle of December.

Data, the lifeblood of the enterprise

data lifeblood.jpgEven though object-oriented techniques and analytics have been around since the last century, today they are being applied and thought about in whole new ways. Technologies are enabling objects to interact with monitoring, analytics, and control systems over a diverse range of networks and on a plethora of devices. Computers are embedded in devices and rarely thought of as devices themselves, by most people.

 

This more connected and action-oriented approach will expand the reach and impact of information technology systems, impacting business value generation, applications expectations, and use cases where IT hasn’t really been focused effectively before.

 

One of the exciting aspects of this intelligent edge approach to the business use of IT is that the software will enable greater control of the physical world, not just the digital one. This means less latency and more efficient use of resources (including human attention). For many, this started in the consumer space, and is only now being embraced within business.

 

The importance of this information and its integration into the business means that the focus on security will need to increase, protecting the data as well as the control data streams. This flow will become like the blood flow of the human body, if it is interrupted or somehow contaminated – bad things happen.

 

With gamification techniques, this information flow can be used to adjust human behaviors as well as machines. How organizations think about and deal with data is already changing.

 

Everyone needs to get comfortable with:

  1. The data sets we’re working with today will look trivial within the relatively near future. Storage technology will continue to get larger and cheaper.
  2. We’ll keep the data longer and continue to generate new value from the data in use today. Data is a corporate asset and we need to treat it as such.
  3. Data scientists will be in high-demand and business schools will branch into this area in a big way, if they haven’t already.
  4. The conflict between real-time access to information and the security implications will continue to be a concern
  5. The use of cloud techniques will mean that organizations will need to start feeling comfortable with moving the computing to the data more often than the data to the computing. The pipes are big, but not that big.
  6. The diversity of devices used to access the information and the locations they are accessed from will continue to increase. BYOD is not about devices.
  7. Master data and metadata management are critical skills to get the most out of big data efforts. Even if they can’t be synchronized, they need to be understood.

 

We have the computing and bandwidth capabilities, it is just our imaginations on how to use it that limits us.

Keeping its fans first is important to NASCAR

As we move into the fall here in Dallas, one of the events that comes to my mind is NASCAR at the Texas Motor Speedway. Recently some of us at HP had the chance to engage in a bit of Q&A with Steve Phelps, Senior Vice President and Chief Marketing Officer of NASCAR, to find out how the new style of IT is making a difference.

 

Steve oversees all NASCAR efforts in corporate marketing, brand/consumer marketing, Integrated Marketing Communications (IMC), licensing, automotive group, business development, digital and social media, broadcast, entertainment, NASCAR Productions, information technology, corporate events and human resources. With over 75 million fans, that can be a big job and since NASCAR is so innovative about how they interact with this audience, the conversation is worth sharing.

 

Q. With the influx of Big Data throughout many industries, analysts are predicting that CMOs will become the new CIOs of the future. What role does Information Management and Analytics play in your daily activity?

A. The amount of data generated across both traditional and social media surrounding our sport is staggering. Fans connect with our sport digitally and share their experience with us more than ever before. Due to a shifting media landscape, news coverage of our sport is constant, coming from hundreds of print outlets, television broadcasts, and online publications. Candidly, before partnering with HP to develop the Fan and Media Engagement Center (FMEC) there was no easy way for us to make sense of all the noise. Thanks to HP’s cutting edge technology, the engagement center ingests huge amounts of data related to our sport and allows us to focus a lens on almost any topic that we want. We can now make informed decisions on just about every aspect of our business. We are just beginning to tap into its capabilities, but the value the FMEC has already provided through measurement and analytics can be felt on a daily basis.  

 

 

FMEC 2.jpg

 

Q. We’ve heard a lot about Big Data jamming the systems of many corporations and enterprise groups. NASCAR must have had massive data stores in place. How did process automation and HP Enterprise Services consulting view this challenge?

A. HP provided a true end-to-end solution for us. HP’s Enterprise Services team has helped us build this solution from Day One of our collaboration and has been a partner in the evolution and development of this process since launch.

 

In terms of hardware and software, our Fan and Media Engagement Center is HP-powered from front to back - from the back-end HP Blade servers and 3Par storage to help us store and manage all this “Big Data,” the middle software analytics layer powered by HP Autonomy, to our front-end display matrix with the latest in digital signage, it is all HP.

 

Q. Much of Information Management and Analytics has to do with gleaning the right information from the data to make it actionable. What were your goals when you started the project? Now that the engagement center has been implemented, how have these goals changed?

A. The idea of the Fan and Media Engagement Center came from our Chairman, Brian France. He wanted to create a resource that would benefit not only NASCAR, but the entire NASCAR industry by providing business-impacting insights tailored to specific audiences within the NASCAR ecosystem, including race teams, tracks, and partners.

 

In its first year of existence, the FMEC has already delivered value to each audience, yet we have only scratched the surface of the system’s capabilities. The FMEC is a “Version 1.0” platform, and we continue to learn, tweak, and refine the system. Our immediate goals have not changed, however, I envision that our goals will evolve as the system does.

 

Q. At HP, we talk quite a bit about information being the most valuable asset in the enterprise. How has the data you’ve been able to analyze proved beneficial to sponsors and partners? 

A. NASCAR is now able to provide insights to the many partners in our sport’s ecosystem. We can analyze fan levels of engagement around sponsor at-track activations, measure how a partner’s brand is perceived by our fan base, and learn more about what our fans like and dislike.

 

Additionally, we can hone in on specific topics – sentiment around broadcast partners, feedback on a sponsor contest, for example – and produce in-depth insights into fan behavior, so we can serve them the best content and provide the best experience.

 

Earlier this year, Chevy unveiled a new production model at a press event during the weekend of the Daytona 500. Within an hour after the event was over, our President Mike Helton was able to hand deliver a dashboard to our partner showing how fans and potential customers felt about the new car.

 

That kind of value can’t be measured.

 

Q. Have you seen an uptick in sales, or fan base growth? Are sponsors and partners more willing to make an investment when they are able to see data and know their return with more certainty? How has it enabled them to get ROI?

A. The FMEC wasn’t developed to be a direct revenue generator for NASCAR. However, I like to characterize ROI when speaking about the FMEC as a Return on Information. The FMEC is providing our entire eco-system with business impacting information. In certain circumstances, the impact can be felt in real-time, however a number of partners will use the information to help formulate the way they activate in our sport for years to come. That is truly when partners will be able to maximize on the value the FMEC provides. That said, the demand for FMEC information has been high this entire season and continues to grow. 

  

Q. What is one of the largest differences you’ve seen in the way your marketing organization works now that you have the Fan & Media Engagement Center?

A. One of the biggest benefits that the FMEC has provided us is the ability to market in real-time. For example at this year’s Talladega race, bad weather forced some pretty significant rain delays. Talladega is one of our largest tracks, over two miles. Rain delays can cause significant impact to fan interest, our broadcast partners, and our corporate partners.

 

During the rain delay, we were able to keep a real-time handle on levels of conversation about the race and delay, and take action to keep fans engaged via our social media channels, asking and answering questions, providing updates, sharing photos.

 

We were also able to zero in on sentiment about the track drying system, how many people were talking about it, noting how impressive it was in improving track drying time. We were able to analyze the public sentiment and provide a snapshot to track partners who are considering it for their own tracks.

 

This was a situation that could have been a negative - a significant weather delay - that we were able to turn into a positive by keeping fans engaged and showing the value of a new technology product to partners.

 

To learn more about how HP and NASCAR are working together, check out these videos:

 

 From one race fan to another, I hope to see you at the track soon!

 

What's the future of data center efficiency?

 

efficient data center.pngI was in an interesting discussion today with a couple of people writing up a paper for a college class on the data efficiency trends for data centers going forward. Although this is not necessarily my core area of expertise, I always have an opinion.

 

I thought there are a few major trends in tackling the data center efficiency issue:

1)      The data center:

  • For existing data centers, ensure that the hot/cold aisle approach is used wherever possible to maximize the efficient flow of air to cool the equipment.
  • For new data centers, look to place them in areas that take advantage of the natural environment and make them more efficient. This was done with the Winyard data center at HP. It is also why organizations look to move data center to Iceland (why else would you place a data center on top of a volcano).
  • There is also the perspective of “Why have a traditional data center at all?” Why not go with a containerized approach – like the HP EcoPod.

2)      For the hardware within the data center there are also a number of approaches (here or on the horizon):

  • Going DC only inside the walls of the datacenter. Every step down of voltage is an opportunity for reduced efficiency. Minimize where and how it takes place.
  • Using the appropriate processor architecture for the job. This is what the Moonshot effort is trying to address. Don’t waste cycles through underutilization.
  • Why waste all the power spinning disk drives that are rarely accessed. One of the valuable applications of memristor technology is to provide high performing yet very power efficient data storage and access. It’s not out yet but soon.

I am sure there are a number of other areas I didn’t talk with them about, what are they?

 

One I thought I had while writing this that is a bit different than the goal of the paper but important to business is the role of application portfolio management. Why waste energy on applications that are actually not generating value?

 

HP Storage announcements at Discover

I don’t do posts about HP product announcements, but since this the week of HP Discover, why not. There definitly are some changes in the wind for storage.

 

One of the first things out of the shoot today at HP Discover was a whole new set of capability in the area of software defined storage. With all the options in storage today (SSD, low end, backup…) the complexity of managing the environment is reducing the flexibility of organizations. Storage solutions are becoming complex, inefficient and rigid.

 

storage shifts.png

The family of converged storage solutions announced today should help address many of these issues by providing a single architecture enabling a flexible and extensible approach that embraces block, object and file storage of devices ranging from HDDs to SSD/flash, providing:

  • Performance acceleration – eliminating system bottlenecks
  • Efficiency optimization – extending the life and utilization of existing media
  • System resiliency – providing constant information/application access
  • Data mobility – federating across systems and sites

With a range of HP 3PAR StoreServ solutions ranging from the low end 7200 to high performance (StoreServ 7450) or high scaling 10800 – all running from a single architecture and interface.

 

The 7450 that was announced today is:

  • Accelerated:  Over 500,000 IOPS and less than 0.6 ms latency proving a massively parallelized architecture, flash-optimized cache algorithms and QoS
  • Efficient:  Reduce capacity requirements up to 50% and extend flash lifespan delivering a multi-layered and fine-grained virtualization with hardware-accelerated data compaction
  • Bulletproof:  Eliminate downtime for performance-critical applications with a Quad-controller design with persistent cache and ports, peer failover, and multi-site replication
  • Futureproof:  Allowing organizations to move data seamlessly to balance performance and cost enabling a simple solution across Tier 1, midrange, and flash with federated data mobility

Also announced was HP StoreOnce VSA, a software defined approach to information protection enabling a more agile and efficient approach to remote information protection.

 

The breadth of these announcements should enable greater flexibility for organizations that maintain their own infrastructure.

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels