The Next Big Thing
Posts about next generation technologies and their effect on business.

Rethinking future services and the application portfolio

applications.pngAreas changing within business and IT include the movement away from dedicated hardware for applications, as well as the concept of dedicated applications themselves. In order for these changes to be truly successful there are a number of factors to be addressed.

 

Today there are a wealth of software providers that supply intellectual property to address business problems (e.g., ERP solutions). Although some support more flexible access methods (e.g., SaaS), they are still rigid in what they make available to the business itself. The problems are viewed as IT and not what the business needs. In order for these service providers to address the specific needs of an organization, greater service integration flexibility is required. This allows for real integration of business processes, meeting the businesses unique needs. IT that supports those business processes may come from many different sources.

 

This flexibility will require greater data transport capabilities and analytics, turning generic processing into business differentiation. This movement of data outside the control of a service provider is the bane of most as-a-service solutions, yet when you think about it – whose data is it??

 

To meet the needs of the system users, greater platform independent support is required. This will allow the integration of generic business processes into a context specific solution that can be used by the various business roles to make better business decisions. Since the mobile interface is the enterprise interface going forward, placing the information in the context of the user is critical, on the device the user is actually using. Or if the response is well understood facilitating the systems of action needed to predict and respond to business events.

 

This also means that custom application configuration capabilities will be critical. Rather than having 3rd generation programmers handcrafting new behaviors into the system, standards and tools for customization will be required. Application configuration capabilities will improve the time to market and reduce the maintenance costs -- relying on business-oriented graphical modeling to aggregate functionality from across the portfolio of capabilities. Social capabilities and gamification support will be built into these customization capabilities. This mass-customized contextual portfolio approach is the antithesis of what leveraged service providers enable today.

 

One of the biggest detriments (at least from my perspective) of the dot com era was the view that everyone can code. These coders can do that in a 3rd generation language like Java (or JavaScript for that matter). And finally, that coders actually understand user interface and business process automation design (and security). I don’t think we can afford to put up with these views any longer. The changes in how computing works and is delivered as well the complex possibilities enabled by the abundance of IT capabilities don’t allow it. There has been work to leverage experts and hide complexity over the years, yet most organizations take advantage of very little of this work. It’s time that we move on.

Start thinking about HTTP 2.0 early

http.pngOne of the changes on the horizon that I’ve not paid too much attention to but will impact the services space is: HTTP 2.0. Most organizations today are using HTTP 1.1 and since that dates back to 1999, it is getting rather long in the tooth.

 

Some of the areas trying to be addressed in this update are performance and security.  There are efforts underway today (like SPDY) to improve the existing HTTP, and these are only recently being supported by some of the mainstream browsers. The foundation they defined is being used though to move forward.

 

If the standards effort progresses as defined, HTTP 2.0 will be faster, safer, and be more efficient than HTTP 1.1. Most of these changes will actually take place behind the scenes, so for the user they will upgrade their browser and have HTTP 2.0 capabilities and have to wait for the servers to provide the improved functionality.

For companies though capabilities like server push, enabling HTTP servers to send multiple responses (in parallel) for a single client request -- significant improvements are possible.

 

 

“An average page requires dozens of additional assets, such as JavaScript, CSS, and images, and references to all of these assets are embedded in the very HTML that the server is producing!”

 

So for HTTP interfaces, instead of waiting for the client to discover references to needed resources, the server could sent all of them immediately as soon as you know they’ll be needed. Server push can eliminate entire roundtrips of unnecessary network latency. With user interface responsiveness being an important satisfaction criteria for users this could be a differentiator for a service, especially if it is turned to the network bandwidth available.

 

For businesses, there is a bit of work to do, since porting environments between HTTP servers requires a great deal of testing, even if you have not yet architected a solution to newer functionality. Microsoft and others have put out new server software so organizations can get their feet wet now, while the standards are still solidifying.

HP addressing the need for constant security vigilance

security extend.pngAfter talking with a number of people recently, it became clear that HP is constantly investing in the security space, much more than I knew. These investments have been going on for a very long time.

  • 2011 – Autonomy (Compliance, Behaviour & Classification)
  • 2010 - Fortify (Application Security),
  • 2010 - ArcSight (Compliance & Risk Mgmt)
  • 2009 - TippingPoint (via 3COM) (IPS)
  • 2008 - EDS / HP consulting and managed services, Vistorm security consulting and security managed services
  • 2007 - SPI Dynamics / HP Application Security Center

There is now a highly experienced team of more than 5000 professionals with security certifications including CHECK, CLAS, CISSP, CISM, CISA, CLEF, IISP, ISO 27001 Lead auditor, PCI QSA, and others but it doesn’t stop there.

 

There are a constant stream of announcements from HP in the security space (including one early this month) and new ones today. The reason for this vigilance is that every 7-10 years, technology development and delivery undergo a shift that opens up new business and access models. These shifts fundamentally change the way that technology is consumed and the value that it can bring; these shifts change what is possible and create new opportunities for innovation. They also open up new opportunities for security threat concerns and all organizations are affected by security breaches.

 

Cyber Security moved from 12th to 3rd place in risk factors faced by businesses in Lloyd’s 2013 Risk index. The potential for financial, reputational and physical damage has elevated the issue or cyber security to board level. Today’s enterprise is struggling to find the balance between protecting itself from organized cyber criminals, maintaining legal, regulatory and compliance standards while enabling the adoption of  new IT solutions for business value generation like mobility, cloud and analytics.

 

Additionally, cybercriminals have created a cybercrime market place, sharing and selling information on tools, tactics and targets, to find vulnerabilities in organizations’ infrastructure, stealing critical customer data and intellectual property. Individuals and groups are starting to specialize and turn into service providers for others wanting to use their capabilities, since the size of this underground security market may actually be larger than the protection oriented security market itself.

 

While some threats are external and malicious, others are internal - like a disgruntled employee who might steal proprietary information. There are also unintentional mistakes, such as an employee losing an unlocked or unencrypted device or being tricked into sending non-encrypted documents and classified company information to illegitimate sources. Alarmingly, as the enterprise landscape becomes more open and the need to share information greater, we see 44% of data breaches happening at the hands of a trusted supplier. The threat environment is dynamic and complex.

 

Regulators respond to this complex security ecosystem by implementing numerous regulations and mandates in the hope of preventing further issues. Unfortunately, using compliance to define your security strategy sets a low bar, since the requirements are reactive in nature. Organizations are forced to address the regulations because they can’t appear to be non-compliant.

 

HP is focused on helping organizations address their information security, by spend less time on reactive threat management and more on disrupting the security ecosystem. To focus on understanding and protecting business’s critical information assets to better aligned to their information risk tolerance (this is definitely not a once size fits all approach). There is a shortcoming for trained security personnel and HP is trying to provide the tools and the services to address the gaps for organizations.

 

Some of the new services announced today include:

Big Data - the opportunity for better decision making.

unlimited data.pngThis week I had the opportunity to attend one of Leon Kappelman’s classes at the University of North Texas to participate in interactions with students about their senior project/presentation. The teams of students were covering a number of topics like BYOD, Cloud adoption, Biometric based security… All topics where I felt fairly comfortable.

 

One presentation was focused on Data Management in the Age of Big Data and they had one concept well understood that many analysts miss.

 

The opportunity for better decision making.

 

Too often IT folks focus on the data and not the context the data describes and what actions need to be taken.

 

The team focused on 5 key issues. The lack of:

  1. Data Governance
  2. Data Quality standards and management
  3. Data Architecture and Security
  4. Operations support
  5. Business buy-in

We had quite a discussion about the business buy-in issue, since we needed them to explain why it would get this far without buy-in but they explained that the issue orbited around business culture and the implications advanced analytic techniques would have on the culture.

 

I was happy to see these students internalized these concepts, and hope the organizations they move into after graduation are ready for their perspective.

Are standard processes constraining our agile needs?

Complex process.pngLately I’ve been in a number of discussions about processes and automation. When you look at traditional ERP/CRM systems, they have already automated the processes and it is up to you to figure out how to run your business within them.

 

As we develop more sophisticated systems that can begin to recognize patterns of behavior, new software solutions that adapt to changing needs are possible. One area of this effort is adaptive case management.

 

Adaptive Case Management (ACM) is information technology that exposes structured and unstructured business information (business data and content) and allows structured (business) and unstructured (social) organizations to execute work (routine and emergent processes) in a secure but transparent manner.”

 

I usually talk about standards as allowing us to focus our innovation. In the case of processes though I have to ask: Is the future less about standard ways of doing things and more about adaptive approaches that adjust dynamically to the needs of the day? I think it is.

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels