We need the education system to provide platforms to teachers and students to enable Education 2.0. And it’s not just about the technology, Education 2.0 must support the knowledge economy and therefore there must be a focus on creativity and innovation.
I was asked by a co-worker about the top 10 technologies of the decade. I thought about it a bit and came up with nine:
- GPS – Global Positioning devices make it possible to track everything from your car to your dog. It has been incorporated into numerous devices, including almost every mobile phone sold and is now being used it business as well as personal lives.
- Home broadband – Although dialup network connections to the Internet were popular since the 90s, the ability to have a high speed connection to the home was the springboard for social networking. It’s changed the way people find information about everything from current events to how they are going to entertain themselves tonight.
- Genomics – Although the human genome project was completed early in the last decade, the ripple effects will be felt for decades to come as the triggering mechanisms for genes are understood and adjusted, changing some areas of healthcare at a fundamental level. As we live longer, we have a greater chance of needed the outcome of this research as chronic illness becomes normal.
- Memristor – This is an area of research where HP created a new fundamental electronic device that was only theoretically possible. Its impact may not be apparent yet but memristor-based technology has broad applications from static memory to logic circuits and will change computing and electronics for decades to come.
- On demand multimedia – The capabilities of broadband in the home has allowed for the movement of video and audio from CDs and DVDs to streaming. The ability to access almost any content at any time has changed entertainment and how it is provided by both professionals and amateurs and has almost eliminated structuring our lives around when content is available.
- Wireless mobile networking – Although there was some wireless networking available before 2001, the advent of wireless access to the Internet through 2G, 3G and now 4G capabilities has had at least as much impact as broadband networking. It has changed the way people spend their leisure time and settled many arguments before they could start.
- Social Networking – Broadband and wireless networking have changed how people interact at a fundamental level. It is now possible to know where someone is as well as what they’re doing on a moment by moment basis – changing the very definition of privacy.
- Materials – The use of carbon fiber and other new materials have changed the way products are created, making them stronger, lighter and more resilient. These technologies have changed the way products are built and perform ranging from airplanes to swimsuits.
- Nanotech – This innovation is the ability to manipulate matter on the atomic scale. Carbon nanotubes are one example of nanotech and they are working their way into objects ranging from electronics to healthcare and materials. Since materials behave differently at the nano-scale, research in this area allows for modification of fundamental characteristics like hardness, color, conductivity and chemical behavior. This area of research will change nearly every industry in the coming decades. HP is doing research at using nanotech to create new chemical sensing technologies.
What one would you add?
As we move into the age of data abundance, where we keep more info about more things for a longer period of time, privacy practices are going to be increasingly important.
By now you’ve probably heard that the FTC is taking a microscope to Google’s privacy practices. It’s doing this because the Google Buzz social networking service disregarded their privacy policies when it launched back in February of 2010.
From an FTC press release:
“The proposed settlement bars the company from future privacy misrepresentations, requires it to implement a comprehensive privacy program, and calls for regular, independent privacy audits for the next 20 years. This is the first time an FTC settlement order has required a company to implement a comprehensive privacy program to protect the privacy of consumers' information. In addition, this is the first time the FTC has alleged violations of the substantive privacy requirements of the U.S.-EU Safe Harbor Framework, which provides a method for U.S. companies to transfer personal data lawfully from the European Union to the United States.”
"When companies make privacy pledges, they need to honor them," said Jon Leibowitz, Chairman of the FTC. "This is a tough settlement that ensures that Google will honor its commitments to consumers and build strong privacy protections into all of its operations."
From a Google blog post written by Alma Whitten, director of privacy, product & engineering:
“The launch of Google Buzz fell short of our usual standards for transparency and user control-letting our users and Google down. While we worked quickly to make improvements, regulators-including the U.S. Federal Trade Commission-unsurprisingly wanted more detail about what went wrong and how we could prevent it from happening again. Today, we've reached an agreement with the FTC to address their concerns. We'll receive an independent review of our privacy procedures once every two years, and we'll ask users to give us affirmative consent before we change how we share their personal information.
We'd like to apologize again for the mistakes we made with Buzz. While today's announcement thankfully put this incident behind us, we are 100 percent focused on ensuring that our new privacy procedures effectively protect the interests of all our users going forward.”
The old saying that “any publicity is good publicity” may not apply in this case, since there are already concerns about the information technology industries ability to pull together information from various sources and derive behaviors. These techniques can be used for good or evil.
The Federal Trade Commission has a program called On Guard Online to help consumers understand the privacy issues. Businesses need to identify someone to be on the lookout as well. It can be called the Chief Security Officer or the Chief Privacy Officer but it has to be someone who is responsible for these issues with enough support and influence to make something happen when there is a problem. It is not likely to take care of itself. It doesn’t matter how big or small you are it's an issue we'll all need to live with.
When I talked with Dan Moor a security forensics and incident response person I work with, he said:
"I saw a little bit of it and just now read the statement. The part that is sticking in my head is “and calls for regular, independent privacy audits for the next 20 years.” I like the general idea of extending a privacy baseline for companies to follow but the 20 years of audits just strikes me as a little foreboding. I guess thinking about it a little more seems to lead me to think that information protection strategies are soon going to take on a new regulatory facet that extends the basic access concerns to how the data is used by applications and the next social app du jour. In my perspective that just made the information protection audit scenario of the past (ISO 27001, et. al) now just the first step. Once you show them the infrastructure is sound, now you will have to prove that your apps aren’t giving away the farm with a clever friend selection. "
I got an interesting note from LinkedIn today announcing that they just hit their 100 Millionth member. They also told me that I was member 199861 (whew - I got in before 200K). Those are some pretty large number by almost anyone's scale. From Wikipedia that puts it on par with the 11th (Mexico) and 12th (Philippines) largest country in the world.
Yet, social networking techniques are still in their infancy in most corporations. There is a great deal of growth potential here as organizations move into unified communications, deploying collaboration tools, move to be more globalized and extended the enterprises through Coopition. The techniques to pull networks together are changing rapidly as well as we being to move from defined connections (like LinkedIn) to derived networks based on personnel behavior.
This last area is one that I've been talking with some of the HP labs teams about, since HP has over >300K personnel finding the right one just a few minutes quicker could have quite an impact.
I saw an interesting story on Tracking the National Mood Through Twitter. It made me think that one thing business leaders are always wondering about is how their policies/decisions are being received in the field. Could the technique could be used within larger business organizations?
Within HP we have our own internal twitterish tool that is part of a larger collaboration set called Watercooler, where this approach could be applied directly. For many organizations they don’t have this kind of social networking infrastructure. I’d think you could extend the concept into email and other tools that are more common though, if you could overcome any security concerns.
Since the group that did this mood analysis has a fairly formal approach and a track record of data gathering, if it is applied, the business leaders could try and identify alignment of various characteristics to maximize the impact or acceptance of the workforce to corporate decisions.
I remember back in the late 90s, I was called in to help with a healthcare benefits enrollment helpdesk that seemingly at random would be overcome by calls. 90% of the time it worked fine, but every once in a while even having 50% more people wouldn’t be enough. I did some statistical analysis of the historical records and found out the critical variables. It was something like -- if 3 days after the salaried workers were paid lined up with the day after the hourly workers were paid and it was after the end of fiscal quarter… (or something like that) the call center would be overwhelmed. It was a pattern that sort of made common sense, but behaviorally was a perfect storm kind of event.
This kind of analysis isn’t really hard, it just takes the vision and the stamina to make it happen.
I’ve blogged in the past about modeling and the shifts possible with the alignment of the abundance of computing and data. Identifying these patterns and then determining ways to take advantage of them will separate the business herds of the future. Now that I am looking at the HP labs projects, I am seeing more ways these techniques can be applied to a wider range of situations and industries.