Yesterday, the New York Times had an article titled: Pills Tracked From Doctor to Patient to Aid Drug Marketing. The articles discussed how the new analytics capabilities are allowing drug marketers to locate influential doctors by their social behavior as well as patient behavior. This article was a good example of where additional insight can be used to define action.
I normally view much of the “Big Data” trend to be focused too much on insight and not enough on action, but this article did talk through some of the interesting issues at least as it relates to the healthcare provider market.
Related to these is a post about one of the Government Big Data Solutions Award winners. This award was established to recognize public sector enterprises that are setting the gold standard for their approach to Big Data.
In The Smart Data Collective blog, Bob Gourley highlighted this year’s winner of the 2012 award, The National Cancer Institute, whose:
“Frederick National Laboratory has been using Big Data solutions in pioneering ways to support researchers working on complex challenges around the relationship between genes and cancers. In a recent example, they have built infrastructure capable of cross-referencing the relationships between 17000 genes and five major cancer subtypes across 20 million biomedical publication abstracts.”
This is an example of the type of analytics that are possible to drive the depth of our understanding, harnessing the power of the information around us and using some of the new tools available like Vertica to develop insight and improve time-to-action.
There is a great deal more on-going research and development being performed on more types of clinical procedures, especially those that are intricate and invasive. Some examples are:
- CorPath200 - the first robotic-assisted technology for coronary intervention techniques. This novel medical technology allows doctors to perform accurate robotic placement of balloon/stent catheters and coronary guidewires from a seated cockpit.the first robotic-assisted technology for coronary intervention techniques. This novel medical technology allows doctors to perform accurate robotic placement of balloon/stent catheters and coronary guidewires from a seated cockpit.
- TraumaPod – A telerobotic, semi-autonomous surgical system, with a goal of stabilizing injured soldiers within minutes after a battlefield trauma and administer life-saving medical and surgical care prior to evacuation.
- RAVEN and MiroSurge – These two surgical robots are used for performing endoscopic surgery. RAVEN II is a remotely operated laparoscopic system. MiroSurge is a versatile robotic system with different control modes and arm-mounting locations that can be applied to several surgical domains.
- HeartLander – A Miniature mobile robot that is designed for epicardial procedures like heart cell transplantation and intrapericardial drug delivery by adhearing to the surface of the heart,autonomously navigating to a specified location and then administering therapy.
- NeuroArm - Surgeons sometimes need to perform surgeries while a patient was inside a magnetic resonance (MRI) machine. This robot is specifically design for use in that scenario and is as dexterous as the human hand but even more precise and tremor-free.
- MrBot is designed for accessing the prostate gland using MRI data.
- Amadeus is a laparoscopic surgical robot system from Canada with four arms designed to assist in tele-operation for long-distance surgeries, among other situations.
It is clear that robotics may be a core capability of future surgeons.
Yesterday, I wrote a post on the 1000 Genomes project and mentioned that medicine will be undergoing a significant change as these techniques become more common.
We can move beyond the medicine for the masses with side-effects for some, to a much more tailored approach to treatment designed for the individual. This will be based on a genetic understanding of how a body will react. This shift is coming sooner than most of us think and is fueled by data and analytics.
“By characterizing the geographic and functional spectrum of human genetic variation, the 1000 Genomes Project aims to build a resource to help to understand the genetic contribution to disease. Here we describe the genomes of 1,092 individuals from 14 populations, constructed using a combination of low-coverage whole-genome and exome sequencing.”
It takes about a 2 terabytes to store the genomic information in detail. 1.5 gigabytes to store every DNA letter. If you have a good reference and only store the differences, it can be stored in about 20 megabytes. Today when you can get 3TB external drives for about $100, it means you can store all this information in a fraction of the space available on a typical drive today.
1092*20MB+1GB = 22GB compressed
Although storing the information may be cheap, collecting it will be costly (although this cost is going down rapidly too).
If you think out just five years, with the trend in exponential growth of today’s magnetic storage, for that same $100 you should be able to buy about 100 TB of storage ($1/TB). By then the population of the earth will be around 7 Billion people. So it would cost only:
7^9*20^6/1^9= $140 Million
to store the genomic information of everyone on earth. Not an outlandish sum, and it would definitely change how we think of medicine and that is assuming there are no disruptive breakthroughs in storage.
What other aspects of business use of IT are constrained by conventional thinking?