Visit additional Tabor Communication Publications
March 11, 2010
The current political debate on US health care reform is depressing on so many levels. The fact that the wealthiest country in the world can't seem to figure out a way to provide basic medical care for its citizens is discouraging enough. Worse, the current plan on the table will basically just bring more people into the 20th century-style health care model. In general, that model is reactionary: waiting for a disease or medical emergency to strike, and then treating the symptoms with drugs or surgery. The good news is that this style of medicine is going out of fashion.
In all likelihood, the new model is going to be something similar to what's referred to as "P4" medicine (powerfully predictive, personalized, preventative). That is a term coined by biotech luminary Leroy Hood, president and cofounder of the Institute for Systems Biology in Seattle. The idea is to bring a P4 approach into the practice of health care, incorporating the rapidly advancing technologies of molecular immunology, biotechnology, genomics, and computer science. Hood's non-profit Systems Biology institute is designed to bring together researchers from these fields and act as a incubator for P4-type biotech spinoffs.
Hood has been actively spreading the word about how this new approach will transform medicine. In fact, he was one of the plenary speakers at SC09 in Portland last November, where he talked about the way HPC fits into the systems biology paradigm. In a recent interview published in Technology Review, Hood spells out the basic outline of P4 medicine:
Individual genomes will become a standard of medical records in 10 years or so, and we will have the power to make inferences [about an individual's health] when combined with phenotypic information. Then we can begin to plan strategies for individual health care in ways we have never done before.
The idea is to use knowledge of a person's genome to deliver targeted treatments that optimize that individual's health, ideally before disease strikes. The paradigm encompasses all the new biotech buzzwords: nanotechnology, genomics, proteomics, and metabolomics. Layered on top of all this is the computational know-how that will be used to turn the "omics" data into useful health care. Says Hood:
Medicine is going to become an information science. The whole health-care system requires a level of IT that goes beyond mere digitization of medical records, which is what most people are talking about now. In 10 years or so, we may have billions of data points on each individual, and the real challenge will be to develop information technology that can reduce that to real hypotheses about that individual.
Hood worries that we'll be hard-pressed to come up with enough computational horsepower and storage capacity to deal with the genomic data for billions of people. I would be less concerned on that front. High performance computing and storage seems to be moving along at least as quickly as genome capturing technologies like DNA sequencing.
Connecting biotech with IT is the key, since this can move health care onto a Moore's Law-like curve where the value per dollar increases exponentially over time. The current trajectory of 20th century-style medicine is unsustainable. According to the US Congressional Budget Office, health care costs are on track to reach 50 percent of GDP by the middle of the century and 100 percent by 2082. Obviously that can't happen (see Stein's Law).
Tweaking peoples' genes to make them healthier and longer lived, via pharmagenomics and related technologies, is a much more economical approach. We already employ a low-tech version of this today when we make healthy lifestyle choices: exercising, eating well, getting regular sleep, and so on. All these activities can profoundly change our gene expression for the better. Being able to tune up our DNA and other cellular components in a more precise way would be the ultimate in preventive care. In fact, it would eradicate most of the costs associated with treating degenerative diseases -- everything from cancer, heart disease and diabetes to Alzheimer's. Preventing just these four diseases would eliminate a huge chunk of the health care bill.
In the meantime, we can watch the current health care debate in the US and hope we can at least achieve broader access to a 20th century medical system. But whatever happens, it's probably not worth getting too stressed out about it. I'm told it's not good for your health.
Posted by Michael Feldman - March 11, 2010 @ 4:31 PM, Pacific Standard Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.