Visit additional Tabor Communication Publications
September 15, 2011
The deep thinking supercomputer that vanquished two of the most accomplished Jeopardy champs in the game's history is about to begin its first commercial gig. IBM Watson is being enlisted by WellPoint, one of the largest health insurers in the US, to be its first cyber-healthcare analyst. Specifically, the IBM Watson technology will be used for applications that can suggest patient diagnosis and treatment options for doctors in the WellPoint's Blue Cross, Blue Shield provider network.
According to the press release, IBM and WellPoint will develop the healthcare applications, adapting Watson's Question Answering technology, aka DeepQA, for the medical domain. But the actual hardware will be somewhat different. An article in Computerworld reports that the machine used to host the WellPoint apps will be a smaller version than the ten rack, 90-node Power 750 cluster used in the Jeopardy series in February, but because of optimizations since then, will have the same computational muscle. Fundamentally though, the system will incorporate all the "big data" analytics smarts as its game-show precursor.
Here is IBM's pitch from the announcement:
In recent years, few areas have advanced as rapidly as health care. For physicians, incorporating hundreds of thousands of articles into practice and applying them to patient care is a significant challenge. Watson can sift through an equivalent of about 1 million books or roughly 200 million pages of data, and analyze this information and provide precise responses in less than three seconds. Using this extraordinary capability WellPoint is expected to enable Watson to allow physicians to easily coordinate medical data programmed into Watson with specified patient factors, to help identify the most likely diagnosis and treatment options in complex cases. Watson is expected to serve as a powerful tool in the physician's decision making process.
It's a well-known problem in medicine. According to WellPoint, the amount of medical information is doubling every five years. And while there are reams of research studies on just about every imaginable disease and medical condition, it's all but impossible for front-line physicians to digest the information in real time and incorporate it into their practice. To Watson though, all those research papers, along with the related medical texts, case studies and patient health records are just data points to be correlated and weighed.
What WellPoint and IBM envision is a system that can mine all that information and spit out patient diagnosis and treatment options to the doctor. The stated goal is to reduce ineffective medical treatments, and thus costs. The first area where it will be applied is cancer care, one of the most information-dense and challenging medical domains for doctors. WellPoint says it expects to start deploying the Watson technology in early 2012, beginning with clinical pilots using selected doctors.
WellPoint execs, who have characterized the technology as a "game changer," are exuberant about the project. "The implications for healthcare are extraordinary," gushed Lori Beer, WellPoint's executive vice president of Enterprise Business Services, in the Computerworld report. "We believe new solutions built on the IBM Watson technology will be valuable for our provider partners, and more importantly, give us new tools to help ensure our members are receiving the best possible care."
The effort is not without its critics though. In another story from the Indianapolis Star, some physicians expressed doubt about such a system, noting that well-known diseases and conditions are relatively easy to diagnosis, while rarer ones don't have a lot of data to draw from.
A more general concern is that the technology will be used to maximize the profits rather than health. Unfortunately, the private health care system is motivated by money, first, and patient outcomes, second. Unleashing a technology into that framework certainly has the potential for some unfortunate consequences. Wendell Potter, who used to head of corporate communications for health insurer CIGNA, and is now a vocal critic of the for-profit health care industry, reflects that concern. "My fear is that tools like this will be used to ultimately figure out ways to deny treatments that doctors recommend for their patients,” says Potter, in the Indianapolis Star article.
The software could apply this insidiously. Imagine if drug X was 50 percent effective for a condition, and cost $100 dollars a month, while drug Y was 80 percent effective, but cost $1000 dollars a month. If the cost/benefit of the treatment is applied only quantitatively, drug X is going to be presented as the top choice to the doctor. Of course, such decisions are made every day by real live health care providers, but behind the veneer of a smart machine, another layer of obfuscation would added.
WellPoint is well aware of such criticism and maintains Watson's results will always be reviewed by a doctor, who will have the final say-so on the diagnosis and treatment. But it's not too hard to imagine some future version of this application that does away the doctor, and interfaces directly to patients. At that point, maybe Watson will have to be deployed with some malpractice insurance.
Posted by Michael Feldman - September 15, 2011 @ 4:23 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.