Visit additional Tabor Communication Publications
September 15, 2011
The deep thinking supercomputer that vanquished two of the most accomplished Jeopardy champs in the game's history is about to begin its first commercial gig. IBM Watson is being enlisted by WellPoint, one of the largest health insurers in the US, to be its first cyber-healthcare analyst. Specifically, the IBM Watson technology will be used for applications that can suggest patient diagnosis and treatment options for doctors in the WellPoint's Blue Cross, Blue Shield provider network.
According to the press release, IBM and WellPoint will develop the healthcare applications, adapting Watson's Question Answering technology, aka DeepQA, for the medical domain. But the actual hardware will be somewhat different. An article in Computerworld reports that the machine used to host the WellPoint apps will be a smaller version than the ten rack, 90-node Power 750 cluster used in the Jeopardy series in February, but because of optimizations since then, will have the same computational muscle. Fundamentally though, the system will incorporate all the "big data" analytics smarts as its game-show precursor.
Here is IBM's pitch from the announcement:
In recent years, few areas have advanced as rapidly as health care. For physicians, incorporating hundreds of thousands of articles into practice and applying them to patient care is a significant challenge. Watson can sift through an equivalent of about 1 million books or roughly 200 million pages of data, and analyze this information and provide precise responses in less than three seconds. Using this extraordinary capability WellPoint is expected to enable Watson to allow physicians to easily coordinate medical data programmed into Watson with specified patient factors, to help identify the most likely diagnosis and treatment options in complex cases. Watson is expected to serve as a powerful tool in the physician's decision making process.
It's a well-known problem in medicine. According to WellPoint, the amount of medical information is doubling every five years. And while there are reams of research studies on just about every imaginable disease and medical condition, it's all but impossible for front-line physicians to digest the information in real time and incorporate it into their practice. To Watson though, all those research papers, along with the related medical texts, case studies and patient health records are just data points to be correlated and weighed.
What WellPoint and IBM envision is a system that can mine all that information and spit out patient diagnosis and treatment options to the doctor. The stated goal is to reduce ineffective medical treatments, and thus costs. The first area where it will be applied is cancer care, one of the most information-dense and challenging medical domains for doctors. WellPoint says it expects to start deploying the Watson technology in early 2012, beginning with clinical pilots using selected doctors.
WellPoint execs, who have characterized the technology as a "game changer," are exuberant about the project. "The implications for healthcare are extraordinary," gushed Lori Beer, WellPoint's executive vice president of Enterprise Business Services, in the Computerworld report. "We believe new solutions built on the IBM Watson technology will be valuable for our provider partners, and more importantly, give us new tools to help ensure our members are receiving the best possible care."
The effort is not without its critics though. In another story from the Indianapolis Star, some physicians expressed doubt about such a system, noting that well-known diseases and conditions are relatively easy to diagnosis, while rarer ones don't have a lot of data to draw from.
A more general concern is that the technology will be used to maximize the profits rather than health. Unfortunately, the private health care system is motivated by money, first, and patient outcomes, second. Unleashing a technology into that framework certainly has the potential for some unfortunate consequences. Wendell Potter, who used to head of corporate communications for health insurer CIGNA, and is now a vocal critic of the for-profit health care industry, reflects that concern. "My fear is that tools like this will be used to ultimately figure out ways to deny treatments that doctors recommend for their patients,” says Potter, in the Indianapolis Star article.
The software could apply this insidiously. Imagine if drug X was 50 percent effective for a condition, and cost $100 dollars a month, while drug Y was 80 percent effective, but cost $1000 dollars a month. If the cost/benefit of the treatment is applied only quantitatively, drug X is going to be presented as the top choice to the doctor. Of course, such decisions are made every day by real live health care providers, but behind the veneer of a smart machine, another layer of obfuscation would added.
WellPoint is well aware of such criticism and maintains Watson's results will always be reviewed by a doctor, who will have the final say-so on the diagnosis and treatment. But it's not too hard to imagine some future version of this application that does away the doctor, and interfaces directly to patients. At that point, maybe Watson will have to be deployed with some malpractice insurance.
Posted by Michael Feldman - September 15, 2011 @ 4:23 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.