IBM Announces Advances in Watson’s Cognitive Computing Capabilities

August 28, 2014

NEW YORK, N.Y., Aug. 28 — IBM today announced significant advances in Watson‘s cognitive computing capabilities that are enabling researchers to accelerate the pace of scientific breakthroughs by discovering previously unknown connections in Big Data.

Available now as a cloud service, IBM’s Watson Discovery Advisor is designed to scale and accelerate discoveries by research teams. It reduces the time needed to test hypotheses and formulate conclusions that can advance their work — from months to days and days to just hours — bringing new levels of speed and precision to research and development.

Building on Watson’s ability to understand nuances in natural language, Watson Discovery Advisor can understand the language of science, such as how chemical compounds interact, making it a uniquely powerful tool for researchers in life sciences and other industries.  

Researchers and scientists from leading academic, pharmaceutical and other commercial research centers have begun deploying IBM’s new Watson Discovery Advisor to rapidly analyze and test hypotheses using data in millions of scientific papers available in public databases. A new scientific research paper is published nearly every 30 seconds, which equals more than a million annually (Source: CiteSeerx). According to the National Institutes of Health, a typical researcher reads about 23 scientific papers per month, which translates to nearly 300 per year, making it humanly impossible to keep up with the ever-growing body of scientific material available.

In 2013, the top 1,000 research and development companies spent more than $600 billion annually on research alone (Source: Strategy&). Progress can be slow, taking an average of 10 to 15 years for a promising pharmaceutical treatment to progress from the initial research stage into practice (Source: Pharmaceutical Research and Manufacturers of America). Using Watson Discovery Advisor, researchers can uncover new relationships and recognize unexpected patterns among data that have the potential to significantly improve and accelerate the discovery process in research and science.

“We’re entering an extraordinary age of data-driven discovery,” said Mike Rhodin, senior vice president, IBM Watson Group. “Today’s announcement is a natural extension of Watson’s cognitive computing capability. We’re empowering researchers with a powerful tool which will help increase the impact of investments organizations make in R&D, leading to significant breakthroughs.”

Leading life sciences organizations are deploying Watson Discovery Advisor to advance discoveries in ongoing research projects, including Baylor College of Medicine, Johnson & Johnson and The New York Genome Center.

  • In a retrospective, peer reviewed study released this week by Baylor College of Medicine and IBM, scientists demonstrated a possible new path for generating scientific questions that may be helpful in the long term development of new, effective treatments for disease. In a matter of weeks, biologists and data scientists using the Baylor Knowledge Integration Toolkit (KnIT), based on Watson technology, accurately identified proteins that modify p53, an important protein related to many cancers, which can eventually lead to better efficacy of drugs and other treatments. A feat that would have taken researchers years to accomplish without Watson’s cognitive capabilities, Watson analyzed 70,000 scientific articles on p53 to predict proteins that turn on or off p53’s activity. This automated analysis led the Baylor cancer researchers to identify six potential proteins to target for new research. These results are notable, considering that over the last 30 years, scientists averaged one similar target protein discovery per year. 

“On average, a scientist might read between one and five research papers on a good day,” said Dr. Olivier Lichtarge, the principal investigator and professor of molecular and human genetics, biochemistry and molecular biology at Baylor College of Medicine. “To put this in perspective with p53, there are over 70,000 papers published on this protein. Even if I’m reading five papers a day, it could take me nearly 38 years to completely understand all of the research already available today on this protein. Watson has demonstrated the potential to accelerate the rate and the quality of breakthrough discoveries. “

  • Johnson & Johnson is collaborating with the IBM Watson Discovery Advisor team to teach Watson to read and understand scientific papers that detail clinical trial outcomes used to develop and evaluate medications and other treatments. This collaboration hopes to accelerate comparative effectiveness studies of drugs, which help doctors match a drug with the right set of patients to maximize effectiveness and minimize side effects. Typically, comparative effectiveness studies are done manually, requiring three people to spend an average of 10 months (2.5 man-years) just to collect the data and prepare them for use before they are able to start analyzing, generating and validating a hypothesis. In this research study, the team hopes to teach Watson to quickly synthesize the information directly from the medical literature, allowing researchers to start asking questions about the data immediately to determine the effectiveness of a treatment compared to other medications, as well as its side effects. 
  • Sanofi is exploring how  working with Watson can speed up the discovery of alternate indications for existing drugs (drug re-purposing). Watson is able to understand and extract key information by reading millions of pages of scientific literature and then visualizes relationships between drugs and other potential diseases they could target while providing supporting evidence each step of the way. Drug Safety and Toxicity is a major driver of the high failure rate in clinical development /  trials.  Sanofi is exploring how Watson’s ability to understand, extract and organize toxicological information can enable researchers to make better informed decisions with respect to candidate progression
  • IBM Watson will be supporting the analysis in New York Genome Center’s clinical study to advance genomic medicine. The clinical study will initially focus on clinical application of genomics to help oncologists deliver DNA-based treatment for glioblastoma, an aggressive form of brain cancer that kills more than 13,000 Americans each year. Despite tremendous discoveries into the genetic drivers of diseases like cancer over the past decade, big data makes it difficult to translate DNA data into life-saving treatments. Based on results from the clinical study, IBM Watson could soon help scale up the availability of personalized treatment options.

Industry Implications 

Discovering something new is applicable to many domains such as medicine, law, finance, etc., that all require deep insight into a large body of information and protocols. Cognitive computing will allow human experts to interact with large bodies of data and research and the knowledge and insight of many other experts in their field. For example, Watson could be used to:

  • Accelerate a medical researcher’s ability to develop life-saving treatments for diseases by synthesizing evidence and removing reliance on serendipity
  • Enhance a financial analyst’s ability to provide proactive advice to clients
  • Improve a lawyer’s merger and acquisition strategy with faster, more comprehensive due diligence and document analysis
  • Accelerate a government analyst’s insight into security, intelligence, border protection and law enforcement and guidance, etc.
  • Create new food recipes. Chefs can use Watson to augment their creativity and expertise and help them discover recipes, learning about the language of cooking and food by reading recipes, statistical, molecular and food pairing theories, hedonic chemistry, as well as regional and cultural knowledge

Source: IBM

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire