Visit additional Tabor Communication Publications
October 27, 2010
LAS VEGAS, Oct. 26 -- IBM (NYSE: IBM) today announced a new project in which researchers at Columbia University Medical Center will utilize IBM's streaming analytics technology to potentially detect severe complications in brain injured patients up to 48 hours earlier than traditional methods.
For patients that have suffered a bleeding stroke from a ruptured brain aneurysm, recovery can involve serious complications. One of the most severe and frequent complications is delayed ischemia, a life threatening condition in which the brain does not get enough blood to function properly. Currently, detectable symptoms only appear once blood flood has been significantly reduced, forcing medical professionals to be reactive instead of preventative in their treatment. In 20 percent of patients with this complication, there are no observable symptoms at all and it is only after it is too late that their doctor realizes that the patient needed treatment.
Developed at IBM Research Labs, IBM streaming analytics analyzes large volumes of data in motion. Using this technology, medical researchers believe they may be able to uncover the patterns in symptom progression not visible to the naked eye and possibly spot the onset of the condition up to 48 hours earlier than current methods. The detection of these early warning signs would give doctors the ability to plan and begin treatments sooner or potentially stop the progression of the condition altogether.
"The ability to analyze massive volumes of medical data to uncover early warning signs for this life-threatening complication could lead to significant improvements in how this condition is treated," said Dr. Michael Schmidt, director of neuromonitoring and informatics, Neurological Intensive Care Unit (NICU) Columbia University Medical Center. "We need the ability to not only uncover the hidden data patterns in the lab but then take what we learn and use it in real time at the bedside for the benefit of the patient."
The first phase of the project involves uncovering patterns within the volumes of data that are related to the patient's complications. Using analytics software, the researchers are processing physiological data streams such as EEG feeds, blood pressure, blood oxygen levels, and temperature readings in conjunction with persistent data, such as lab test results, patient information, and symptoms reported by medical professionals and patients. The analysis of this information may be able to uncover hidden patterns in test results that are difficult to correlate without the help of analytics.
Once the key hallmarks of this life threatening condition are identified and validated, the second phase of the project will allow researchers to move the project to the neurological intensive care unit. There, they can gather data in real time from patients, testing the warning signs previously identified and providing medical professionals quick insight into the condition of their patients.
Medical professionals working in a neurological intensive care unit measure and assess more than 200 variables when evaluating a patient. From heart rates, temperature, blood pressure, brain and heart activity readings they are faced with a constant and complex stream of data. IBM's breakthrough streaming analytics software, IBM InfoSphere Streams, combines data from medical tests and equipment in the NICU with data from other sources such as databases and images and analyzes it in real time, giving medical professionals instant updates on a patient's condition, spotting trends and symptoms as they emerge.
"We are only at the beginning of what's possible with streaming analytics in healthcare such as for early intervention during heart attacks to minimize cardiac muscle damage or this innovative work to detect complications in stroke patients," said Nagui Halim, director and research scientist, IBM streaming analytics. "From healthcare organizations to telecommunications companies in Asia and government agencies in Europe, IBM streaming analytics is helping people use information in ways they never thought possible."
IBM InfoSphere Streams enables continuous and real-time filtering, correlating and analysis of massive volumes of information-in-motion to help improve business insights and decision making. The analytics software handles structured and unstructured streaming data sources such voice, video, databases, market feeds, medical equipment feeds, images from satellites and application data in real-time. The software is a first of its kind platform that combines more than 20 years of IBM information management expertise, eight years of development by Research and more than 200 patents to create a powerful high-performance computing system.
Streaming analytics software is a part of IBM's more than $14 billion investment in business analytics which includes 24 acquisitions, as well as organic innovation. In addition, IBM has assembled 7,000 analytics consultants with industry expertise, and opened a network of analytics centers of excellence around the world.
IBM is creating a smarter, more connected healthcare system that delivers better care with fewer mistakes, predicts and prevents diseases, and empowers people to make better choices. This includes integrating data so doctors, patients and insurers can share information seamlessly and efficiently. IBM also helps clients apply advanced analytics to improve medical research, diagnosis and treatment in order to improve patient care and help reduce healthcare costs.
For more information on and news from the IOD Conference and Business Analytics Forum: www.ibm.com/press/baiod2010.
To hear how IBM clients are using analytics to transform their business visit: http://www.youtube.com/user/ibmbusinessanalytics.
For more information about IBM, visit: http://www.ibm.com/smarterhealthcare.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.