Visit additional Tabor Communication Publications
May 13, 2009
New software delivers real-time business analytics platform to enable smarter business decisions
NEW YORK, May 13 -- At its annual investor briefing today, IBM announced the availability of its unique "stream computing" software that enables massive amounts of data to be analyzed in real time, delivering extremely fast, accurate insights to enable smarter business decision-making. The new software is called IBM System S.
IBM also announced today the opening of the IBM European Stream Computing Center, headquartered in Dublin, Ireland that will serve as a hub of research, customer support and advanced testing for what is expected to be a growing base of European clients who wish to apply stream computing to their most challenging business problems.
Additionally, IBM is making System S trial code available at no cost to help clients better understand the software's capabilities and how they can take advantage of it for their business. This trial code includes developer tools, adapters and software to test applications.
System S is built for perpetual analytics -- utilizing a new streaming architecture and breakthrough mathematical algorithms, to create a forward-looking analysis of data from any source -- narrowing down precisely what people are looking for and continuously refining the answer as additional data is made available.
For example, System S can analyze hundreds or thousands of simultaneous data streams -- stock prices, retail sales, weather reports, etc. -- and deliver nearly instantaneous analysis to business leaders who need to make split-second decisions. The software can help all organizations that need to react to changing conditions in real time, such as government and law enforcement agencies, financial institutions, retailers, transportation companies, healthcare organizations, and more.
"System S software is another example of IBM helping clients through our long-term investments in business analytics and advanced mathematics," said Dr. John E. Kelly III, IBM senior vice president and director of IBM Research. "The ability to manage and analyze incoming data in real time, and use it to make smarter decisions, can help businesses and other enterprises differentiate themselves."
The enormous potential of this technology represents a significant advancement in information technology: using computers to rapidly analyze multiple streams of diverse, unstructured and incompatible data sources in real time, enabling very fast, accurate and insightful decisions. As the world becomes increasingly interconnected and instrumented, the amount of data is skyrocketing - not just structured information found in databases -- but unstructured, incompatible data captured from electronic sensors, web pages, email, blogs and video. By 2010, the amount of digital information is expected to reach 988 exabytes, roughly the equivalent of a stack of books from the Sun to Pluto and back.
Traditional computing models retrospectively analyze stored data and can not continuously process massive amounts of incoming data streams that affect critical decision-making. System S is designed to help clients become more "real-world aware," seeing and responding to changes across complex systems.
This first-of-a-kind software platform features a combination of more than 20 years of IBM information management expertise, five years of development by IBM Research, and more than 200 patents to create a powerful high-performance computing system that is adaptable to run on a variety of hardware. This software, already in use by select clients worldwide, illustrates how IBM Research can help expand the company's opportunity for growth and provide powerful new solutions for clients.
Uppsala University and the Swedish Institute of Space Physics are using System S to better understand "space weather," which can influence energy transmission over power lines, communications via radio and TV signals, airline and space travel, and satellites. By using the LOIS Space Center radio facility in Sweden to analyze radio emissions from space in three dimensions, scientists use this technology to compile endless amounts of data and extract predictions on activities in space. Since researchers need to measure signals from space over large time spans, the raw data generated by even one antenna quickly becomes too large to handle or store. System S analyzes the data immediately as it streams from sensors. Over the next year or so the project is expected to perform analytics on at least 6 gigabytes per second or 21,600 gigabytes per hour - the equivalent of all the Web pages on the Internet.
The Marine Institute of Ireland plans to use System S to better understand fragile marine ecosystems. As a core component of this collaboration, a real-time distributed stream analytical fabric for environmental monitoring and management is under development. Acting on large volumes of underwater acoustic data and processing it in real-time, the Institute extracts useful information such as species identification of marine life, population count and location. Future extensions to the analytics platform, using acoustic data sampled at alternate frequencies might allow correlation and modeling in areas such as weather and marine traffic, extending the value of the recently announced SmartBay project.
TD Securities and IBM collaborated to develop a revolutionary prototype of the world's fastest automated options trading system using System S. With this system, scientists at IBM collaborated with TD Securities to achieve a 21 times performance improvement on the volume of data consumed by financial trading systems.
IBM and the University of Ontario Institute of Technology (UOIT) are using testing System S to help doctors detect subtle changes in the condition of critically ill premature babies. The software ingests a constant stream of biomedical data, such as heart rate and respiration, along with clinical information about the babies. Monitoring "preemies" as a patient group is especially important as certain life-threatening conditions such as infection may be detected up to 24 hours in advance by observing changes in physiological data streams. The type of information that will come out of the use of System S is not available today. Currently, physicians monitoring preemies rely on a paper-based process that involves manually looking at the readings from various monitors and getting feedback from the nurses providing care.
For more information about IBM's System S software, visit http://www-01.ibm.com/software/data/infosphere/streams/. System S is available as a part of the InfoSphere Product line.
For more information about IBM (NYSE: IBM), visit http://www.ibm.com/think.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.