AI prodigy and former Jeopardy champion Watson is ready for a career in medicine.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Convey_boards.jpg” alt=”” width=”98″ height=”85″ />Last week at SC12 in Salt Lake Convey pulled the lid off its MX big data-driven architecture designed to shine against graph analytics problems, which were at the heart of the show’s unmistakable data-intensive computing thrust this year. The new MX line is designed to exploit massive degrees of parallelism while efficiently handling hard-to-partition big data applications.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/bigdatagraphic_132x.jpg” alt=”” width=”75″ height=”105″ />Big data is all the rage these days. It is the subject of a recent Presidential Initiative, has its own news portal, and, in the guise of Watson, is a game show celebrity. Big data has also caused concern in some circles that it might sap interest and funding from the exascale computing initiative. So, is big data distinct from HPC – or is it just a new aspect of our evolving world of high-performance computing?
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Panasas_on_background.bmp” alt=”” width=”120″ height=”79″ />Panasas has launched ActiveStor 14, the company’s fifth-generation storage applicance aimed at high performance computing. The new offering adds solid state drives (SSDs) to what has been almost exclusively a hard disk-based (HDD) NAS storage line-up. The inclusion of SSDs into the company’s flagship offering is further proof that flash memory has become a mainstream storage technology for accelerating HPC workloads.
While much attention has been focused on reducing application latencies, notably by using flash memory, a suitable high performance and scalable storage solution can significantly accelerate research and post-trade analytics performance on the very same servers, and improve the quality of the results from these apps by accommodating more data into modeling and analytics calculations.
According to a new report from Saugatuck Technology, cloud-based business analytics is poised for huge growth over the next two years.
The sequel to SGI’s UV supercomputer has arrived. Dubbed UV 2, the new platform doubles the number of cores and quadruples the memory that can be supported under a single system. The product, which will be officially announced next week at the International Supercomputing Conference in Hamburg, represents the first major revision of SGI’s original UV, which the company debuted in 2009.
IBM and Citigroup introduce the Jeopardy-winning supercomputer to the financial industry.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/uRiKA_top.bmp” alt=”” width=”94″ height=”93″ />For the first time in its history, Cray has built something other than a supercomputer. On Wednesday, the company’s newly hatched YarcData division launched “uRiKA,” a hardware-software solution aimed at real-time knowledge discovery with terascale-sized data sets. The system is designed to serve businesses and government agencies that need to do high-end analytics in areas as diverse as social networking, financial management, healthcare, supply chain management, and national security.
A recent Forrester report predicts that by 2020 the vast cloud landscape will be under the control of a small number of companies.