Tag: datadirect networks
“Today you buy Tylenol (500mg for almost everyone), but it doesn’t positively impact everybody because genetically we are all different,” says Rajiv Nema, Director of Marketing at SAP (HANA, Mobile Innovations). “$70B is spent on cancer medicine in the United States alone, and 40% – almost $30B – of that is wasted because the medicine does not positively impact the patient. “
For the past few decades, the norm among the large government labs, academic research facilities and top commercial sites has been to deploy one large system per site at a time. However, more recently growing diversity of applications and end user community requirements, combined with non-overlapping budget and expanding technology lifecycles, has been driving a multi-cluster environment approach.
<img src=”http://media2.hpcwire.com/hpcwire/dansbury.jpg” alt=”” width=”95″ height=”93″ />With $45 million in government funding, the research center will develop software to make supercomputers more efficient and to help process data from the SKA, the world’s largest radio telescope. The technology is being developed with industry partners, and will be made available to scientific and industrial organizations in the UK.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/AlexBouzari_small.jpg” alt=”” width=”97″ height=”96″ />DataDirect Networks (DDN) is one of the few specialist storage vendors that has weathered the storm of industry consolidation. The company has managed to remain independent and focused on its HPC business, while others like BlueArc, Engenio, XtremIO, and Whamcloud got scooped up by larger companies with more diverse business goals. We recently spoke with DDN CEO Alex Bouzari to get his take on the industry churn and other trends that are reshaping the storage market.
Successful oil and gas exploration today requires ever-faster upstream processing. To shorten the compute time needed to get actionable information, organizations need to reduce survey processing run times from months to weeks and be capable of scaling to handle the explosive data growth.
Life Sciences can mean different things to different people. In genomic research, it referrers to the art of sequencing; in BioPharma, it covers molecular dynamics and protein docking; and in clinical, electronic records. However, all three markets have one thing in common, the sequencing of the human genome and the control, analysis, and distribution of that data. Today with the continued decrease in sequencing costs, life sciences research is moving from beakers to bytes and increasingly relies on the analysis of large volumes of data.
Today, organizations are facing an exponential increase in the amount of data being created. The ability to successfully manage this data, coupled with the growing complexity of storage infrastructures is creating significant challenges for IT managers. While the cost of maintaining storage infrastructures continues to increase, headcount and budget remains fixed. What is needed is an advanced management platform that reduces the cost and complexity of storage management.
While much attention has been focused on reducing application latencies, notably by using flash memory, a suitable high performance and scalable storage solution can significantly accelerate research and post-trade analytics performance on the very same servers, and improve the quality of the results from these apps by accommodating more data into modeling and analytics calculations.
On Wednesday, DataDirect Networks unveiled its new Storage Fusion Architecture (SFA) system, the SFA12K, its third generation SFA platform. Like previous SFA offerings, this one, of course, is aimed at super-sized HPC machines, but it is also targeted at big data applications that are spreading across the Internet and infiltrating enterprise datacenters.
NetApp flexed its newly acquired supercomputing muscles this week when it announced it would be supplying one of the largest Lustre storage system in the world for the Sequoia supercomputer to be installed at Lawrence Livermore National Laboratory next year. NetApp’s E-Series storage, which they inherited when the company purchased LSI’s Engenio business, will be used to provide 55 petabytes of disk arrays for the 20-petaflop Sequoia machine.