Tag: big data analytics
The “big data” revolution is upon us, fed by the need in both the public and private sectors to quickly analyze large datasets for important patterns and trends. With big data analysis, ecommerce vendors can target customers more precisely, financial analysts can quickly spot changing market conditions, manufacturers can tune logistics planning, and the list goes on. They all need powerful, easy to use analysis tools to maintain a competitive edge.
Today, organizations are facing an exponential increase in the amount of data being created. The ability to successfully manage this data, coupled with the growing complexity of storage infrastructures is creating significant challenges for IT managers. While the cost of maintaining storage infrastructures continues to increase, headcount and budget remains fixed. What is needed is an advanced management platform that reduces the cost and complexity of storage management.
Organizations today routinely perform multi-step analyses on large volumes of diverse datasets to derive actionable information to make critical decisions. These operations must be carried out in ever-shorter time spans to be of value. As a result, organizations need new high performance computing (HPC) capabilities to ensure analyses workflows run efficiently and cost-effectively. And it’s not your father’s HPC. Increasingly, what’s needed is a more commercially-oriented HPC solution, one that requires an enterprise-grade infrastructure.
Supercomputer maker Cray has posted a modest loss for the first quarter of 2011 and downgraded its low-end revenue expectations for the year by $20 million. In a conference call with investors, Cray CEO Peter Ungaro blamed most of this on a slowdown in government funding, as countries retreat from the spending spree of the last couple of years. Despite that, Ungaro and company are still aiming for a profitable year as they prepare to roll out new supercomputer offerings in the second half of 2011.