Tag: big data analytics

Energy Giant Vestas Harnesses HPC and Analytics for Renewables

Sep 21, 2016 |

The energy industry was an early adopter of supercomputing; in fact, energy companies have the most powerful supercomputers in the commercial world. And although HPC in the energy sector is almost exclusively associated with seismic workloads, it also plays a critical role with renewables as well, reflecting the growing maturity of that vertical. The largest Read more…

SGI Signs Cisco as Reseller of SGI UV 300H SAP HANA Line

Jun 29, 2016 |

SGI continued its drive into the enterprise by signing Cisco Systems as a reseller of its shared-memory, single node UV 300H server line. The new deal, announced yesterday in a company blog, mirrors a similar reseller arrangement struck with Dell roughly one year ago. Back in February, SGI signed an OEM agreement with Hewlett Packard Read more…

IBM Launches Apache Spark Development Environment

Jun 8, 2016 |

IBM, which has bet big on Apache Spark as a kind of analytics operating system ($300 million investment), yesterday announced the first cloud-based development environment for near real-time, high performance analytics using Apache Spark and a variety of tools from IBM and others. According to IBM the new set of capabilities, available on IBM Bluemix Read more…

CMU and Boeing Establish Aerospace Data Analytics Lab

Oct 1, 2015 |

Carnegie Mellon University and The Boeing Company (NYSE: BA) announced plans today to establish the Boeing/Carnegie Mellon Aerospace Data Analytics Lab, a new academic research initiative that will leverage the university’s leadership in machine learning, language technologies and data analytics. This is more evidence of the collision between big data and HPC spurring academic-industry collaboration. Read more…

How Big Data Analytics Supports Tyrone’s ISP Customers in India

Jun 8, 2015 |

The big data revolution is sweeping across India, transforming engineering, science, healthcare, finance and every other aspect of business and society. In particular, one vital segment of the nation’s economy, the Internet Service Providers (ISPs), are seeking better ways to use the massive amounts of information they and their customer’s generate to provide enhanced services Read more…

Tulane Accelerates Discovery with Hybrid Supercomputer

Dec 16, 2014 |

The rich culture and distinctive charm of the city of New Orleans served as the backdrop for this year’s annual Supercomputing Conference (SC14). If you haven’t been before, residents of the Big Easy will urge you to visit the uptown campus of Tulane University. Renowned for its beautiful trees and landscaping, the university is also a prominent Read more…

Using In-Memory Computing to Simplify Big Data Analytics

Oct 2, 2012 |

The “big data” revolution is upon us, fed by the need in both the public and private sectors to quickly analyze large datasets for important patterns and trends. With big data analysis, ecommerce vendors can target customers more precisely, financial analysts can quickly spot changing market conditions, manufacturers can tune logistics planning, and the list goes on. They all need powerful, easy to use analysis tools to maintain a competitive edge.

Simplifying Big Data Storage Management

Sep 10, 2012 |

Today, organizations are facing an exponential increase in the amount of data being created. The ability to successfully manage this data, coupled with the growing complexity of storage infrastructures is creating significant challenges for IT managers. While the cost of maintaining storage infrastructures continues to increase, headcount and budget remains fixed. What is needed is an advanced management platform that reduces the cost and complexity of storage management.

Big Data, Analytics and Workflow

Sep 19, 2011 |

Organizations today routinely perform multi-step analyses on large volumes of diverse datasets to derive actionable information to make critical decisions. These operations must be carried out in ever-shorter time spans to be of value. As a result, organizations need new high performance computing (HPC) capabilities to ensure analyses workflows run efficiently and cost-effectively. And it’s not your father’s HPC. Increasingly, what’s needed is a more commercially-oriented HPC solution, one that requires an enterprise-grade infrastructure.

Pullback on Government Spending Slows Cray Momentum

May 10, 2011 |

Supercomputer maker Cray has posted a modest loss for the first quarter of 2011 and downgraded its low-end revenue expectations for the year by $20 million. In a conference call with investors, Cray CEO Peter Ungaro blamed most of this on a slowdown in government funding, as countries retreat from the spending spree of the last couple of years. Despite that, Ungaro and company are still aiming for a profitable year as they prepare to roll out new supercomputer offerings in the second half of 2011.