Visit additional Tabor Communication Publications
October 17, 2011
Under strategic distribution agreement, SGI Will Ship Cloudera's distribution including Apache Hadoop and Cloudera Enterprise Management suite factory-installed on SGI Hadoop clusters
PALO ALTO, CA and FREMONT, Calif., Oct. 17 -- Cloudera Inc., the leading provider of Apache Hadoop-based data management software and services, and SGI (NASDAQ: SGI), the trusted leader in technical computing, today jointly announced that their companies have signed an agreement for SGI to distribute Cloudera software pre-installed on SGI Hadoop Clusters. SGI, which recently set a world record performance benchmark for Terasort data processing and analysis leveraging Cloudera's Distribution Including Apache Hadoop (CDH) and is a member of the Cloudera Connect Partner Program, will resell and offer level one support for Cloudera software and services -- including Cloudera University training courses -- to its customers. The relationship will also enable the two companies to jointly build, sell and deploy integrated, high performance Apache Hadoop-based commercial solutions.
Apache Hadoop is a powerful and disruptive open source technology that addresses the economic, flexibility and scalability issues surrounding massive amounts of enterprise data and enables actionable insights to be derived from structured and unstructured data sets. Hadoop, which forms the infrastructure foundation of many of the world's leading social media companies, including Facebook, LinkedIn and Twitter, has rapidly become a leading solution to the new challenges generated by Big Data.
Together, SGI Hadoop clusters and Cloudera's software, services and support form a complete, end-to-end solution for enterprises deploying Apache Hadoop in performance-intensive environments. As the global leader in technical computing, SGI was among the first technology vendors to embrace and proliferate the use of Apache Hadoop in the Federal and enterprise sectors, and is currently running the largest Hadoop clusters servers in the world. Cloudera has pioneered the use of Apache Hadoop in business applications and was first to make Hadoop enterprise-ready, delivering best of breed management software, support and training services. CDH is the most widely deployed Hadoop distribution in both commercial and non-commercial environments, bundling 100 percent pure open source Apache Hadoop with other leading open source components in the Hadoop stack.
"We understand the power of Hadoop. Since the technology's inception, we have successfully deployed tens of thousands of Hadoop servers to our customers," said Bill Mannel, vice president of product marketing at SGI. "Leveraging Cloudera's Distribution including Apache Hadoop together with our SGI Hadoop Cluster, we achieved a world record Hadoop benchmark for data processing and analysis -- 81 percent faster than the competition. CDH, combined with Cloudera's management suite, puts the promise and potential of Hadoop -- and the complete Hadoop stack -- within reach. We are pleased to work with Cloudera, and together, we are enabling our mutual customers to streamline the path to putting Hadoop to work for their businesses."
"The combination of Cloudera's market-leading software with SGI's first-class products and reputation in the HPC market enables global access and delivery capacity for Apache Hadoop in stalwart HPC verticals, including defense, intelligence, research and telecommunications," said Ed Albanese, Head of Business Development for Cloudera. "This partnership enables Cloudera to better serve a segment of customers accustomed to factory-installed products and solution-oriented delivery. We're pleased that SGI has selected Cloudera products and will offer these products as a bundled component of their proven server line."
SGI, the trusted leader in technical computing, is focused on helping customers solve their most demanding business and technology challenges. Visit sgi.com for more information.
Connect with SGI on Twitter (@sgi_corp), YouTube (youtube.com/sgicorp), and LinkedIn.
Cloudera, the leader in Apache Hadoop-based software and services, enables data driven enterprises to easily derive business value from all their structured and unstructured data. Cloudera's Distribution Including Apache Hadoop (CDH), available to download for free at www.cloudera.com/downloads, is the most comprehensive, tested, stable and widely deployed distribution of Hadoop in commercial and non-commercial environments. For the fastest path to reliably using this completely open source technology in production for Big Data analytics and answering previously un-addressable big questions, organizations can subscribe to Cloudera Enterprise, comprised of Cloudera Support and a portfolio of software including Cloudera Management Suite. Cloudera also offers consulting services, training and certification on Apache technologies. As the top contributor to the Apache open source community and with tens of thousands of nodes under management across customers in financial services, government, telecommunications, media, web, advertising, retail, energy, bioinformatics, pharma/healthcare, university research, oil and gas and gaming, Cloudera's depth of experience and commitment to sharing expertise are unrivaled. www.cloudera.com
Connect with Cloudera
Source: Cloudera; SGI
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.