Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Tag: Hadoop

HPC and Big Data: A “Best of Both Worlds” Approach

Mar 31, 2014 |

While they may share a number of similar, overarching challenges, data-intensive computing and high performance computing have some rather different considerations, particularly in terms of management, emphasis on performance, storage and data movement. Still, there is plenty of room for the two areas to merged, according to Indiana University’s Dr. Geoffrey Fox. Fox and his Read more…

Shaking the HPC In-Memory Stack

Mar 3, 2014 |

Many who have been in HPC for a number of years will remember GridGain, the in-memory computing company that has found success at a number of commercial and academic high performance computing sites since its official launch in 2010—an effort backed with an initial $2.5 million investment, followed by another boost last year with a Read more…

Adapting Hadoop to HPC Environments

Feb 14, 2014 |

MapReduce is well known for its relative ease of use in today’s ubiquitous world of parallelism. The beauty of the model is in its ability to absolve or abstract away details of parallelism, fault tolerance, synchronization, and input management from the user. A user typically writes her algorithm using two uniform functions — a map Read more…

How HPC is Hacking Hadoop

Feb 11, 2014 |

Although the trend may be quiet and distributed across only a relative few supercomputing sites, Hadoop and HPC are hopping hand-in-hand more frequently. These two technology areas aren’t necessarily made for another—there are limitations in what Hadoop can do. But a stretch of recent research has been pushing the possibilities, especially when it comes to Read more…

Cray Advances Hadoop for HPC

Feb 4, 2014 |

In a recent blog entry, Mike Boros, Hadoop Product Marketing Manager at Cray, Inc., writes about the company’s positioning of Hadoop for scientific big data. Like the old adage, “when the only tool you have is a hammer, every problem begins to resemble a nail,” Boros suggests that the Law of the Instrument may be true Read more…

SC13 Spotlights Three Main Trends

Dec 18, 2013 |

Collecting my thoughts after a thrilling and enlightening Supercomputing Conference in Denver, I want to discuss some of the key trends and highlights observed. The annual conference is always a hotbed of product announcements and this year, the 25th running, was no exception. There are many articles covering the key announcements, so I instead want Read more…

Cray Targets Oil and Gas Sector’s Big Data Needs

Sep 25, 2013 |

Supercomputer-maker Cray is helping oil and gas companies benefit from the most-advanced reservoir modeling approach yet. Called Permanent Reservoir Monitoring, or PRM, the technique requires innovative data warehousing technology and data analysis techniques.

Accelerate Hadoop MapReduce Performance using Dedicated OrangeFS Servers

Sep 9, 2013 |

Recent tests performed at Clemson University achieved a 25 percent improvement in Apache Hadoop Terasort run times by replacing Hadoop Distributed File System (HDFS) with an OrangeFS configuration using dedicated servers. Key components included extension of the MapReduce “FileSystem” class and a Java Native Interface (JNI) shim to the OrangeFS client. No modifications of Hadoop were required, and existing MapReduce jobs require no modification to utilize OrangeFS.

Cray Bundles Intel’s Hadoop with CS300 Line of Supercomputers

Jun 20, 2013 |

This month, Cray will begin delivery of a new big data analytics cluster that combines a of its entry-level CS300 system that’s been optimized to run Intel’s Hadoop distribution. Cray says the new system will provide customers with a “turnkey” Hadoop cluster that can tackle big data problems that would be difficult to solve using commodity hardware.

Cray Cracks Commercial HPC Code

Jun 20, 2013 |

During a conversation this week with Cray CEO, Peter Ungaro, we learned that the company has managed to extend its reach into the enterprise HPC market quite dramatically–at least in supercomputing business terms. With steady growth into these markets, however, the focus on hardware versus the software side of certain problems for such users is….