Traditional research and Big Data apps are increasingly run on the same HPC system as lines between their computational requirements blur and demand for dual-use capability grows, said Bob Braham, SGI CMO. He pointed to last month’s ramp up of the latest SGI supercomputer at the Earthquake and Volcano Information Center at the University of Tokyo’s Earthquake Research Institute (ERI) as an example of the trend.
“Beginning 10-15 years ago, as researchers ventured into earthquake prediction, large-scale computing systems were needed to both house seismic waveform databases and support an overall increase in high-performance model calculations and numerical simulations as sophisticated monitoring and prediction systems came into place. Over time, additional levels of processing power were needed with the subsequent expansion into volcanic eruption prediction,” said Braham.
After the 2011 Tohoku earthquake and tsunami in Japan (which killed almost 16,000 people) researchers widened their focus to include earthquake and volcanic disaster mitigation, which implied an increase in overall CPU performance, memory capacity and disk space. To accommodate the growing processing capability ERI needed to replace the current computer system.
“At very large, data-intensive, research organizations we’re seeing HPC systems being used for more than just conducting specialized research,” said Braham.
SGI Japan has a more than 15-year relationship with ERI, starting in 1999 when observational volcanic research started to include data-intensive predictive simulations and monitoring of tectonic activity. “We’ve deployed or upgraded five systems during that time, enabling incrementally more sophisticated research capability,” Braham said.
The most recent upgrade is an SGI UV 2000 featuring and shared-memory system and SGI ICE X distributed-memory system. The result is a joint-use computing platform for large-scale computing in science and technology, including model calculation and simulation in leading-edge seismological and volcanological studies.
Key attributes of the new system include:
- SGI VizServer with NICE Software for the visualization of graphic-intensive, 3D images to their remote user community
- Linux-based, large-scale SGI computing platform can be leveraged to solve a range of complex problems using parallel computing and shared or distributed-memory systems.
- SGI UV 2000 symmetric multi-processing system (SMP) provides 1,024-cores using Intel Xeon E5-4600 v2 processors coupled with 8 terabytes (TB) of cache-coherent shared memory.
- The SGI ICE X distributed-memory system provides a 144-node cluster with 3,456 cores using Intel Xeon E5-2600 v3 processors.
- High-speed connection between nodes is delivered using InfiniBand 4x FDR.
- High capacity storage is achieved utilizing a 500TB Lustre File System and the SGI InfiniteStorage 5100 with 288TB for data backup.
More generally, “As big data becomes more important to running an overall business operation, organizations are moving toward using HPC to support the overall enterprise,” said Braham.