Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
March 25, 2014

Texas Tech Receives NSF Grant to Develop Supercomputer Prototype

March 25 — The National Science Foundation has awarded a $500,000 grant to researchers at Texas Tech University to develop a new supercomputer prototype that could lead to more efficient data-intensive computing – and speed-up the scientific discovery cycle.

Yong Chen, assistant professor of computer science and Director of Data-Intensive Scalable Computing Laboratory at Texas Tech University, is leading a team of researchers in a project titled, “Development of a Data-Intensive Scalable Computing Instrument (DISCI) for High Performance Computing.”

“High performance computers traditionally are designed for computation-intensive problems,” Chen said. “They are not a good fit for the increasingly important data-intensive applications.”

Imagine trying to use a 35 year old computer to perform modern day tasks, such as streaming a movie. It would be impossible because that particular device was only meant for computations.

“In computing technology, advancement in software has always lagged behind hardware,” said Rattikorn Hewett, department chair and professor of computer science. “Just like having a modern home running an old plumbing system, sophisticated high performance computers alone can’t perform well without advanced mechanisms for Data movement and Data access. Dr. Chen’s project attempts to unlock this problem with software solutions that would potentially have great impacts on anyone who uses data intensively.”

Chen says existing supercomputers experience comparable behavior, but on a much larger scale, for many data-intensive scientific and enterprise computing problems.

“Data generation has become so cheap and so easy,” Chen said. “Almost everyone has a smartphone capable of taking pictures or video. Gene sequencers have never been so cheap. The proliferation of sensors, embedded devices and mobile devices has led to data generation easier than never before. The problem comes with data storage, retrieval and utilization.”

The team’s goal is to create a supercomputer that will enable academic departments, cross-disciplinary units and collaborators to analyze and utilize their data, and put them to use with accuracy, speed and efficiency.

They spend the majority of their time manipulating data, rather than doing actual computing. The amount of computing time is significantly less, than the data access/movement time.

“We are delighted that Dr. Yong Chen has received this important award for Texas Tech,” said Robert Duncan, Vice President for Research. “We are in the age of massive data sets that stream from cameras and sensors, and from complex instruments that are used in everything from research to health care. This will help Texas Tech in our quest to transform huge data sets into functional knowledge.”

The team’s goal is to develop a super-computer prototype that will make data-intensive applications run more efficiently for future generations.

“We deeply appreciate the valuable support from the Office of Vice President for Research, Whitacre College of Engineering, Department of Computer Science, and High Performance Computing Center that make this award possible,” Chen said.

—–

Source: Texas Tech University