An ambitious astronomy effort designed to peer back to the origins of the universe and map the formation of galaxies is underpinned by an emerging memory technology that seeks to move computing resources closer to huge astronomy data sets.
The Square Kilometer Array (SKA) is an international initiative to build the world’s largest radio telescope. A “precursor project” in the South African desert called MeerKAT consists of 64 44-foot “receptor” satellite dishes. The array gathers and assembles faint radio signals used to create images of distant galaxies.
Once combined with other sites, SKA would be capable of peering back further in time than any other Earth-based observatory. As with most advanced science projects, SKA presents unprecedented data processing challenges. With daily data volumes reaching 1 exabyte, “The data volume is becoming overwhelming,” astronomer Simon Ratcliffe noted during a webcast this week.
In response, Micron Technology Inc. has come up with a processing platform for handling the growing data bottleneck called the Hybrid Memory Cube (HMC). The memory specialist combined its fast logic process technology with new DRAM designs to boost badly needed bandwidth in its high-density memory system.
Steve Pawlowski, Micron’s vice president of advanced computing, claimed its memory platform delivers as much as a 15-fold increase in bandwidth, a capability that addresses next-generation networking and exascale computing requirements.
Applications such as SKA demonstrate “the ability to put [computing] at the edge” to access the most relevant data, Pawlowski added.
The radio telescope array uses a front-end processor to convert faint analog radio signals to digital. Those signals are then processed using FPGAs. Memory resources needed to make sense of all that data can be distributed using relatively simple algorithms, according to Francois Kapp, a systems engineer at SKA South Africa. The challenge, Kapp noted, is operating the array around the clock along with the “increasing depth and width of memory” requirements. “You can’t just add more memory to increase the bandwidth, ” he noted, especially as FPGAs move to faster interfaces.
Hence, the SKA project is wringing out Micron’s HMC approach as it maps the universe and seeks to determine how galaxies were formed. The resulting daily haul of data underscores what Jim Adams, former NASA deputy chief scientist, called “Big Science.”
The exascale computing requirements of projects such as SKA exceed those of previous planetary missions such as the 2015 New Horizon fly-by of Pluto. Adams said it took NASA investigators a year to download all the data collected by New Horizon.
The technical challenges are similar for earth-bound observatories. “Astronomy is becoming data science,” Ratcliffe added.
Micron positions its memory platform as a “compute building block” designed to provide more bandwidth between memory and computing resources while placing processing horsepower as close as possible to data so researchers can access relevant information.
Meanwhile, university researchers at the University of Heidelberg are attempting to accelerate adoption of new memory approaches like Micron’s through open-source development of configurable HMC controller that would serve as a memory interface.
Research Juri Schmidt noted that the German university’s network-attached memory scheme was another step toward pushing memory close to data by reducing the amount of data movement.
Micron’s Pawlowski noted that the current version of the memory platform is being used to sort and organize SKA data as another way to reduce data movement. The chipmaker is also investigating how to incorporate more logic functionality, including the use of machine learning to train new analytics models.
Computing, memory and, eventually, cloud storage could be combined with Micron’s low-power process technology for energy efficient high-performance computing. While the company for now doesn’t view HMC as an all-purpose platform, it would be suited to specific applications such as SKA, Pawlowski noted.
The astronomy initiative will provide a major test for exascale computing since, according to Adams, SKA “is a time machine,” able to look back just beyond the re-ionization period after the Big Bang when galaxies began to form.