Visit additional Tabor Communication Publications
October 19, 2010
Datacenter powers research and innovation for UC San Diego and beyond
Oct. 19 -- The San Diego Supercomputer Center (SDSC) this week formally marked its 25th year, highlighting several scientific and technological accomplishments that include assisting researchers in developing new drugs for AIDS and cancer, predicting the impact of earthquakes, and determining the structures of key enzymes to increase the world's food supply.
Established by the National Science Foundation (NSF) under an agreement between UC San Diego and neighboring General Atomics, SDSC's staff and management also looked ahead to the next 25 years, as the center positions itself as a key resource for data-intensive computing for UC San Diego and the entire UC system, as well as researchers throughout academia, government, and industry.
"The San Diego Supercomputer Center fills a very important and crucial role, especially in data-intensive computation for all researchers," said UC San Diego Chancellor Marye Anne Fox in commemorating the center's anniversary. As an organized research unit of UC San Diego, SDSC "has transformed how science is done throughout the world."
Fox further noted that SDSC has distinguished itself with an impressive list of scientific accomplishments in its first 25 years, at a time when the entire UC San Diego campus is celebrating its 50th anniversary as one of the nation's top research universities.
Michael Norman, SDSC's director, said that the center's future lies in the convergence of high-performance computing and what he called high-performance data, as all researchers face the daunting task of sorting through and making sense of a "data tsunami" as they conduct their research.
"We are seeing the rise of data-intensive science, and quite frankly SDSC has been pursuing this avenue of scientific inquiry for more than 10 years," Norman said, noting that in addition to a number of new systems, the center is preparing to introduce Gordon, the first data-intensive, flash memory-based supercomputing system that should rank among the top 100 systems in the world after it debuts in mid-2011.
"SDSC has a distinguished history of technical innovation, scientific discovery, and community service, and we are well positioned for data-intensive science for the next 25 years," said Norman, who was named SDSC's third director earlier this year.
SDSC opened its doors as one of the nation's first supercomputer centers, as the U.S. sought to increase its overall investment in computing to support scientific research. Sid Karin, SDSC's founder and first director -- who sent an unsolicited proposal to the NSF to create the facility -- noted that SDSC was among the first centers to provide interactive access to scientists across a wide range of domains, who soon began to realize the value that the first supercomputers could bring to their research.
However, Karin added that SDSC distinguished itself not only with supercomputing hardware that delivered levels of performance called "mind-boggling" at the time, but with its staff. "One of SDSC's greatest impacts was its vast amount of human expertise," he said.
After many years as being primarily a nationally funded supercomputer center, SDSC recently strengthened its ties at the local and state levels with UC San Diego and the UC system, becoming a key resource for UC researchers while still serving those throughout the larger national scientific community.
"SDSC had to reinvent itself completely," said Frieder Seible, Dean of the UC San Diego Jacobs School of Engineering and chair of SDSC's executive committee, adding that the center's recent "UCSD-centric" approach has created a new level of collaboration that is benefiting many areas of the campus. "I am even more excited about the advances in computational science to be made in the next 25 years, many of them in ways that we cannot even imagine."
In addition to serving as a key resource for UC San Diego, SDSC has for many years been a leader in collaborations at the national and international levels. In 2001, for example, SDSC was named a founding member of the NSF's TeraGrid project, created to support scientific discovery and education through a grid-based cyberinfrastructure that currently includes 10 other supercomputer centers across the country. SDSC this year submitted a proposal to the NSF to lead the next-generation of TeraGrid, to be called eXtreme Digital, or XD, and start operations next year.
"It is integrating programs such as these that are the vehicles by which people collaborate in scientific research," said Richard Moore, SDSC's deputy director and leader of the center's national systems.
SDSC has also been preparing for the future by focusing on and becoming a recognized leader in data management. Almost all of today's data is generated digitally, and the amount of overall digital information is expected to grow 10-fold in just the next five years, according to a recent study by International Data Corporation (IDC). Moreover, SDSC has moved toward becoming a "one-stop" data center, expanding its resources and expertise in all areas of data management including preservation, storage, portals, analytics, and visualization.
"Data is about to hit its tipping point, and we see it as the driver of the Information Age," said Fran Berman, who directed SDSC from 2001 to 2009 and co-chaired a Blue Ribbon Task Force on digital preservation, sustainability, and access. "When we think about the key challenges of our time -- global warming, the economy, health sciences – data is behind every one of them."
As an Organized Research Unit of UC San Diego, SDSC is a national leader in creating and providing cyberinfrastructure for data-intensive research. Cyberinfrastructure refers to an accessible and integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC is a founding member of TeraGrid, the nation's largest open-access scientific discovery infrastructure.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.