SDSC Launches Comet Supercomputer

October 23, 2015

Oct. 23 — When the San Diego Supercomputer Center (SDSC) launched its first supercomputer, a Cray XMP-48 in late 1985, it was about as powerful as an iPhone is today. Now SDSC has formally taken the wraps off Comet, a new petascale supercomputer that is over 2 million times more powerful than that first system. 

With the ability to perform almost two million billion operations or calculations per second, Comet is designed to transform scientific research by expanding computational access to a larger number of researchers working across a wider range of domains.

“The San Diego Supercomputer Center plays a vital role in fulfilling our vision to solve our world’s most pressing research challenges,” UC San Diego Chancellor Pradeep K. Khosla told a capacity audience at SDSC last week as the Center showcased Cometas part of its 30th anniversary celebration.

Chancellor Khosla spoke about how SDSC has become a national leader in cyberinfrastructure, providing the high-performance computing, storage, networking, and expertise needed to harness the university’s collective research efforts efficiently and seamlessly. He also emphasized SDSC’s role in advancing UC San Diego’s Strategic Plan goals and research endeavors.

“Today, UC San Diego is a $1 billion research enterprise,” noted Khosla, adding that “SDSC has been a trail blazer for academic computing.”

“We are fortunate that SDSC has been one of the national leaders – research-oriented pioneers – in building an advanced cyberinfrastructure ‘nervous system’ for the academic and scientific communities,” said UC San Diego Vice Chancellor for Research Sandra A. Brown, another featured speaker at the event.

The result of a National Science Foundation grant valued at almost $24 million including hardware and operating funds, Comet is designed to meet the needs of what is often referred to as the ‘long tail’ of science – the idea that the large number of modest-sized computationally-based research projects represent, in aggregate, a tremendous amount of research that can yield scientific advances and discovery.

“The launch of Comet marks yet another stage in SDSC’s leadership in the national cyberinfrastructure ecosystem,” said James Kurose, Assistant Director of the NSF’s Computer and Information Science and Engineering (CISE) Directorate, in remarks at the SDSC event. “Through this launch and the extraordinary computing capabilities of SDSC, the center will continue to expand the frontiers of science and engineering, allowing researchers to open new windows into phenomena as vast as the Universe and as small as nanoparticles.”

SDSC Director Michael Norman noted that SDSC’s mission has expanded over its three decades to encompass much more than advanced computation, including a host of services related to the voluminous amount of digitally based information generated daily, and systems designed to analyze, store, and share that data.

“In recent years the research community has moved into a new era of scientific endeavor based on computational science, now best described as data-intensive science,” said Norman. “The term ‘big data’ became the short-hand description for this, or, for academia, ‘data science and engineering.’ This convergence of computational science with data science and engineering rests on an inherent reliance of interdisciplinary collaborations, which is needed to solve the grand research challenges of our times.”

Comet’s innovative design makes it ideal for supporting a broad range of research and computing modalities.  Two distinctive features – support for science gateways and high-performance virtualization – will significantly expand the community of researchers with access to high-performance computing resources,” said SDSC Deputy Director Shawn Strande, who also is Comet’s program manager. “Comet is expected to reach an active research community of over 10,000 users, and is destined to become one of the most productive HPC systems available to the academic research community.”

In addition to highlighting SDSC’s major milestones in an expanded timeline covering its 30-year history, the Center also announced the launch of a new fund-raising campaign for UC San Diego, in partnership with SDSC, designed to tackle some of the grand research challenges facing the State of California and beyond outlined previously in the campus’ Strategic Plan. Details about this new effort will be announced at a later date.

How SDSC’s ‘Comet’ Supercomputer is Serving Science and Society

Comet is configured to help transform advanced computing by expanding access and capacity not only among research domains that typically rely on high-performance computing – such as chemistry and biophysics – but among domains which are relatively new to using  supercomputers, such as genomics, finance, and the social sciences. Some of the domains already being served by Comet include:

Astrophysics: Supercomputers can greatly accelerate timescales for researching the origins of the universe.

Neurosciences, Brain Research: SDSC’s Neuroscience Gateways project will contribute to the national BRAIN initiative announced by the Obama Administration to deepen our understanding of the human brain.

Social Sciences: Sociologists and political scientists are analyzing newly accessible data sets to study censorship of the press, factors that affect participation in the political process, and the properties of social networks.

Molecular Science: Studying the properties of lipids, proteins, nucleic acids, and small molecules can advance our understanding of biophysical processes at the atomic scale, leading to new drug designs and reducing disease.

DNA Nanostructures: Conducting nanoscale biomolecular research could lead to low-cost DNA sequencing technologies, and in turn create targeted drug delivery systems and help explain the molecular causes of disease.

Alternative Energy Solutions/New Materials Research: Finding new and more efficient solutions to energy harvesting, nanoporous membranes for water desalinization, solar thermal fuels, and more.

Fluid Turbulent Physics: Supercomputers can create highly detailed simulations to track ocean currents or improve industry methods related to the discharge of pollutants, or oil flow in pipelines.

Climate Change/Environmental Sciences: Modeling atmospheric aerosols, identified as influencing the chemical composition and radiative balance of the troposphere, has direct implications for our climate and public health.

Seismic Research/Disaster Prevention: Keys to hazard management for major earthquakes, hurricanes, and wildfires include the ability to predict a wide range of possibilities. Supercomputer-generated simulations are used to inform decision-making strategies.

The Tree of Life: Biologists construct phylogenetic trees to capture the evolutionary relationship between species, and help us better understand the functions and interactions of genes, the origin and spread of diseases, the co-evolution of hosts and parasites, and migration of human populations.

Key Features of Comet

  • ~2 petaflops of overall peak performance – one million billion operations or calculations per second.
  • Dell compute nodes using next-generation Intel Xeon processors, 27 racks of compute nodes totaling 1,944 nodes or 46,656 cores.
  • 128 GB (gigabytes) of DRAM and 320 GB (gigabytes) of flash memory per standard compute node
  • 72 nodes per rack with full bisection InfiniBand FDR interconnect in each rack, and a 4:1 bisection cross-rack interconnect
  • Additional GPU and large-memory (1.5 Terabytes) nodes for applications such as visualization, molecular dynamics simulations, or de novo genome assembly
  • 7 PB (petabytes) of Lustre-based high-performance storage from Aeon, and 6 PB of durable storage for data reliability
  • First XSEDE production system to support high-performance virtualization

About SDSC

As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s Comet joins the Center’s data-intensive Gordon cluster. SDSC is a partner in XSEDE (eXtreme Science and Engineering Discovery Environment), the most advanced collection of integrated digital resources and services in the world.

Source: SDSC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Amid Upbeat Earnings, Intel to Cut 1% of Employees, Add as Many

January 24, 2020

For all the sniping two tech old timers take, both IBM and Intel announced surprisingly upbeat earnings this week. IBM CEO Ginny Rometty was all smiles at this week’s World Economic Forum in Davos, Switzerland, after  Read more…

By Doug Black

Indiana University Dedicates ‘Big Red 200’ Cray Shasta Supercomputer

January 24, 2020

After six months of celebrations, Indiana University (IU) officially marked its bicentennial on Monday – and it saved the best for last, inaugurating Big Red 200, a new AI-focused supercomputer that joins the ranks of Read more…

By Staff report

What’s New in HPC Research: Tsunamis, Wildfires, the Large Hadron Collider & More

January 24, 2020

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Toshiba Promises Quantum-Like Advantage on Standard Hardware

January 23, 2020

Toshiba has invented an algorithm that it says delivers a 10-fold improvement for a select class of computational problems, without the need for exotic hardware. In fact, the company's simulated bifurcation algorithm is Read more…

By Tiffany Trader

Energy Research Combines HPC, 3D Manufacturing

January 23, 2020

A federal energy research initiative is gaining momentum with the release of a contract award aimed at using supercomputing to harness 3D printing technology that would boost the performance of power generators. Partn Read more…

By George Leopold

AWS Solution Channel

Challenging the barriers to High Performance Computing in the Cloud

Cloud computing helps democratize High Performance Computing by placing powerful computational capabilities in the hands of more researchers, engineers, and organizations who may lack access to sufficient on-premises infrastructure. Read more…

IBM Accelerated Insights

Intelligent HPC – Keeping Hard Work at Bay(es)

Since the dawn of time, humans have looked for ways to make their lives easier. Over the centuries human ingenuity has given us inventions such as the wheel and simple machines – which help greatly with tasks that would otherwise be extremely laborious. Read more…

TACC Highlights Its Upcoming ‘IsoBank’ Isotope Database

January 22, 2020

Isotopes – elemental variations that contain different numbers of neutrons – can help researchers unearth the past of an object, especially the few hundred isotopes that are known to be stable over time. However, iso Read more…

By Oliver Peckham

Toshiba Promises Quantum-Like Advantage on Standard Hardware

January 23, 2020

Toshiba has invented an algorithm that it says delivers a 10-fold improvement for a select class of computational problems, without the need for exotic hardware Read more…

By Tiffany Trader

In Advanced Computing and HPC, Dell EMC Sets Sights on the Broader Market Middle 

January 22, 2020

If the leading advanced computing/HPC server vendors were in the batting lineup of a baseball team, Dell EMC would be going for lots of singles and doubles – Read more…

By Doug Black

DNA-Based Storage Nears Scalable Reality with New $25 Million Project

January 21, 2020

DNA-based storage, which involves storing binary code in the four nucleotides that constitute DNA, has been a moonshot for high-density data storage since the 1960s. Since the first successful experiments in the 1980s, researchers have made a series of major strides toward implementing DNA-based storage at scale, such as improving write times and storage density and enabling easier file identification and extraction. Now, a new $25 million... Read more…

By Oliver Peckham

AMD Recruits Intel, IBM Execs; Pending Layoffs Reported at Intel Data Platform Group

January 17, 2020

AMD has raided Intel and IBM for new senior managers, one of whom will replace an AMD executive who has played a prominent role during the company’s recharged Read more…

By Doug Black

Atos-AMD System to Quintuple Supercomputing Power at European Centre for Medium-Range Weather Forecasts

January 15, 2020

The United Kingdom-based European Centre for Medium-Range Weather Forecasts (ECMWF), a supercomputer-powered weather forecasting organization backed by most of Read more…

By Oliver Peckham

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

White House AI Regulatory Guidelines: ‘Remove Impediments to Private-sector AI Innovation’

January 9, 2020

When it comes to new technology, it’s been said government initially stays uninvolved – then gets too involved. The White House’s guidelines for federal a Read more…

By Doug Black

IBM Touts Quantum Network Growth, Improving QC Quality, and Battery Research

January 8, 2020

IBM today announced its Q (quantum) Network community had grown to 100-plus – Delta Airlines and Los Alamos National Laboratory are among most recent addition Read more…

By John Russell

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

SC19: IBM Changes Its HPC-AI Game Plan

November 25, 2019

It’s probably fair to say IBM is known for big bets. Summit supercomputer – a big win. Red Hat acquisition – looking like a big win. OpenPOWER and Power processors – jury’s out? At SC19, long-time IBMer Dave Turek sketched out a different kind of bet for Big Blue – a small ball strategy, if you’ll forgive the baseball analogy... Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energet Read more…

By Oliver Peckham

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

51,000 Cloud GPUs Converge to Power Neutrino Discovery at the South Pole

November 22, 2019

At the dead center of the South Pole, thousands of sensors spanning a cubic kilometer are buried thousands of meters beneath the ice. The sensors are part of Ic Read more…

By Oliver Peckham

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed instances for storage workloads. The fourth-generation Azure D-series and E-series virtual machines previewed at the Rome launch in August are now generally available. Read more…

By Tiffany Trader

Intel’s New Hyderabad Design Center Targets Exascale Era Technologies

December 3, 2019

Intel's Raja Koduri was in India this week to help launch a new 300,000 square foot design and engineering center in Hyderabad, which will focus on advanced com Read more…

By Tiffany Trader

Summit Has Real-Time Analytics: Here’s How It Happened and What’s Next

October 3, 2019

Summit – the world’s fastest publicly-ranked supercomputer – now has real-time streaming analytics. At the 2019 HPC User Forum at Argonne National Laborat Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This