SDSC Launches Comet Supercomputer

October 23, 2015

Oct. 23 — When the San Diego Supercomputer Center (SDSC) launched its first supercomputer, a Cray XMP-48 in late 1985, it was about as powerful as an iPhone is today. Now SDSC has formally taken the wraps off Comet, a new petascale supercomputer that is over 2 million times more powerful than that first system. 

With the ability to perform almost two million billion operations or calculations per second, Comet is designed to transform scientific research by expanding computational access to a larger number of researchers working across a wider range of domains.

“The San Diego Supercomputer Center plays a vital role in fulfilling our vision to solve our world’s most pressing research challenges,” UC San Diego Chancellor Pradeep K. Khosla told a capacity audience at SDSC last week as the Center showcased Cometas part of its 30th anniversary celebration.

Chancellor Khosla spoke about how SDSC has become a national leader in cyberinfrastructure, providing the high-performance computing, storage, networking, and expertise needed to harness the university’s collective research efforts efficiently and seamlessly. He also emphasized SDSC’s role in advancing UC San Diego’s Strategic Plan goals and research endeavors.

“Today, UC San Diego is a $1 billion research enterprise,” noted Khosla, adding that “SDSC has been a trail blazer for academic computing.”

“We are fortunate that SDSC has been one of the national leaders – research-oriented pioneers – in building an advanced cyberinfrastructure ‘nervous system’ for the academic and scientific communities,” said UC San Diego Vice Chancellor for Research Sandra A. Brown, another featured speaker at the event.

The result of a National Science Foundation grant valued at almost $24 million including hardware and operating funds, Comet is designed to meet the needs of what is often referred to as the ‘long tail’ of science – the idea that the large number of modest-sized computationally-based research projects represent, in aggregate, a tremendous amount of research that can yield scientific advances and discovery.

“The launch of Comet marks yet another stage in SDSC’s leadership in the national cyberinfrastructure ecosystem,” said James Kurose, Assistant Director of the NSF’s Computer and Information Science and Engineering (CISE) Directorate, in remarks at the SDSC event. “Through this launch and the extraordinary computing capabilities of SDSC, the center will continue to expand the frontiers of science and engineering, allowing researchers to open new windows into phenomena as vast as the Universe and as small as nanoparticles.”

SDSC Director Michael Norman noted that SDSC’s mission has expanded over its three decades to encompass much more than advanced computation, including a host of services related to the voluminous amount of digitally based information generated daily, and systems designed to analyze, store, and share that data.

“In recent years the research community has moved into a new era of scientific endeavor based on computational science, now best described as data-intensive science,” said Norman. “The term ‘big data’ became the short-hand description for this, or, for academia, ‘data science and engineering.’ This convergence of computational science with data science and engineering rests on an inherent reliance of interdisciplinary collaborations, which is needed to solve the grand research challenges of our times.”

Comet’s innovative design makes it ideal for supporting a broad range of research and computing modalities.  Two distinctive features – support for science gateways and high-performance virtualization – will significantly expand the community of researchers with access to high-performance computing resources,” said SDSC Deputy Director Shawn Strande, who also is Comet’s program manager. “Comet is expected to reach an active research community of over 10,000 users, and is destined to become one of the most productive HPC systems available to the academic research community.”

In addition to highlighting SDSC’s major milestones in an expanded timeline covering its 30-year history, the Center also announced the launch of a new fund-raising campaign for UC San Diego, in partnership with SDSC, designed to tackle some of the grand research challenges facing the State of California and beyond outlined previously in the campus’ Strategic Plan. Details about this new effort will be announced at a later date.

How SDSC’s ‘Comet’ Supercomputer is Serving Science and Society

Comet is configured to help transform advanced computing by expanding access and capacity not only among research domains that typically rely on high-performance computing – such as chemistry and biophysics – but among domains which are relatively new to using  supercomputers, such as genomics, finance, and the social sciences. Some of the domains already being served by Comet include:

Astrophysics: Supercomputers can greatly accelerate timescales for researching the origins of the universe.

Neurosciences, Brain Research: SDSC’s Neuroscience Gateways project will contribute to the national BRAIN initiative announced by the Obama Administration to deepen our understanding of the human brain.

Social Sciences: Sociologists and political scientists are analyzing newly accessible data sets to study censorship of the press, factors that affect participation in the political process, and the properties of social networks.

Molecular Science: Studying the properties of lipids, proteins, nucleic acids, and small molecules can advance our understanding of biophysical processes at the atomic scale, leading to new drug designs and reducing disease.

DNA Nanostructures: Conducting nanoscale biomolecular research could lead to low-cost DNA sequencing technologies, and in turn create targeted drug delivery systems and help explain the molecular causes of disease.

Alternative Energy Solutions/New Materials Research: Finding new and more efficient solutions to energy harvesting, nanoporous membranes for water desalinization, solar thermal fuels, and more.

Fluid Turbulent Physics: Supercomputers can create highly detailed simulations to track ocean currents or improve industry methods related to the discharge of pollutants, or oil flow in pipelines.

Climate Change/Environmental Sciences: Modeling atmospheric aerosols, identified as influencing the chemical composition and radiative balance of the troposphere, has direct implications for our climate and public health.

Seismic Research/Disaster Prevention: Keys to hazard management for major earthquakes, hurricanes, and wildfires include the ability to predict a wide range of possibilities. Supercomputer-generated simulations are used to inform decision-making strategies.

The Tree of Life: Biologists construct phylogenetic trees to capture the evolutionary relationship between species, and help us better understand the functions and interactions of genes, the origin and spread of diseases, the co-evolution of hosts and parasites, and migration of human populations.

Key Features of Comet

  • ~2 petaflops of overall peak performance – one million billion operations or calculations per second.
  • Dell compute nodes using next-generation Intel Xeon processors, 27 racks of compute nodes totaling 1,944 nodes or 46,656 cores.
  • 128 GB (gigabytes) of DRAM and 320 GB (gigabytes) of flash memory per standard compute node
  • 72 nodes per rack with full bisection InfiniBand FDR interconnect in each rack, and a 4:1 bisection cross-rack interconnect
  • Additional GPU and large-memory (1.5 Terabytes) nodes for applications such as visualization, molecular dynamics simulations, or de novo genome assembly
  • 7 PB (petabytes) of Lustre-based high-performance storage from Aeon, and 6 PB of durable storage for data reliability
  • First XSEDE production system to support high-performance virtualization

About SDSC

As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s Comet joins the Center’s data-intensive Gordon cluster. SDSC is a partner in XSEDE (eXtreme Science and Engineering Discovery Environment), the most advanced collection of integrated digital resources and services in the world.

Source: SDSC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energetic effort,” IBM Research wrote in a blog post. “Therefor Read more…

By Oliver Peckham

Focused on ‘Silicon TAM,’ Intel Puts Gary Patton, Former GlobalFoundries CTO, in Charge of Design Enablement

December 12, 2019

Change within Intel’s upper management – and to its company mission – has continued as a published report has disclosed that chip technology heavyweight Gary Patton, GlobalFoundries’ CTO and R&D SVP as well a Read more…

By Doug Black

Quantum Bits: Rigetti Debuts New Gates, D-Wave Cuts NEC Deal, AWS Jumps into the Quantum Pool

December 12, 2019

There’s been flurry of significant news in the quantum computing world. Yesterday, Rigetti introduced a new family of gates that reduces circuit depth required on some problems and D-Wave struck a deal with NEC to coll Read more…

By John Russell

How Formula 1 Used Cloud HPC to Build the Next Generation of Racing

December 12, 2019

Formula 1, Rob Smedley explained, is maybe the biggest racing spectacle in the world, with five hundred million fans tuning in for every race. Smedley, a chief engineer with Formula 1’s performance engineering and anal Read more…

By Oliver Peckham

RPI Powers Up ‘AiMOS’ AI Supercomputer

December 11, 2019

Designed to push the frontiers of computing chip and systems performance optimized for AI workloads, an 8 petaflops (Linpack) IBM Power9-based supercomputer has been unveiled in upstate New York that will be used by IBM Read more…

By Doug Black

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

GPU Scheduling and Resource Accounting: The Key to an Efficient AI Data Center

[Connect with LSF users and learn new skills in the IBM Spectrum LSF User Community!]

GPUs are the new CPUs

GPUs have become a staple technology in modern HPC and AI data centers. Read more…

At SC19: Developing a Digital Twin

December 11, 2019

In the not too distant future, we can expect to see our skies filled with unmanned aerial vehicles (UAVs) delivering packages, maybe even people, from location to location. In such a world, there will also be a digital twin for each UAV in the fleet: a virtual model that will follow the UAV through its existence, evolving with time. Read more…

By Aaron Dubrow

Focused on ‘Silicon TAM,’ Intel Puts Gary Patton, Former GlobalFoundries CTO, in Charge of Design Enablement

December 12, 2019

Change within Intel’s upper management – and to its company mission – has continued as a published report has disclosed that chip technology heavyweight G Read more…

By Doug Black

Quantum Bits: Rigetti Debuts New Gates, D-Wave Cuts NEC Deal, AWS Jumps into the Quantum Pool

December 12, 2019

There’s been flurry of significant news in the quantum computing world. Yesterday, Rigetti introduced a new family of gates that reduces circuit depth require Read more…

By John Russell

RPI Powers Up ‘AiMOS’ AI Supercomputer

December 11, 2019

Designed to push the frontiers of computing chip and systems performance optimized for AI workloads, an 8 petaflops (Linpack) IBM Power9-based supercomputer has Read more…

By Doug Black

At SC19: Developing a Digital Twin

December 11, 2019

In the not too distant future, we can expect to see our skies filled with unmanned aerial vehicles (UAVs) delivering packages, maybe even people, from location to location. In such a world, there will also be a digital twin for each UAV in the fleet: a virtual model that will follow the UAV through its existence, evolving with time. Read more…

By Aaron Dubrow

Intel’s Jim Clarke on its New Cryo-controller and why Intel isn’t Late to the Quantum Party

December 9, 2019

Intel today introduced the ‘first-of-its-kind’ cryo-controller chip for quantum computing and previewed a cryo-prober tool for characterizing quantum proces Read more…

By John Russell

On the Spack Track @SC19

December 5, 2019

At the annual supercomputing conference, SC19 in Denver, Colorado, there were Spack events each day of the conference. As a reflection of its grassroots heritage, nine sessions were planned by more than a dozen thought leaders from seven organizations, including three U.S. national Department of Energy (DOE) laboratories and Sylabs... Read more…

By Elizabeth Leake

Intel’s New Hyderabad Design Center Targets Exascale Era Technologies

December 3, 2019

Intel's Raja Koduri was in India this week to help launch a new 300,000 square foot design and engineering center in Hyderabad, which will focus on advanced com Read more…

By Tiffany Trader

AWS Debuts 7nm 2nd-Gen Graviton Arm Processor

December 3, 2019

The “x86 Big Bang,” in which market dominance of the venerable Intel CPU has exploded into fragments of processor options suited to varying workloads, has n Read more…

By Doug Black

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

SC19: IBM Changes Its HPC-AI Game Plan

November 25, 2019

It’s probably fair to say IBM is known for big bets. Summit supercomputer – a big win. Red Hat acquisition – looking like a big win. OpenPOWER and Power processors – jury’s out? At SC19, long-time IBMer Dave Turek sketched out a different kind of bet for Big Blue – a small ball strategy, if you’ll forgive the baseball analogy... Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
CEJN
CJEN
DDN
DDN
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

Cerebras to Supply DOE with Wafer-Scale AI Supercomputing Technology

September 17, 2019

Cerebras Systems, which debuted its wafer-scale AI silicon at Hot Chips last month, has entered into a multi-year partnership with Argonne National Laboratory and Lawrence Livermore National Laboratory as part of a larger collaboration with the U.S. Department of Energy... Read more…

By Tiffany Trader

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

IBM Opens Quantum Computing Center; Announces 53-Qubit Machine

September 19, 2019

Gauging progress in quantum computing is a tricky thing. IBM yesterday announced the opening of the IBM Quantum Computing Center in New York, with five 20-qubit Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This