Cancer Research: A Supercomputing Perspective

By Aaron Dubrow

May 31, 2017

Cancer, the second-leading cause of death in the U.S. after heart disease, kills more than 500,000 citizens per year, including about 2,000 children.

In 2016, then Vice President Joe Biden launched the Cancer Moonshot, saying: “I know that we can help solidify a genuine global commitment to end cancer as we know it today —  and inspire a new generation of scientists to pursue new discoveries and the bounds of human endeavor.”

The importance of high performance computing (HPC) in cancer research was recognized by the Cancer Moonshot Task Force report, and by then Vice President Joe Biden and Energy Secretary Ernie Monitz.

“Supercomputers are key to the Cancer Moonshot,” Monitz wrote. “These exceptionally high-powered machines have the potential to greatly accelerate the development of cancer therapies by finding patterns in massive datasets too large for human analysis. Supercomputers can help us better understand the complexity of cancer development, identify novel and effective treatments, and help elucidate patterns in vast and complex data sets that advance our understanding of cancer.”

With complex, non-linear signaling networks, multiscale dynamics from the quantum to the macro level, and giant, complex datasets of patient responses, cancer is quite possibly the ultimate in HPC problems.

“What could be more complicated and more important?” said J. Tinsley Oden, a computational researcher at The University of Texas at Austin applying uncertainty quantification to cancer treatment predictions. “At each step, it has the most complex features. It is really a garden of rich, important problems that are in the path of many of the developments that we’ve been working on for years.”

Infographic depicts TACC’s multi-domain approach to fighting cancer — click to expand

Hundreds of oncologists, biologists and computer scientists use the HPC systems at the Texas Advanced Computing Center (TACC) to understand the fundamental nature of cancer biology and to improve cancer treatments. Their work addresses a range of cancers types and treatment modalities, and spans applied or fundamental research.

Though diverse in their specific targets, the approaches they use can be loosely grouped into seven broad methodologies: molecular simulation; bioinformatics; mathematical modeling; computational treatment planning; quantum calculation; clinical trial design; and machine learning. The following sections describe and provide examples of each.

Molecular Simulations

Simulating protein and drug interactions at the molecular level enables scientists to understand the mechanics of cancer to design more effective treatments.

For Rommie Amaro, professor of Chemistry and Biochemistry at the University of California, San Diego, this means uncovering new pockets in tumor protein 53 (p53) — “the guardian of the genome” — which plays a crucial role in conserving the stability of DNA and preventing mutations.

The model of full-length p53 protein bound to DNA as a tetramer. The surface of each p53 monomer is depicted with a different color. [Courtesy: Özlem Demir, University of California, San Diego]
In approximately 50 percent of all human cancers, p53 is mutated and rendered inactive, therefore, reactivating mutant p53 using small molecules has been a long-sought-after anticancer therapeutic strategy.

In September 2016, writing in the journal Oncogene, Amaro reported results of the largest atomic-level simulation of the p53 to date — comprising more than 1.5 million atoms. The simulations, enabled by the Stampede supercomputer at TACC, helped identify new binding sites on the surface of the protein that could potentially reactivate p53.

“When most people think about cancer research they probably don’t think about computers,” she said. “But biophysical models are getting to the point where they have a great impact on the science.”

Virtual drug screening is another important HPC application for cancer research. Shuxing Zhang, professor of experimental therapeutics at MD Anderson Cancer Center, used molecule simulations on TACC’s Lonestar5 system to screen 1,448 Food and Drug Administration-approved small molecule drugs to determine which had the molecular features needed to bind and inhibit TNIK — an enzyme that plays a key role in cell signaling in colon cancer.

Zhang discovered that mebendazole, an FDA-approved drug that fights parasites, could effectively bind to TNIK and inhibit its enzymatic activity. He reported his results in Nature Scientific Reports in September 2016.

“Such advantages render the possibility of quickly translating the discovery into a clinical setting for cancer treatment in the near future,” Zhang wrote.

Bioinformatics

The human genome consists of three billion base pairs, so identifying single mutations by sight simply isn’t possible. For that reason, the field of bioinformatics — which uses computing and software to identify patterns and differences in biological data — has been an enormous boon for cancer researchers.

But bioinformatics is more than simple, one-to-one pattern matching.

A heat map showing differences in gene expression between primary tumors and cultured cell lines. Each row is a gene and each column is a tumor or cell sample. In the heat map, red indicates high expression and blue indicates low expression. NHA refers to normal human astrocytes, a star-shaped glial cell of the central nervous system. [Courtesy: Amelia Weber Hall, Iyer lab]
“When you move into multi-dimensional, time-series, or population-level studies, the algorithms can get a lot more computationally intensive,” said Matt Vaughn, TACC’s Director of Life Sciences Computing. “This requires resources like those at TACC, which help large numbers of researchers explore the complexity of cancer genomes by providing elastic, large-scale computing capability.”

For Vishy Iyer, a molecular biologist at The University of Texas at Austin (UT Austin), and his collaborators, access to TACC’s Stampede supercomputer helps them mine reams of data from The Cancer Genome Atlas to identify genetic variants and subtle correlations that affect gene expression in tumors.

“TACC has been vital to our analysis of cancer genomics data, both for providing the necessary computational power and the security needed for handling sensitive patient genomic datasets,” Iyer said.

In February 2016, Iyer and a team of researchers from UT Austin and MD Anderson Cancer Center reported in Nature Communications on a genome-wide transcriptome analysis of the two types of cells that make up the prostate gland. They identified cell-type-specific gene signatures that were associated with aggressive subtypes of prostate cancer and adverse clinical responses.

“This knowledge can be helpful in the development of more targeted therapies that seek to eliminate cancer at its origin,” Iyer said.

Using a similar methodology, Iyer and a team of researchers from UT Austin and the National Cancer Institute identified a transcription factor associated with an aggressive type of lymphoma that is highly correlated with poor therapeutic outcomes. They published their results in the Proceedings of the National Academy of Sciences in January 2016.

Whereas Iyer, an experienced HPC user, develops custom tools for his analyses, a much larger number of researchers access Stampede and comparable systems through scientific gateways. One prominent gateway is Galaxy, an open source bioinformatics platform that serves 30,000 researchers and runs more than 3,000 compute jobs a day.

Since 2014, TACC has powered the data analyses for a large percentage of Galaxy users, allowing researchers to solve tough problems in cases where their personal computer or campus cluster is not sufficient. Of those researchers, a significant subset use the site to analyze cancer genomes.

“Galaxy can be used to identify tumor mutations that drive cancer growth, find proteins that are overexpressed in a tumor, as well as for chemo-informatics and drug discovery,” said Jeremy Goecks, Assistant Professor of Biomedical Engineering and Computational Biology at Oregon Health and Science University and one of Galaxy’s principal investigators.

Goecks estimates that hundreds of researchers each year use the platform for cancer research, himself included. Because cancer patient data is closely protected, the bulk of this usage involves either publically available cancer data, or data on cancer cell lines – immortalized cells that reproduce in the lab and are used to study how cancer reacts to different drugs or conditions.

“This is an ideal marriage of TACC having tremendous computing power with scalable architecture and Galaxy coming along and saying, we’re going to go the last mile and make sure that people who can’t normally use this hardware are able to.”

Mathematical Modeling

While some researchers believe bioinformatics will rapidly advance the understanding and treatment of cancer, others think a better approach is to mathematize cancer: to uncover the fundamental formulas that represent how cancer, in its varied forms, behaves.

At the Center for Computational Oncology at UT Austin, researchers are developing complex computer models to predict how cancer will progress in a specific individual.

Each factor involved in the tumor response — whether it is the speed with which chemotherapeutic drugs reach the tissue or the degree to which cells signal each other to grow — is characterized by a mathematical equation that captures its essence. These models are combined and parameterized and initialized with patient-specific data.

In April 2017, writing in the Journal of The Royal Society Interface, Thomas Yankeelov and collaborators at UT Austin and Vanderbilt University, showed that they can predict how brain tumors (gliomas) will grow in mice with greater accuracy than previous models by including factors like the mechanical forces acting on the cells and the tumor’s cellular heterogeneity.

To develop and implement their mathematically complex models, the center’s scientists use TACC’s supercomputers, which enable them to solve bigger problems that they otherwise could and reach solutions far faster.

Recently, the group has begun a clinical study to predict, after one treatment, how an individual’s cancer will progress, and use those predictions to plan the future course of treatment.

“There are not enough resources or patients to sort this problem out because there are too many variables. It would take until the end of time,” Yankeelov said. “But if you have a model that can recapitulate how tumors grow and respond to therapy, then it becomes a classic engineering optimization problem. ‘I have this much drug and this much time. What’s the best way to give it to minimize the number of tumor cells for the longest amount of time?’”

Computing at TACC helps Yankeelov accelerate his research. “We can solve problems in a few minutes that would take us three weeks to do using the resources at our old institution,” he said. “It’s phenomenal.”

Quantum Calculations

X-ray radiation is the most frequently used form of radiation therapy, but a new treatment is emerging that uses a beam of protons to kill cancer cells with minimum damage on surrounding tissues.

“As happens in cancer therapy, we know empirically that it works, but we don’t know why,” said Jorge A. Morales, a professor of chemistry at Texas Tech University and a leading proponent of the computational analysis of proton therapy. “To do experiments with human subjects is dangerous, so the best way is through computer simulation.”

Computational experiments can mimic the dynamics of the proton-cell interactions without causing damage to a patient and can reveal what happens when the proton beam and cells collide from start to finish, with atomic-level accuracy. Morales has been simulating proton-cell chemical reactions using quantum dynamics models on TACC’s Stampede supercomputer to investigate the fundamentals of the process.

His studies, reported in PLOS One in March 2017, as well as in Molecular Physics, and Chemical Physics Letters (2015 and 2014 respectively), have determined the basic byproducts of protons colliding with water within the cell, and with nucleotides and clusters of DNA bases – the basic units of DNA. The studies shed light on how the protons and their water radiolysis products damage DNA.

Though fundamental in nature, the insights and data that Morales’ simulations produce help researchers understand proton cancer therapy at the quantum level, and help modulate factors like dosage and beam direction.

“These simulations will bring about a unique way to understand and control proton cancer therapy that, at a very low cost, will help to drastically improve the treatment of cancer patients without risking human subjects,” Morales said.

Computational Treatment Planning

Wei Liu, a researcher at the Mayo Clinic, also studies proton therapy, but he looks at the treatment from a clinical perspective.

In comparison with current radiation procedures, proton therapy saves healthy tissue in front of and behind the tumor. It is particularly effective when irradiating tumors near sensitive organs where stray beams can be particularly damaging.

However, the pinpoint accuracy required by the protein beam, which is its greatest advantage, means that it must be precisely calibrated and that discrepancies from the ideal (whether from device, human error or even patient breathing) must be taken into consideration.

Writing in Medical Physics in January 2017, Liu and his collaborators showed that their “chance-constrained model” was better at sparing organs at risk than current methods.

“Each time, we try to mathematically generate a good plan,” he said. “There are 25,000 variables or more, so generating a plan that is robust to these mistakes and can still get the proper dose distribution to the tumor is a large-scale optimization problem.”

The researchers used the Lonestar5 supercomputer at TACC to generate treatment plans that minimize the risk and uncertainties involved in proton beam therapy.

“It’s very computationally expensive to generate a plan in a reasonable timeframe,” he continued. “Without a supercomputer, we can do nothing.”

Computational Trial Design

Another way researchers use TACC’s advanced computers is to design clinical trials that can better determine which combination of dosages will be most effective, specifically for the biological agents used in immunotherapy, which work very differently from chemotherapy and radiation.

Writing in the Journal of the Royal Statistics Society Series C (Applied Statistics), Chunyan Cai, assistant professor of biostatistics at McGovern Medical School at The University of Texas Health Science Center at Houston (UTHealth) described her efforts using Lonestar5 to identify biologically optimal dose combinations for agents that target the PI3K/AKT/mTOR signaling pathway, which has been associated with several genetic aberrations related to the promotion of cancer.

Scanning electron micrograph of a human T lymphocyte (also called a T cell) from the immune system of a healthy donor. Immunotherapy fights cancer by supercharging the immune system’s natural defenses (include T-cells) or contributing additional immune elements that can help the body kill cancer cells. HPC is helping researchers better understand how immunotherapeutic agents can be used effectively [Courtesy: NIAID]
“Our research is motivated by a drug combination trial at MD Anderson Cancer Center for patients diagnosed with relapsed lymphoma,” Cai said. “The trial combined two novel biological agents that target two different components in the PI3K/AKT/mTOR signaling pathway.”

They investigated six different dose-toxicity and dose-efficacy scenarios and carried out 2,000 simulated trials for each of the designs.

Based on those simulations, she concluded that “the design proposed has desirable operating characteristics in identifying the biologically optimal dose combination under various patterns of dose–toxicity and dose–efficacy relationships.”

The research is leading to new, safer and more effective ways to test combinations of immunotherapeutic agents.

Machine Learning

A final, and truly radical, way that researchers are using HPC for cancer research is through the application of machine and deep learning.

The Eberlin research group at UT Austin develops clinical applications of ambient mass spectrometry for cancer diagnosis. They create tools and techniques to assist surgeons in distinguishing between normal and cancer tissue during tumor resection operations.

To do so, they have had to develop statistical methods that can analyze and interpret large amount of mass spectrometry data gathered from clinical samples.

Jonathan Young, a post-doctoral research in the group, is building machine learning classifiers to reliably predict whether a given tissue sample is cancer or normal, and if it is indeed cancer, which specific subtype the tumor belongs to.

Young uses the Maverick system at TACC, which contains a large number of NVIDIA GPUs, to develop and implement the machine learning algorithms. “The large memory capacity of Maverick is well suited for our extensive datasets, and the parallelization capability will aid in parameter sweeps during the training of classifiers,” Young said.

Young will present his work at the American Society for Mass Spectrometry (ASMS) Annual Conference this June.

Another example of the application of machine learning to cancer can be found in the work of Daniel Lobo, an assistant professor of biology and computer science at the University of Maryland, Baltimore County (UMBC). He is using machine learning to map out the cellular communication networks that underlie cancer, and to design methods to disrupt them.

In their January 2017 paper in Scientific Reports, Lobo and collaborators showed that machine learning can uncover the cellular networks that determine pigmentation in tadpoles and reverse-engineering never-before-seen coloration. Their work was facilitated by Stampede, which enabled the team to run billions of simulations to identify models of the cellular network and the means of altering it.

Lobo’s lab is applying the method to cancer research to determine what type of interventions might stop metastasis in its tracks without damaging other cells.

“Traditional approaches like chemotherapy attack the cells that grow the most, but leave cells that are signaling others to grow and that may be the most important,” Lobo says. “We’re using machine learning to find out the communication networks between these cells and hopefully to discover a treatment that can cause the tumor to collapse.”

“Getting a true understanding, given the complexity of the information, without some assistance from machine learning, is probably hopeless,” said Michael Levin, Lobo’s collaborator. “I think it’s inevitable that we use machine learning to enrich scientific and biomedical discovery.”

From patient-specific treatments to immunology to drug discovery, advanced computing is accelerating the basic and applied science underlying our understanding of cancer and the development and application of cancer treatments.

If scientists are the rocket in the cancer moonshot, HPC processing power is the jet fuel.

About the Author

Aaron Dubrow joined TACC in October 2007 as the Science and Technology Writer with the responsibility of reporting on the myriad of research and development projects undertaken by TACC.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

SC19’s HPC Impact Showcase Chair: AI + HPC a ‘Speed Train’

November 16, 2019

This year’s chair of the HPC Impact Showcase at the SC19 conference in Denver is Lori Diachin, who has spent her career at the spearhead of HPC. Currently deputy director for the U.S. Department of Energy’s (DOE) Read more…

By Doug Black

Microsoft Azure Adds Graphcore’s IPU

November 15, 2019

Graphcore, the U.K. AI chip developer, is expanding collaboration with Microsoft to offer its intelligent processing units on the Azure cloud, making Microsoft the first large public cloud vendor to offer the IPU designe Read more…

By George Leopold

At SC19: What Is UrgentHPC and Why Is It Needed?

November 14, 2019

The UrgentHPC workshop, taking place Sunday (Nov. 17) at SC19, is focused on using HPC and real-time data for urgent decision making in response to disasters such as wildfires, flooding, health emergencies, and accidents. We chat with organizer Nick Brown, research fellow at EPCC, University of Edinburgh, to learn more. Read more…

By Tiffany Trader

China’s Tencent Server Design Will Use AMD Rome

November 13, 2019

Tencent, the Chinese cloud giant, said it would use AMD’s newest Epyc processor in its internally-designed server. The design win adds further momentum to AMD’s bid to erode rival Intel Corp.’s dominance of the glo Read more…

By George Leopold

NCSA Industry Conference Recap – Part 1

November 13, 2019

Industry Program Director Brendan McGinty welcomed guests to the annual National Center for Supercomputing Applications (NCSA) Industry Conference, October 8-10, on the University of Illinois campus in Urbana (UIUC). One hundred seventy from 40 organizations attended the invitation-only, two-day event. Read more…

By Elizabeth Leake, STEM-Trek

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Data Management – The Key to a Successful AI Project

 

Five characteristics of an awesome AI data infrastructure

[Attend the IBM LSF & HPC User Group Meeting at SC19 in Denver on November 19!]

AI is powered by data

While neural networks seem to get all the glory, data is the unsung hero of AI projects – data lies at the heart of everything from model training to tuning to selection to validation. Read more…

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing components with Intel Xeon, AMD Epyc, IBM Power, and Arm server ch Read more…

By Tiffany Trader

SC19’s HPC Impact Showcase Chair: AI + HPC a ‘Speed Train’

November 16, 2019

This year’s chair of the HPC Impact Showcase at the SC19 conference in Denver is Lori Diachin, who has spent her career at the spearhead of HPC. Currently Read more…

By Doug Black

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Intel AI Summit: New ‘Keem Bay’ Edge VPU, AI Product Roadmap

November 12, 2019

At its AI Summit today in San Francisco, Intel touted a raft of AI training and inference hardware for deployments ranging from cloud to edge and designed to support organizations at various points of their AI journeys. The company revealed its Movidius Myriad Vision Processing Unit (VPU)... Read more…

By Doug Black

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researchers of Europe’s NEXTGenIO project, an initiative funded by the European Commission’s Horizon 2020 program to explore this new... Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed instances for storage workloads. The fourth-generation Azure D-series and E-series virtual machines previewed at the Rome launch in August are now generally available. Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

Cerebras to Supply DOE with Wafer-Scale AI Supercomputing Technology

September 17, 2019

Cerebras Systems, which debuted its wafer-scale AI silicon at Hot Chips last month, has entered into a multi-year partnership with Argonne National Laboratory and Lawrence Livermore National Laboratory as part of a larger collaboration with the U.S. Department of Energy... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This