Rick Stevens: Connecting Computing to Science

By Michael Feldman

March 9, 2007

Perhaps the two most important technologies of the 21st century will be information technology and biotechnology. Certainly they have become the most rapidly expanding domains of our era. The advancements in devices such as microarray biochips, medical imaging, and mass spectrometers have created a wealth of biological data to be analyzed. The result is that, increasingly, biological problems now require large scale computing. In a sense, life science has become a sub-domain of information science.

Expressions such as bioinformatics, computational biology and systems biology are being used to describe this new integration. And research organizations are actively exploring problems within the intersection of biology and computer science.

At the Department of Energy's (DOE) Argonne National Laboratory, the Computing and Life Sciences (CLS) directorate is attempting to synergize these two technologies to fulfill the department's mission. At Argonne, the integration of computational science with systems biology is designed to help build basic scientific knowledge, solve environmental problems related to energy production, and develop and manage new energy sources.

Heading the CLS directorate is Rick Stevens, a man who seems perfectly suited for the type of interdisciplinary work that the organization is doing. There, he is able to indulge his deep interests in algorithms, math and science, especially biological science.

As a scientist, Stevens is hard to categorize. In fact, he himself is not a great believer in distinct scientific disciplines. According to him, calling yourself a biologist, a chemist or a computer science is a just way people self-identify with a community. But these disciplines have become a rather artificial way to view the world. There are just people and problems, he says.

“I've always been interested in trying to connect computing to science,” says Stevens. “But I'm not that interested in computing for computing's sake.”

As a kid, Stevens was very much attracted to computing as it was portrayed on Star Trek. In the 23rd century, computers were things you used to do exciting things, like computing wormhole trajectories. In the 21st century, we'll have to be satisfied with sub-warp applications. But that still leaves plenty to do.

According to Stevens, putting biology and computing under the same lab directorate is a kind of experiment. By forging these cross-cultural relationships, they want to see if the sum is greater than the parts. Since Stevens is personally aligned with this intersection of computing and biology, to him the challenges are not only some of the most interesting problems in the world, but also are just great fun.

As one might imagine, the life of the head of an DOE lab directorate can be rather intense. It's not unusual for Stevens to be up at 5:00 AM.  At that ungodly hour, he tries to pound out a little code, which he mostly writes in C, Perl, Python or Mathematica. He says he's also learning a little UPC.

“If I spend a couple of hours in the morning writing code, I'm a much more cheerful person the rest of the day,” notes Stevens.

He spends the remainder of the day managing the lab: cheerleading the staff, working with the funding agencies, and planning the direction of the lab work. He tries to reserve some time for himself to reflect on the big picture and think about the future.

But when things get onerous at the lab, Stevens retreats to his other job — as a professor of computer science at the University of Chicago. There you'll find him working with his five PhD students. With one exception they are all working on projects in computational biology.

Stevens seems to get the most out of both his roles. He says Argonne is a great place to get things done. It's a very high energy, very focused environment, and the people are extremely supportive. “We think of it as a cross between a university and a start-up company,” he says. On the other hand, he also enjoys the teaching culture and more free-wheeling atmosphere of the university. There, he's able to wander off and follow his interests, wherever they take him.

But one of the big advantages of working for the DOE is the access to big iron. As one of the department's leadership computing centers, Argonne is on a select list to receive the latest cutting-edge supercomputers. It is expected to get a 100 teraflop IBM Blue Gene machine sometime this year. In 2008, the lab is looking to deploy a 500 teraflop system. Stevens says the lab is on a trajectory to get a sustained petaflop and even beyond.

High-end capability supercomputing systems for life sciences have traditionally focused on biochemical modeling at the level of atoms and molecules. But, according to Stevens, that misses the complexity of the organism and interactions of the ecosystem. Lately, he has become interested in applying petascale power to systems biology problems. For example, modeling microbial soil habitats is a vast computational undertaking, but promises to help us understand one of the most complex and important ecosystems on the planet.

Another promising use of petascale systems involves building models of cells that incorporate genetic information. This will allow scientists to predict a cell's response to different environment and substrates, and perform computational what-if questions to understand design tradeoffs in natural or man-made biological systems. For example, this type of application could be used to model highly efficient ethanol-producing microorganisms for different nutritional substrates.

“To understand the dynamics of how something works, you have to execute a simulation on a computer,” explains Stevens. “There's no other way to do it. So in many ways, doing theory in biology is going to be equivalent to doing these complex simulations. That's an insight that is just starting to hit lots of people.”

The computing power required to pursue some of these problems already exists today. As teraflop systems become available to more people, the opportunity for scientists to do interesting systems biology is exploding. While the hardware continues to become more accessible, building the models is the hard part.

“We don't have enough people with a background in computing and mathematics and, at the same time, with a background in biology, to actually wire these two things together,” he says. “Most bioinformatics programs are too superficial. Because of that we have a lack of models.”

The lack of expertise in computational biology may be holding back the field, but futurist and inventor Ray Kurzweil probably considers that problem just background noise. If there's anyone more bullish than Rick Stevens on the potential of computer science and biology, it's Kurzweil. His notion of “Singularity” is the predicted outcome of merging immense computational power with human beings, precipitating “a technological change so rapid and profound it represents a rupture in the fabric of human history.” Not surprisingly, Kurzweil's prediction of a transhumanist world draws its share of controversy.

“Ray has a hugely optimistic vision of where humanity could go,” says Stevens “What he's try to say is that the future could be so unbelievably cool that we should all want to get there. He's thinking in terms of exponentials and trying to understand the effects of extrapolation. The question is: How good are we at predicting the outcome of complex questions where there are underlying exponential drivers, like Moore's Law?”

“Is there merit to this view of the world?” continues Stevens. “Well, probably. It's been well understood that people have a hard time thinking in exponentials. This is a classical futurist viewpoint, whether it is understanding population increases, global warming, pollution or whatever. People are very bad at making predictions. They tend to over estimate in the near term and under estimate in the long term. What this means is that Ray could very well be correct that in 10 to 20 years, the convergence of these underlying technologies will enable many, many things to be different. Now, if he just simply said that, I don't think anyone would disagree.”

The fact is, Kurzweil is making a more precise prediction: achieving Singularity in 2045. According to Stevens, that's where he gets sort of quasi-theological. What Kurzweil is essentially arguing is that the technological juggernaut will take us to this brave new world regardless of the specific technologies in play. In other words, it's not a function of Moore's Law, network bandwidth, storage capacity, bioimaging technology, microarray chips or any number of rapidly growing technologies; it's the exponential rate of technology itself.

Stevens sums it up as follows: “To solve problems you need three things — time, money and ideas. If you have two, you can compensate for the other one. Kurzweil collapses time and money because of exponential processes. What's left are the ideas.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Hyperion: AI-driven HPC Industry Continues to Push Growth Projections

November 21, 2019

Three major forces – AI, cloud and exascale – are combining to raise the HPC industry to heights exceeding expectations. According to market study results released this week by Hyperion Research at SC19 in Denver, Read more…

By Doug Black

At SC19: Bespoke Supercomputing for Climate and Weather

November 20, 2019

Weather and climate applications are some of the most important uses of HPC – a good model can save lives, as well as billions of dollars. But many weather and climate models struggle to run efficiently in their HPC en Read more…

By Oliver Peckham

Microsoft, Nvidia Launch Cloud HPC Service

November 20, 2019

Nvidia and Microsoft have joined forces to offer a cloud HPC capability based on the GPU vendor’s V100 Tensor Core chips linked via an InfiniBand network scaling up to 800 graphics processors. The partners announced Read more…

By George Leopold

Hazra Retiring from Intel Data Center Group, Successor Not Known

November 20, 2019

Rajeeb Hazra, corporate VP of Intel’s Data Center Group and GM for the Enterprise and Government Group, is retiring after more than 24 years at the company. At this writing, his successor is unknown. An earlier story on... Read more…

By Doug Black

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU-accelerated computing. In recent years, AI has joined the s Read more…

By John Russell

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Data Management – The Key to a Successful AI Project

 

Five characteristics of an awesome AI data infrastructure

[Attend the IBM LSF & HPC User Group Meeting at SC19 in Denver on November 19!]

AI is powered by data

While neural networks seem to get all the glory, data is the unsung hero of AI projects – data lies at the heart of everything from model training to tuning to selection to validation. Read more…

SC19 Student Cluster Competition: Know Your Teams

November 19, 2019

I’m typing this live from Denver, the location of the 2019 Student Cluster Competition… and, oh yeah, the annual SC conference too. The attendance this year should be north of 13,000 people, with the majority attende Read more…

By Dan Olds

Hyperion: AI-driven HPC Industry Continues to Push Growth Projections

November 21, 2019

Three major forces – AI, cloud and exascale – are combining to raise the HPC industry to heights exceeding expectations. According to market study results r Read more…

By Doug Black

At SC19: Bespoke Supercomputing for Climate and Weather

November 20, 2019

Weather and climate applications are some of the most important uses of HPC – a good model can save lives, as well as billions of dollars. But many weather an Read more…

By Oliver Peckham

Hazra Retiring from Intel Data Center Group, Successor Not Known

November 20, 2019

Rajeeb Hazra, corporate VP of Intel’s Data Center Group and GM for the Enterprise and Government Group, is retiring after more than 24 years at the company. At this writing, his successor is unknown. An earlier story on... Read more…

By Doug Black

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

ScaleMatrix and Nvidia Launch ‘Deploy Anywhere’ DGX HPC and AI in a Controlled Enclosure

November 18, 2019

HPC and AI in a phone booth: ScaleMatrix and Nvidia announced today at the SC19 conference in Denver a joint offering that puts up to 13 petaflops of Nvidia DGX Read more…

By Doug Black

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This