Let’s Remember: The Noun is ‘Science’

By Dr. Robert Panoff, Shodor Foundation

July 1, 2005

For several weeks, I have been trying to digest the recent report (402408) by the President's Information Technology Advisory Council (PITAC) on computational science whose publication was punctuated by the nearly simultaneous dissolution of PITAC itself. In the same spirit that even a broken clock is right twice a day (well, at least the old analog clocks were!), maybe the President did us all a favor by making it clear that advice about information technology isn't advice that the administration wants to hear — at least not this advice. I want to spend as little energy as possible pointing out the shortcomings of a report that has fallen on deaf ears. Instead, let's open a fresh conversation among ourselves, with our colleagues, and even with our fellow citizens, about the real roadmap for computational science and its potential to transform science and mathematics education for all.

Over many years, the high performance computing community, in my opinion, has lost its focus because it has forgotten that the noun in “computational science” is “science.” As supercomputer centers matured, those whose leadership failed to keep the quality of science as their driving motivation lost some of their relevance and some even closed. Those centers that kept science at the forefront fared better. The evaluation of the NSF advanced computational infrastructure program suggested that while there was still a need for well-run “cycle shops,” the leading-edge high performance computing centers as then constituted, for the most part, were not likely to lead scientific teams to produce significant advances in science. They had gotten too caught up in boxes and wires, and not what the boxes and wires could do. And as budget priorities have changed, there hasn't been a huge outcry even from the science community because so much of computational science is being accomplished day in and day out at a scale that does not call for the biggest machines and a few centers. Consider one piece of evidence that suggests the PITAC authors missed the chance to make their case for a broader impact of computational science: the most compelling science challenges that face us -challenges that do, in fact, justify a national effort at the large end of the spectrum- were relegated to the appendices in the report. Their main recommendation is to sustain software centers, not science. As Stan Lee would say, “'Nuf said,”

Dan Warner, a professor of mathematical sciences at Clemson University and one of the co-founders of Shodor, a national resource in computational science education, recently put the situation very clearly. In considering the vast oceans of data that are being generated by a variety of observational laboratories, he observed, “It isn't whether we have more chips processing the data, but whether we have more neurons. We need many more people engaged in the conduct of science, and computational science is a wonderful way to bring people into science.”

Our challenge is to see that computational science education is a most effective means for addressing a larger issue: quantitative reasoning. In simple terms, we still have to ensure our children actually grow up knowing how to compare quantities, even if it isn't being tested anymore by the SAT! In a relevant context of measuring and comparing, of observing and conjecturing, students need to master fractions, decimals, percents and ratios, and reading and interpreting graphs — not through repeated testing, but through minds-on exploration. With an administration caught up in the sloganeering of “No Child Left Behind” (which is really “No Child Allowed to Get Ahead”…don't get me started!), we have found that computational approaches to science education (the effective use of computational tools and visualization to teach the concepts of math and science) is as important or more important to stress as education in “computational science” education (teaching the process of building and testing a numerical model).

As reported in most major papers last week, it seems programming has lost its luster. Modeling the world hasn't. Our computational science classes at Shodor for middle school and high school students (http://www.shodor.org/succeed) are in full gear now, and the students learn everything from systems dynamics to agent-based modeling, data analysis, and visualization. But the focus is not the computer, but what the computer can help one learn about the world. Students want to focus more on content driven disciplines. And that is the strength of computational science, because modern math and science are more about pattern recognition and characterization than mere symbol manipulation. The tools of computational science can open up avenues of exploration for students in ways that even direct observation can't. The observation is paramount, but the observation is made in the context of a scientific model that is implemented on the computer. The science is at the heart of computational science.

For several years now, I have been sharing with faculty and teachers a simplification of the process referred to as “the scientific method.” Basically, we can boil down the process of science -the acquisition of sure knowledge- to four basic questions:

  • What can I observe?
  • What can I learn from these observations?
  • How sure am I that I am right?
  • Why should I care?

A well-balanced experience with an interactive model, or an exploration and visualization of a dataset, can go a long way for teaching the process of science with the added benefit that more students will actually want to be scientists.

One approach to bring computational science to the masses is by enlisting the help of many to assist in the task of analyzing the overwhelming data being generated by a number of space and land-based projects, from star surveys to earthquakes, from census data to on-line archives of historical records. By incorporating the exploration of real data -and there is so much of it yet to be explored- as part of the learning of math and science starting in the middle grades through high school and college, we can make education an adventure for the whole human race. Unfortunately, we have many math and science teachers at the elementary and middle school levels who choose to be teachers at this level because they “don't do math!” Significant work to incorporate models and computational tools into the math education of many students has started to show its benefits, by easing some of the math anxiety and showing how the math makes sense. Some materials also show how to seamlessly incorporate these tools into existing curricula in support of standards (see: http://www.shodor.org/interactivate). For these approaches to become more widespread, it will take a wholesale change in schools of education in the pre-service preparation of math and science teachers, which means a massive change in the attitudes of faculty in the sciences and in education.

Computational science is both content and method. Students should know the basics of the tools of computation, but also use computation to learn the basics of chemistry, biology, physics, and engineering. So many of the texts in use at all levels are wholly lacking. At the very least, they fail to accurately communicate that much of what we know in the sciences is from computational models as much as from direct observation.

We have a long, long way to go. Eric Jakobsson, returning to Illinois from his tour of duty at the National Institutes of Health, reported that several years into the ten-year plan of the National Institute of General Medical Sciences (NIGMS), little progress has been made to open up an education and training pathway that integrates at every level physical science, mathematics, and computation with the learning of biology in a problem-solving environment. As a result, perhaps proving that merely having a “road map” does not guarantee success, he related that we are not training American biology researchers with quantitative skills at even close to a rate to sustain, let alone advance, American biology. Some of that biology requires big iron to manage exponentially growing databases; most of biology requires computational science that uses those databases remotely.

The same can be said for chemistry. Only ten years ago, without a supercomputer, no leading chemist could do significant computational chemistry in a reasonable amount of time; now most of the packages for much of the chemistry can be done without recourse to the biggest machines, and real computational chemistry can be part and parcel of every undergraduate -even high school- chemistry course. For instance, the Burroughs Wellcome Fund has recently awarded a grant for a computational chemistry server to be housed at Shodor so that North Carolina high school students would have precisely this resource and experience. True, there are significant problems that require massive computing, but there are many more problems that are more relevant to the education and training of computational chemists that don't. Thom Dunning, a renowned chemist who as its new director has taken up the challenge of restoring the “applications” focus of the National Center for Supercomputing Applications, has set an even more challenging goal of incorporating computational chemistry into every undergraduate chemistry course at the University of Illinois, not just for a select few who may be computational chemists.

Even at the highest level, computational resources are limited and aren't “there yet.” For instance, it would take about a dozen years on the fastest existing supercomputer merely to initiate the computation for a drop of water at the molecular level — that is, to assign initial values for each of the spatial components of position, velocity, and acceleration for each molecule in a single drop. That doesn't mean we shouldn't try to solve large problems, it's just a measure of how far we have to go.

So, back to reality. If we keep thinking that computational science is only for the biggest problems, then it affects only a few who would be given limited access to limited resources concentrated in a few national centers. If that is the only way that “real science” will get done, we will never convince a doubting Congress the second time around, let alone an administration that may not realize that only one of the three R's actually begins with “R,” of the relevance of computational science. To justify an appropriate appropriation for a long-range road map, we have to have a more wide-reaching goal of computational science for everyone at all levels, and that means developing an effective computational approach to science education as well as an effective education in computational science.


HPCwire contributor Dr. Robert M. Panoff is founder and Executive Director of The Shodor Education Foundation, Inc., a non-profit education and research corporation dedicated to reform and improvement of mathematics and science education by appropriate incorporation of computational and communication technologies.

He has been a consultant at several national laboratories and is a frequent presenter at NSF-sponsored workshops on visualization, supercomputing, and networking. He has served on the advisory panel for Applications of Advanced Technology program at NSF, and is a founding partner of NSF-affiliated Corporate and Foundation Alliance.

Dr. Panoff received his B.S. in physics from the University of Notre Dame and his M.A. and Ph.D. in theoretical physics from Washington University in St. Louis, undertaking both pre- and postdoctoral work at the Courant Institute of Mathematical Sciences at New York University.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire