Pioneers of Modern Computer Graphics Recognized With ACM A.M. Turing Award

March 18, 2020

NEW YORK, March 18, 2020 — ACM, the Association for Computing Machinery, today named Patrick M. (Pat) Hanrahan and Edwin E. (Ed) Catmull recipients of the 2019 ACM A.M. Turing Award for fundamental contributions to 3-D computer graphics, and the revolutionary impact of these techniques on computer-generated imagery (CGI) in filmmaking and other applications. Catmull is a computer scientist and former president of Pixar and Disney Animation Studios. Hanrahan, a founding employee at Pixar, is a professor in the Computer Graphics Laboratory at Stanford University.

Image courtesy of ACM.

Ed Catmull and Pat Hanrahan have fundamentally influenced the field of computer graphics through conceptual innovation and contributions to both software and hardware. Their work has had a revolutionary impact on filmmaking, leading to a new genre of entirely computer-animated feature films beginning 25 years ago with Toy Story and continuing to the present day.

Today, 3-D computer animated films represent a wildly popular genre in the $138 billion global film industry. 3-D computer imagery is also central to the booming video gaming industry, as well as the emerging virtual reality and augmented reality fields. Catmull and Hanrahan made pioneering technical contributions which remain integral to how today’s CGI imagery is developed. Additionally, their insights into programming graphics processing units (GPUs) have had implications beyond computer graphics, impacting diverse areas including data center management and artificial intelligence.

The ACM A.M. Turing Award, often referred to as the “Nobel Prize of Computing,” carries a $1 million prize, with financial support provided by Google, Inc. It is named for Alan M. Turing, the British mathematician who articulated the mathematical foundation and limits of computing. Hanrahan and Catmull will formally receive the 2019 ACM A.M. Turing Award at ACM’s annual awards banquet on Saturday, June 20, 2020 in San Francisco, California.

“CGI has transformed the way films are made and experienced, while also profoundly impacting the broader entertainment industry,” said ACM President Cherri M. Pancake. “We are especially excited to recognize Pat Hanrahan and Ed Catmull, because computer graphics is one of the largest and most dynamic communities within ACM, as evidenced by the annual ACM SIGGRAPH conference. At the same time, Catmull and Hanrahan’s contributions demonstrate that advances in one specialization of computing can have a significant influence on other areas of the field. For example, Hanrahan’s work with shading languages for GPUs, has led to their use as general-purpose computing engines for a wide range of areas, including my own specialization of high-performance computing.”

“Because 3-D computer graphic imagery is now so pervasive, we often forget what the field was like just a short time ago when a video game like Pong, which consisted of a white dot bouncing between two vertical white lines, was the leading-edge technology,” said Jeff Dean, Google Senior Fellow and SVP, Google AI. “The technology keeps moving forward, yet what Hanrahan and Catmull developed decades ago remains standard practice in the field today—that’s quite impressive. It’s important to recognize scientific contributions in CGI technology and educate the public about a discipline that will impact many areas in the coming years—virtual and augmented reality, data visualization, education, medical imaging, and more.”

Background and Development of Recognized Technical Contributions

Catmull received his PhD in Computer Science from the University of Utah in 1974. His advisors included Ivan Sutherland, a father of computer graphics and the 1988 ACM A.M. Turing Award recipient. In his PhD thesis, Catmull introduced the groundbreaking techniques for displaying curved patches instead of polygons, out of which arose two new techniques: Z-buffering (also described by Wolfgang Straber at the time), which manages image depth coordinates in computer graphics, and texture mapping, in which a 2-D surface texture is wrapped around a three-dimensional object. While at Utah, Catmull also created a new method of representing a smooth surface via the specification of a coarser polygon mesh. After graduating, he collaborated with Jim Clark, who would later found Silicon Graphics and Netscape, on the Catmull-Clark Subdivision Surface, which is now the preeminent surface patch used in animation and special effects in movies. Catmull’s techniques have played an important role in developing photo-real graphics, and eliminating “jaggies,” the rough edges around shapes that were a hallmark of primitive computer graphics.

After the University of Utah, Catmull founded the New York Institute of Technology (NYIT) Computer Graphics Lab, one of the earliest dedicated computer graphics labs in the US. Even at that time, Catmull dreamed of making a computer-animated movie. He came a step closer to his goal in 1979, when George Lucas hired Catmull, who in turn hired many who made the advances that pushed graphics toward photorealistic images. At LucasFilm, Catmull and colleagues continued to develop innovations in 3-D computer graphic animation, in an industry that was still dominated by traditional 2-D techniques. In 1986, Steve Jobs bought LucasFilm’s Computer Animation Division and renamed it Pixar, with Catmull as its President.

One of Catmull’s first hires at Pixar was Pat Hanrahan. Hanrahan had received a PhD in BioPhysics from the University of Wisconsin-Madison in 1985 and had worked briefly at NYIT’s Computer Graphics Laboratory before joining Pixar.

Working with Catmull and other members of the Pixar team, Hanrahan was the lead architect of a new kind of graphics system, which allowed curved shapes to be rendered with realistic material properties and lighting. A key idea in this system, later named RenderMan, was shaders (used to shade CGI images). RenderMan’s functions separated the light reflection behavior from the geometric shapes, and computed the color, transparency, and texture at points on the shapes. The RenderMan system also incorporated the Z-buffering and subdivision surface innovations that Catmull had earlier contributed to the field.

During his time at Pixar, Hanrahan also developed techniques for volume rendering, which allows a CGI artist to render a 2-D projection of a 3-D data set, such as a puff of smoke. In one of his most cited papers, Hanrahan, with co-author Marc Levoy, introduced light field rendering, a method for giving the viewer the sense that they are flying through scenes by generating new views from arbitrary points without depth information or feature matching. Hanrahan went on to develop techniques for portraying skin and hair using subsurface scattering, and for rendering complex lighting effects – so-called global illumination or GI – using Monte Carlo ray tracing.

Hanrahan published his RenderMan research in a seminal 1990 paper that was presented at ACM SIGGRAPH. It would take five more years, however, for the computing hardware to develop to a point where the full-length 3-D computer animated movie Toy Story could be produced using Hanrahan’s RenderMan system.

Under Catmull’s leadership, Pixar would make a succession of successful films using RenderMan. Pixar also licensed RenderMan to other film companies. The software has been used in 44 of the last 47 films nominated for an Academy Award in the Visual Effects category, including AvatarTitanicBeauty and the BeastThe Lord of the Rings trilogy, and the Star Wars prequels, among others. RenderMan remains the standard workflow for CGI visual effects.

After he left Pixar in 1989, Hanrahan held academic posts at Princeton and Stanford universities. Beginning in the 1990s, he and his students extended the RenderMan shading language to work in real time on powerful GPUs that began to enter into the marketplace. The programming languages for GPUs that Hanrahan and his students developed led to the development of commercial versions (including the OpenGL shading language) that revolutionized the writing of video games.

The prevalence and variety of shading languages that were being used on GPUs ultimately required the GPU hardware designers to develop more flexible architectures. These architectures, in turn, allowed the GPUs to be used in a variety of computing contexts, including running algorithms for high−performance computing applications, and training machine learning algorithms on massive datasets for artificial intelligence applications. In particular, Hanrahan and his students developed Brook, a language for GPUs that eventually led to NVIDIA’s CUDA.

Catmull remained at Pixar, which later became a subsidiary of Disney Animation Studios, for over 30 years. Under his leadership, dozens of researchers at these labs invented and published foundational technologies (including image compositing, motion blur, cloth simulation, etc.) that contributed to computer animated films and computer graphics more broadly. Both Hanrahan and Catmull have received awards from ACM SIGGRAPH, as well as the Academy of Motion Picture Arts & Sciences for their technical contributions.

Biographical Background

Patrick M. Hanrahan
Pat Hanrahan is the CANON Professor of Computer Science and Electrical Engineering in the Computer Graphics Laboratory at Stanford University. He received a Bachelor of Science degree in Nuclear Engineering (1977) and a PhD in Biophysics (1985) from the University of Wisconsin-Madison. He held positions at the New York Institute of Technology and Digital Equipment Corporation in the 1980s before serving as a Senior Scientist at Pixar (1986-1989). He later served as an Associate Professor at Princeton University (1991-1994) and Professor at Stanford University (1994-present), where he has advised more than 40 PhD students. Hanrahan co-founded Tableau Software, a data analytics company that was acquired by Salesforce in August 2019.

Hanrahan’s many honors include the 2003 ACM SIGGRAPH Steven A. Coons Award for Outstanding Creative Contributions to Computer Graphics. He is a Fellow of ACM and of the American Academy of Arts & Sciences. He is a member of the National Academy of Engineering, in addition to induction into many other prestigious organizations.

Edwin E. Catmull
Ed Catmull is co-founder of Pixar Animation Studios and a former President of Pixar and Walt Disney Animation Studios. He earned Bachelor of Science degrees in Physics and Computer Science (1970) and a PhD in Computer Science (1974) from the University of Utah. During his career, Catmull was Vice President of the Computer Division of Lucasfilm Ltd., where he managed development in areas of computer graphics, video editing, video games and digital audio. He founded the Computer Graphics Lab at the New York Institute of Technology.

Catmull received the 1993 ACM SIGGRAPH Steven A. Coons Award for Outstanding Creative Contributions to Computer Graphics, and the 2006 IEEE John von Neumann Medal for fundamental contributions to computer graphics and a pioneering use of computer animation in motion pictures. He is a Fellow of ACM and of the Visual Effect Society. He is a member of the Academy of Motion Picture Arts & Sciences and of the National Academy of Engineering.

About the ACM A.M. Turing Award

The A.M. Turing Award was named for Alan M. Turing, the British mathematician who articulated the mathematical foundation and limits of computing, and who was a key contributor to the Allied cryptanalysis of the Enigma cipher during World War II. Since its inception in 1966, the Turing Award has honored the computer scientists and engineers who created the systems and underlying theoretical foundations that have propelled the information technology industry.

About ACM

ACM, the Association for Computing Machinery, is the world’s largest educational and scientific computing society, uniting computing educators, researchers and professionals to inspire dialogue, share resources and address the field’s challenges. ACM strengthens the computing profession’s collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.


Source: ACM 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Quantinuum Reports 99.9% 2-Qubit Gate Fidelity, Caps Eventful 2 Months

April 16, 2024

March and April have been good months for Quantinuum, which today released a blog announcing the ion trap quantum computer specialist has achieved a 99.9% (three nines) two-qubit gate fidelity on its H1 system. The lates Read more…

Mystery Solved: Intel’s Former HPC Chief Now Running Software Engineering Group 

April 15, 2024

Last year, Jeff McVeigh, Intel's readily available leader of the high-performance computing group, suddenly went silent, with no interviews granted or appearances at press conferences.  It led to questions -- what's Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Institute for Human-Centered AI (HAI) put out a yearly report to t Read more…

Crossing the Quantum Threshold: The Path to 10,000 Qubits

April 15, 2024

Editor’s Note: Why do qubit count and quality matter? What’s the difference between physical qubits and logical qubits? Quantum computer vendors toss these terms and numbers around as indicators of the strengths of t Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Computational Chemistry Needs To Be Sustainable, Too

April 8, 2024

A diverse group of computational chemists is encouraging the research community to embrace a sustainable software ecosystem. That's the message behind a recent Read more…

Hyperion Research: Eleven HPC Predictions for 2024

April 4, 2024

HPCwire is happy to announce a new series with Hyperion Research  - a fact-based market research firm focusing on the HPC market. In addition to providing mark Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

Intel’s Xeon General Manager Talks about Server Chips 

January 2, 2024

Intel is talking data-center growth and is done digging graves for its dead enterprise products, including GPUs, storage, and networking products, which fell to Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire