Pioneers of Modern Computer Graphics Recognized With ACM A.M. Turing Award

March 18, 2020

NEW YORK, March 18, 2020 — ACM, the Association for Computing Machinery, today named Patrick M. (Pat) Hanrahan and Edwin E. (Ed) Catmull recipients of the 2019 ACM A.M. Turing Award for fundamental contributions to 3-D computer graphics, and the revolutionary impact of these techniques on computer-generated imagery (CGI) in filmmaking and other applications. Catmull is a computer scientist and former president of Pixar and Disney Animation Studios. Hanrahan, a founding employee at Pixar, is a professor in the Computer Graphics Laboratory at Stanford University.

Image courtesy of ACM.

Ed Catmull and Pat Hanrahan have fundamentally influenced the field of computer graphics through conceptual innovation and contributions to both software and hardware. Their work has had a revolutionary impact on filmmaking, leading to a new genre of entirely computer-animated feature films beginning 25 years ago with Toy Story and continuing to the present day.

Today, 3-D computer animated films represent a wildly popular genre in the $138 billion global film industry. 3-D computer imagery is also central to the booming video gaming industry, as well as the emerging virtual reality and augmented reality fields. Catmull and Hanrahan made pioneering technical contributions which remain integral to how today’s CGI imagery is developed. Additionally, their insights into programming graphics processing units (GPUs) have had implications beyond computer graphics, impacting diverse areas including data center management and artificial intelligence.

The ACM A.M. Turing Award, often referred to as the “Nobel Prize of Computing,” carries a $1 million prize, with financial support provided by Google, Inc. It is named for Alan M. Turing, the British mathematician who articulated the mathematical foundation and limits of computing. Hanrahan and Catmull will formally receive the 2019 ACM A.M. Turing Award at ACM’s annual awards banquet on Saturday, June 20, 2020 in San Francisco, California.

“CGI has transformed the way films are made and experienced, while also profoundly impacting the broader entertainment industry,” said ACM President Cherri M. Pancake. “We are especially excited to recognize Pat Hanrahan and Ed Catmull, because computer graphics is one of the largest and most dynamic communities within ACM, as evidenced by the annual ACM SIGGRAPH conference. At the same time, Catmull and Hanrahan’s contributions demonstrate that advances in one specialization of computing can have a significant influence on other areas of the field. For example, Hanrahan’s work with shading languages for GPUs, has led to their use as general-purpose computing engines for a wide range of areas, including my own specialization of high-performance computing.”

“Because 3-D computer graphic imagery is now so pervasive, we often forget what the field was like just a short time ago when a video game like Pong, which consisted of a white dot bouncing between two vertical white lines, was the leading-edge technology,” said Jeff Dean, Google Senior Fellow and SVP, Google AI. “The technology keeps moving forward, yet what Hanrahan and Catmull developed decades ago remains standard practice in the field today—that’s quite impressive. It’s important to recognize scientific contributions in CGI technology and educate the public about a discipline that will impact many areas in the coming years—virtual and augmented reality, data visualization, education, medical imaging, and more.”

Background and Development of Recognized Technical Contributions

Catmull received his PhD in Computer Science from the University of Utah in 1974. His advisors included Ivan Sutherland, a father of computer graphics and the 1988 ACM A.M. Turing Award recipient. In his PhD thesis, Catmull introduced the groundbreaking techniques for displaying curved patches instead of polygons, out of which arose two new techniques: Z-buffering (also described by Wolfgang Straber at the time), which manages image depth coordinates in computer graphics, and texture mapping, in which a 2-D surface texture is wrapped around a three-dimensional object. While at Utah, Catmull also created a new method of representing a smooth surface via the specification of a coarser polygon mesh. After graduating, he collaborated with Jim Clark, who would later found Silicon Graphics and Netscape, on the Catmull-Clark Subdivision Surface, which is now the preeminent surface patch used in animation and special effects in movies. Catmull’s techniques have played an important role in developing photo-real graphics, and eliminating “jaggies,” the rough edges around shapes that were a hallmark of primitive computer graphics.

After the University of Utah, Catmull founded the New York Institute of Technology (NYIT) Computer Graphics Lab, one of the earliest dedicated computer graphics labs in the US. Even at that time, Catmull dreamed of making a computer-animated movie. He came a step closer to his goal in 1979, when George Lucas hired Catmull, who in turn hired many who made the advances that pushed graphics toward photorealistic images. At LucasFilm, Catmull and colleagues continued to develop innovations in 3-D computer graphic animation, in an industry that was still dominated by traditional 2-D techniques. In 1986, Steve Jobs bought LucasFilm’s Computer Animation Division and renamed it Pixar, with Catmull as its President.

One of Catmull’s first hires at Pixar was Pat Hanrahan. Hanrahan had received a PhD in BioPhysics from the University of Wisconsin-Madison in 1985 and had worked briefly at NYIT’s Computer Graphics Laboratory before joining Pixar.

Working with Catmull and other members of the Pixar team, Hanrahan was the lead architect of a new kind of graphics system, which allowed curved shapes to be rendered with realistic material properties and lighting. A key idea in this system, later named RenderMan, was shaders (used to shade CGI images). RenderMan’s functions separated the light reflection behavior from the geometric shapes, and computed the color, transparency, and texture at points on the shapes. The RenderMan system also incorporated the Z-buffering and subdivision surface innovations that Catmull had earlier contributed to the field.

During his time at Pixar, Hanrahan also developed techniques for volume rendering, which allows a CGI artist to render a 2-D projection of a 3-D data set, such as a puff of smoke. In one of his most cited papers, Hanrahan, with co-author Marc Levoy, introduced light field rendering, a method for giving the viewer the sense that they are flying through scenes by generating new views from arbitrary points without depth information or feature matching. Hanrahan went on to develop techniques for portraying skin and hair using subsurface scattering, and for rendering complex lighting effects – so-called global illumination or GI – using Monte Carlo ray tracing.

Hanrahan published his RenderMan research in a seminal 1990 paper that was presented at ACM SIGGRAPH. It would take five more years, however, for the computing hardware to develop to a point where the full-length 3-D computer animated movie Toy Story could be produced using Hanrahan’s RenderMan system.

Under Catmull’s leadership, Pixar would make a succession of successful films using RenderMan. Pixar also licensed RenderMan to other film companies. The software has been used in 44 of the last 47 films nominated for an Academy Award in the Visual Effects category, including AvatarTitanicBeauty and the BeastThe Lord of the Rings trilogy, and the Star Wars prequels, among others. RenderMan remains the standard workflow for CGI visual effects.

After he left Pixar in 1989, Hanrahan held academic posts at Princeton and Stanford universities. Beginning in the 1990s, he and his students extended the RenderMan shading language to work in real time on powerful GPUs that began to enter into the marketplace. The programming languages for GPUs that Hanrahan and his students developed led to the development of commercial versions (including the OpenGL shading language) that revolutionized the writing of video games.

The prevalence and variety of shading languages that were being used on GPUs ultimately required the GPU hardware designers to develop more flexible architectures. These architectures, in turn, allowed the GPUs to be used in a variety of computing contexts, including running algorithms for high−performance computing applications, and training machine learning algorithms on massive datasets for artificial intelligence applications. In particular, Hanrahan and his students developed Brook, a language for GPUs that eventually led to NVIDIA’s CUDA.

Catmull remained at Pixar, which later became a subsidiary of Disney Animation Studios, for over 30 years. Under his leadership, dozens of researchers at these labs invented and published foundational technologies (including image compositing, motion blur, cloth simulation, etc.) that contributed to computer animated films and computer graphics more broadly. Both Hanrahan and Catmull have received awards from ACM SIGGRAPH, as well as the Academy of Motion Picture Arts & Sciences for their technical contributions.

Biographical Background

Patrick M. Hanrahan
Pat Hanrahan is the CANON Professor of Computer Science and Electrical Engineering in the Computer Graphics Laboratory at Stanford University. He received a Bachelor of Science degree in Nuclear Engineering (1977) and a PhD in Biophysics (1985) from the University of Wisconsin-Madison. He held positions at the New York Institute of Technology and Digital Equipment Corporation in the 1980s before serving as a Senior Scientist at Pixar (1986-1989). He later served as an Associate Professor at Princeton University (1991-1994) and Professor at Stanford University (1994-present), where he has advised more than 40 PhD students. Hanrahan co-founded Tableau Software, a data analytics company that was acquired by Salesforce in August 2019.

Hanrahan’s many honors include the 2003 ACM SIGGRAPH Steven A. Coons Award for Outstanding Creative Contributions to Computer Graphics. He is a Fellow of ACM and of the American Academy of Arts & Sciences. He is a member of the National Academy of Engineering, in addition to induction into many other prestigious organizations.

Edwin E. Catmull
Ed Catmull is co-founder of Pixar Animation Studios and a former President of Pixar and Walt Disney Animation Studios. He earned Bachelor of Science degrees in Physics and Computer Science (1970) and a PhD in Computer Science (1974) from the University of Utah. During his career, Catmull was Vice President of the Computer Division of Lucasfilm Ltd., where he managed development in areas of computer graphics, video editing, video games and digital audio. He founded the Computer Graphics Lab at the New York Institute of Technology.

Catmull received the 1993 ACM SIGGRAPH Steven A. Coons Award for Outstanding Creative Contributions to Computer Graphics, and the 2006 IEEE John von Neumann Medal for fundamental contributions to computer graphics and a pioneering use of computer animation in motion pictures. He is a Fellow of ACM and of the Visual Effect Society. He is a member of the Academy of Motion Picture Arts & Sciences and of the National Academy of Engineering.

About the ACM A.M. Turing Award

The A.M. Turing Award was named for Alan M. Turing, the British mathematician who articulated the mathematical foundation and limits of computing, and who was a key contributor to the Allied cryptanalysis of the Enigma cipher during World War II. Since its inception in 1966, the Turing Award has honored the computer scientists and engineers who created the systems and underlying theoretical foundations that have propelled the information technology industry.

About ACM

ACM, the Association for Computing Machinery, is the world’s largest educational and scientific computing society, uniting computing educators, researchers and professionals to inspire dialogue, share resources and address the field’s challenges. ACM strengthens the computing profession’s collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.


Source: ACM 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Intel’s Next-gen Falcon Shores Coming Out in Late 2025 

April 30, 2024

It's a long wait for customers hanging on for Intel's next-generation GPU, Falcon Shores, which will be released in late 2025.  "Then we have a rich, a very aggressive cadence of Falcon Shores products following that Read more…

Stanford HAI AI Index Report: Science and Medicine

April 29, 2024

While AI tools are incredibly useful in a variety of industries, they truly shine when applied to solving problems in scientific and medical discovery. Researching both the world around us and the bodies we inhabit has c Read more…

Atos/Eviden Find a Strategic Path Forward

April 29, 2024

French IT giant Atos seems to have found a path forward. In recent years, Atos has been struggling financially and has not had much luck finding a buyer for some or all of its technology. Atos is the parent of the Read more…

IBM Delivers Qiskit 1.0 and Best Practices for Transitioning to It

April 29, 2024

After spending much of its December Quantum Summit discussing forthcoming quantum software development kit Qiskit 1.0 — the first full version — IBM quietly debuted the latest version (February 15) and recently provi Read more…

Edge-to-Cloud: Exploring an HPC Expedition in Self-Driving Learning

April 25, 2024

The journey begins as Kate Keahey's wandering path unfolds, leading to improbable events. Keahey, Senior Scientist at Argonne National Laboratory and the University of Chicago, leads Chameleon. This innovative projec Read more…

Quantum Internet: Tsinghua Researchers’ New Memory Framework could be Game-Changer

April 25, 2024

Researchers from the Center for Quantum Information (CQI), Tsinghua University, Beijing, have reported successful development and testing of a new programmable quantum memory framework. “This work provides a promising Read more…

Stanford HAI AI Index Report: Science and Medicine

April 29, 2024

While AI tools are incredibly useful in a variety of industries, they truly shine when applied to solving problems in scientific and medical discovery. Research Read more…

IBM Delivers Qiskit 1.0 and Best Practices for Transitioning to It

April 29, 2024

After spending much of its December Quantum Summit discussing forthcoming quantum software development kit Qiskit 1.0 — the first full version — IBM quietly Read more…

Shutterstock 1748437547

Edge-to-Cloud: Exploring an HPC Expedition in Self-Driving Learning

April 25, 2024

The journey begins as Kate Keahey's wandering path unfolds, leading to improbable events. Keahey, Senior Scientist at Argonne National Laboratory and the Uni Read more…

Quantum Internet: Tsinghua Researchers’ New Memory Framework could be Game-Changer

April 25, 2024

Researchers from the Center for Quantum Information (CQI), Tsinghua University, Beijing, have reported successful development and testing of a new programmable Read more…

Intel’s Silicon Brain System a Blueprint for Future AI Computing Architectures

April 24, 2024

Intel is releasing a whole arsenal of AI chips and systems hoping something will stick in the market. Its latest entry is a neuromorphic system called Hala Poin Read more…

Anders Dam Jensen on HPC Sovereignty, Sustainability, and JU Progress

April 23, 2024

The recent 2024 EuroHPC Summit meeting took place in Antwerp, with attendance substantially up since 2023 to 750 participants. HPCwire asked Intersect360 Resear Read more…

AI Saves the Planet this Earth Day

April 22, 2024

Earth Day was originally conceived as a day of reflection. Our planet’s life-sustaining properties are unlike any other celestial body that we’ve observed, Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

Intel Plans Falcon Shores 2 GPU Supercomputing Chip for 2026  

August 8, 2023

Intel is planning to onboard a new version of the Falcon Shores chip in 2026, which is code-named Falcon Shores 2. The new product was announced by CEO Pat Gel Read more…

Intel’s Xeon General Manager Talks about Server Chips 

January 2, 2024

Intel is talking data-center growth and is done digging graves for its dead enterprise products, including GPUs, storage, and networking products, which fell to Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire