July 21, 2021 — Shang-Hua Teng, a University Professor and Seely G. Mudd Professor of Computer Science and Mathematics at University of Southern California Viturbi, has been honored with a Symposium on Theory of Computing (STOC) Test of Time Award. Teng, with Daniel A. Spielman of Yale University, received the award from the ACM Special Interest Group on Algorithms and Computation Theory for a paper on smoothed analysis of algorithms originally presented at the STOC conference in 2001.
In the paradigm-shifting paper, “Smoothed analysis of algorithms: why the simplex algorithm usually takes polynomial time,” Teng and Spielman use the concept of smoothed analysis to give a more realistic understanding of an algorithm’s performance, such as its running time.
The concept helps to explain a long-debated phenomenon: why do some algorithms work better in practice than in theory? Teng and Spielman found that many algorithms, particularly the widely used simplex algorithm for linear programming, work as long as there is noise in the input, because there is usually noise in real-world data.
The study’s findings have been applied to practical algorithms in countless applications, including faster internet communications, deep learning, data mining, differential privacy, game theory, and personalized recommendation systems.
Shang-Hua Teng, a University Professor and Seely G. Mudd Professor of Computer Science and Mathematics, has been honored with a Symposium on Theory of Computing (STOC) Test of Time Award. Teng, with Daniel A. Spielman of Yale University, received the award from the ACM Special Interest Group on Algorithms and Computation Theory for a paper on smoothed analysis of algorithms originally presented at the STOC conference in 2001.
In the paradigm-shifting paper, “Smoothed analysis of algorithms: why the simplex algorithm usually takes polynomial time,” Teng and Spielman use the concept of smoothed analysis to give a more realistic understanding of an algorithm’s performance, such as its running time.
The concept helps to explain a long-debated phenomenon: why do some algorithms work better in practice than in theory? Teng and Spielman found that many algorithms, particularly the widely used simplex algorithm for linear programming, work as long as there is noise in the input, because there is usually noise in real-world data.
The study’s findings have been applied to practical algorithms in countless applications, including faster internet communications, deep learning, data mining, differential privacy, game theory, and personalized recommendation systems.
Over the years, some researchers from operations research, network systems, data mining, and machine learning told me that they used methods inspired by smoothed analysis in their work. Of course, practical algorithmic behaviors are far more complex than what our theory can capture, which is why we and others are continuing to look for ways to develop better theories for practice.
How did you and Professor Spielman meet?
I first met Dan in 1990 when he—then a junior of Yale—gave a seminar at CMU (where I was a PhD student). I was his student host. We then reconnected and became life-long friends at MIT Math department in 1992 when he arrived as a PhD student and I joined as an instructor for the department.
When you were both working on this paper, did you have any idea it would have such an enormous and long-lasting impact?
Twenty years ago, like many in our field, Dan and I recognized the significance of the challenge that motivated our paper: closing the theory-practice gap for algorithms. The simplex method was often mentioned as an example where practical performance defies theoretical prediction. We believed that the theory-practice gap would continue to be a fundamental subject for computing.
We were also encouraged by the responses to our initial work from scientists and researchers, who were closer to practical algorithm design and optimization than we were. Their feedback encouraged us that our steps were meaningful towards capturing practical behaviors of algorithms.
As theoreticians, Dan and I enjoyed the conceptual formulation of smoothed analysis and the technical component of probability, high-dimensional geometry, and mathematical programming in our work. It is exciting to develop a theory that is relevant to some aspect of practice and a great honor indeed to have my work recognized by my peers.
Coming back to the present day, what have you been working on recently? Has the pandemic impacted your research?
During this historical moment, I did find one area of mathematics soothing: recreational mathematics. When I was a student, I used to read Scientific American, and always enjoyed the mathematical puzzles and games in the magazine. When I was teaching at Boston University, one of my PhD students, Kyle Burke, was super passionate and gifted in puzzles and games. He wrote a thesis in 2009 with a cool title: “Science for Fun: New Impartial Board Games.”
Three years ago, he recommended a talented undergraduate, Matt Ferland, to be a PhD student in our department. During the Covid Zoom world, Matt, Kyle and I have been studying several fundamental problems in Combinatorial Game Theory (a more studious name for recreational mathematics), including board games incorporated with quantum-inspired elements.
We also designed new board games based on mathematical and computer science problems. In a recent paper, we solved two long-standing problems in this field that were open since the 1980s and 1990s. These results involve the mathematical extension of the word-chain game we used to play as kids. I have also started playing these games with my 8-year-old daughter. (One of Teng’s games is playable here.)
Click here to learn more.
Source: USC Viturbi