Visit additional Tabor Communication Publications
September 17, 2009
In the wake of this week's HPC on Wall Street conference, where the attendees gabbed about the wonders of high performance computing (that we dutifully reported on), it's worthwhile remembering that all this cutting-end technology failed to prevent the financial meltdown in the fall of 2008, and the subsequent global economic collapse. In fact, it's arguable that superfast computers and networks just sped up the process.
There's plenty of blame to go around, but some have pointed to the deficiencies of the technology itself. Specifically, there has been a good deal of criticism of the software models used to calculate financial risk of the now discredited collateralized debt obligations (CDOs) and credit default swaps (CDSs). Back in February, I wrote about a particularly questionable mathematical formula that was widely used by quantitative analysts (aka quants) to build these models.
With a year of hindsight to draw on, maybe an even clearer picture is emerging. A recent New York Times article attempted to explain the failure of the models in a more systematic way. The author suggests the software was doomed from the start because it didn't factor in human fallibility:
The risk models proved myopic, they say, because they were too simple-minded. They focused mainly on figures like the expected returns and the default risk of financial instruments. What they didn’t sufficiently take into account was human behavior, specifically the potential for widespread panic. When lots of investors got too scared to buy or sell, markets seized up and the models failed.
It's a cautionary tale of what happens when people obsessed with math (quants) meet people obsessed with money (speculators). Perhaps the Wall Street crowd needs to get in touch with people obsessed with people (behavioral scientists).
By the way, the latest financial instruments Wall Street has come up with are called "life settlements." They've actually been around since 2005, but have spread rapidly since then. In a nutshell it works like this: Financial institutions buy up and package lots of life insurance policies into securities, which they can then resell to investors. (Sound familiar?) The payout is made when the people that took out the policies die. The quicker the person dies, the bigger the yield.
Some interesting motivations are in play here. It puts the investors in the position of rooting for the death of the policyholders, which makes one wonder what steps they might take to increase their ROI. Also, since the value a policy is inversely proportional to the likelihood of that person dying, it creates a death auction mentality for the buyer and seller. Try to model that quant-boy.
In fact, though, the human element is systematically ignored in lots of models. In another Times article, Nobel Prize-winning economist Paul Krugman argues that economists failed to predict the current crisis because they ignored the fact that irrational behavior from real live people doesn't adhere to a pure free-market model. Krugman summed it up thusly:
As I see it, the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth. Until the Great Depression, most economists clung to a vision of capitalism as a perfect or nearly perfect system. That vision wasn’t sustainable in the face of mass unemployment, but as memories of the Depression faded, economists fell back in love with the old, idealized vision of an economy in which rational individuals interact in perfect markets, this time gussied up with fancy equations.
Besides financial instruments and macroeconomics, what other models could be headed for a hard landing? Well, presumably anything that involves people. One that comes to mind is climate change. There are plenty of global warming simulations out there, but I doubt if any fully account for the interplay between the physical processes of the atmosphere, geosphere, biosphere, and peoples' behavior. Throw in the variable of a carbon tax or say a cap and trade policy, and now you're talking about a real grand challenge.
Obviously trying to apply a mathematical model to human behavior is bound to be tricky inasmuch as we only have a crude understanding of the process of how people think, much less the interaction of large numbers of them. But it's hard to imagine how any of these models are going to be of much practical use until we figure out how to incorporate ourselves into them.
Posted by Michael Feldman - September 17, 2009 @ 7:14 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.