Visit additional Tabor Communication Publications
August 11, 2011
Conventional wisdom informs us that innovation leads to society's well-being by fostering things like economic growth and higher living standards. It's pretty much accepted that technology advancements in industrialization, computers, medical technology, and business practices are the big drivers. Economists also claim that innovation drives a specific aspect of economic strength, called productivity.
Or at least it should. An article this week in Technology Review points out that at least one innovation measure is on the decline. Researchers have noticed that since the 1973, US productivity growth has started to flatten.
Tyler Cowen, Professor of Economics, at George Mason calls it the "The Great Stagnation," which conveniently is the same title as the book he authored. Cowen and others use a measurement called total factor productivity (TFP), which according to Wikipedia "accounts for effects in total output not caused by inputs." Basically it's a metric for how efficiently the economic inputs are utilized for production. The idea is that this reflects the rate of technological advancement, aka innovation.
The chart below tells the sad tale:
The graphic is from a recent report (PDF) compiled by The Hamilton Project that tries to make some sense of what's happening to innovation in the US. I have several problems with the report, but most of it is centered on the linkage between this TFP metric and innovation.
Anecdotally, having lived through both the pre-70s and post-70s, I can say with a fair amount of confidence that innovation in the latter era has been a lot more impressive than in the former. And not just innovation, but the rate of innovation.
From post-WWII to the 70s, the biggest advancements were the establishment of personal transportation in the modern automobile and the spread of television as the dominant media. It allowed people and goods to be transported freely across the country -- at least where the roads go -- and enabled near universal access to entertainment and news from homes. Not bad.
But since the 70s we've seen the rise of personal and mobile computing, the internet, genetic sequencing (and molecular-based medicine, in general), as well as my favorite and yours, high performance computing. So today, nearly any type of information accumulated by society can be accessed and manipulated from anywhere. To me, that's more impressive than a 56 Chevy and a 19-inch black and white.
It also should be pointed out that even useful innovation is often ignored. Obviously in that case, it can't get reflected in productivity. This may be especially true when the rate of innovation is so high that it's hard for people or businesses to know when to hop aboard.
Some sectors tend to adopt technology quicker than others. For example, manufacturing and biotech have not embraced HPC with nearly the enthusiasm of say, academia and government research. And on the more personal level, technologies like VoIP, (which, as a Skype user, I can attest is a tremendous productivity booster), has yet to be picked up en masse. The reasons for resisting new technologies can be financial, educational or cultural, but they certainly play a big part in adoption.
Then there's just the more general question whether innovation can exist independently of an economy's productivity. Some observers have noticed that the flattening of the TFP slope after 1973 coincides with the US government's abandonment of Keynesian economic policy (run deficits when the private sector cut back, otherwise run surpluses). The implication here is that productivity is more likely to correlate to government spending habits.
On that note, it might be worthwhile to look at what the government is spending its money on. Certainly we've seen funding for defense and entitlements -- two areas unlikely to contribute to much to either innovation or productivity -- increase substantially in the past four decades. Meanwhile US investments in R&D as a percent of GDP dropped from 2.2 percent in 1964 to about 1 percent today. But that in itself is no guarantee, given that R&D spending was below 1 percent in the 1950s, when TFP was doing just dandy.
Then there's the observant economist who noticed that the TFP for durable goods actually increased during the past four decades, compared to the pre-70s pace. At the same time, the TFP for non-durable goods, which includes the service sector, actually flattened out (it was never very steep to begin with). Since the service sector has grown disproportionally to the durable goods sector, the overall slope of the TFP has flattened.
That's not to say we shouldn't do better in the innovation arena. But I do see the problem more as one of adoption than any perceived decline in innovation itself. Again, HPC users could be viewed as a microcosm of the problem. The technology has a good track record for improving productivity, with enough case studies to choke a modest-sized library. Innovation here comes in many forms -- accelerators (GPUs and FPGAs), architectures (clusters, SMP machines, and exotics), and software (MPI, OpenMP, CUDA, OpenCL, and so on). The array of choices is overwhelming to the HPC newbie. Here, as elsewhere, understanding the technology is going to be the key to productivity.
In any case, be wary of reports that claim innovation is in trouble. Economists have a propensity to forecast doom scenarios, which is why economics is often referred to as the dismal science. They also love to uncover correlations like this, since that is the lifeblood of their field. But understanding the interplay between technology, economics and society is a daunting task, filled with variables that, frankly, no one fully understands.
Posted by Michael Feldman - August 11, 2011 @ 6:26 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.