Visit additional Tabor Communication Publications
HPC Matters is a joint blog consisting of contributors from the Tabor Communications team on their observations and insights into HPC matters.
August 07, 2008
Whenever I am asked what I do for a living, I respond with "I get to hang out and talk with the smartest people in the world." It is, by the way, the coolest aspect of what I do and while it sounds corny, every day I get to reach into new ideas. I had a recent conversation with a colleague under the auspices of delving into the future of high-performance technology and breakthrough applications, and it somehow morphed into a discussion on the topic of scarcity. More precisely, we began to talk about scarcity of natural resources and how it will ultimately transform behavior, technology, politics and the global economy on a mass scale. As I thought deeply about our discussion, it was clear to me that over the next ten years the world would experience shifts on a scale we have yet to even contemplate.
Now this may seem obvious to most of you on a number of levels. Certainly we are all, in our own little ways, making transformations to our daily lives. For me, I traded in my beast of an SUV (I grew increasingly frightened people would start throwing blood on me); I drive less; put energy-efficient lightbulbs in my home; recycle everything and the list goes on. And if anyone missed world history, a lot of wars have been fought over geography and the access to, or scarcity of, natural resources. But what I am referring to here is much more fundamental. We are simply running out of things. And some of the stuff we do have has had the unfortunate consequence of poisoning our planet. There is only so much water, oil, minerals, food, trees, diverse ecosystems, etc. An interesting data point comes from my husband, who happens to be an environmental engineer who specializes in water. According to him, desalination is one of the fastest growing practices at his firm. The future of our society will grow increasingly centered on controlling and managing natural resources -- how they are acquired, produced and used.
In the U.S. this reality poses some interesting political and social challenges. We do not have an industrial policy and the notion of controlling and/or metering natural resources on a grand scale is counter to our cultural and political DNA. Outside of the U.S., the world looks very different, and we are seeing initiatives aimed at trying to manage to these new realities take shape.
So what does all of this mean to our community? Well, a lot. It would be banal to state the obvious -- that new data sources, applications and industries will emerge -- many of which will require HPC-scale resources, know-how and technologies. That much we can envision. Necessity will drive much of this change. What is less obvious is the rate of change and the "butterfly effect." Intellectually, we can see that everything is interconnected, but the consequence of those interrelationships is unknown. Those of us who have spent a career modeling adoption trends will have to throw old assumptions out the window. There is no precedent on which to base the future. Technology focused on advanced analytics and predictive modeling at a "systems" level will increasingly become a critical component to our decision-making process.
Over time, clean technologies will permeate everything, from the mundane, such as washing clothes and dishes to the more complex and esoteric, such as the availability and demand for ultra-light electric automobiles. The requirement to reduce our carbon footprint will give rise to widespread adoption of novel techniques, such as congestion pricing, to reduce the number of cars on the road and shift our behavior toward mass transit systems. It won't be enough to have an energy-efficient sticker on your refrigerator, server or car -- every component will have to be compliant. Cloud, SaaS and other internet-delivered services will take center stage -- again, adoption out of necessity.
Clearly, one blog doesn't change much on its own, but every one of us has a responsibility to do our own small part. From the little changes in our personal lives (which may seem so inconsequential, but on a mass scale become enormously significant) to opening the dialogue and raising awareness about how critical our community is to the future of our planet.
Posted by Debra Goldfarb - August 06, 2008 @ 9:00 PM, Pacific Daylight Time
No Recent Blog Comments
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.