Visit additional Tabor Communication Publications
December 03, 2009
Dec. 2 -- Can economics better predict how banks will react to future credit crunches and their impact on the wider economy? Breakthrough simulation software by European researchers could hold the answers to this question and more.
How will economic policies adapt in 2020 when a quarter of the EU population is over 65 and natural resources are dwindling? Can economists better predict future banking crises or economic turmoil? This week, the European Commission unveiled breakthrough research that could help answer questions like these by using economic simulation software.
Produced by the EU-funded EURACE research project that came to a successful end in November, the software applies simulation technology also used for computer generated images (CGI) in movies. The EURACE software platform runs on simulation technology called FLAME (Flexible Large-scale Agent Modelling Environment).
The simulation software predicts the interaction between large populations of different economic actors, like households and companies, banks and borrowers or employers and job-seekers who trade and compete like real people.
By giving each simulated agent individual and realistic behaviour and interactions that show how markets will evolve, these massive-scale simulations can better test new policies tackling future societal challenges.
"This first class European research can help us make the move from the economics of pen and paper to the economics of super-computers," said Viviane Reding, EU Commissioner for Information Society and Media.
"The results of this research project will complement traditional economic statistics and assumptions about how economic actors react by enabling better testing of a policy's effects on people while still on the drawing board. I expect government researchers and national research institutes will act quickly to put this tool at the disposal of decision-makers as soon as possible," noted the Commissioner.
This simulation technology uses computer-based experiments to focus on the relationship between large populations of different economic actors across many interconnected markets. It is the first time this sort of technology is applied on such a big scale using high-powered computing.
Each simulated household (or business, or bank) will make different decisions in reaction to various monetary, fiscal or pro-innovation policies including, for example, whether to remain in a job or seek a new one, how much of a wage is saved, spent or invested. This means that the impact of one policy in one market at one point in time is no longer assessed in isolation from other factors.
Predicting the unpredictable?
Traditional economics failed to predict the scale of the knock-on effect of the credit crunch on the world economy. The new software shows how banks react in different ways by looking at a wide range of factors like how much reserves they must keep compared to investments, their savers' consumption/investment and saving patterns, and psychological factors like confidence in the market. It can then give policy-makers -- who want to know how fiscal and monetary reforms will affect banks and customers -- a better warning of the scale of a financial crisis' impact on the real economy. The software can also simulate the same scenario with an older demographic to help plan for an older Europe, or with limited energy supplies.
Designed to run on supercomputers that allow simulation to be carried out on a massive scale but accessible to any connected desktop PC, the software can be used by economists and policy-makers with no knowledge of computer programming. By connecting hundreds of thousands of small simulated actions and reactions across the economy, the software can give policy-makers better and bigger pictures of their policy impact on people's life and work.
Igniting the flame
The three-year project was carried out by economists and computer scientists from eight universities (in Italy, France, Germany, Turkey and the UK), brought together by the EU and financed from the European Commission's technology research budget.
The €2.5 million project which started in 2006 was co-funded with €2.1 million under the Commission's Sixth Framework Programme for research. It was part of the European Commission's initiative to boost high-risk research in future and emerging information technologies.
The Commission recently called on EU countries to increase high-risk research investment to catch up with the US, China and Japan. The Commission will lead by example, boosting the current €100 million annual funding 70 percent by 2013.
Source: ICT Results -- http://cordis.europa.eu/ictresults
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.