Visit additional Tabor Communication Publications
December 19, 2011
Federal and state governments investing 1.1 million Euros
BIELEFELD, Germany, Dec. 19 -- Bielefeld University's Faculty of Physics is getting a new high-performance computer. Researchers will be using it to study strongly interacting matter in order to learn about the properties of matter as it existed in the early universe immediately after the Big Bang. The supercomputer, costing 1.1 million Euros, is being financed with federal and state government funds. On Wednesday 25 January from 4 p.m. onwards, it will be presented at an inauguration colloquium in the university lecture hall H2.
Bielefeld's physicists will be using this new high-performance computer to calculate properties of so-called 'quarks' and 'gluons'. Quarks are considered to be the elementary constituents of all known matter. They interact through the exchange of force particles, the gluons. The physicists particularly want to find out what happens when quarks are placed under very high temperatures or extreme density. With their previous computer, apeNEXT, they have already been able to determine very precisely that the behaviour of quarks changes dramatically at a temperature of 1.78 billion degrees. Although this temperature is approximately 100,000 times higher than that at the core of our sun, it is not unnaturally high. In its early phase shortly after the Big Bang, the universe was even hotter. This was the time when the foundations were laid for the further development of the cosmos, and this is why the properties of the 'quark soup', the quark–gluon plasma, are so important for understanding the current state of the universe.
To study the ‘beginning of the universe' experimentally, researchers are using particle accelerators to create dense matter like those prevailing in the early universe. They can do this for a short time in a small volume with the Large Hadron Collider of the European Organisation for Nuclear Research CERN and the Relativistic Heavy Ion Collider in Brookhaven, New York. In close cooperation with researchers at these two locations, the new Bielefeld computer will be used to study the quark–gluon plasma in detail through computer simulations.
Two companies, sysGen GmbH and NVIDIA, are working together with the University in order to install the high-performance computer. sysGen GmbH is a supplier of computer technology, NVIDIA is one of the world's leading manufacturers of graphic processors (GPUs). These graphic processors, which can also are used in PCs or games computers, are being connected with a network of computer processors to form a GPU cluster. A total of 400 GPUs are being installed. This allows to reach a cumulative peak performance of about 500 Teraflops. This is equivalent to about 10,000 normal PCs. One particular feature of the new computer is its comparatively low power consumption. It is 50 times smaller than a system with the same computing capability composed of PCs.
Edwin Laermann, Professor of Theoretical Physics at Bielefeld University is expecting great opportunities from the new supercomputer: 'We are excited about the new possibilities the GPU cluster will bring to research on strongly interacting hot and dense matter at Bielefeld University'. Laermann is a member of the 'lattice gauge theory' research group that will be working with the new supercomputer. Dr. Olaf Kaczmarek reports that this high-performance computer builds on more than 15 years of experience acquired in Bielefeld in the use of special computers for quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. 'We are pleased with the successful cooperation with the QCD support team at NVIDIA, who are providing technological support for the new high-performance computer, and the American research colleagues in the USQCD consortium who are using similar hardware architectures for their research on strong interaction physics', says Frithjof Karsch, Professor at Bielefeld University and Brookhaven National Laboratory in the USA.
Research on strongly interacting matter is part of the Theoretical Sciences research profile at Bielefeld University, a cooperation between mathematics, theoretical physics, and mathematical economics.
For further information in the Internet, go to www2.physik.uni-bielefeld.de/lattice.html.
Source: Bielefeld University
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.