Contributor Miha Ahronovitz traces the history of high throughput computing (HTC), noting the particularly enthusiastic response from the high energy physics world and the role of HTC in such important discoveries as the Higgs boson. As one of the biggest generators of data, this community has been dealing with the “big data” deluge long before “big data” assumed its position as the buzzword du jour.
French-American duo wins Nobel Prize in Physics for developing methods to demonstrate bizarre quantum behavior.
As participants from around the world make their way to Prague for the EGI Technical Forum, grid-enabled tools continue to facilitate global collaboration. Grid computing provides the backbone for a wide range of research, all the way from basic science to once-in-a-lifetime breakthroughs, like the recent achievements surrounding the elusive Higgs boson particle.
CERN’s Worldwide LHC Computing Grid is the superhighway for particle physics data.
Organization preps for 100GbE core network.
The famed Belle physics experiment that aims to examine the issue of why matter rather than anti-matter composes the universe was in jeopardy for its next incarnation due to server investment and maintenance costs, thus the research team looked to EC2 for a solution.
Large Hadron Collider gives computing infrastructure a workout.
Online, at conferences and in theory, manycore processors and the use of accelerators such as GPUs and FPGAs are being viewed as the next big revolution in high performance computing. If they can live up to the potential, these accelerators could someday transform how computational science is performed, providing much more computing power and energy efficiency.