Visit additional Tabor Communication Publications
December 02, 2005
In a paper which was featured on the cover of the July 28, 2005 issue of Nature, an international group of researchers reported the first observation of geologically produced anti-neutrinos. The observation is giving scientists new insight into the interior of our planet.
While the "geo-neutrinos" were detected at the KamLAND facility in Japan, most of the data was stored on High Performance Storage System (HPSS) at the U.S. Department of Energy's National Energy Research Scientific Computing Center (NERSC) and analyzed using the PDSF cluster at NERSC. Together, these systems allowed scientists to find the scientific equivalent of a needle in a very large haystack.
KamLAND records data 24 hours a day, seven days a week. These data are shipped on tapes from the experimental site to LBNL, where it is read off the tapes and stored in the HPSS at NERSC. KamLAND records about 200 GB of data each day and HPSS currently has more than 250 TB of KamLAND data stored, making KamLAND the second-largest user of NERSC's HPSS system.
The KamLAND experiment, located in a mine in Japan, is a 1 kiloton liquid scintillator detector that was built to study anti-neutrinos coming from Japanese nuclear reactors, which are about 200 km from the detector. KamLAND is the first reactor experiment that observed the disappearance of electron anti-neutrinos from the reactor to the detector. Last year, the experiment also showed that the energy spectrum has a distortion typical of neutrino oscillation and measured the so-called mass-splitting, a key parameter in neutrino oscillation.
During dedicated production periods at NERSC, the KamLAND data are read out of HPSS and run through the reconstruction software to convert the waveforms (essentially oscilloscope traces) of about 2,000 photo-multiplier tubes (PMTs) to physically meaningful quantities such as energy and position of the event inside the detector. This reduces the data volume by a factor of 60-100 and the reconstructed events are stored on disk for further analysis.
"The event reconstruction requires a lot of computing power, and with over 600 CPUs, PDSF is a great facility to run these kinds of analysis," said Patrick Decowski, an LBNL physicist who works with NERSC staff on the project. "PDSF has been essential for our measurements."
With the data on disk, specialized analysis programs run over the reconstructed events to extract the geo-neutrinos and perform the final analysis. PDSF is also used for various simulation tasks in order to better understand the background signals in the detector.
"The whole analysis is like looking for a needle in a haystack - out of more than 2 billion events, only 152 candidates were found," Decowski said. "And of these, 128 - plus or minus 13 - are background events."
Despite the poor signal-to-background ratio of the early measurements, they are nonetheless exciting, since the data open up a completely new field on how to study the Earth's interior. Forty years ago, the late John Bahcall proposed the study of neutrinos coming from the sun to understand the fusion processes inside the sun. The measurement of a persistent deficit of the observed neutrino flux relative to Bahcall's calculations led to the 2002 Nobel Prize for Ray Davis and the discovery of neutrino oscillation.
Today, anti-neutrinos are being used to study the interior of the Earth, which is still little known. The deepest borehole ever drilled is less than 20 km in depth, while the radius of the Earth is more than 6000 km. While seismic events have been used to deduce the interior makeup of the Earth's three basic regions - the core, the mantle and the crust - there are no direct measurements of the chemical makeup of the deeper regions.
An important measurement to understand the Earth is the measurement of the heat-flux coming from within. These measurements show that the Earth produces somewhere between 30 and 45 TW of heat. Two important sources of heat generation are the primordial energy released from planetary accretion and latent heat from core solidification. However, it is believed that radiologically produced heat (heat from radioactivity) also plays an important role in the Earth's heat balance, contributing perhaps half of the total heat.
Neutrinos can help in the understanding of the Earth's internal structure and heat generation. Three important isotopes that are part of current Earth models - potassium, uranium and thorium - produce electron anti-neutrinos in their radioactive decay. These neutrinos (so-called geo-neutrinos) only interact with the surrounding Earth material very weakly and almost all of them reach the surface of the Earth.
However, occasionally they do interact with normal matter, and by building a large device that can detect them, something can be learned about the abundance of these isotopes. This allows scientists to study part of the composition of the Earth and most importantly, provide an estimate of the amount of heat produced through radioactive decay. The research is a multinational effort, as shown by the fact that the Nature article represented the work of 87 authors from 14 institutions spread across four nations.
This is a reprint of an article originally published by Berkeley Lab Computing Sciences.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.