Visit additional Tabor Communication Publications
November 11, 2005
Scientists have long sought ways to map the Universe and explore its most violent phenomena, from mysterious gamma ray bursts and supernova to the black holes that inhabit active nuclei in the centers of galaxies. In their quest, some researchers are now focusing on subatomic particles called neutrinos, which show promise of being valuable messengers. In contrast to other particles or light, which are absorbed, bent, or scattered in their travels, the tiny, almost massless neutrino is able to travel virtually unimpeded across the vast distances of space to reach the Earth. Aided by the TeraGrid network, cluster, and massive data resources at SDSC at UC San Diego, physicists are developing a new kind of telescope, AMANDA-II, the Antarctic Muon and Neutrino Detector Array, to observe these neutrinos and decipher their tales about the location and inner workings of the cataclysmic events in which they originated.
Researcher Andrea Silvestri and Professor Steven Barwick of the Physics and Astronomy Department at the University of California, Irvine (UCI), along with many other scientists, are beginning to use the multipurpose AMANDA-II high-energy neutrino telescope at the South Pole to seek answers to a broad array of questions in physics and astrophysics. Observing neutrinos can shed light on fundamental problems such as the origins of cosmic rays, the search for dark matter and other exotic particles, as well as serving as a monitor for supernovas in the Milky Way.
However, the same qualities that let neutrinos travel freely across the universe also make them extremely difficult to detect. To further complicate matters, the vast majority of neutrinos that reach the Earth are produced nearby through cosmic ray collisions in the Earth's atmosphere, potentially masking the rarer, distant-origin neutrinos the scientists are seeking. How can particle astrophysicists Silvestri and Barwick filter out the mass of unwanted information from their data and tease out the tiny signal of high-energy neutrinos from far away?
Taming a Flood of Data
Scientists have steadily increased the size and effectiveness of the AMANDA telescope since it began collecting data in 1997. In AMANDA-II, the data acquisition electronics were upgraded with Transient Waveform Recorders that capture the complete waveform for each event detected. The researchers expect that several important goals will benefit by as much as a factor of 10 from the additional information gathered, including improvements in reconstructing muon cascades, the search for diffuse sources of ultra-high energy neutrinos, and the search for neutrino point sources.
However, with this progress in gathering data come new challenges, and the telescope now produces a flood of information, growing from one terabyte to 15 terabytes per year even in compressed formabout the same amount of information as in the entire printed collection of the Library of Congress or the data on 3,500 DVDs.
To analyze this immense data collection, Silvestri and Barwick turned to the large-scale data and computing capabilities of the NSF TeraGrid facility at SDSC. "The more capable the AMANDA telescope becomes, the more information we gather about neutrinos, which greatly helps our science," said Silvestri. "But this also means that to analyze all this data, we need the expertise and high-end resources of SDSC and the TeraGrid."
The first step was to transfer the 15 terabytes of raw AMANDA neutrino data over a high-speed network from UCI into a Storage Resource Broker (SRB) data archive at SDSC. Having developed the advanced SDSC SRB data management tool and installed more than one petabyte of online disk and more than six petabytes of archival storage capacity, SDSC is ideally suited to house and analyze massive data sets. And as the TeraGrid was designed to do, the high-speed network allowed the researchers to transparently access their massive data archive housed at SDSC for use on TeraGrid computational resources at various sites, speeding their research.
Finding a Neutrino in a Haystack
The 15 terabytes of the full AMANDA-II waveform data collected for one year during 2003 contains some two billion experimental events, and the challenge the scientists faced was to identify the few neutrinos among the millions of times greater number of background muon events. Scientists measure neutrinos by detecting muons, which are subatomic particles produced in the rare interaction of a neutrino with other matter.
In their analysis, the researchers processed and filtered the experimental data and reconstructed each individual event. By running sophisticated algorithms on the TeraGrid through numerous iterations using likelihood-based statistical methods, the researchers analyzed the full 15 terabytes of experimental data. This process was highly data and compute-intensive, and only by having access to the resources of the massive SDSC online disk and tape storage and some 70,000 CPU hours on the TeraGrid supercomputer were the researchers able to carry out their data analysis. Typical jobs ran on 512 processors using 1 to 2 gigabytes of memory.
Finally, the researchers succeeded in distinguishing the faint signal of 1,112 atmospheric neutrinos from the billions of extraneous events. When they compared their results to standard analyses they found good agreement, confirming that the AMANDA instrument and SDSC-aided data analysis can produce the same physics results as previous data. In particular, the angular distribution of the atmospheric neutrino sample extracted from the standard data set agreed well with the new AMANDA data, with all 1,112 neutrinos originating in the northern hemisphere and distributed across the sky in a fairly uniform way, as expected.
In validating the AMANDA instrument and analysis, the researchers also investigated whether the observed neutrinos were generated from collisions with the Earth's atmosphere, which produces a uniform spatial distribution of neutrino events across the sky, or whether some of the neutrinos were created by a source of extraterrestrial origin, which would be expected to produce a more concentrated event cluster in the direction of the source. Their statistical analysis of the data showed that the observed regions of the sky were compatible with atmospheric neutrino events, without significant event clusters that might indicate an extraterrestrial source.
The scientists explained that it has been an enormous undertaking, requiring many years and the efforts of diverse groups and specialities working together to develop and validate an entirely new kind of telescope such as AMANDA. "This is a major result for the AMANDA-II neutrino telescope and broader research community," said Silvestri. "It's the first validation that we can in fact perform valid neutrino analysis with the new generation of instrument, its much larger data stream, and all the steps of our analysis, and we couldn't have done it without SDSC and the TeraGrid data and compute resources."
Moreover, since their initial analysis used only part of the complete information contained in the AMANDA-II waveform data, the researchers are now developing new software tools to exploit the full information available. The scientists expect the additional information to improve their ability to resolve even smaller differences in energy and angle. This will be crucial in their continuing search for the hard-to-detect energetic extra-terrestrial neutrinos that may hold the answers to many fundamental questions about the Universe.
A New Kind of Telescope
The fulfillment of a 40-year dream, AMANDA was designed to overcome the obstacles to detecting elusive neutrinos, and shows promise of giving scientists a startlingly broader view the Universe through the window of these high-energy particles. AMANDA is an ingenious new kind of "telescope" that senses neutrinos instead of light from above as have all telescopes since the time of Galileo (see sidebar). And unlike normal telescopes, which always face upward, AMANDA can also look downward, using the size of the Earth to "filter out" the extraneous downward-moving atmospheric muons, which are about a million times more abundant, and in this way detect high-energy neutrinos in the intermediate range from distant parts of the Universe. Only such neutrinos are able to pass through the whole Earth after entering in the Northern Hemisphere to reach the AMANDA telescope at the South Pole.
Occasionally, one of these upward-moving high-energy neutrinos will interact with an oxygen atom in the ice near the AMANDA array to produce a cascade of light-emitting muon particles. This light can travel long distances through the clear ice at the South Pole, which is free of competing background light, until it is picked up by the sensitive AMANDA photodetectors that gather this indirect evidence of the passage of a neutrino. The telescope can also search for even more energetic neutrinos by looking for downward-moving neutrinos of ultra-high energies.
The AMANDA neutrino telescope continues to grow in power, and currently consists of some 700 photon detectors arranged like beads on vertical strings, lowered into 19 holes in the ice at the South Pole. The holes are distributed across a circular area, creating a cylindrical volume of ice that serves as the detector, some 120 meters in diameter and 500 meters tall, with its top about 1,500 meters below the ice cap's surface. Each photon detector module consists of a photomultiplier tube housed in a tough, pressure-resistant hollow sphere, with electrical and optical connectors attached. After a hole is bored in the ice with heated water, the string of detector modules is lowered into the water-filled hole, which then freezes solid, locking the detectors permanently in place.
In the future, the research will be scaled up even further in the NSF IceCube project, a much larger one-kilometer cube telescope array that will produce 20 times as much data, one terabyte per day or some 300 terabytes annually. Silvestri points out that "this will drive the need for even larger data, computational, and network resources at SDSC to better answer questions about the most energetic events in the history of the Universe."
A. Silvestri et al, Performance of AMANDA-II using data from Transient Waveform Recorders, Proceedings of 29th International Cosmic Ray Conference, Pune, India August 3-10, 2005.
A. Silvestri et al, The AMANDA Neutrino Telescope, Proceedings of International School of Cosmic Ray Astrophysics. Erice, Italy July 2-13, 2004.
Antarctic Muon and Neutrino Detector Array (AMANDA) - http://amanda.uci.edu/
Andrea Silvestri - http://www.ps.uci.edu/~silvestri
Steven Barwick - http://www.ps.uci.edu/physics/barwick.html
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.