Visit additional Tabor Communication Publications
December 08, 2006
Imagine taking the entire collection of historical documents at the Smithsonian National Air and Space Museum and storing it on a single DVD. University of Central Florida Chemistry Professor Kevin D. Belfield and his team have cracked a puzzle that stumped scientists for more than a dozen years. They have developed a new technology that will allow users to record and store massive amounts of data -- the museum's entire collection or as many as 500 movies, for example -- onto a single disc or, perhaps, a small cube.
Belfield's Two-Photon 3-D Optical Data Storage system makes this possible.
"For a while, the community has been able to record data in photochromic materials in several layers," Belfield said. "The problem was that no one could figure out how to read out the data without destroying it. But we cracked it."
Think of it this way. Television viewers can tape a show on a VHS tape. They can use the tape several times. But each time the same segment of the tape is used, the quality diminishes as the tape wears out. Eventually, the data is lost. The same is true of recordable DVDs.
Belfield's team figured out a way to use lasers to compact large amounts of information onto a DVD while maintaining excellent quality. The information is stored permanently without the possibility of damage.
The process involves shooting two different wavelengths of light onto the recording surface. The use of two lasers creates a very specific image that is sharper than what current techniques can render. Depending on the color (wavelength) of the light, information is written onto a disk. The information is highly compacted, so the disk isn't much thicker. It's like a typical DVD.
The challenge scientists faced for years was that light is also used to read the information. The light couldn't distinguish between reading and writing, so it would destroy the recorded information. Belfield's team developed a way to use light tuned to specific colors or wavelengths to allow information that a user wants to keep to stay intact.
The UCF team's work was published in Advanced Materials (2006, vol. 18, pp. 2910-2914, http://dx.doi.org/10.1002/adma.200600826) and recently highlighted in Nature Photonics (www.nature.com/nphoton/reshigh/2006/1106/full/nphoton.2006.47.html). A patent is pending.
Once the technology is fine-tuned, it could be used to store historical documents or create complicated databases that could give decision-makers quick access to critical information, Belfield said.
Blu-Ray Disc Association and industrial leaders in computer and other media recently commercially introduced Blu-Ray Disc technology that allows for storage of 25 gigabytes (GB) on a single layer of a disc and 50 GB on two layers. It has been referred to as the next generation of optical disc format, and it offers high-definition quality.
Belfield's technique allows for storing on multiple layers with the capacity of at least 1,000 GB and high-definition quality.
The UCF team has received a $270,000, three-year grant from the National Science Foundation to continue its work. The team will focus on making the technique even more efficient, partly by reducing the required laser power.
The team's work with lasers and lights has other practical applications. Belfield and his colleagues in the Department of Chemistry are exploring the use of light to detect and treat certain types of cancer.
Belfield's research team is creating chemical agents that, after being injected into patients, will travel within the bloodstream to find and bind with cancer cells. Using light, doctors would then be able to see if and where a patient has cancer cells. Another agent could be injected that would then destroy the cancer cells when activated by light, without damaging other healthy cells.
Source: University of Central Florida
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.