Visit additional Tabor Communication Publications
December 15, 2006
Low-cost MRI machines, super-fast Internet routers, and high-capacity power lines top the list of likely breakthroughs in the field of superconductivity in 2007, according to a 'Top-10' forecast list released today by Elie K. Track, Ph.D., senior partner, HYPRES Inc., a developer of superconducting microelectronics technology.
Dr. Track compiled the list of expected breakthroughs through industry research, conversations with scientific experts around the world, and through his work at HYPRES. The list was developed in an effort to pull together information on the wide variety of superconductivity projects worldwide and begin a dialog about the innovative advancements and breakthrough applications that are well positioned to occur next year.
"In my conversations with many respected colleagues, I continue to hear about new and exciting applications and breakthroughs that are likely to take place in 2007, largely because of the involvement of superconductor-based technologies," said Track. "I thought it would be useful to pull all these together into one list so we can truly realize and appreciate the profound impact that superconductivity will have on various industries, the scientific community, and the average person in the coming year."
Topping the list is an expected breakthrough announcement of laboratory demonstrations that can lead to an advanced, low-cost MRI machine that leverages superconducting technology. Ultimately, this will make it easier and cheaper to screen for many serious medical conditions, such as breast cancer and brain tumors. By using tiny magnetic fields, these advanced MRI machines will also work in a more open environment, easing concerns for claustrophobic patients.
Other expected breakthroughs on the list include:
2). Ultra high speed Internet switches that will carry Internet traffic to a much higher level of density and complexity, leading to an information highway that is much faster than what we currently have. The specific advancement would involve the use of superconducting technology to process optical signals in interconnecting circuits, leading to 100 Tbps routers.
3). High-capacity power lines that use cables made out of superconductors to efficiently carry electricity to areas that are without power infrastructure. These innovative cables carry 3-5 times more current than traditional power lines of the same size. Such a system was demonstrated in New York State in 2006, and Dr. Track expects further, more comprehensive demonstrations and implementations in 2007.
4). The demonstration of a wireless digital receiver, using superconducting electronics, outside of the laboratory. This breakthrough will ultimately lead to significantly improved wireless communication systems -- in speed, accuracy, and data capacity -- for military and commercial applications.
5). The Food and Drug Administration granting approval for use of superconducting sensors in advanced magnetic cardio-imaging machines that will be used to more effectively screen for coronary artery disease.
6). The proven design of a 10 teraflops workstation computer, to replace room-sized systems. This superconductor-charged system would have a number of applications, including greatly increasing the accuracy of weather forecasting.
7). Demonstration of a superconductor-based ship propulsion motor for the U.S. Navy, leading to dramatic savings in size, weight and power needs for future transportation systems.
8). Progress in the development of an analog quantum computer, which is expected to improve the speed for processing complex mathematical computations from years to minutes.
9). The successful demonstration of the SCUBA-2 infrared camera on the James Clerk Maxwell Telescope in Hawaii, the most complex demonstration ever of superconducting electronics – will provide an unprecedented view of the universe.
10). The addition of an AC Josephson voltage standard device, leading to sharp improvements in the fundamental accuracy of measurements of electrical signals. This would be an enormous breakthrough in the metrology community.
Honorable mentions: National Security Agency funding for superconducting supercomputer, demonstration of Bell's inequalities (fundamental advancement in quantum mechanics physics), and improved superconducting materials that allow superconductivity to take place at higher temperatures.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.