Visit additional Tabor Communication Publications
September 01, 2009
Sept. 1 -- Computers that use light to process large amounts of data faster than ever before are just one of many groundbreaking potential applications of a new £6 million research programme at Queen's and Imperial College London, launched today, Sept. 1, 2009.
The Engineering and Physical Sciences Research Council (EPSRC) is funding the two universities to establish a world-leading research programme on the fundamental science of so-called "nanoplasmonic devices."
Nanoplasmonic devices' key components are tiny nanoscale metal structures -- more then 100 times smaller than the width of a human hair -- that guide and direct light.
The structures have been tailor-made to interact with light in an unusual and highly controlled way. This means they could one day be used to build new kinds of super-high-speed "optical computers" -- so named because they would process information using light signals, instead of the electric currents used by today's computers.
At present, the speed with which computers process information is limited by the time it takes for the information to be transferred between electronic components. Currently this information is transferred using nanoscale metallic wires that transmit the signals as an electric current.
To speed up the process, the scientists at Queen's and Imperial hope to develop a way of sending the signals along the same wires in the form of light.
In order to achieve this, they are developing a raft of new metallic devices including tiny nanoscale sources of light, nanoscale "waveguides," to guide light along a desired route, and nanoscale detectors to pick up the light signals.
Similar approaches may also help in the development of devices for faster Internet services.
Professor Anatoly Zayats, from the Queen's University's Centre for Nanostructured Media, who leads the project said: "This is basic research into how light interacts with matter on the nanoscale. But we will work together with and listen to our industrial partners to direct research in the direction that hopefully will lead to new improved products and services that everyone can buy from the shelf."
Professor Stefan Maier, who leads the research team at Imperial, added: "This is an exciting step towards developing computers that use light waves, not electrical current, to handle data and process information. In the future these optical computers will provide us with more processing power and higher speed. This will also open the door to a world of possibilities in scientific fields at the interface with the biosciences, and perhaps even in the world of personal computing."
The project is also supported by INTEL, Seagate, Ericsson, Oxonica, IMEC and the National Physics Laboratory.
Source: Queen's University Belfast
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.