FEATURES & COMMENTARY
Paris, FRANCE — Barry James reports that after the Web comes … not the Matrix, but the Grid. And it’s being brought to you by the same Swiss lab that created the World Wide Web. Unlike Hollywood’s idea of a “matrix” supplanting today’s Internet, “the Grid” is a real-life concept in networking computer power. Its aim is to give researchers instantaneous access to an amount of data magnitudes greater than today.
Scientists in Europe and the United States are working to define the standards of the Grid, and one of the key players is CERN, the European Laboratory for Particle Physics near Geneva where the World Wide Web was born.
If the World Wide Web is analogous to a telephone network capable of handling text, sounds and images, think of the Grid more like a modern electricity distribution system, delivering vast quantities of raw computing power and data where and when needed.
The main impact at first is expected to be on sciences that require enormous computing capabilities, such as particle physics, genome research or Earth observation.
But researchers speculate that the new network, which is expected to be laid over the existing Internet, will spawn all kinds of commercial and other applications, just as the World Wide Web has created technologies that were unimagined 15 or 20 years ago.
Several similar efforts are under way in the United States, including the National Technology Grid and NASA’s Information Power Grid. Much of the research has come out of universities and labs with supercomputing centers, like CERN.
It was at CERN that two scientists, Tim Berners-Lee and Robert Cailliau, devised the Web in 1989 because the laboratory needed a way to transmit data quickly in-house and to thousands of physicists around the world. Today it faces the same problem on a much bigger scale.
In five years, CERN plans to open its Large Hadron Collider, in which scientists will accelerate protons almost to the speed of light and smash them together to create in a microcosm the forces that existed a few billionths of a second after the Big Bang.
Four detectors will observe the shadowy traces of the collisions as some 40 million protons fly apart every second, scattering quarks and other elements. The detectors will pour out prodigious amounts of data.
Just how much data they will pour out was explained by David Williams, a CERN scientist who was division leader when the Web development was being carried out.
Each of the four LHC experiments, he said, will produce as much data as if each person in the world were making 20 simultaneous phone calls. Together, the experiments will store and analyze twice as much processed data every year as all the data presently stored in U.S. research libraries.
CERN already is one of the world’s biggest scientific computing centers, and the Large Hadron Collider will increase data-handling requirements by as much as 1,000 times, Mr. Williams said.
The problem is how to deliver the data to several hundred universities around the world that will be taking part in the experiments.
Under the model now being worked out, the laboratory will send data to regional centers, which will distribute it to subregional centers that in turn will deliver it to universities.
They, in turn, will make the data available to their research departments. What this means is that the individual researcher potentially will have access to the computing power and data of the entire Grid.
The objective is that the Grid will enable scientists to carry out experiments without being concerned about the details of the underlying hardware or software, in the same way that people don’t think about where the power is coming from when they switch on an electrical appliance.
At present, scientists solve problems by asking, “What can I do with the computers to which I own or control?” In the future, Mr. Williams said, they will ask, “What problem do I want to solve?” and use the Grid capacity appropriate to the task.
In effect, the Grid will consist of three layers connected by fiber-optic cables – an interface that will enable scientists to prepare their experiments, an applications level and the underlying Grid operating system that will connect users to the distributed resources.
Mr. Williams, a CERN scientist who was division leader when the Web development was being carried out, said the first users would be people working on large collaborative science projects.
Exactly who would get access and on what terms is one of the problems now being worked out. As on the Web, Mr. Williams said, “end users must have easy access.”
He predicted that eventually commercial companies would rent out time or space on the Grid, although for what purposes he could not say.
“You don’t know what people will dream up,” he said.
The project, which is based entirely on open-source software such as the Linux operating system, would in effect make uncounted numbers of individual computers act as a single computer.
“I think that 10 years ago, none of us understood how important the Web was going to be, including Tim,” Mr. Williams said, “and with the Grid, we again don’t quite understand what we are talking about. But we think it will be just as important.”