In some ways, science is suffering from too much data. Experiments and computer simulations analyzing everything from the dynamics of climate change to the precise details of folding proteins can churn out billions of numbers describing these physical phenomena. Making sense of all this data remains a challenge. Recently, however, researchers at the University of California, Davis, and Lawrence Livermore National Laboratory announced that they have developed software that makes analysis and visualization of huge data sets possible without the aid of a supercomputer.
Basking in Big Data
January 16, 2009