Although usually considered a physics facility, biologists harness the computational resources at CERN to conduct research into the beginnings of life.
The notion of “personal genomics” has generated a great deal of buzz over the last several years but according to one researcher, many of the promises that lie at the “plateau of productivity” for this technology are tied to some significant computational-side complexities.
Projects like the Sloan Digital Sky Survey have provided a wealth of cosmological data for scientists to explore in detail. However, making use of those terabytes — and generating far more data in the process of simulating and analyzing new concepts — is highlighting the bottlenecks for scientific computing at massive scale.
Languages like R and MATLAB, which were once unofficially reserved for technical computing domains are slowly finding their way into enterprises due to the rise in demand for large-scale data analytics. This demand is coupled with recent announcements about cloud-based ways to use these languages, opening new doors to access and use.