Visit additional Tabor Communication Publications
October 15, 2012
To the new age set, it may be the age of Aquarius, but to researchers everywhere, it's the age of data. Just about every machine nowadays doubles as a data collection mechanism and while some of it has more immediate "low-hanging" value, the far greater portion must be managed, annotated and curated to extract that potential. This is the long end of the data tail and its stretch is vast. In an effort to wrangle this data beast, the National Science Foundation is proposing a new organization, the Institute for Empowering Long Tail Research, as part of its Scientific Software Innovation Institutes program.
Like their "missing middle" counterpart, much of America's research community does not have the necessary tools to transform ever-growing data feeds into scientific knowledge. The Computation Institute, along with co-collaborators – the University of California, Los Angeles, the University of Arizona, the University of Washington and the University of Southern California – have received a $500,000 one-year planning grant from the NSF, to examine how innovative software can optimize the research process.
Ian Foster, Director of the Computation Institute, explains that "with limited resources and expertise, even simple data discovery, collection, analysis, management, and sharing tasks are difficult for small teams." In an official statement, he says that "this project represents the first significant effort to understand and articulate these researchers' needs and translate them into a coherent roadmap for future research."
The phrase "long end of the tail" describes a statistical distribution in which a high-frequency population is followed by a low-frequency population which gradually tails off. The events at the farthest end of the tail have the lowest probability of occurrence. The concept is used to describe the retailing strategy of selling a high number of low-demand items, in which the proceeds from your less-popular items (through the sheer number of them) can create as much, or more, profit than your best-selling items.
Amazon and Netflix are well-known for employing this kind of business strategy, but the long-end of the tail shows up in lots of places. It's the small and medium-sized businesses that are responsible for generating most of the nation's wealth, and it's the smaller labs that carry out the majority of science.
"For these small teams, the growing importance of cyberinfrastructure and its applications in discovery and innovation is as much problem as opportunity," notes Bill Howe, Affiliate Assistant Professor in the Department of Computer Science and Engineering at the University of Washington. The long tail of science has an unfortunate consequence, adds Howe: "modern computational methods often are not exploited, much valuable data goes unshared, and too much time is consumed by routine tasks."
The project aims to change this dynamic. Researchers from domains as varied as biology, economics and metagenomics will work together to figure out better ways of managing data. The long-end of the science tail may be a time-consuming affair, but as with the business world, the returns can be well worth it, a sentiment shared by project partner Bryan Heidorn, Director of the School of Information Resources and Library Science at the University of Arizona:
"There may only be a few scientists worldwide that would want to see a particular boutique data set, but there are many thousands of these data sets. Access to these data sets can have a very substantial impact on science. In fact, it seems likely that transformative science is more likely to come from the tail than the head."
The teams are already looking at cloud computing-based tools like Globus Online as way to offload the more mundane tasks, such as monitoring data transfers, to free up time for more important endeavors. As the project makes headway, the collaborators will propose a second funding round, but there is no doubt about the institute's potential, says Foster:
"We believe that a Scientific Software Innovation Institute focused on long tail research can have a transformative impact on US science," he shares. "By accelerating discovery and innovation in those small laboratories where most research occurs, we can increase total research output, strengthen the powerhouses of US research, and motivate and prepare students to participate more effectively in research careers."
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.