On April 21, the National Endowment for the Humanities (NEH) announced something new: they would be teaming up with the U.S. Department of Energy to offer one million CPU hours on supercomputers at NERSC for use by researchers in the humanities. The effort is managed out of the NEH’s new Office of Digital Humanities, created recently to recognize the increasing importance of computing in what has traditionally been a very old-fashioned area of research.
According to Brett Bobley, CIO at the NEH and the director of the new Office of Digital Humanities (ODH), access to materials was the first big boon of computing for the humanities.
Although there are many research disciplines in the humanities, many of them share one trait: the analysis of documents; lots and lots of documents. Old newspapers, land ownership records, manuscript fragments, diaries, and war department memos are the stuff of new discovery. But unfortunately this stuff hasn’t been very available for scholars to study until the Internet revolutionized our society’s ability to organize and present data. As Bobley says, “once the Web came around, you suddenly had access to tons of materials you used to have to fly around the world to see in person.”
Though it was a critical breakthrough, access to these documents only creates an opportunity for discovery. In order to realize the potential of that opportunity, a vast mountain of material has to be filtered, sifted, collated, compared and understood. With millions of documents accessible instantly from anywhere in the world, the challenge is simply too great to meet without the aid of computers.
The recognition of this shift in research methodology was the genesis for the creation of the ODH as the nation’s leading humanities research funding organization. The NEH sponsored the “Supercomputing and the Humanities” workshop in July of 2007 to explore some of the research already going on at that time, and to get a glimpse at the potential for the future. There were many presenters, including David Koller from the Institute for Advanced Technology in the Humanities at the University of Virginia, who presented results of efforts to computationally reassemble fragmentary artifacts; essentially, using a computer to put together the pieces of ancient, broken puzzles. In another example, David Bamman at Tufts University presented efforts by the Perseus Project to use computational methods for syntactic parsing of document stores to create a distilled understanding of an entire library’s contents.
With these motivating examples, it is exciting to imagine the possibles of moving up from desktops or small home-built clusters to large systems with thousands or tens of thousands of processors. And that excitement led to the creation of the new Humanities High Performance Computing (HHPC) initiative. But how will the work actually get done?
In many cases there is a large body of algorithmic work — especially in the analysis of text, video, and voice streams — that has been sponsored by the military and intelligence communities. These algorithms can jump start the digital transition in the humanities, providing a ploughshare use for technologies originally developed in the military-industrial complex. According to Bobley, “a surprising amount of this technology is in the public domain,” and the NEH sees it as part of their mission to build the relationships that will bridge these tools into the humanities. But, as Bobley points out, a key in the success of this transition will be making the tools usable for non-computational specialists.
Making that transition, and bringing over tools with added value in usability and accessibility, is a tall order, but one that starts like most things start: with a conversation. As Bobley sees it, a major goal for this first HHPC effort is to get the computational and humanities communities talking with each other and exploring possibilities.
The T-RACES project, a collaboration between the computational experts of the San Diego Supercomputer Center and the University of California Humanities Research Institute (UCHRI), is an example of the collaboration that Bobley has in mind. UCHRI researchers are bringing the expertise and domain context of the institutionalized practice of flagging minority neighborhoods as undesirable for mortgages in the 1930s and 40s in California to the effort. SDSC researchers are bringing their expertise with grid-based repositories of data to this domain, so that the historical documents can be made available to anyone over the internet, along with new context and analysis by the UCHRI team enabled by computational tools.
Although the recent announcement focused on the NEH/DOE Humanities High Performance Computing (HHPC) Program, Bobley points to two grant programs that are likely to be instrumental in advancing the use of computation in this field. The HHPC program will award 1 million hours of CPU time on NERSC machines to a few “lighthouse” projects in chunks of 100,000 to 500,000 hours. (The deadline for application is July 15 for a January 2009 project start.)
The second opportunity for researchers is the NEH’s long standing Collaborative Research Grants. These grants are for one to three years, and have typically been used to bring together teams of humanities scholars to accomplish a major task. But Bobley points out that, in the context of the HHPC program, the collaborative grants could be used to build the distributed teams that will be needed to make best use of supercomputer time awarded in the HHPC program. Information on both grants is available at www.neh.gov/grants.
Long-time HPC professionals looking to put a little juice back in their careers may want to take special note of all this. Bobley’s vision is for the conversation his initiative starts to go both ways, “We really are interested not only in inspiring humanities scholars, but also in bringing HPC practitioners to the humanities.”