Visit additional Tabor Communication Publications
January 16, 2009
International effort will encourage partnerships between humanities scholars, computer and information scientists, librarians and others
Jan. 16 -- Today, a new, international competition called the Digging into Data Challenge was announced by four leading research agencies: the Joint Information Systems Committee (JISC) from the United Kingdom, the National Endowment for the Humanities (NEH) and the National Science Foundation (NSF) from the United States, and the Social Sciences and Humanities Research Council (SSHRC) from Canada.
The Digging into Data Challenge encourages humanities and social science research using large-scale data analysis, challenging scholars to develop international partnerships and explore vast digital resources, including electronic repositories of books, newspapers and photographs to identify new opportunities for scholarship.
Applicants will form international teams from at least two of the participating countries. Winning teams will receive grants from two or more of the funding agencies and, one year later, will be invited to present their work at a special conference. These teams, which may be composed of scholars and scientists, will be asked to demonstrate how data mining and data analysis tools currently used in the sciences can improve humanities and social science scholarship. The hope of this competition is that these projects will serve as exemplars to the field and encourage new, international partnerships among scholars, computer scientists, information scientists, librarians and others.
"It is exciting to us to be able to foster research with outcomes of equal excitement to the humanities and computer and information science and engineering disciplines," said Haym Hirsh, director of NSF's Division of Information and Intelligent Systems. "Through this program, twenty-first century technologies will enable new modes of scholarship that complement centuries-old ways of conducting research."
"The Digging into Data Challenge brings together scientists and humanities scholars to take advantage of the digitization of millions of books, newspapers, photographs and countless other documents," said NEH Chairman Bruce Cole. "The NEH is delighted to work with JISC, NSF, and SSHRC to offer this competition and we look forward to many exciting discoveries from the analysis and study of this data."
"The Digging into Data Challenge will allow for the large-scale analysis of huge collections of diverse cultural heritage resources," said Alastair Dunning, JISC's digitization program manager. "Such forms of analysis, unthinkable before the arrival of the Internet, will help give new insights to academic inquiry."
"This exciting new joint initiative with NEH, JISC and NSF, will allow Canadian researchers to further develop sophisticated text and image mining and data visualization technologies while building international research partnerships," said Chad Gaffield, SSHRC President. "SSHRC is confident that the results will create new knowledge about humanity from the vast digital resources now becoming available."
In order to apply, interested applicants must first submit a letter of intent by March 15, 2009. Final applications will be due July 15, 2009. Further information about the competition and the application process can be found at http://www.diggingintodata.org.
About the National Endowment for the Humanities
Created in 1965 as an independent federal agency, the National Endowment for the Humanities supports learning in history, literature, philosophy, and other areas of the humanities. NEH grants enrich classroom learning, create and preserve knowledge, and bring ideas to life through public television, radio, new technologies, museum exhibitions, and programs in libraries and other community places. Additional information about the National Endowment for the Humanities and its grant programs is available on the Internet at http://www.neh.gov/.
About the Joint Information Systems Committee
The Joint Information Systems Committee (JISC) is a joint committee of the U.K. further and higher education funding bodies and is responsible for supporting the innovative use of information and communication technology (ICT) to support learning, teaching, and research. It is best known for providing a U.K. national infrastructure network, a range of support, content, and advisory services, and a portfolio of high-quality resources. Information about JISC, its services and programs can be found at http://www.jisc.ac.uk/.
About the Social Sciences and Humanities Research Council
The Social Sciences and Humanities Research Council (SSHRC) is an independent federal government agency that funds university-based research and graduate training through national peer-review competitions. SSHRC also partners with public and private sector organizations to focus research and aid the development of better policies and practices in key areas of Canada's social, cultural and economic life. More information about SSHRC is available on the Internet at http://www.sshrc.ca/.
About the National Science Foundation
The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering, with an annual budget of $6.06 billion. NSF funds reach all 50 states through grants to over 1,900 universities and institutions. Each year, NSF receives about 45,000 competitive requests for funding, and makes over 11,500 new funding awards. NSF also awards over $400 million in professional and service contracts yearly.
Source: National Science Foundation
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.