Visit additional Tabor Communication Publications
September 05, 2008
Sept. 5 -- The Institute for Computing in Humanities, Arts and Social Science (I-CHASS / www.chass.uiuc.edu) at the University of Illinois at Urbana-Champaign has been awarded a National Endowment for the Humanities Institute for Advanced Topics in the Digital Humanities (IADTH) grant for almost $250,000 for 2008-2009. These grants support national or multi-state training programs on approaches in humanities computing and seek to increase the number of humanities scholars using digital technology in their research.
I-CHASS will lead a collaboration partnering the National Center for Supercomputing Applications (NCSA / www.ncsa.uiuc.edu), the Pittsburgh Supercomputing Center (PSC / www.psc.edu), and the San Diego Supercomputer Center (SDSC / www.sdsc.edu) that will foster innovation in the research and development of computational resources for humanities research groups. This partnership, called Humanities High Performance Computing Collaboratory (HpC), will engage scholars in sustained collaboration with high-performance computing specialists in order to identify, create, and adapt computational tools and methods focusing on simulation and modeling, social networking, grid and distributed computing, data analytics, or visualization technologies. The grant will facilitate nine mini-residencies (three at each supercomputing center) and a two-day conference for 45 humanities participants and 15 high-performance computing specialists. The grant also will support the construction and maintenance of a virtual community for participants and the larger public that will function as an online collaboratory space.
"The Humanities High Performance Computing Collaboratory has the potential to transform humanities, arts, and social science research," said I-CHASS director Vernon Burton. "It creates the opportunity for researchers to educate themselves in high-performance computing technologies with help from nationally-recognized computing specialists while those specialists are provided with sophisticated researchers who can advise on the technological needs of humanities, arts, and social science scholarship and research. This model of collaborative engagement will foster innovation in the research and development of computational resources for humanities research groups."
"PSC is excited to be included in the opportunities that the IADTH program will make available to the humanities, arts and social sciences communities," said Laura McGinnis, manager of PSC's Data & Information Resource Services group. "We are looking forward to sharing our depth and breadth of technology with disciplines that traditionally have not made use of high-end resources."
"SDSC looks forward to working with the University of Illinois to offer Data Challenges in the Humanities, a workshop that will feature case studies that highlight data curation and preservation challenges," said Diane Baxter, director of education for SDSC. "These studies will help us explore technology solutions and community-led technology initiatives that have addressed similar challenges."
The humanities groups selected for the 2008-2009 mini-residencies are:
The SCGMA Group has been collaborating with I-CHASS, the Software Environment for the Advancement of Scholarly Research (SEASR / http://seasr.org/) project, the Center for Medieval Studies at the University of Minnesota-Twin Cities, the Program in Medieval Studies at the University of Texas-Austin, and the Communications Department at the University of California-San Diego since May 2007 to develop a new interdisciplinary scholarly community for globalizing the study of the Middle Ages. SCGMA has been working to create an online infrastructure to support the organization of, and research with, sources in multiple formats and languages available from multiple scholarly disciplines in order to organize large quantities of textual, visual, and aural resources. HPC will allow SCGMA to extend its current use of high-performance technologies to encompass a more elaborate technological model.
The Humanistic Algorithms project is a collaboration between SEASR, I-CHASS, and the University of Southern California's Institute for Multimedia Literary that focuses on creating a digital archive system in support of a digital portfolio application for faculty and students. SEASR will use data analytics to extract information from unstructured texts (such as raw textual data like websites) to produce semantic information that can be used to create meta-analyses of scholarly multimedia. From these meta-analyses, Humanistic Algorithms anticipates taking up questions such as: What are the components of scholarly multimedia? What is pedagogy in a networked world? How do we collaborate, train faculty, and teach students how to read and compose scholarly multimedia? HPC will allow Humanistic Algorithms the opportunity to isolate and adapt additional high-performance computing technologies that will aid in the development of the digital portfolio application. The mini-residencies will allow the group to experiment with new technologies and chart long-term goals.
The HistorySpace Group brings together humanities scholars experimenting with information rich virtual environments (IRVEs) that express combinations of textual, graphic, sonic and three- and four-dimensional forms of representation in order to collaborate on workflows, disciplinary conventions, protocols and tools that will move humanities scholars from print to virtual media production. HistorySpace, in conjunction with NCSA, is constructing an elaborate network of storyboards and workflow charts that will serve as the iterative, scenario-based design method that will structure the IRVE. This grant will allow HistorySpace to consult with high-performance computing specialists in order to refine their IRVE methodological structure, consider the integration and adaptation of additional high-performance computing tools, and begin construction of its first prototype IRVE.
Founded in 2004, I-CHASS charts new ground in high-performance computing and the humanities, arts, and social sciences by creating both learning environments and spaces for digital discovery. I-CHASS presents path-breaking research, computational resources, collaborative tools, and educational programming to showcase the future of the humanities, arts, and social sciences by engaging visionary scholars from across the globe to demonstrate approaches that interface advanced interdisciplinary research with high-performance computing. I-CHASS is a partnership of the University of Illinois at Urbana-Champaign, the National Center for Supercomputing Applications (NCSA), and the Illinois Informatics Initiative (I3). For more information, visit: http://www.chass.uiuc.edu.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.