Visit additional Tabor Communication Publications
February 05, 2013
WEST LAFAYETTE, Ind., Feb. 5 – Purdue University researchers have received a five-year $14.5 million National Science Foundation grant to expand its widely used nanoHUB.org online science and engineering gateway.
The Purdue-led Cyber Platform, a part of the Network for Computational Nanotechnology (NCN), will assist researchers across the globe by developing a virtual society that shares simulation software, data and other innovative content to provide engineers and scientists with the fundamental knowledge required to advance nanoscience into nanotechnology.
"Thousands of times a day the leading researchers 'come' to Purdue through the globally unique tool of nanoHUB," Purdue President Mitch Daniels said Tuesday (Feb. 5) in announcing the grant. "The new NSF investment is an affirmation of the brilliance of nanoHUB's Purdue creators and of its worldwide scientific significance."
Annually, nearly 250,000 users in 172 countries participate in nanoHUB, an online meeting place for simulation, research, collaboration, teaching, learning and publishing. The nanoHUB provides a library of 267 simulation tools, free from the limitations of running software locally, used in the scientific computing cloud by more than 12,000 people every year.
The Internet-based initiative provides 3,000 resources from more than 1,000 authors for research and education in the areas of nanoelectronics and nanoelectromechanical systems and their application to nano-biosystems. The nanoHUB menu also includes courses, tutorials, seminars, discussions and facilities to foster nano-research collaboration, including the Birck Nanotechnology Center in Purdue's Discovery Park.
Through Cyber Platform developments and community engagement efforts, the nanoHUB in its next phase is designed to:
"Our long-term vision for the Cyber Platform is to use the nanoHUB as an online nano society that researchers, practitioners, educators and students depend on daily," said Purdue electrical and computer engineering professor Gerhard Klimeck, principal investigator of the Purdue-led Cyber Platform. "At the same time, we are excited about how this tool has extended into professional practice as a computational resource for a multidisciplinary culture of innovation grounded in cloud services-enabled workflows."
Joining Klimeck on the Cyber Platform team at Purdue are co-principal investigators Krishna Madhavan, Michael McLennan, Lynn Zentner and Michael Zentner. For the NSF abstract, go to http://www.nsf.gov/awardsearch/showAward?AWD_ID=1227110&HistoricalAwards=false
The nanoHUB has become the first broadly successful, cloud-computing environment for research across multiple disciplines, with more than 960 citations in scientific literature and 8,000 secondary citations, with nearly one-third of those papers involving experimental data. It also has evolved well beyond online simulation for research.
From New York to London and Moscow to Madrid, more than 14,000 students in 760 formal classes at 185 institutions have used nanoHUB simulations for classroom teaching, homework and projects. The nanoHUB also provides a library of 3,000 learning materials.
"Most of these tools are adopted for formal education in six months, compared with the 3.8 years it takes for the release of new college textbook editions," Klimeck said.
NCN founding director Mark Lundstrom, the Don and Carol Scifres Distinguished Professor of Electrical and Computer Engineering at Purdue, said a key part of the Cyber Platform project is to engage an ever-larger and more diverse cyber community that shares novel, high-quality nanoscale computation and simulation research and educational resources.
"The reason we created the nanoHUB cyberinfrastructure 10 years ago was to connect those who are doing simulation with experimental collaborators," Lundstrom said. "Today, it's called cloud computing."
Based in Discovery Park, Purdue's primary interdisciplinary research complex, the Cyber Platform team hopes to accelerate the transformation of nanoscience to nanotechnology through the integration of simulation with experimental data.
Lundstrom said the project will focus on developing open-source software to stimulate data sharing and inspire and educate the next-generation workforce. A key goal is to create communities in the nano-related areas of manufacturing, informatics, and environmental health and safety.
"Despite a decade of success in nanotechnology research and education, significant gaps remain because work is still performed by isolated individuals and small groups," Lundstrom said. "This fragmentation by specialty hinders tool and data sharing across knowledge domains."
By partnering with professional societies and commercial publishers, Klimeck said nanoHUB is changing how researchers publish simulation results through novel interactive journals that reflect a user's workflow, link directly back to their data and open doors to more collaborators.
This approach will drive new content to the nanoHUB, establishing an efficient, one-stop shopping site for high-powered, nano-related research simulations, he said. NCN also has developed processes for enabling researchers to rapidly deploy their research codes and innovative tutorials and classes on nanoHUB.
"To date, these processes harvested research and educational results from more than 1,000 contributors worldwide," Klimeck said. "Expansion into new areas of nano research and education - including precollege education - represent a huge growth potential for nanoHUB that goes beyond simulation to embracing data management, search and exploration."
With NSF funding, Purdue launched NCN in 2002 with a five-year, $10.5 million grant to advance nanoscience toward nanotechnology via online simulations on nanoHUB.org. The NSF funding was distributed among six partner universities to seed the infrastructure creation and develop the nanoHUB content.
Today, NSF is funding Purdue for the operation and advancement of this national nanotechnology infrastructure. Two other independent NSF grants, each at a level of $3.5 million to Purdue and the University of Illinois, Urbana Champaign, will advance nanoelectronic and nano-bioengineering while using nanoHUB to engage a global community. The resulting NCN is now funded at a level of $21.9 million for five years.
Source: Purdue University
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.