In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they’re turning an eye to solving data-intensive challenges. We spoke with the agency’s Irene Qualters and Barry Schneider about..
Despite the high interest about cloud computing from the scientific community it is still a wide open field. Dr. Oliveira and colleagues describe how choosing the best cloud support is a step forward, but also address the persistent need for services focused on the scientific workflow execution to bridge the gap between the cloud and real science in practice.
The Open Science Data Cloud project from the University of Illinois has received additional resources from the NSF to further their goals to create a unique cloud resource for scientific collaboration and research.
Earl Dodd argues that for the HPC cloud to gain practical acceptance as a viable decision-support tool in a wide variety of businesses and industries, it must include Remote Interactive 3D Visualization as a fundamental component of its architecture. Without this vital functionality, the HPC cloud runs the risk of being considered a technological novelty with limited commercial success. However, there are some persistent non-technical barriers that are preventing the full emergence of a broad new user group in the high-performance computing in the cloud space.
In this interview with Kate Keahey from Argonne National Lab, we discuss her background with distributed computing, limitations of the grid, challenges and benefits of cloud computing for HPC and her view on critical elements that the community as a whole—vendor, users, and scientists alike—will need to address as the space matures.
Microsoft’s Dan Reed recently explained the theoretical basis for the company’s latest push to open the clouds for scientific users who require high-performance computing but who are often bound by cost constraints and the need to shed the systems maintenance overhead.
While Amazon’s cloud offering is not a fit for all scientific HPC applications, even if the datasets were smaller in size, Belle II is one example that demonstrates the decision to avoid upfront hardware investments in favor of a no-investment option that can scale as research demands ebb and flow.