Groundbreaking scientific research is becoming more reliant on computationally intensive HPC resources, and mid-level research organizations without the resources to build an extensive HPC cluster are looking for cost-effective ways to contribute to these initiatives.
In an effort to evaluate creative methods of participating in those large scientific projects, research out of Brigham Young University done by Spencer Taylor examined the open source software HTCondor, which makes use of computing power from idle computers to perform jobs on a local network. In this case, it was specifically applied to a water resource model called Gridded Surface Subsurface Hydrologic Analyst (GSSHA), a model that requires computationally intensive stochastic functions not uncommon to many scientific disciplines.
The resulting tests showed that HTCondor can be a workable alternative to acquiring additional HPC resources for mid-level research institutions. “We found that performing stochastic simulations with GSSHA using HTCondor system significantly reduces overall computational time for simulations involving multiple model runs and improves modeling efficiency,” Taylor argued.
The idea behind employing HTCondor, using idle computing resources to help process large amounts of data and perform intensive computations, has notably been used by researchers at Berkeley in the SETI@home project, where home computers are volunteered when idle to form a grid that analyzes extra-terrestrial radio signals. HTCondor hopes to accomplish something similar such that mid-sized research institutions can integrate their computing base with existing HPC resources both on-site and in the cloud. As noted in the research, “the goal of this project is to demonstrate an alternative model of HPC for water resource stakeholders who would benefit from an autonomous pool of free and accessible computing resources.”
The architecture diagram below shows how the HTCondor software accesses and implements the various resources, including on-site ‘worker computers,’ local HPC implementations, and the existing HTCondor network, built similarly to the SETI@home network via volunteer computers across the country.
The specific instance set up by the BYU research utilized a model that ran six precipitation events, using hydrometeorological data over a two-week period. The simulation required 14 minutes on a single desktop computer, and the test was set up to run 150 of those simulations, which would take 35 hours on average on a single processor.
“Because of the nature of HTCondor,” as Taylor explained in the research, “each stochastic simulation ran on a different number of processors ranging from about 80 to 140. As expected, with about 100 times the computational power of normal circumstances I was able to essentially reduce the runtime by factor of 100.” In essence, by running these formerly idle processors in parallel, the BYU implementation was able to achieve performances consistent with other localized HPC instances.
As seen in the figure above and noted in the research, “it is also possible to include commercial cloud resources as part of an HTCondor pool.” The software makes it possible to optimize what jobs are sent to the cloud based on the price points.
“For example,” the research noted, “if you were using Amazon’s Elastic Compute Cloud (EC2) you could set the ‘ec2_spot_price’ variable to ‘0.011’ so that HTCondor would send jobs to the cloud only if the cost per CPU hour was $0.011 or less.” Many research institutions utilize cloud services for excess data storage and computation at peak times, so being able to incorporate those into the HTCondor system is an important consideration.
Stochastic simulations, ones where the results are dependent on several randomized probabilistic variables, are commonplace across various scientific disciplines. As such, Taylor is hopeful this application can be utilized across those disciplines. “Using the scripts developed in this project as a pattern, HTCondor could be used for many other applications besides GSSHA jobs.”
Related Articles
NASA Builds ‘Climate in a Box’