You're a commuter, your daily rush-hour ordeal made even more grueling by the hassle of unexpected merging lanes, the heady essence of asphalt, and the sign-toting, orange-clad road crew ahead. Resurfacing the road again? you think. But they just did that two years ago! Is this why my taxes are so high?
When it comes to major public construction projects, it's not just the public who wants the end product to be faster, cheaper, and better. The Federal Highway Authority estimates that a staggering $94 billion will be spent on transportation infrastructure every year for the next twenty years. Not surprisingly, state and federal transportation departments want to make sure that their significant infrastructure investments are worthwhile–and they've upped the stakes. The traditional bidding process, in which the least expensive estimate wins the contract, has undergone a transformation in recent years. Cost is no longer the primary factor in determining who gets the job; now project duration–the amount of time that drivers will be negotiating the construction–and quality and durability–are also important criteria.
The best of all possible worlds
Of course, tradeoffs are inevitable. That old saw in engineering and software development says that you can't have faster, cheaper, and better–you can only have two out of three. “If you're trying to minimize the duration, you have to use overtime, and that means increasing your costs,” says Khaled El-Rayes, an assistant professor in the Department of Civil Engineering at UIUC. “If you're trying to improve quality, in many cases you have to pay more for that increase in quality.”
How to reach a comfortable tradeoff between these conflicting objectives? That's the focus of the research that El-Rayes and his research assistant Amr Kandil are currently conducting, using NCSA machines to optimize the decision-making process. El-Rayes, who received an NSF Career Award for optimizing construction utilization of resources in transportation infrastructure systems, is developing an optimization model that can determine the optimal tradeoff between conflicting multiple objectives. This is no simple problem: for each task involved in a large-scale construction project, there are at least three important criteria to consider–cost, duration, and quality. Plug in different combinations of possible values for each, and you can generate a large number of permutations involving different kinds of construction, equipment, and crews, the addition or omission of overtime, an off-peak work schedule, and other possible factors. With the average infrastructure project involving 600 or 700 different activities, the task of determining the optimal balance of duration, cost, and quality proves impossibly overwhelming for a human being.
Instead, El-Rayes uses a genetic algorithm-based model that allows him to generate a large number of possible construction resource utilization plans that provide a wide range of tradeoffs among project cost, duration and quality, and to eliminate the vast majority of suboptimal plans quickly. “At the end,” he says, “what you want is a set of optimal tradeoffs which decision-makers can use to determine, according to their preferences, the best possible combination of resources.” This might mean, for example, that a longer project duration time is tolerable if cost or quality is a bigger concern, or that a reduced duration is a greater priority than cost or quality.
The advantage of this optimization model is its ability to transform the traditional two-dimensional time-cost tradeoff analysis to an advanced three-dimensional time-cost-quality tradeoff analysis. The introduction of the third dimension in construction projects is a challenging task particularly when quality is itself a difficult factor to quantify. “The cost is simply dollar value, and so it is easy to aggregate by adding it all up,” says El-Rayes. “Quality is more challenging.”
El-Rayes's model, which incorporates quality, is currently based on data from the Illinois Department of Transportation (IDOT), which keeps records, for example, on the kind of utilized construction crews along with their measured performance in various quality metrics such as compressive and flexural strength for concrete pavement work. Examining this data in aggregate, El-Rayes can determine how frequently and by how much a given combination of resources exceeded IDOT-specified quality limits, allowing him to assign a quality level to that specific construction crew and resource combination.
In the future, El-Rayes and his research team hope to be able to add even more factors for consideration, including safety, service disruption, and environmental impact. He would also like to make the process more user-friendly by including an interactive tool that would allow users to rank solutions based on weighting factors according to their preferences.
Optimizing the optimal
While El-Rayes' model, by automatically weeding out all less-than-optimum scenarios, makes the decision-making process easier for humans, there is no getting around the fact that it is still an enormous calculation. “If we had a project that included 700 activities, an average sized construction process,” explains El-Rayes, “and each activity had a potential 3 to 5 options each–and that's conservative–it would create a solution space which is exponential to the number of activities.” It's a huge solution space, one which, El-Rayes estimates, would require around 430 hours of computation on a single processor. “Solving this problem wouldn't be feasible,” El-Rayes says. “Nobody's going to wait 430 hours for the solution.”
This is where NCSA comes in. Using Tungsten, El-Rayes and his research team, with the help of Nahil Sobh, who heads NCSA's Performance Engineering and Computational Methods group, are currently exploring how to parallelize his computations over a number of processors, so that rather than performing them on a single processor in a contractor's office or a state, local, or federal transportation department office, they can instead be distributed over a number of unutilized office processors, drastically reducing the run time to the duration of a weekend.
In his experiments on the NCSA Tungsten cluster, El-Rayes examined the required computational time for optimizing three construction projects of different sizes: 180 activities, 360 activities, and 720 activities, each of which he has analyzed on 1 processor and on multiple processors to a maximum of 50. So far, he says that parallelization has been successful in transforming the analysis of the largest project of 720 activities from an impractical problem requiring several weeks (430 hours) on a single computer to a feasible task that can be accomplished on a network of unutilized office computers over a weekend in 55 hours. “We don't even need 50 processors for this size project,” he says. “For bigger projects, we might benefit from an increase in the number of processors, but the improvement starts to level off after maybe 10 to 15 processors, which is a reasonable number for an office to have available over a weekend.” The problem he has chosen for these computations is a hypothetical highway construction project, but he says that the optimization model would be equally applicable to other kinds of large-scale projects, such as the construction of a convention center or a bridge, which would involve more different kinds of activities than would highway construction. What all large-scale project have in common, however, is their complexity, and that is the problem El-Rayes hopes his computations will help solve.
“We want to transform an infeasible problem into a practical problem. That's what we're aiming for,” says El-Rayes.
Funding statement
This research is supported by the National Science Foundation.