Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
October 4, 2010

Weighing the Queue, Evaluating the Utility

Nicole Hemsoth

Last week at the R Systems-sponsored HPC 360 event in Champaign-Urbana, Illinois, the focus was on the manufacturing sector with an expected emphasis on the value of modeling and simulation to drive competitiveness and growth. A secondary focus questioned how simulation-centered companies can look to utility or on-demand solutions to extend their ability to make best use of computational resources and improve efficiency.

While there were a number of manufacturing companies present, only a few were actually making use of virtualized or on-demand resources although there were several weighing their options. Among the host of attendees in the “investigative” category was Matt Dunbar, chief software architect for SIMULIA, the simulation brand for Dassault Systemes, which produces the finite analysis product suite Abaqus.

Software research and development arms like SIMULIA require a vast amount of computational resources to further enhance their product line but what happens when a company like Dassault Systemes runs out of power and cooling capacity and furthermore leaves developers waiting in long queue lines? And what happens when the on-site resources cannot deliver the 24/7 capability needed without requiring architects to have long wait times as their projects remain on hold? 

Software architects eager to move forward with software research and development have to make a tough decision between either waiting in a long queue for post-processing in particular or, in turn, need to consider the viability of sending at least some workloads off-site.

As Dunbar stated, “doing actual batch simulation in the cloud is reasonably straightforwared but doing 3D graphics post-processing is something that remains a question mark for us. There are a number of ways we can do that, but right now we’re trying to decide how best to do that.” This is a difficult decision because software architects are either faced with waiting for a long time or taking what might be a performance hit with their use of utility resources versus their own, slightly more time-intensive (due to wait time) use of workstations.

Dunbar gave an overview presentation at the HPC 360 conference in which he discussed some of the challenges the company is facing as it ponders the decision to move post-processing into the cloud due to increasing restrictions and spent a few moments discussing some of his key points with us.

In Matt Dunbar’s view, “you have to come up with performance that’s equivalent to the workstation or come up with a way to handle post-processing” which echoes the sentiments of a number of other companies reliant on 3D processing to drive growth and further development.