CD-adapco and Cray Inc. recently announced that CD-adapco has purchased two additional Cray XD1 supercomputers to help boost productivity in CD-adapco's international computational fluid dynamics (CFD) consulting business. Dennis Nagy, vice president of marketing and business development at CD-adapco, and Himanshu Misra, computer aided engineering (CAE) business manager at Cray Inc., talked to HPCwire about the collaboration between their two companies and how high performance computing (HPC) is addressing today's CFD market requirements.
———-
HPCwire: Dennis, tell me a little about CD-adapco and your products.
Dennis Nagy: We're the global leader in what we call full-spectrum CAE flow and thermal simulation, otherwise known as CFD. “Full spectrum” refers to serving the broad business needs of our customers, who range from product development engineers who use our computer aided design (CAD)-embedded CFD solutions, upwards to R&D engineers who use our most sophisticated and advanced simulation suite capabilities, running on leading-edge HPC platforms. We have a 25-year track record providing CAE software and consulting services. Our STAR-CD and STAR-CCM+ CFD simulation codes are employed in a growing number of industries. The most recent compilation of fastest supercomputers shows that our flagship product, STAR-CD, is the leading CFD software in terms of installed GFLOPS of processing power within the automotive industry, and it's fourth overall in CAE behind three structural analysis applications.
HPCwire: How are CD-adapco's solutions different from others in the CFD space?
Nagy: We believe we scale better than the other CFD applications and have a much faster, more automated solution technology in the form of polyhedral meshing. Our software also offers multiple user entry points, starting with the CAD-embedded “front door to CFD” and extending all the way up to the most advanced, comprehensive suite of user-configurable toolsets. In terms of expertise, our solutions contain the deepest services component in the CAE vendor industry, based on our 25 years of experience solving actual CAE problems for our customers.
HPCwire: Recent news releases announced that CD-adapco has ordered two more Cray XD1 supercomputers for the company's consulting offices. What is the nature of your consulting business, and why did you turn to Cray to support it?
Nagy: We now have a 72-processor Cray XD1 system in our New York office running production jobs. We will be adding a 72-processor system in our London office and a 36-processor system in the Paris office. Because we take on our customers' toughest problems, we needed considerable computing power and excellent price/performance. Cray's XD1 solution was right for us. CD-adapco started as a consultancy a quarter century ago, and we've continued to grow that synergistic aspect of our business. We use our own software in the consulting business, which enables us to see things from the user perspective. As we push the practical boundaries of CFD and finite element analysis (FEA), the consulting business gives us a good overview of what advances need to be added to the software in the future and when it will be feasible to deploy them.
Himanshu Misra: We are excited to be collaborating with CD-adapco. The fact that CD- adapco will soon have 180 processors of Cray XD1 system for their STAR-CD workload is yet another endorsement of the overall solution. Since CD-adapco best understands the numerics involved in their STAR-CD code, their purchase of Cray XD1 systems is a strong statement in support of the price-performance characteristics, robustness and ease of use of Cray XD1 system for large scale CFD. We are already seeing a lot of interest from the STAR-CD user community.
HPCwire: Himanshu, how did Cray help CD-adapco adopt the systems?
Misra: We worked closely with CD-adapco to fine-tune the Cray XD1 system for their code. Our system had to pass a whole battery of rigorous benchmark tests, which showed an average 20-30% performance advantage over cluster systems at the 32-processor range. Cray has a long history in HPC that goes back to 1972 with the development of the seminal Cray-1. Last year we introduced the Cray XD1 system, which uses direct-connect system architecture and HPC-optimized Linux to remove bottlenecks and provide exceptional performance and scalability when running real-world applications like CFD. The product makes HPC available to a growing community of users. The types of MPI communication calls and the inherent message sizes associated with STAR-CD numerical modules were found to be a good natural fit for the hardware features of Cray XD1. Based on a wide range of industrial problems for its customers, CD-adapco observed the following advantages:
- Reduced turnaround time for the same number of processor counts
- Improved throughput in a multi- user environment, as less processors were now needed to deliver the same turnaround time thereby freeing up the system for other users
HPCwire: Dennis, could you briefly describe how the CFD market has evolved and how HPC is used for this area of CAE today?
Nagy: HPC and CAE have been linked for over two decades, starting with the first vector Crays running MSC.Nastran FEA through the minisupercomputer wave, the massively parallel machines, to the current rapid growth of cluster-based HPC. I was with MSC during those growth years of 1985 through 1997. The first CAE users were government labs and a few of the largest aerospace and automotive companies. Thirty years ago many people in the field thought only 25 Cray machines would ever be used for finite element analysis (FEA) worldwide, but there were more than 150 machines in operation by the mid- 1990s. Today, most major original equipment manufacturers in the automotive, aerospace, and turbomachinery industries use HPC for CFD simulations.
Misra: CFD analyses today not only require very large models, but increasingly incorporate complex physics. Multiphase and free surface flows as well as many traditional moving grid analyses-for things like engines and pumps-require hefty CFD analyses. CD-adapco is using their Cray XD1 systems to run a number of very challenging CFD analyses that engineers would not have attempted just a few years ago because of the long turnaround times that occurred on earlier systems. For example, one analysis is for a solid oxide fuel cell stack. Each plate in the stack is modeled in detail with regard to flow, heat transfer, and all the electrochemistry, which requires over 10 million cells. And this isn't considered a super-size model. A solution for the whole stack can be converged in less than a day on 30 Cray XD1 system processors.
Nagy: We are also running several total vehicle models for major truck and auto manufacturers that require high cell density to accurately simulate aerodynamics and engine compartment thermal management. These models contain about 20 million cells. The flow is resolved from the front grille all the way back to the wake of the vehicle, which can extend 50 meters behind a truck trailer. We need quick turnaround to support the design process. The Cray XD1 has enough muscle to do that. Despite the size of the model, the analysis fits on as few as 14 processors. On 20 processors, solutions are available in a day. We also run many in-cylinder analyses to assess new engine technology. A typical “breathing” study that involves intake and compression for these large models can run overnight on a Cray XD1 machine on as few as 10 processors.
HPCwire: How do you both plan to address the software scalability issues identified in the recent IDC ISV report?
Misra: Cray systems are designed specifically to deal with scalability.
Nagy: Fortunately, CFD as we implement it scales fairly well and CD-adapco has been on the forefront of exploiting this characteristic within our CFD solvers STAR-CD and STAR-CCM+. We have enough large customers asking for cluster- optimized software from us that we have taken this opportunity seriously for quite a while already. We already get 136x speed-up (wall clock time) when using a reasonably-tuned 160-processor system and we believe there is more efficiency to be gotten by further tuning. None of our customers yet ask for systems with many more nodes than that, but when they do, we will be there with attractive scaling.
HPCwire: What are some of the major challenges ISVs face today in trying to meet the requirements of HPC users in the CFD sphere?
Nagy: Price/performance is one challenge, especially in conjunction with application scalability. CFD applications scale well in general, and STAR-CD has shown some impressive results recently on 100-node clusters. Another challenge to ISVs is ensuring the installability and reliability of the fully configured solution. In the past, ISVs only had to test their software on a relatively small number of HPC systems from a few established global vendors. But with the popularity of economical cluster systems based on separately obtainable components, we're facing a situation in which our software may run reliably on one regional OEM's cluster but not on another OEM's cluster-even when the two systems are based on the same type of processors, interconnect hardware/software, MPI interfaces, and operating system.
HPCwire: How are you addressing these challenges?
Nagy: We're working closely with leading component suppliers and major global OEMs to tune our software to their architectures and exploit the advances embedded in those architectures. We also certify reliability on the most popular standardized platforms or “recipes” of OEM-configurable systems.
Misra: There is no dearth of hardware OEM's, but the challenge is to identify key H/W players that have a demonstrated passion and sustained commitment to technical HPC. Additionally, as a product, Cray XD1 offers a very neat solution wherein the ISVs can dynamically link their 64bit executables against the Cray XD1's native MPI libraries. This automatically ensures that the application will exploit the performance advantages offered by the Cray XD1's RapidArray interconnect with very little extra effort.
HPCwire: Thanks for speaking with us, gentlemen. I'm sure the insight into the CFD industry will be very helpful.