21st century market dynamics put a great deal of pressure on manufacturers to operate differently. Driving forces are many but include the need to satisfy customer demands quickly and to deal with energy constraints and environmental concerns. On the flip side, the growth of network technology in tandem with service-oriented architectures can have a transformative effect by providing real-time insight into manufacturing processes. This is the basis of smart manufacturing, which applies networked information-based technologies throughout the manufacturing and supply chain enterprise to achieve increased efficiency, productivity, competitive advantage, and ultimately better ROI.
When it comes to the role of HPC in manufacturing, much of the focus has been given to virtual design and prototyping, using computer modeling and simulation for product design and improvement. Smart manufacturing cuts a wider path, leveraging data and information to enable proactive and intelligent manufacturing decisions.
To find out more about this emerging paradigm, HPCwire spoke with Jim Davis, CIO of UCLA and cofounder of the Smart Manufacturing Leadership Coalition (SMLC), an organization that is driving standards in processes and developing the nation’s first open smart manufacturing platform.
“We still interface with the design side,” Davis said, “but our emphasis is on the real-time nature of manufacturing. We’re interested in real-time data, the real-time use of computation/analytics, the orchestration of the software into actionable forms that are interfacing with automation and control, or with real-time decision-making or with real-time events at the supply-chain level. So the notion of time, real-time, actionable tasks and decision-making are what distinguishes smart manufacturing from the design chain.”
Davis goes on to explain that when his group began looking at a number of different industry segments, a common theme emerged that they needed the ability to access computation/analytics in a much better way. They needed to be able to scale IT infrastructure; they needed the connectors to interface with automation and control or factory platforms in a better way, but at the same time, they needed to be able to merge data and orchestrate it for broader kinds of metrics that would extend across offices, or supply chains, or operations.
“This took us down the path of platform technology, and led to the development of our Smart Manufacturing Platform,” said Davis. “We’ve been looking at a whole set of services that allows the computation analytics and so forth to be accessed at scale for real-time actionable use.”
“At the platform level, there is quite a bit of overlap with the design, and in fact the design models make really good sense of the manufacturing space and vice versa but design is distinctly different from manufacturing the actually delivery.”
Last year, SMLC won a Department of Energy contract to develop the nation’s first open smart manufacturing technology platform for collaborative industrial networked information applications. The first two test beds funded by the $10 million award are at a General Dynamics Army Munitions plant to optimize heat treating furnaces and at a Praxair Hydrogen Processing plant to optimize steam methane reforming furnaces. The test bed project technologies stand to reduce annual generation of CO2 emissions by 69 million tons, and waste heat by 1.3 quads, or approximately 1.3 percent of total US energy use.
In the case of the steam methane reforming furnace, Davis explains that managing the furnace and its energy use in a better way is a good fit for a high-fidelity computational fluid dynamics model. Understanding flow and heat distribution characteristics within the furnace has been difficult because the harshness of the furnace environment tends to preclude sensor placement. Now project participants are working to put infrared cameras around the furnace that allows the internals of the furnace to be measured and visualized on a real-time basis. Then they’re taking that data and bringing it together with other measurements using a computational fluid dynamics model to predict the overall heat distribution, optimize it and then update a control model that’s running the plant.
“We use a computational-fluid dynamics model to predict and update parameters in a control model and that allows us to run this in real time,” Davis explained. “There’s a substantial energy savings by using the high-fidelity model.”
The team is doing the model development using a 12,000 core UCLA cluster. While the application isn’t optimized to use all the cores, there are sufficient computational resources such that compute times went from a matter of days and weeks down to just hours, tractable ranges from a process standpoint.
SMLC is also working with another company that fabricates metal parts, and this plant involves heating and foraging steps, heat treatment steps, followed by shaping and machining steps. By using modeling to achieve the right metallurgical properties, the company saves on the machining maintenance, machine time, and machine utilities. Doing it this way also saves energy in the heat treatment process.
“This is an example of a discrete process where we’re actually able to save electricity and gas/fuel-based energy in substantial ways and at the same time improve the production and quality of the product,” said Davis.
SMLC analyzed across industries, across manufacturing structures, and across problems, and put together a set of requirements which were used to spec the Smart Manufacturing Platform. The platform is based on the services infrastructure developed by Nimbis Services, the originator of the cloud-based technical computing marketplace. SMLC added a unique workflow-as-a-service layer that allows companies to select and put together different components ranging from ‘how do I collect data?’ to ‘how do I analyze it?’ and finally ‘how do I interface it back with the plant?’ Put another way, the workflow-as-a-service layer arranges a series of pieces of code into an organized format that can be put into actionable use.
SMLC and its partners are now in the process of building out the platform against specific test beds in automotive, food, ammunition, gas, refining, chemicals, and pharmaceuticals. The prototype contains a vertical stack of all the services – the computational and storage layers, the cloud management layers, and the workflow-as-a-service layer – and it has the ability to bring those environments together. The next step is building out capabilities and robustness within each of the layers, so for the cloud management layer, for example, they will be implementing OpenStack.
“We’re seeing a set of tools that are relatively invariant across companies,” said Davis, “these tools have to do with access to computational resources, the ability to spin up and down instances of computation. The platform is basically architecting out those invariant elements and leaving a layer called the smart manufacturing marketplace, a layer in which the companies can come in and select different components that they need that are specific to their own uses and missions.”