Amid a landscape of shortened timelines and more complex products, companies are increasingly using simulation to digitally verify that a product meets requirements and performs as designed. Yet, engineers and analysts often find themselves working in silos, making contributions that are not in sync. Design engineers use computer-aided design (CAD) tools to design, while the analysts rely on simulation solutions to verify and validate those designs. These tools aren’t all connected with each other, which can result in numerous challenges across the product development lifecycle.
Product Design Using Modeling and Simulation
Design engineers transform their ideas into 3D designs using CAD solutions. The resulting model is a digital mockup of the real product. Engineers can use the model to visualize the product’s components. They can also gain an understanding of the product’s movement and performance via tools built into modern CAD solutions. CAD enables engineers to explore multiple design options at this phase of development, which will help them come up with the most suitable design to pursue.
Analysts use simulation solutions to digitally verify whether the design meets the product requirements. Since requirements can span multiple physics, many different analysts may use a host of different tools for verification. Typically, this group of analysts will work to verify all product requirements.
Design engineers take the input from simulation results to modify the design. This can fuel an iterative design process, where engineers can use those findings to explore more options and gain new insights into how design changes may impact performance.
This post will discuss the fundamental challenges involved with this more disconnected process of design and simulation carried out in a loop.
Traditional Approach Followed by Analysts
Traditionally, organizations follow a standard process when a design is ready to be digitally tested and validated. First, simulation analysts will seek out the latest design model from the design engineer. The analysts will then simplify that model for their own simulation purposes. They don’t need all the geometry information that’s detailed in that model to perform their simulation. Instead, they abstract that more elaborate design into a more basic simulation model that they can use to verify requirements.
Next, the analysts will apply real-world operating conditions, including loads and boundary conditions, to the newer, simplified model.
Then comes meshing. This is a unique process that creates a discretized model. Many analysts rely on automated tools to complete this step.
Finally, using specific solvers, analysts will solve the model numerically to produce simulation results. They will then, in turn, interpret those results using visualization tools.
Iteration is the Nature of Design
Of course, designs rarely hit all marks on the first try. Simulation will uncover flaws that must be addressed. At that point, engineers will attempt to explore alternate designs to fix those issues. This requires the use of simulation to verify requirements satisfaction and performance. Because design engineers and analysts use different, unintegrated tools, the analyst must repeat the entire simulation sequence listed above for each new design. This eats up a lot of extra time and resources—all because the stakeholder groups use disconnected toolsets.
This cycle will be repeated many times over if there are multiple physics simulations for a single design. Given today’s market for smart, connected products, there’s a high likelihood that requirements spanning multiple physics will have to be verified. In a typical example, running a simulation on a complex product will require various analysts who run their own simulations on structural, fluid flow, thermal, and electromagnetics requirements. All analysts will have their own simulation setups consisting of simplifications, abstractions, loads, boundary conditions, mesh, and solvers. That’s a lot of work to verify and validate each design iteration.
Managing Simulation Artifacts
The traditional approach also lacks a single source of truth for the design model and simulation model. Design and simulation artifacts are likely housed across a multitude of desktops, shared drives, and various data management systems. It’s hard just to know when to update a simulation model, and far more difficult to automate the propagation of any design changes.
And since these efforts are siloed between design and simulation, the team needs to manage a plethora of files produced by the different simulations. Each simulation has a corresponding design model, simplified simulation model, mesh, solver options, and simulation results. That’s a lot of files to keep up with! And the number grows exponentially as design teams explore new design alternatives. Tracking and managing them becomes more and more difficult, even before you add in simulation automation workflows, which can execute hundreds or even thousands of analyses.
Design engineers contend with daunting challenges when creating today’s complex products. CAD models and simulation analyses can help teams arrive at an optimal design. But when engineers and analysts use these tools separately, in an unintegrated fashion, their work becomes more difficult. They experience longer timelines, increased costs, and inaccurate or incomplete data.
Iteration is the nature of design. It follows that design teams that use connected tools can more easily iterate, feed simulation data back into the design process, and manage all the resulting simulation artifacts.