Chad Jackson

It took me a while, but I reached a conclusion. More simulation integration is needed. It’s needed in a bad way. You see, there are a ton of good simulation tools being put to use out there. Over the course of the last year, I’ve been digging and digging into them. Some are being used to …

Simulation Reconciliation, the Series Read More »

Simulation Reconciliation, the Series

It took me a while, but I reached a conclusion. More simulation integration is needed. It’s needed in a bad way.

You see, there are a ton of good simulation tools being put to use out there. Over the course of the last year, I’ve been digging and digging into them. Some are being used to democratize simulation. Some are being used at the bleeding edges of new domains, like IoT. Each, in their own way, is an attempt to predict product behavior as a means to ‘get it right the first time’, far ahead of making prototypes, engaging mass production and delivering final products. And while that’s a good thing, there are emerging issues. Issues that are pretty problematic. But here I am, getting ahead of myself.

To make sure we’re all on the same page, let’s scope this out.

Getting Our Arms Around It

Broadly speaking, simulation is used in various engineering silos as part of product development. Here’s the rundown.

  • Mechanical Hardware Performance: Of all the simulation types on this list, this is likely most familiar. Here you’ll find different methods (Finite Element Analysis, Boundary Element Analysis, Computational Fluid Dynamics, etc.) that look at mechanical structures, statics and dynamics, thermodynamics, fluid dynamics and far more. Take note, however, that this increasingly includes test performance. There are lots of great tools that capture granular results from actual tests. Here, simulation is bleeding over into test. And that’s good.
  • Electrical Hardware Performance: The development of Printed Circuit Boards (PCBs), custom processors and Systems on Chips (SoCs) is riddled with challenges. As manufacturers continue with efforts to make electronics smaller and work on lower power, they must validate that the designs work as intended. Simulations are used here to predict electromagnetic interference, signal loss and a host of other issues.
  • On-Product Software Performance: A major challenge here is ensuring that the software you are developing will run on the off-the-shelf or custom electronics you will be using. The whole Model Based Development process is tailored to progressively verify that fact at various steps in the process. Note that these models are split into a controls side (the software) and the plant side (physical simulation of the product).
  • Systems Performance: Findings from one of our earlier studies shows just how important first-time success on system tests is to development. The good news is that system modeling and simulation, frequently as part of Model Based System Engineering, seems to be on the rise. These 0D and 1D simulations, which include equation-based representations of hardware and software, allow system engineers to get an idea of performance very early while developing the product’s architecture.
  • IoT Performance: These types of simulation don’t necessarily fit our traditional concept of performance. A failure here won’t break the product in terms of broken hardware or a software fault. This type of verification, however, is crucial. These simulations allow engineers to virtually mockup the sensors on the product that are connected to databases, analytics and software off the product. By varying those sensors with emulation capabilities, the manufacturer can test if they are collecting the right data from the product and how their off-product software should react.

The Problem

So what’s the issue here? Ultimately, the problem is the outcome of a sequence of development realities.

  1. Individual teams are responsible for completing their scope of work on short schedules. Everyone is focused on what is going on in their silo. Everyone is under tremendous pressure to get it done on time.
  2. Different teams need representations of each other’s work. Model Based Development requires some representation of the plant side of things. Systems models need some idealization of hardware and software. IoT emulations need some representation of much of the product. Few teams can afford to truly work in a silo anymore.
  3. Each team creates their own representation of other’s work. Why? Because no team can afford to develop representations for every other team. Left to their own devices, each team will create what is needed to make their simulation work.
  4. Most designs are work-in-process. That means most of these items are changing, sometimes drastically, over the course of the design cycle. That’s the nature of design. On top of this, notifying other teams of changes is a burdensome and onerous task.
  5. One team’s representations of another team’s work are often quickly outdated. This means that the representation of another team’s work is inaccurate. This undermines the effectiveness of each simulation.


Make no mistake. These various simulations are valuable to each team in the silo. However, many of these simulations are tied to some representation of another team. If it is out-of-date, then the effectiveness of those simulations is drastically undermined.


As a next step, I plan on publishing a series of posts on the interconnections between these types of simulations. So stay tuned.

Have thoughts on this topic? I’d love to hear from you in the comments below. Sound off!

Share this post