Increased Complexity, Increased Costs?

What’s the story with simulation and verification for boards? This is a really interesting topic because I think we all know that boards have gotten more complicated in recent years. We’re trying to jam more components on them and we’re trying to get more compute power out of them. They’re generating more heat. The nets are more elaborate.

Here’s the issue with increased board complexity. You can design it, and then you can go into prototyping and testing to see if it actually works as intended. But getting into re-spins is fairly expensive, both in terms of cost and time. So not only does that undermine your development budget, but the delays can last two or three weeks. Then, when you get another board prototype back in-house for more testing, there are added expenses from that.

The Digital Solution

So the idea with simulation and verification during board design is that you gain insight into your performance, and if there are any issues you find out about them digitally as opposed to dealing with the physical prototype. You can make adjustments, and you can fix the issue digitally instead of having to go through a re-spin.

In one way, this helps you avoid errors that are very costly in development. You can also use that feedback, those checks and analyses, to make better decisions. This means you can experiment a little bit. You don’t have to go with the first feasible kind of board configuration. You can move components around and you can see if it affects performance in a positive way or not.

Those are the two approaches with board simulation and verification. As for the types of analyses that we’re talking about here, there is power integrity, signal integrity, and all sorts of design rule checks. We’re also talking about thermal analysis to check overheating. All of those fall under the umbrella of simulation and verification for circuit boards.