Is PDM and requisite part of PLM?
That’s the question we took at a recent roundtable discussion at this year’s COFES. You see, in the past year, there have been a number of PLM-ish products that have been launched that don’t have PDM capabilities included. You have Autodesk’s PLM360 which can optionally be integrated with their Vault product (here’s my review). You also have Kenesto, an ad-hoc process routing system that let’s you attach files, but doesn’t do day-to-day PDM (here’s my take on their product as well). Then you have Nuage, which is far more focused on the social technology side of PLM than PDM (and here’s my review of this one too). And of course, you have Arena Solutions and Aras that have 3rd party offerings for PDM.
So what’s the argument for and against integrated PDM and PLM offerings? Let’s take a look.
The Arguments Back Then
Back in December of 2010, I published a post titled PDM-less PLM: Pragmatic or Problematic? In it, I laid out some early arguments on both sides. I went back, took a look at them and had to admit that not a whole lot had changed. To start, here’s the arguments against the separation of PDM and PLM.
Traditional thinking is that PLM and PDM should be integrated. Why? The idea is that while executing a process with a PLM system, stakeholders will need to reference and review product data which resides in the PDM system. An integration between these two systems enables the PLM system to always point at the right version of the product data. As a result, decisions in the process are always made against the correct product data. If done manually, decisions may be made against the incorrect version of product data, leading to erroneous decisions. This same core concept extends to reporting and analytics that may be based on product data. It also extends to generating an traceable audit trail for liability or legal reasons.
And here are the arguments for the separation of PDM and PLM.
The fly in the ointment with the above argument is how PLM systems are deployed today. Very few companies go with a big bang deployment of PLM. It’s just too expensive and too long before an return-on-investment is achieved. Instead, many manufacturers use a phased approach composed of intermittent projects that on show a ROI on an individual basis. Where’s the problem? The deployment of PDM across multiple sites, multiple continents or an extensive supply chain requires a significant investment in terms of time and money. As a PDM deployment must often be completed prior to the start of the PLM deployment, it can be years before the PLM system addresses process issues.
If you’re not familiar with this concept of granularity in the IT ecosystem for product development, you can catch up on it in my post titled Point Solutions, Integrated Solutions and the Granularity Value Proposition.
The Arguments Now
Fast forward a year and half, and there have been some progress or maturation of the arguments on both sides.
During the COFES discussion, one executive from a software provider took an interesting position. They said that if you should really look at PDM system simply as an extension of the CAD application. Whichever CAD application you choose, you should also use that same PDM system to manage its CAD data. They further went on to say that if you have multiple CAD applications, then you should have multiple PDM systems, one for each CAD application. The implication, as a result, is that you will need something separate to manage all of those processes anyway. So why not separate it.
Another related concept was related to how you integrate an IT ecosystem that is fragmented due to a granular approach. Most everyone in that session agreed that a ‘light’ integration made sense. But we never did dive down to really come to a consensus around what that ‘light’ integration would look like. However, I would suggest that it needs to be integrated enough to point people to the right data that is spread across the fragmented IT ecosystem. Interestingly enough, solutions like Inforbix (my review is here) and alcove9 could provide such capabilities. But to be honest, they would really need to be integrated into something like PLM360, Kenesto or Nuage.
What are the other arguments against take such a granular PDM separated from PLM approach? We had some representatives from software providers that offers such PDM-PLM integrated solutions. But they weren’t vocal. I don’t think that the ‘you need to access the right data during the process’ argument has faded in terms of strength. All one needs to do is point to the harm that an decision made off the wrong data can do to a product development schedule. I’m not sure, however, that is has matured beyond that.
Summary and Questions
Here’s the recap.
- A number of PLM-ish systems that do not offer PDM capabilities have emerged in the market, including Autodesk’s PLM360, Kenesto and Nuage.
- The argument against the separation of PDM from PLM is that decisions in a process might be based on the wrong data, which leads to downstream errors.
- The argument for the separation of PDM from PLM is that a more granular approach needs to be taken to the IT ecosystem supporting product development in general. All encompassing solutions take too long to deploy and are too costly.
- Folks at COFES agreed that in a granular IT ecosystem, there needs to be a ‘light’ integration between systems. Although it wasn’t clear what form that should take. In my opinion, the key is getting people to the right data.
Like I said before, it’s been a year and a half since the last discussion. What’s your position on this topic? Is it dangerous, too dangerous, for manufacturers to take seriously? Is it a long overdue concept? What do you think some critical aspects of a ‘light’ integration are? Sound off. Let us know what you think.
Take care. Talk soon. And thanks or reading.