Wow. It hasn’t even been a year.

You see, there’s a lede for this post. Back in December, I published a guest post with Desktop Engineering titled Iterating Toward Iron Man’s Design Process. It was a fanciful little post looking forward to see how design could look a whole lot more like what we saw in the 2008 Ironman movie. In it, we saw Tony Stark presented with a virtual design he changed with his fingers while talking to Jarvis, his AI assistant. When I wrote it, I thought that we’re kinda far away, but that it was worth considering. Today? Well, I’m starting to think there is real opportunity here. And whomever does figure out will break the industry. But before we get to that, let’s talk tech.

The Tech

In as little as ten months, the landscape of enabling technology for this concept. Let’s look at each in turn.


If you look at the Ironman movie, Tony Stark didn’t sit down in front of a flat screen to design. No. He had designs projected right in front of him. How’s that translate into today? Augmented Reality.

I gotta say, this technology has taken some big steps forward in the last year. The latest version of PTC’s Vuforia has some astounding capabilities. It is starting to recognize shapes. It is recognizing flat planes. Essentially, it is starting to recognize its environment. Now, it is very early. But these are the kind of steps that are crucial. In the future, engineers might not be sitting at their designs designing, but might be in the machine shop with an augmented reality overlay.

Still unsure about viability here? Well, recently some big players have entered this space. Google now offers ARCore. Apple now has ARKit and Microsoft has its View 3D. These are all technology platforms upon which apps can be built. This is no joke.


So how would tomorrow’s engineer interact with this kind of system?

From a touch and feel perspective, the answer is haptics. This interface allows users to use their hands to interact with things in a virtual space. To draw the parallel again, that’s exactly what we saw Tony Stark do in the early Ironman movies. The Geomagic suite from 3D Systems has some capabilities along these lines. CyberGlove has something along these lines as well. This isn’t some imaginary technology. It is just in its early stages of adoption now. The biggest and best example of this is in your pocket: your smart device. Using your finger to control that device is now the primary interface.

A touch and feel interface, however, is unlikely to be the primary interface. Voice control could be a major component as well. We’ve already seen it with Alexa and Siri. In fact, more folks are shifting from creating apps to creating voice-controlled assistants. If you think about it, a voice interface is just plain simpler and easier for the user. Going forward, you’re going to see a lot more tools with voice control. It makes obvious sense in this context.


So is an engineer going to be building up 3D models piece-by-piece in this kind of system? Maybe. But it is more likely that an engineer would be giving commands to their digital assistant to complete designs. So what technology could complete that picture?

Generative Design.

Obviously, this technology has gotten a lot of buzz in the past year or two. Autodesk was one of the first to really introduce the idea on a broad scale. Dassault and Siemens PLM both now have capabilities there. But primarily, today’s technology is based on topology optimization. I expect we’ll be seeing some new methods applied in this space soon. But with a voice-controlled assistant, you watch on AR as it starts generating alternative designs with Generative Design.

Starting Points

Now, this idea of an engineer primarily relying on a voice-controlled assistant leveraging Generative Design assumings you already have one thing: an existing 3D model. This is where 3D scanning will likely come into play. Reverse Engineering has been around for a significant amount of time. Yet, in this context, it takes on new meaning. It becomes the starting point for automation. Creaform and 3D Systemes both have pretty good technologies in this space. They could, and should, be key components to this kind of system.

Now, what is interesting here is the ability to work with the Mesh Geometry the results from 3D scanning. As you may, or may not know, scanning an object doesn’t get you smooth boundary representation geometry that you get from Parametric or Direct Modeling. In face, Mesh Geometry can’t really be manipulated by those technologies. Luckily enough, Facet Modeling, the technology that can modify Mesh Geometry, has taken some big steps forward here recently.

Who’s gonna do it?

That’s the key question, right? Let’s look at the components.

  • CAD Applications: Having access to these capabilities are table-stakes. Companies that have it now include PTC, Autodesk, Siemens PLM, Dassault Systèmes, 3D Systems, ANSYS (Spaceclaim), Onshape, 3D Systems and many more. But, to be honest, licensing one of these kernels is pretty easy. A startup could do this easily enough.
  • Augmented Reality: PTC obviously has technology and expertise here, although they seem to be pretty focused on the IoT and Service spaces right now. Someone could easily enough leverage Google’s (ARCore), Apple’s (ARKit) or Microsoft’s (View 3D) tech here.
  • Haptics: As mentioned earlier, 3D Systems has some technology here as well as some other smaller players.
  • Voice-Controlled Assistant: Google, Apple, Amazon and others are developing these kind of capabilities as a platform that could be leveraged by others. The question is: how open are these platforms? I can’t answer that myself.
  • Generative Design: Topology Optimization is there. It is functioning today. You see in Autodesk’s, Dassault Systèmes’ and Siemens PLM’s offerings right now.
  • 3D Scanning: Creaform and 3D Systems have good capabilities here. However, I haven’t seen them go mainstream yet.
  • Facet Modeling: Creaform has tools here. However, Siemens PLM has made big strides recently in this area.

So are we going to see a big vendor jump in here? Mmmmm. Perhaps. However, I don’t see many of them chomping at the bit to invest a lot in mechanical design. Those that are invest are doing so in an evolutionary way. So were does that leave us?


When Spaceclaim launched, they fundamentally changed the market. Their positioning, which focus on enabling the engineer to explore more designs, triggered big changes. Because their positioning was underpinned with Direct Modeling, and they successfully got some big companies to purchase their product, the larger folks went out and acquired similar technologies.

Ultimately, I think the opportunity is there for a company to build something like Jarvis. And if they do? Watch out. It could seriously disrupt the industry.

Closing Thoughts

While we wait on this system to come to fruition, let’s be productive. What do we call this thing?

AR-Assisted Design?

The Engineering Agent?


Well, maybe coming up with another acronym isn’t that productive anyway.