What’s Your IoT Strategy?

Connectivity. Smart software. Machine learning.

Today, the landscape of the Internet of Things is awash with new and exciting technology that can empower miraculous product capabilities. Collecting and analyzing the right data can yield insights that might transform a company. Building the right intelligence and automation can transform an industry. The visions are grandiose. Yet, most companies today struggle with a short and simple question.

How do we get there?

Most people know the basic technical steps. You have to collect the data. You have to analyze the data. You have to act on the data. All of that is easy enough to understand… at a high level. Getting down into the details, however, is not so easy. In fact, most companies don’t follow a straight line from here to there. There is a lot of meandering, quite a few mistakes and bucketloads of learning.

Collecting the data is often one of the most simple yet most technical aspects of an IoT initiative. You can instrument a product with tons of sensors. You can stream those readings somewhere on the cloud. Likewise, taking action can also be pretty easy. Once you have some correlation between sensor readings and some event, catastrophic or marvelous, most know what needs to come next: avoid it or repeat it. But the data analysis bit? That can be terribly difficult. That’s where you’re trying to find the needle in the haystack. So how do you go about that?

In my time working with manufacturers, I’ve seen successful companies take one of two approaches to analyzing data. The two approaches share a common trait: controlled scope and focus. As may, or may not know, getting overwhelmed with big data can be a project killer. So let’s look at each in turn to see how these strategies keep things under control.

Boiling the Ocean… in R&D

The first successful approach I’ve seen employed is ambitious in some ways, yet focused in others. Here, companies instrument a product or prototype extensively. However, they don’t do this in a production environment, at least not initially. They put this instrumented product through the paces as they would a prototype in testing. They expose it to a wide variety of cases and collect everything.

Once they have some critical mass of data, they don’t try to analyze the data themselves. Frankly, there’s just way too much of it. They turn machine learning software loose on it. They might feed it their initial hypotheses about what they think are the potential correlations. But by and large, they are looking for the software to bring them the key findings.

Companies in this mode are looking to learn. They are the midsts of discovery, and they know it. They kick off this kind of effort, but keep it under wraps in an R&D department. They aren’t looking to disrupt current development projects. Findings from this kind of project will be applied in future projects that likely have not even been started yet. However, many different findings from this kind of effort could have wide ranging impacts across the company. There is a lot of potential here.

So, in some ways the scope is big. There is broad instrumentation. Data from a lot of different sensors are captured. They are looking to learn. But in other ways, this effort is contained. It is only in R&D, often a small group. The impact of findings is limited to future projects.

Proving or Disproving a Hypothesis

A very different approach comes in the form of an organization that is looking to achieve something very specific. They have a particular event, something with some kind of key business impact associated with it, that they want to be able to predict. Furthermore, there is a technical team that has a few ideas on what sensor measurements could be correlated to that event. In general, these organizations aren’t looking to learn in a broad sense. They are looking to prove or disprove a hypothesis.

In this case, there is very limited and specific instrumentation. They are only looking to capture the data that will let them verify their idea or not. Because of that, there is a limited amount of data. As a result, the data analysis is more simple. There’s no need for machine learning here. Dashboards can be set up manually relatively easily to monitor the right streams.

Findings from these kinds of efforts can be applied relatively quickly, depending on the complexity of the development project. Furthermore, they can provide value to the company quickly as well. Yet, their value is highly dependent on that single hypothesis.

Overall, the scope here is small. There is limited instrumentation. The analysis is very focused. The turnaround on value here is quick, depending on the success of the hypothesis. Either way, the scope is contained.

Takeaways

Now, there is no reason that a company couldn’t pursue both of these kinds of projects. The first might even feed into the second. In fact, a company might run multiple projects of each type. But in both cases, the scope is defined and controlled in a way that the company can make it actionable.

IoT efforts can be terribly complex. But breaking it down into piecemeal projects makes it far simpler.

Chad Jackson is an Industry Analyst at Lifecycle Insights and publisher of the engineering-matters blog. With more than 15 years of industry experience, Chad covers career, managerial and technology topics in engineering. For more details, visit his profile.