Insights & Data Blog

Insights & Data Blog

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

The 9 challenges of an industrial IoT implementation (Part 2)

In Part 1 of this blog (here), we addressed the critical question of where to start, as well as aspects relating to people and organization, so critical in any major transformation, as so often overlooked.

In this second part, we go into the functional challenges of an IoT implementation and provide some actionable recommendations: (Big) Data challenges, challenges around how to turn data into insights and how to manage the scope of the project, between agility and integration.

 

1.Data (by Avinash Vaidya, Manufacturing & Auto solutions, North America)

There are about 30M seconds per year. If you have 400 machines and you capture 4 data points every second per machine (for instance temperature, humidity, vibration and pressure), you will have captured 1 billion data points after 2 years – a number so huge that it will require careful thinking and strategizing to optimize. In my view, one of the key factors determining business success for any IoT implementation will be data strategy, as it will be a key part of any IoT related business case. Here I want to provide recommendations to optimize data strategy and associated costs, but also provide a transparent view of the maturity of this discipline and associated uncertainties, and my thoughts on how those can be mitigated.

 

Here are some useful considerations for data strategy:

·        Data points: A general recommendation to keep the amount of data under control is to remain focused on what business goal is to be achieved and only gather data supporting that goal. That said, there is limited precedent to understand which data points are typically related to certain outcomes. So it will take gathering some history of all types of data and running predictive models against it to get an initial understanding of what data points are useful.

·        Along the same lines, it is still unclear what the right frequency will be for capturing and storing the information. The temptation for real-time or near real-time is generally very high, but not always needed. As a very high level thumb rule – if it is not life related, then real-time or near real-time can generally be avoided. Typically in any business environment the sensor based “things” can send out continuous streams of data and operating parameters, when looked at it from a final “actionable Insight”, it is more “noise” than “value”. Making the correct choice in this area will result in lessening the complexity of the solution architecture, build and most importantly the analytics that enables the end goals.

·        It may be possible to process some of the data at the edge, this means, without processing it and storing it in the central database, in the same way that reflexes allow humans to take quick action without neural information going through the brain.

·        New cognitive technologies open new possibilities, for instance video monitoring of some machines which will enable monitoring of aspects which sensors can’t capture.

·        Data doesn’t need to be stored forever. Depending on the business situation that is being addressed, this question can become extremely important OR can be moved over to the “data archival” point. In cases where there is a need for data preservation “on-device”, this aspect attains high criticality. This can also necessitate device selection of different capabilities or ensuring an architecture that will support real-time or near real-time data extraction and storage capability mirroring the device capture, but with a higher volume or capacity that on the device itself. As mentioned earlier, it will also shape the archival strategy for implementations.

·        Storing only “abnormal” values will help to greatly limit the amount of data stored. This will require definition of what the range of normal is.

 

That said, with all the possible planning and hypotheses, there will still be a lot of uncertainty as IoT solutions are still maturing are there is admittedly a lot of unknown around what data will be useful, and with it uncertainty on the appropriate infrastructure. Cloud solutions can offer a great way to make the infrastructure scalable to adapt to amounts of data stored. As a global system integrator, we are talking to the largest Cloud vendors to understand pricing models to support IoT implementations. They are also working through the uncertainties based on lessons learned their first customers. What is clear at this point is that vendors are working with the clear goal of creating competitive offerings to support IoT through the Cloud.

 

2.Insight (by Laurent Perea, Industry 4.0 Consulting Services, France)

 

This is one of the most critical part of your project – running the right analytics to predict and prevent failure, spare part needs, launch relevant alerts, self-correct machine parameters, etc. The main challenge (and success factor) will be to navigate the market place and find the right vendor for your company. There is plenty of choice, including large players such as SAP, IBM and Microsoft alongside pure plays like Predix, Pivotal, IP Leanware and Dataiku.

In making your selection, it’s important to bring together a set of tools that allow you to limit the need for data scientists, who are a scarce and expensive resources. Some data scientists will be needed, but the aim should be to have them set up analytics models in such a way that they can run continuously, generating outputs that your production engineers can use.

 

3.Scope (by Mike Dennis, Natural Resources and Chemicals, North America)

 

The balance of speed to benefits within a broader company IT architecture remains one of the biggest challenges to successful IoT implementations. Too much focus on speed, and the company comes away with islands of automation. Too much structure, and innovation suffers. The leaders adopting these capabilities set the guidelines for the innovation: focus on business benefits, alignment on architecture, teams driven to specific outcomes, willingness to cannibalize the existing business.

 

This is where the agile approach is most useful. First setting the context for the initiative: How do I adjust my production to meet rapidly changing demand? How do I keep my people safe and connected in real time? How do I manage a complex supply chain to meet my changing business? Then establishing small, nimble teams to drive outcomes in their areas in weeks, not years. 

 

The strategic and technical frameworks have been set, avoiding the never-ending battles over outcomes and platform. Business results are imperative, technology tools less so.

 

Teams focus on developing new business opportunities or resolving operational issues which are based on business value and using agile methodologies. New capabilities and the associated results can then be realized at the speed of business without being hindered by the overhead of large-scale technology implementations.

 

 

In part 3 of this blog, we will cover technical challenges of IoT implementations relating to IoT standards and security.

About the author

Anne Aussems

Leave a comment

Your email address will not be published. Required fields are marked *.