The manufacturing industry is the most complex industry as far as variety and depth of the product are concerned. Industry classification benchmark organizations such as the North American Industry Classification System (NAICS), International Standard of Industrial Classification (ISIC) have categorized this industry into two major segments based on the production processes they follow – discrete manufacturing and process manufacturing. Further, segregation happens based on product offerings such as an automobile, hi-tech, aerospace, chemicals, pharmaceuticals, metals, etc. Although the above narration is not exhaustive, one can understand the level of complexity the term “manufacturing” indicates, let alone the challenges.
The collection and comprehensive evaluation of data from different sources – production equipment and systems as well as an enterprise- and customer-management systems – are becoming standard to support real-time decision making. Big Data, with its four “V” components – volume, velocity, variety, and varsity – is increasingly becoming popular, along with its counterpart – analytics.
Let’s scrutinize top 4 challenges vis-à-vis potential solutions that the cyber-physical world offers in the backdrop of Industry 4.0 which is revolutionizing manufacturing world since its inception in Hannover Messe Industries Fair in 2011.
- Optimize production and enhance efficiency
Process manufacturing has a big volume of data stored in historians for decades. But this data is mostly underutilized as intricate access makes actionable insights sluggish.
Machine logs contain data on asset performance. Internet of Things (IoT) also adds a new dimension with connected assets and sensors. This data is potentially of great value to manufacturers. Data analytics can help plant personnel to quickly capture, cleanse, and analyze machine data and reveal insights that can help them improve performance.
McKinsey’s use case on a Biopharma Manufacturing Co. shows how a big data analytics case identifies specific targets and subsequently modifies vaccine yield processes to save $5 to $10 million annually.
- Predict machine failure and reduce downtime
In the asset-intensive manufacturing industry, equipment breakdown and scheduled maintenance are a regular feature. According to Forbes, big data analytics can reduce breakdowns by as much as 26 percent and unscheduled downtime by as much as 23 percent. In automotive manufacturing, robotic arms in assembly lines are a regular feature. These robots perform various tasks like welding of parts in an automobile, gluing, cabling etc. As per Nielsen research, downtime costs auto industry approximately $22K/min.
Capgemini offers a use case of predictive maintenance for a German automotive manufacturer. Frequent changes in robot-driven welding programs were seeing recurrent failure in chassis welding. Glue leakage in filling and applicator heads of greasing robots and dozer was another major concern. A big data and analytics solution, with its predictive model and prior alerts feature (~1 to 2 days advance), could save ~500 mins/week of operational downtime for 600+ robots in assembly lines.
- Optimize the supply chain
The massive spike in supply chain data has become a huge challenge for enterprises. This data is compiled from several sources, ranging from ERP systems within the enterprise to the supplier’s business, orders and shipment information, weblogs for customer shopping patterns logistics, GPS, RFID and recorders, mobile devices, and social channels.
Forward-looking organizations are leveraging big data to get a 360-degree view of the customer in order to better predict customer needs, understand personal preferences, and create a unique brand experience. Starting from supply network planning to procurement to end to end execution of the supply chain, all functions of the process are leveraging benefits of big data analytics.
- Enhance product quality and cut manufacturing cost
Intel has been harnessing big data for its processor manufacturing for some time. The chipmaker must test every chip that comes off its production line. That normally means running each chip through 19,000 tests. Using big data for predictive analytics, Intel was able to significantly reduce the number of tests required for quality assurance to cut down test time and focus on specific tests. The result was a savings of a whopping $3 million in manufacturing costs for a single line of Intel Core processors.
Presented here are only four challenges which can be investigated as improvement opportunities. With Big Data analytics no longer being a “nice to have” option, companies must identify the right opportunities to improve plant efficiency and generate insights. Big data analytics would then provide the competitive edge companies need to succeed in an increasingly complex environment.