Bullwhip effect applied to a data supply chain

Publish date:

Take a look at how the bullwhip effect translates into the data supply chain built for your organization.

The bullwhip effect can be explained as an occurrence detected by the supply chain where data orders sent to the “data manufacturer” (source system) and supplier (DWH) create larger variance than the sales (DWH query) to the end customer. These irregular queries in the lower part of the supply chain become more distinct higher up in the supply chain. This variance can interrupt the smoothness of the data delivery process as each link in the supply chain will over or underestimate the data demand resulting in exaggerated fluctuations.

How does this translate into the data supply chain that you built for your organization?

Imagine that you are the head of an analytics delivery team in Europe and one of your globally operating customers wants to see additional geolocation information of their suppliers in their analytics portal. The portal functionality and data are provided by your company as an additional service. The idea of adding the geolocation sounds great and would add value. You investigate what would be the scope of this new demand.

Let’s assume the geolocation information is currently not available in any of the data warehouses (DWH) used in your IT landscape. As you dig deeper, you find that only some of the country-specific source systems in the SCM solution provide this information. This is one of the first scope increases as additional supply-chain data will need to be entered by different countries.

Next is the question of historical information. How far back in time or for how long does the data need to exist? Is it just current, or does it reach back one, two years, ten years? You decide that it is a new feature and holding two years of information should suffice for now (second scope increase – historical data).

The word spreads and other managers also want to build the geolocation information into their analytical reports of the portal, because it is useful to all customers and internally. There would be about 100 analytics affected by this change – a third major scope increase.

While the new data entry requirement is implemented in the SCM solutions around the world, the architect or development lead who is working on importing current geolocation data also reviews current data sets. In addition, they receive the request about implementing the geolocation data into the 100 reports and started an analysis of all datasets and what was needed to link the location into the existing reports.

They realize that while geolocation by itself is useful, so too is additional geographical information, such as supply chain routes. While the project budget grew, it was decided that it was a good time to add this information and other, much smaller, datasets that had been requested before, but previously didn’t carry enough weight to be implemented – resulting in two further scope increases.

As your team got to the testing stage of a first analytic, one of the testers finds an issue with incomplete post-code data from suppliers and that it would be necessary to do some cleansing and intelligent data completion on the existing data in the systems. This increases the scope with licenses for a data cleansing tool and a post-code address package to fill data gaps.

Finally, the location information (longitude and latitude coordinates) is available, but not deemed usable without a visualization, i.e. a map. Additional map visualization from Google has to be added. You get the idea by now …

So, like the bullwhip effect in a supply chain, adding more “products” or scope to the data supply chain creates a serious risk of getting demand wrong. As we went from just the geolocation of a supplier into one analytic to much more:

  • Global SCM data entry enhancements
  • Enhance 100 other analytics with geo-location information
  • Many additional datasets (to link and enrich)
  • Auto-correction of address data
  • Visualization of geolocation information.

Was there a real demand verification for all of this? Should it have been communicated more clearly? End users or customers would certainly not say no to additional data and features if they don’t cost more.

As a feedback loop, website logs need to be analyzed; what analytics the end user clicked on and how much geolocation information was used. This retrospectively answers the question: “Has geo-location data added to the weight of the data delivery and cost but not added value?”

Capgemini can help you answer these and many more such questions.

 

Related Posts

agile

SAS Analytics Experience 2018: What to expect?

Monish Suri
Date icon October 22, 2018

Capgemini is a Diamond Sponsor of the SAS Analytics Experience 2018, which is scheduled to...

AI and analytics

Spotlight on Capgemini NA @ Informatica World 2018 | May 21–24 in Las Vegas

Jackson, Dusty
Date icon July 10, 2018

Spotlight on Capgemini NA @INFA World 2018 with key representation from Dusty Jackson, Scott...

Artificial Intelligence

Even the artificial intelligence you buy is prejudiced

Reinoud Kaasschieter
Date icon June 21, 2018

When wrong data is fed into the algorithms, they also make the wrong decisions. Learn why do...

cookies.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

Close

Close cookie information