Why Is Traditional BI So Wasteful?

Publish date:

Are you involved in IT, Developing BI systems?  It’s probable that more than half of your time is wasted; building things that nobody will ever use.  In this first of two papers I wonder why this is.  In the next paper I’ll go on to adopt a Biomimicry perspective on how Agile methodology & Data […]

Are you involved in IT, Developing BI systems?  It’s probable that more than half of your time is wasted; building things that nobody will ever use. 

In this first of two papers I wonder why this is.  In the next paper I’ll go on to adopt a Biomimicry perspective on how Agile methodology & Data Lake thinking can trim that waste.

By traditional BI I am thinking of large scale projects using a waterfall type approach that aim to create a traditional enterprise data warehouse.  Perhaps an easy beast to slay?  But illuminating all the same… So where does the waste come from?

  1. Unused Functionality:  That piece of information that interface, that UI link, etc.  The more customisation in an IT system there is then the higher chance there is that something that a business analyst or developer thought would be a good idea 18 months ago isn’t useful at all.  45% is the waste claimed by Standish although estimates vary[i]

Have a look at your TV remote for example – how many buttons do you use?
 

  1. Project failure:  A project sometimes never actually gets completed, this could be for a number of reasons but we do know that the longer a project and the bigger it is, the more likely it is to fail.  Stakeholders change, business changes, legislation changes, the competitive environment changes, hidden complexity is revealed…  a 2012 McKinsey study found that:

 ‘On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted’[ii]  Compared to their objectives, 63% of projects are challenged or fail’ [iii]

This should upset you, it’s demoralising.  Had I really spent this much of my waking energy, time, and passion on something useless?  –  Yes!

Typically we consider a project to be a success if it’s on time and to specification.  Because value is often hard to quantify, especially in BI, we don’t always measure success as cost Vs benefit.  Using the on time in budget’ paradigm model we count all unused functionality in the success bucket because it was in scope.  We should find a way to stop doing this now, is there any other profession with as much waste?  If teachers taught the wrong thing for half the time it would be big news.


Root Causes?

So, what are the root causes of failure?  I suggest the following.

  1. Needs Change:  Something that would have been useful when the system was conceived isn’t now.   Business and technology have moved on.  The longer a project, the more likely this to happen.
  2. Requirements are wrong:  Created by the wrong people, or the right people who don’t really know what they want yet.
  3. Failing Slowly:  If a requirement or system function is not useful then this needs to be eliminated or corrected as early as possible.  We should fail fast and cut our losses, not ‘fail slowly’.
  4. Scale:  The bigger a project, the more stakeholders, the longer the duration, the more complex, then the more likely it is to fail.


Traditional BI ticks the boxes – waste is inevitable…

Traditional BI Reporting
IKIWISI.  “I’ll know it when I see it” is a common theme in BI, which is best thought of as an investigative journey.  End users typically don’t know what it is they’re looking for, and then as soon as something useful has been found it creates a new question, a new need.

Traditional Data warehousing
Designing and implementing a traditional data warehouse is typically a large undertaking, and so liable to wasted effort and risk of project failure. 

Up Front Work: We expect data in an EDW to be single version of the truth, often requiring a canonical data model all stakeholders can agree with.  We then source, clean, and integrate data from disparate sources.  This is a tremendous effort before we see any  benefit.

Pre Knowledge:  The traditional approach also assumes that we know which data we need in the EDW and how it should be organised into facts and dimensions.  It requires us to know how the data will be used and what questions the end user is going to ask before we start building.

Too big to succeed (fail?): Because of this size and up front complexity the EDW is also more likely to fail before it goes live.


Save Us Please!

Taking a Data Lake approach to our Information architecture allows us to take smaller steps with more localised control.  An agile approach shortens the time to value and means we iterate to a better solution quicker.  In my next blog item I’ll discuss these ideas in more detail.

Related Posts

Insights & Data

Reduce warranty liabilities with artificial intelligence

Prasad Shyam
Date icon May 15, 2020

New technologies and data can lead to more proactive warranty management

Artificial Intelligence

Maximizing ROI across the three components of AI enablement

Jerry Kurtz
Date icon May 13, 2020

Artificial intelligence (AI) has become a must-have for organizations looking to gain or...

AI

AI and ecosystem change

Goutham Belliappa
Date icon November 25, 2019

Humans have looked to the stars and have felt very small in the vastness of the universe....