Capping IT Off

Capping IT Off

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

Information needs Mastering

Getting Data is easy, most companies in fact have more more data than they can possibly handle. This data is then pushed into Data Warehouses and complex, and expensive, analytical tools then run complex models to help the company decide its strategy. In most companies, 8 out of 10 according to some surveys, these data warehouses would be more aptly referred to as data landfills where a huge amount of data is poured in with very little concern as to quality. What does this mean? Well here are a few stories to explain the problem A major manufacturing organisation had a clear procurement strategy to drive down the costs on its top 10 suppliers globally, these top 10 suppliers accounted for around 40% of the company spend so any reduction was potentially massive. For 3 years the procurement department concentrated on these top 10 and managed to reduce the spend by about 4%, which meant around a 1.5% reduction in spend... not bad. This company then underwent a major refresh of its BI infrastructure and included a much higher quality MDM solution which included the structure and hierarchies for companies rather than just a flat structure of individual entities. The result of this was an identification that 50% of its costs with suppliers were in fact with a single company which hadn't been considered part of the top 10. To put that into perspective, this meant that the entire procurement department had for 3 years been negotiating with ten individual suppliers whose combined size was less than the biggest actual supplier. The company didn't know this as the company operated a very federated, almost franchise, model for sales in the regions backed by a well co-ordinated manufacturing and distribution network. For 3 years the procurement strategy was right from a business perspective, concentrate on the big fish, but wrong from an execution perspective purely as a result of poor master data management. At a large multi-brand financial services company there was a history of extremely successful marketing campaigns which resulted in new clients on a pretty consistent basis. Unfortunately the company also suffered an above average churn rate on several key products meaning it had to run campaigns in its separate brands more often in order to keep up its growth. Along came the MDM solution and identified that a significant proportion of the successful campaign conversions were actually shifting from one of the banks brands to another, often shifting customer from a high margin, high touch product into a lower cost, lower margin equivalent. Cannibalisation of the banks own customers through its own marketing thus accounted for a large proportion of the additional churn. Again a successful business strategy of aggressive marketing campaigns was undermined in its execution as a result of poor master data management. The key point is that analytics, data warehouses and other elements are great but they still suffer from the age old problem of "Garbage In, Garbage Out". While MDM doesn't guarantee all of the information is of high quality it does ensure that the core entities are at least correctly identified. Turning the data landfill into a usable repository requires active business processes which identify and manage the core entities. MDM is the primary tool which turns Data into information for an enterprise, the good news is that MDM is easy, the bad news is most people do it badly... as to why? That will be down to my next post.

About the author


Leave a comment

Your email address will not be published. Required fields are marked *.