Capping IT Off

Capping IT Off

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

Why SAP MDG is different - taking a central authoring approach to MDM

Most MDM solutions out there take a collaborative approach to MDM, by this I mean that they accept that other systems have master data in them and elements such as survivorship and matching are used to identify and manage the information from these multiple sources.  The MDM solution then determines the golden record and publishes this back to the systems.

This is a traditional view and its often the easiest way to implement MDM as it doesn't require significant amounts of change to existing systems and in particular people processes.  However it does have a significant overhead as you need a central data steward team that verifies all of the information and you can have complex processes around matching, particularly if multiple divisions use the same, or similar, supplies.

SAP have looked at this problem from another perspective.  The people processes.  What they've decided is that in certain areas, particularly around products, materials, BOMs etc, the lifecycle of the information process is linked but separate to the standard transactional processes.  For this reason its better to move towards a centralised authoring model, for instance backed by a corporate shared-service, where requests are centrally authored and then pushed to the various other systems.

The heart of this concept is the idea that you are moving the people processes from the source systems into a central authoring approach.  So in the SAP MDG approach you are first making a major business decision:

All master information for this domain (e.g. product) will be centrally authored and controlled, no information will be created elsewhere for this master domain
That is a big switch for many organisations but one that can come with significant cost savings and benefits.  Firstly you can move to a centralised shared-service where all information is passed to which means you drive consistency across your divisions via that central control.  This new approach to MDM for slower moving master elements might sound in some ways a step back to old school mega-hub approaches but its critically different to those as they still aggregated information rather than taking a people centric approach and focusing on the authoring and publishing of information.

So is SAP's approach right for you?  Well it really comes down to whether you can make that switch.  If you've genuinely got a single SAP instance that contains all of the information today or  you can make the switch to central authoring for your company then the answer could well me 'yes'.  The first decision to make when looking at SAP however is not a technical question its a people and business one.

Can we move to central authoring?

About the author

6 Comments Leave a comment
Hi Jitesh, You can use SAP BODS to cleanse the existing data. Once cleansed and validated the data can be uploaded into the MDG. Regards, Robbie
How well can MDG accommodate authoring of master data for multiple ECC instances, which may have variations in data structures and reference data?
Hi Darin, That's a good question. While implementing MDG one critical question would need to be addressed and that is - which deployment option to go for MDG. Option-1- Deploy MDG as Standalone/master data hub. Option-2- Deploy MDG on top of one Operational ECC instance. In either case, Create/update/block processes would need to be shut down in receiving ECC instances because Create/update/block would happen only at one place and that is MDG. DRF would take care of distributing data to receiving systems. Regards Pravin
Hi Darin, Once the MDG has been delivered, all data should be authored in the MDG. The Data Replication Framework (DRF) enables replication across the distribution channels to the repository system(s) from the MDG via established replication mechanisms such as ALE for SAP ECC. Regards, Robbie
I understand that MDG on a data instance would be the best place to author global master data. However, is that the best place to author local data (such as plant level materials) or would that be better done in the local ECC instance?
Centralization is what SAP MDG tries to accomplish. But we mustn’t forget that to get to Centralization, (where the CoE takes charge of the data), there are a few prerequisites that are critical as well. What about the existing data volume - the dirt that has accumulated over the years?( I assume, that the amount of dirt is the reason for a change in the process now - eh ?) That's where we need "Consolidation" first - an exercise any org. should go thru before imagining Centralization. SAP MDG as a solution - as it is completely incapable of handling this and you would require ETL tools (like SAP BODS, INFA ETL etc) to get this job done. I am sure SAP is smart enough to realize this and is coming up with something end to end solution in the ABAP stack to bridge this gap. SAP MDGEE, anybody heard anything ?

Leave a comment

Your email address will not be published. Required fields are marked *.