Capping IT Off

Capping IT Off

Lack of trustworthy information at knowledge workers’ fingertips?

As I speak with CXOs of large organizations, it comes clear that most don’t trust their data and in the absence of reliable facts at their fingertips, they rely on gut instinct to make critical decisions. I believe that a common cause of this situation is that maintaining data quality is perceived to be too much of a challenge and the responsibility for that challenge is often not clear. This perception leads to a general lack of confidence in the organizations’ information assets. This should not be new news. Business Intelligence (BI) has been on the top of the CIO’s agenda for many years. Organizations large and small have done invested heavily in BI technology and yet we still hear that the business is not getting the results that they expect. Poor data quality is often the cause for this gap between investment and results. In a recent Capgemini survey (Capgemini Intelligent Enterprise Survey 2008) of senior leaders at global companies across multiple industries, executives said that they could improve business performance by 27 % if they only where better able to exploit information. These leaders rightly perceive that access to better, more reliable information could enable knowledge workers to respond proactively in changing environments, make sharper decisions, and create sustainable value. Companies are thirsty for knowledge but are simply drowning in data. So how should this challenge be addressed? I believe that better decisions start with better data quality for “master data” – the key business entities upon which transactions are based (e.g. Customers, Vendors, and Products). While it’s natural that for IT organizations to turn to master data management (MDM) solutions as a fix for the data quality problem, organizations should first consider a business-oriented data quality management (DQM) approach, which involves solutions such as data quality scorecards and firewalls. I’m going to explain why. Measure the quality of your existing data with a data quality scorecard When managing master data, data stewards need to be able to quantify the organization’s level of data quality against a set of business rules to understand if the quality level meets the organiza¬tion’s needs. Data quality technology can capture key data quality metrics and present them to data stewards, allowing them to analyze and drill down into the data in a way that illustrates how data inadequacies result in business impacts. This information is then typically incorporated into data quality scorecards built using, for example, SAP BusinessObjects Data Quality Management and SAP BusinessObjects BI software. The purpose of leveraging data quality scorecards is two-fold. First, it is the starting point for ensuring collaboration between the business and IT groups as data quality issues are identified and discussions around a data remediation strategy occur. Second, because data quality is an iterative process, the scorecards provide a snapshot into how an organization is doing today with identifying and resolving identified issues. These scorecards will continue to change as data improves and as additional data feeds are introduced into the environment (which could cause a negative impact to the scorecards for a period of time). Keep out the bad data with a data quality firewall After you are able to measure data quality with a scorecard, the next step is to manage data quality at the point of entry. A real-time data quality component (such as the one provided with SAP BusinessObjects Data Quality Management) can be used to check each entry record in real-time as it is authored by the end user. The real-time integration acts as a data quality firewall providing real-time response information about the quality of data entering an application. When duplicate records are found or errors are identified in the format or content of the data, the solution indicates this deficiency to the end users, giving them an opportunity to correct the data before saving it. The data quality business rules deployed in the real-time can also be applied in a batch process, which ensures that the same rules apply to legacy data as well as the new (or changed) data that is entering the environment in a real-time fashion (website, call center, CRM application, etc). The Holy Grail of Data Quality In this post I briefly touched upon the need for business and IT collaboration through the data quality process. In my next post, I will discuss this in greater detail, as gaining collaboration between business and IT is the “Holy Grail of Data Quality”, as it is provides the foundation for building a data governance or a MDM initiative throughout the organization. Rely on quality performance information to make the right business decisions because better business intelligence starts with better data quality.

About the author

Leave a comment

Your email address will not be published. Required fields are marked *.