MDM Data Quality Assurance & Control (P&G)
The Need
P&G’s Master Data Management (MDM) and metadata were not in sync among the operational reporting systems for global and regional users, who were managing and modifying data for specific regional instances. In addition, data leakage was a major concern, which limited propagation to other systems in the enterprise.
Because of the complexity of P&G’s data platforms — 48 SAP instances in a 4-tier landscape with multiple downstream application servers — the identification and reconciliation of data problems were taking time and effort. The result was that many different business units were using their own process to fix the problems, which drove the need for data governance with a data quality platform and toolset.
The Solution
P&G’s data governance team leveraged DataTrust (formerly RDt) data quality software to optimize their data quality assurance and control of their master data, including over 32 unique SAP instances and billions of records. Prior to the implementation, analysts would download all data offline on a weekly basis, combine multiple sources and manually reconcile inconsistencies in the data and variance. After an initial assessment of data quality assurance and control (DQA/DQC), a streamlined plan was developed to retire the existing third-party tool.
Impact
We know that using a comprehensive data quality platform enables trusted data. It empowers organizations to monitor, clean, and validate data to ensure its accuracy, consistency, and timely availability. With effective, all-in-one data quality assurance tools, businesses can reduce errors and streamline workflows.
The true impact of this use case was that P&G could use DataTrust to set policies and procedures that allowed them to:
- Ensure consistency across the entire enterprise
- Unify data governance
- Limit data leakage
- Minimize business unit duplication and risk
With the added benefit of auditability for data performance, the payoff is high for data stewards at P&G.
The RightData Edge
DataTrust delivers all of the powerful, automated data quality and observability capabilities you need at a massive scale. This code-free tool is easy to use while maintaining flexibility to perform your required reconciliation and data checks. DataTrust ensures your data products are clean, accurate, trustworthy, and ready for use.
DataTrust automatically detects anomalies and generates business rules to help you find issues faster and with greater accuracy. It can maintain integrity as it validates and reconciles data from one-time migrations or for your ongoing data operations. This data quality assurance tool can help resolve inconsistencies and ensure trust in your data.
DataTrust provides a comprehensive approach to the data quality assurance and control cycle, using the elements:
- Define: Create a framework that suits your unique requirements and objectives.
- Build: Develop scenarios aligned with your framework.
- Operate: Schedule each element of your quality control plans (QCPs) to accommodate your needs.
- Monitor: Feed useful, timely data to dashboards for management teams.
- Evaluate: Analyze apparent trends and make adjustments that streamline workflows.
Data stewards can use these processes to unify the software with policies and governance across the enterprise. As an example, after the build phase, you can schedule the elements of the quality control plans (QCPs) that feed the dashboard to management.
Learn More About DataTrust
DataTrust is a comprehensive data quality and observability suite that enables you to measure, monitor, validate, and reconcile your data's health and integrity. It is the only tool you will need to ensure your data is reliable, consistent, and complete when making critical business decisions. When you need to run efficient operations where data can be trusted, choose DataTrust.
Schedule a demo to see how DataTrust can make your data more accurate, trustworthy, and accessible. You can also contact us online to discuss your questions with an expert.
DataTrust Data Quality: A no-code data quality suite that improves data quality, reliability, consistency, and completeness of data. Data quality is a complex journey where metrics and reporting validate their work using powerful features such as:
Database Analyzer: Using Query Builder and Data Profiling, stakeholders analyze the data before using corresponding datasets in the validation and reconciliation scenarios.
Data Reconciliation: Comparing Row Counts. Compares number of rows between source and target dataset pairs and identifies tables for the row count not matching.
Data Validation: Rules based engine provides an easy interface to create validation scenarios to define validation rules against target data sets and capture exceptions.
Connectors For All Type of Data Sources: Over 150+ connectors for databases, applications, events, flat file data sources, cloud platforms, SAP sources, REST APIs, and social media platforms.
Data Quality: Ongoing discover that requires a quality-oriented culture to improve the data and commit to continuous process improvement.
Database Profiling: Digging deep into the data source to understand the content and the structure.
Data Reconciliation: An automated data reconciliation and the validation process that checks for completeness and accuracy of your data.
Data Health Reporting: Using dashboards against metrics and business rules, a process where the health and accuracy of your data is measured, usually with specific visualization.