MDM has two major functions that ensures that data quality is maintained across the entire data landscape. This post gives a high level picture of the first task. MDM systems implement data quality and completeness while the data is entered into the system. Since master data is shared by different departments, all stake holders for the object under scrutiny will be involved here. A mix of business users and data stewards play the primary role in feeding the right data to the system.
The underlying tool to accomplish this task is through a workflow based system. A simple example below shows how different user groups are owners of different pieces of data within a single master data object, in this case “Customer“
As the workflow progresses, it is the business rules that are defined in order to establish standards ensures capture of all relevant information required for the object. Business needs define the entire workflow as well as the data points to be captured. A workflow can be as complicated or as simple as it can be based on the actual requirements.
If the checks fail, master data record is not published and made available to users until the errors are corrected. Typically the information is sent back to the step where data was missed out and coordinated with respective user to resolve the problem. Once the quality of data meets the standards setup earlier, the new object is inserted or updated into the database.
Similar to the the scenario where adding a customer, several scenarios are developed and maintained in the system. Each scenario is a workflow that is triggered and moves across stages as per the flow logic defined. In each stage, compliance with business rules is adhered to in order to provide compliance for data quality standards. The subsequent function of an MDM system is to effect these changes to all other dependent systems.