Skip to content Skip to footer

Data Quality & Mastering

Data Quality Control

Data quality control is the process of controlling the usage of data with known quality measurement—for an application or a process. This process is usually done after a Data quality assurance process, which consists of discovery of data inconsistency and correction.

Data QA (Quality Assurance) process provides following information to Data QC (Quality Control):

  • Severity of inconsistency
  • Incompleteness
  • Accuracy
  • Precision
  • Missing / Unknown

The Data QC process uses the information from the QA process, then it decides to use the data for analysis or in an application or business process. For example, if a Data QC process finds the data contains too much error or inconsistency, it rejects the data to be processed. The usage of incorrect data could crucially impact output. For example, providing invalid measurements from several sensors to the automatic pilot feature on an aircraft could cause it to crash.

Data Mastering

Master data is information that is key to the operation of a business. It is the primary focus of the Information Technology (IT) discipline of Master Data Management (MDM), and can include reference data. This key business information may include data about customers, products, employees, materials, suppliers, and the like. While it is often non-transactional in nature, it is not limited to non-transactional data, and often supports transactional processes and operations.

This is to avoid confusion with the usage of the term Master data for original data, like an original recording. Master data is nothing but unique data, i.e., there are no duplicate values.

Material Master Data is a specific data set holding structured information about spare parts, raw materials and products within Enterprise Resource Planning (ERP) software. The data is held centrally and used across organisations.

Vendor Master refers to the centralised location of information pertinent to the Vendor.