DQM - A bag of fleas is significantly easier to herd than the consistency of your data.

Measure, control and take responsibility for data quality - with data quality management and data governance you create transparency and stringency in your processes.

Data Governance

High-performance, efficient processes require high-quality master data. This is not new, but it has become even more important as a result of the digital transformation.

To achieve and maintain high data quality, responsibility for it must be clearly defined. We talk about data governance.

In the product data environment, data governance is multi-layered. Different tasks have to be taken on across departments. For this, fundamental structures must be created to map the permanent changes caused by external and internal influences.

What needs to be considered for optimal data governance? This question is best approached via the possible challenges.

Here is a small selection:

  • New data is added
  • Constant expansion of product data
  • New assortments are added
  • Additional suppliers are integrated
  • New products are developed
  • New users have to be integrated and trained

Data Quality Management (DQM)

The answer to this is Data Quality Management. The focus of DQM is on defining data standards. This is usually done by means of product classification.

Following the data standard, data maintenance process quality gates are set up. This ensures that data quality is transparent at all times. It also prevents insufficient data from blocking business processes.

Data quality must be monitored. Given the high relevance of data quality for the company as a whole, experts should be responsible for data quality. Such roles are data governors or data stewards. They have the task of developing data standards, performing data analyses and setting up regular reporting.

Requirements management

New recipients of the data must be served on an ongoing basis. If the data quality is high, many buyers will be found who have a great interest in the product data. This results in ongoing supplementary requirements for data exchange and data structure. Proactive requirements management must be established, for which roles must be created and staffed.

Data quality management, data governance, and requirements management mean that the project does not end with the go-live of the PIM system or MDM system, but must continue to be developed in a permanently agile manner.

Architecture Management (EAM)

Data flows must be manageable. Particularly in large organizations with extensive buildings, there are different onboarding processes and a wide distribution of product data. This results in a certain complexity due to data redundancies. This complexity becomes a risk if significant dependencies in the data flows are overlooked during changes. A data architect can take responsibility for the holistic and integrated view.

Clear system assignments ensure efficiency. In an extensive IT environment, functional areas of individual applications may overlap. This is especially true when using standard software. See this as an opportunity for optimal customization! But make sure that data sovereignty is clarified and that all applications involved are seamlessly integrated. An enterprise architect takes on this role.