Data Quality Management - How EZ Interoperability Solutions can help you
- Shanawaz Khan

- Aug 5, 2018
- 2 min read

Organizations struggle processing and maintaining data quality especially when the data is coming from different sources. Personally, I also have faced the challenge and it took me a while to figure out a plan to tame the beast and once I was able to implement the plan it helped me a great deal to sail the boat smoothly.
If i have to explain "How to maintain good data quality?" then I will create 5 buckets ,lets call it DQP - Data quality principles. Once you understand these principles you can then re evaluate your strategy to take into account these principles.
Lets say that the data quality principles can be remembered by naming it as ATICC, and what does ATICC stand for?
Accuracy
Accuracy should be measured through source documentation. This means that the value of every field should have the exact same meaning as the source data system even after transformation.
A typical metric to measure accuracy is the ratio of data to errors, that tracks the amount of known errors (like a missing, an incomplete or a redundant entry) relatively to the data set.
Consistency
Strictly speaking, consistency specifies that two data values pulled from separate data sets should not conflict with each other.
An example of consistency is for instance a rule that will verify that the sum of employee in each department of a company does not exceed the total number of employee in that organization.
Completeness
Completeness can be measured by determining whether or not each data entry is a “full” data entry. All available data entry fields must be complete, and sets of data records should not be missing any pertinent information.
Integrity
Also known as data validation, integrity refers to the structural testing of data to ensure that the data complies with procedures. This means there are no unintended data errors, and it corresponds to its appropriate designation (e.g., date, month and year).
Here, it all comes down to the data transformation error rate. The metric you want to use tracks how many data transformation operations fail relatively to the whole – or in other words, how often the process of taking data stored in one format and converting it to a different one is not successfully performed.
Timeliness
Timeliness corresponds to the expectation for availability and accessibility of information. In other words, it measures the time between when data is expected and the moment when it is readily available for use.
Keep in mind that Data Quality Management is not a one time effort and has to be evaluated at regular intervals.
There also needs to be a plan for Data Reconciliation and remediation when data errors are identified during the regular evaluations.
Lastly, do keep in mind that Accuracy and Integrity principles are relative to the business logic and data rules are defined keeping in mind the business logic and there is a need to keep revisiting these data rules to fulfil the need.
Organization trying to address data quality problems can outsource their work to EZ interoperability Solutions or hire us as consultants to help you tame the beast.








Comments