Data Quality Models
Data quality representations
Data quality models extend traditional models for databases for the purpose of representing data quality dimensions and the association of such dimensions to data. Therefore, data quality models allow: (i) the analysis of a set of data quality requirements and their representation in terms of conceptual schemas; (ii) accessing and querying data quality dimensions by means of logical schemas. Data quality models also include process models tailored to the analysis and design of quality improvement actions. These models permit tracking data from their source, through various manipulations that data can undergo, to their final usage. In this way, they support the detection of causes of poor data quality and the design of improvement actions.
Among the first data quality models, in 1990 the polygen model  was proposed for explicitly tracing the origins of data and the intermediate sources used to arrive at that...
- 2.Scannapieco M, Pernici B, Pierce EM. IP-UML: towards a methodology for quality improvement based on the IP-MAP framework. In: Proceedings of the 7th International Conference on Information Quality; 2002.Google Scholar
- 4.Shankaranarayan G, Wang RY, Ziad M. Modeling the manufacture of an information product with IP-MAP. In: Proceedings of the 5th International Conference on Information Quality; 2000. p. 1–16.Google Scholar
- 5.Storey VC, Wang RY. An analysis of quality requirements in database design. In: Proceedings of the 4th International Conference on Information Quality; 1998. p. 64–87.Google Scholar
- 6.Wang RY, Madnick SE. A polygen model for heterogeneous database systems: the source tagging perspective. In: Proceedings of the 16th International Conference on Very Large Data Bases; 1990. p. 519–38.Google Scholar
- 9.W3C Working Group. An overview of the PROV family of documents. Available at: http://www.w3.org/TR/prov-overview/