Data are of high quality "if they are fit for their intended uses in operations, decision making and planning" (J. M. Juran). Alternatively, the data are deemed of high quality if they correctly represent the real-world construct to which they refer. Furthermore, apart from these definitions, as data volume increases, the question of internal consistency within data becomes paramount, regardless of fitness for use for any external purpose, e.g. a person's age and birth date may conflict within different parts of a database.
The first views can often be in disagreement, even about the same set of data used for the same purpose. This book discusses the concept as it related to business data processing, although of course other data have various quality issues as well.
This book is your ultimate resource for Data Quality. Here you will find the most up-to-date information, analysis, background and everything you need to know.
In easy to read chapters, with extensive references and links to get you to know all there is to know about Data Quality right away, covering: Data quality, Bit rot, Cleansing and Conforming Data, Data auditing, Data cleansing, Data corruption, Data integrity, Data profiling, Data quality assessment, Data quality assurance, Data Quality Firewall, Data truncation, Data validation, Data verification, Database integrity, Database preservation, DataCleaner, Declarative Referential Integrity, Digital continuity, Digital preservation, Dirty data, Entity integrity, Information quality, Link rot, One-for-one checking, Referential integrity, Soft error, Two pass verification, Validation rule, Abstraction (computer science), ADO.NET, ADO.NET data provider, WCF Data Services, Age-Based Content Rating System, Aggregate (Data Warehouse), Data archaeology, Archive site, Association rule learning, Atomicity (database systems), Australian National Data Service, Automated Tiered Storage, Automatic data processing, Automatic data processing equipment, BBC Archives, Bitmap index, British Oceanographic Data Centre, Business intelligence, Business Intelligence Project Planning, Change data capture, Chunked transfer encoding, Client-side persistent data, Clone (database), Cognos Reportnet, Commit (data management), Commitment ordering, The History of Commitment Ordering, Comparison of ADO and ADO.NET, Comparison of OLAP Servers, Comparison of structured storage software, Computer-aided software engineering, Concurrency control, Conference on Innovative Data Systems Research, Consumer Relationship System, Content Engineering, Content format, Content inventory, Content management, Content Migration, Content re-appropriation, Content repository, Control break, Control flow diagram, Copyright, Core Data, Core data integration, Customer data management, DAMA, Dashboard (business), Data, Data access, Data aggregator, Data architect, Data architecture, Data bank, Data binding, Data center, Data classification (data management), Data conditioning, Data custodian, Data deduplication, Data dictionary, Data Domain (corporation), Data exchange, Data extraction, Data field, Data flow diagram, Data governance, Data independence, Data integration, Data library, Data maintenance, Data management, Data management plan, Data mapping, Data migration, Data processing system, Data proliferation, Data recovery, Data Reference Model, Data retention software, Data room, Data security, Data set (IBM mainframe), Data steward, Data storage device, Data Stream Management System, Data Transformation Services, Data Validation and Reconciliation, Data virtualization, Data visualization, Data warehouse, Database administration and automation...and much more.
This book explains in-depth the real drivers and workings of Data Quality. It reduces the risk of your technology, time and resources investment decisions by enabling you to compare your understanding of Data Quality with the objectivity of experienced professionals.