Executing Data Quality Projects
Academic Press Inc (Verlag)
978-0-12-818015-0 (ISBN)
Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today’s data-dependent organizations.
The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action.
This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization’s standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all.
The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before.
Danette McGilvray has devoted more than 25 years to helping people around the world enhance the value of the information assets on which their organizations depend. Focusing on bottom-line results, she helps them manage the quality of their most important data, so the resulting information can be trusted and used with confidence—a necessity in today’s data-dependent world. Her company, Granite Falls Consulting, excels in bridging the gap between an organization’s strategies, goals, issues, and opportunities and the practical steps necessary to ensure the “right-level quality of the data and information needed to provide products and services to their customers. They specialize in data quality management to support key business processes, such as analytics, supply chain management, and operational excellence. Communication, change management, and human factors are also emphasized because they affect the trust in and use of data and information. Granite Falls’ “teach-a-person-how-to-fish approach helps organizations meet their business objectives while enhancing skills and knowledge that can be used to benefit the organization for years to come. Client needs are met through a combination of consulting, training, one-on-one mentoring, and executive workshops, tailored to fit any situation where data is a component. Danette first shared her extensive experience in her 2008 book, Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information™ (Morgan Kaufmann), which has become a classic in the data quality field. Her Ten Steps™ methodology is a structured yet flexible approach to creating, assessing, improving, and sustaining data quality. It can be applied to any type of organization (for profit, government, education, healthcare, non-profit, etc.), and regardless of country, culture, or language. Her book is used as a textbook in university graduate programs. The Chinese translation was the first data quality book available in that language. The 2021 second edition (Elsevier/Academic Press) updates how-to details, examples, and templates, while keeping the basic Ten Steps, which have held the test of time. With her holistic view of data and information quality, she truly believes that data quality can save the world. She hopes that this edition can help a new generation of data professionals, in addition to inspiring those who already care about or have been responsible for data and information over the years. You can reach Danette at danette@gfalls.com. Connect with her on LinkedIn and follow her on Twitter at Danette_McG. To see how Granite Falls can help on your journey to quality data and trusted information, and for free downloads of key ideas and tem¬plates from the book, see www.gfalls.com.
1. Data Quality and the Data-Dependent World2. Data Quality in Action3. Key Concepts4. The Ten Steps Process5. Structuring Your Project6. Other Techniques and Tools7. A Few Final WordsAppendix: Quick References
Erscheinungsdatum | 08.06.2021 |
---|---|
Verlagsort | San Diego |
Sprache | englisch |
Maße | 216 x 276 mm |
Gewicht | 1040 g |
Themenwelt | Mathematik / Informatik ► Informatik |
ISBN-10 | 0-12-818015-3 / 0128180153 |
ISBN-13 | 978-0-12-818015-0 / 9780128180150 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich