User-Centered Assessment Design - Madhabi Chatterji

User-Centered Assessment Design

An Integrated Methodology for Diverse Populations
Buch | Hardcover
504 Seiten
2025
Guilford Press (Verlag)
978-1-4625-5548-2 (ISBN)
93,50 inkl. MwSt
How can assessment instruments be designed or selected to best serve the needs of intended users, taking into account their interests, capacities, and limitations? Informed by a socioecological perspective, this timely, state-of-the-art reference and text presents an integrated, user-centered process model for developing assessments guided by user contexts. Madhabi Chatterji provides foundational principles and procedures for designing multi-item tests; behavior-based, product-based, and portfolio-based assessments; and self-report instruments. She demonstrates how to integrate qualitative and quantitative methods to devise tools that meet the quality criteria of usefulness and usability alongside validity and reliability. The book features case study discussions; worked-through examples with diverse, global populations; and sample instruments from a variety of disciplines (education, psychology, health care, and others). Chapter overviews and objectives are tied to within-chapter Recaps and Reflection Breaks to further understanding and class discussion.

Madhabi Chatterji, PhD, is Professor Emerita of Measurement, Evaluation, and Education at Teachers College, Columbia University, where she founded and directs the Assessment and Evaluation Research Initiative (AERI). AERI is dedicated to promoting meaningful use of assessment and evaluation information to improve equity and the quality of practices and policies in education, psychology, and the health professions. An award-winning, internationally recognized methodologist and educationist, Dr. Chatterji has taught and mentored numerous doctoral students and postdoctoral researchers over her 30-plus-year career. She is author or editor of more than 100 publications and is a Fellow of the National Education Policy Center. A public intellectual, Dr. Chatterji has spoken out frequently on the limitations of large-scale tests and the adverse social consequences of misused high-stakes educational assessments. Her longstanding scholarly interests lie in instrument design, validation, validity, and test use issues; improving program and policy evaluation designs to support evidence-based practices; and closing learning gaps with proximal diagnostic assessments.

I. Foundations
1. Foundational Concepts in Assessment Design
1.1 Chapter Overview
1.2 Assessments: Old and Emerging Traditions, a Starting Definition, and Some Distinctions
1.3 Viewpoints on Assessment, Measurement, Testing, and Evaluation
1.4 Role of Assessment in Scientific, Professional, and Practical Endeavors
1.5 Evaluating the Quality of Assessments and Construct Measures: Validity, Reliability, and Utility
1.6 Integrating Assessment Design, Validation, and Use: A User-Centered Process
1.7 Summary
2. Why Assess?: Measure-Based Inferences, Uses, Users, and Consequences
2.1 Chapter Overview
2.2 Back to the Future: Early Drivers, Milestones, and Consequences of Assessment Use
2.3 Modern Drivers and Consequences of Assessment Uses in Education
2.4 Modern Drivers and Consequences of Assessment Uses in Psychology, Health, Business, and Other Fields
2.5 Applying User-Centered Principles to Improve Practices
2.6 Summary
3. Whom to Assess? and How?: Specifying the Population and the Assessment Operations
3.1 Chapter Overview
3.2 Why Population Characteristics and the Socioecological Contexts of Assessments Matter
3.3 What Is Measurement Bias?: Case Studies and Hypothetical Illustrations
3.4 Selecting Assessment Operations for Diverse Populations and Multidisciplinary Constructs
3.5 Steps and Actions: Specifying Whom to Assess? and How? with the Process Model
3.6 Summary
II. Assessment Design
4. What to Assess?: Specifying the Domains for Constructs
4.1 Chapter Overview
4.2 Domain Sampling and Domain Specification: Functional Theory and Applied Illustrations
4.3 Construct Types, Domain Conceptualizations, and Structures
4.4 Domain Specification as a Part of the Process Model: Steps, Techniques, Guidelines, and Conventions
4.5 Content-Validating Specified Domains
4.6. Summary
5. Designing Assessments with Structured and Constructed-Response Items
5.1 Chapter Overview
5.2 Why the Mechanics of Item Construction Matter
5.3 Cognitive Constructs Measured Best with Structured- or Constructed-Response Items
5.4 Writing Structured-Response Items: Principles, Guidelines, and Applied Examples
5.5 Guidelines for Designing Constructed-Response and Essay Tasks
5.6. Instrument Assembly
5.7 An Application with the Process Model: A Case Study of Cognitively Based Item and Assessment Design to Foster Learning in Long Division
5.8 Summary
6. Designing Behavior-Based, Product-Based, and Portfolio-Based Assessments
6.1 Chapter Overview
6.2 Behavior-, Product-, and Portfolio-Based Assessments: Definitions, Examples, and Origins
6.3 Advantages of the Performance Assessment Format
6.4 Disadvantages of Performance Assessments: Human Vulnerabilities, Errors, and Biases
6.5 Three Case Studies: Applying the Process Model to Design and Validate Performance Assessments
6.6 Summary
7. Designing Survey-Based and Interview-Based Assessment Tools
7.1 Chapter Overview
7.2 Self-Report Instruments: Their Defining Properties and Common Applications
7.3 Historical Origins of Questionnaires and Attitude Surveys
7.4 Measurement Issues with the Self-Report Modality
7.5 General Design Guidelines for Self-Report Instruments
7.6 Ten More Guidelines for Writing Closed-Ended Survey Items
7.7 A Case Study: Applying the Process Model to Design Two Complementary Self-Report Tools
7.8 Summary
III. Validation and Use of Assessments
8. Analyzing Data from Assessments: A Statistics Refresher
8.1 Chapter Overview
8.2 Preparing for Data Analysis
8.3 Organizing the Data
8.4 Measures of Central Tendency
8.5 Measures of Variability
8.6 Graphical Displays of Data
8.7 The Standard Normal Distribution and Its Applications
8.8 Correlation Coefficients and Their Applications
8.9 Related Statistical Techniques
8.10 Summary
9. Improving the Inferential Utility of Assessment Results: Methods and Limitations
9.1 Chapter Overview
9.2 Frames of Reference and Derived Scores
9.3 Using Norms as the Frame of Reference
9.4 Using Criterion Scores or Standards as the Frame of Reference
9.5 Using Self as the Frame of Reference
9.6 Composite Scores
9.7 Grouped Scores, Equated Scales, and Linked Tests
9.8 Summary
10. A Unified Approach to Construct Validity and Validation: Theory to Evidence
10.1 Chapter Overview
10.2 Construct Validity: An Evolving Concept
10.3 Theoretical Foundations of the Unitarian View of Validation
10.4 Main Clusters and Types of Validity Evidence
10.5 Random Errors of Measurement and Types of Reliability Evidence
10.6 Utility of Measures, Assessments, and Assessment Systems
10.7 Unified Validation Plans
10.8 Chapter Summary
11. Empirical Methods of Validation
11.1 Chapter Overview
11.2 Planning Empirical Validation Studies
11.3 Evaluating Item Performance
11.4 Examining Fairness and Measurement Bias
11.5 Gathering Evidence of Content-Based Validity
11.6 Validating Response Processes: The Cognitive Interview
11.7 Gathering Correlational Evidence of Validity
11.8 Empirical Estimation of Reliability
11.9 Methods to Examine Utility
11.10 Evaluating the Evidence: The PSQI Case Revisited
11.11 Summary
12. User-Centered Assessment Design: Revisiting the Principles, Comparisons, and Conclusions
12.1 Chapter Overview
12.2 Applying the Principles Undergirding the Process Model: A Summary by Section
12.3. A User-Centered Design Process: Comparing the Old with the New
12.4 Extended Applications of the Process Model
12.5 The Process Model Compared to Existing Models of Assessment Design
12.6 Connecting the Process Model with the 2014 Standards
12.7 Summary
Glossary
References
Author Index
Subject Index
About the Author

Erscheint lt. Verlag 28.2.2025
Verlagsort New York
Sprache englisch
Themenwelt Geisteswissenschaften Psychologie Allgemeine Psychologie
ISBN-10 1-4625-5548-9 / 1462555489
ISBN-13 978-1-4625-5548-2 / 9781462555482
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Der Grundkurs

von E. Bruce Goldstein; Laura Cacciamani; Karl R. Gegenfurtner

Buch | Hardcover (2023)
Springer (Verlag)
59,99