|aThe role of theories of learning and cognition in assessment design and development / Paul D. Nichols, Jennifer L. Kobrin, Emily Lai and James Koepfler -- Principled approaches to assessment design, development, and implementation / Steve Ferrara, Emily Lai, Amy Reilly and Paul Nichols -- Developing and validating cognitive models in assessment / Madeleine Keehner, Joanna S. Gorin, Gary Feng and Irvin R. Katz -- An integrative framework for construct validity / Susan Embretson -- The role of cognitive models in automatic item generation / Mark J. Gierl and Hollis Lai -- Social models of learning and assessment / William R. Penuel and Lorrie A. Shepard -- Socio-emotional and self-management variables in learning and assessment / Patrick C. Kyllonen -- Understanding and improving accessibility for special populations / Leanne R. Ketterlin-Geller -- Automated scoring with validity in mind / Issac I. Bejar, Robert J. Mislevy, Mo Zhang and André A. Rupp -- Explanatory item response models / Paul De Boeck, Sun-Joo Cho and Mark Wilson -- Longitudinal models for repeated measures data / Jeffrey R. Harring and Ari Houser -- Diagnostic classification models / Laine Bradshaw -- Bayesian networks / José P. González-Brenes, John T. Behrens, Robert J. Mislevy, Roy Levy and Kristen E. DiCerbo -- The rule space and attribute hierarchy methods / Ying Cui, Mark J. Gierl, and Qi Guo -- Educational data mining and learning analytics / Ryan S. Baker, T. Martin and L.M. Rossi -- Large-scale standards-based assessments of educational achievement / Kristen Huff, Zachary Warner, and Jason Schweid -- Large-scale educational survey assessments / Andreas Oranje, Madeleine Keehner, Hilary Persky, Gabrielle Cayton-Hodges and Gary Feng -- Professional certification and licensure examinations / Richard M. Luecht -- The in-task assessment framework for behavioral data / Deirdre Kerr, Jessica J. Andrews and Robert J. Mislevy -- Digital assessment environments for scientific inquiry practices / Janice D. Gobert and Michael A. Sao Pedro -- Assessing and supporting hard-to-measure constructs in video games / Valerie Shute and Lubin Wang -- Conversation-based assessment / G. Tanner Jackson and Diego Zapata-Rivera.
|aCognitive learning|vHandbooks, manuals, etc.
|aEducational evaluation|vHandbooks, manuals, etc.
內容簡介top The Handbook of Cognition and Assessment 簡介 The Handbook of Cognition and Assessment is a state-of-the-art resource that brings together the most innovative scholars and thinkers in the respective fields to capture the changing landscape of cognitively grounded educational assessments. Under the lead editorship of a research director at the Educational Testing Service and an esteemed professor of educational psychology at the University of Alberta. as well as supported by an expert advisory board, it is written by an international team of contributors at the cutting-edge of cognitive psychology and educational measurement. It covers all elements of modern educational assessment, including conceptual frameworks. methodologies, applied topics, and emerging issues for the field. It offers a methodologically rigorous review of cognitive and learning sciences models for assessment purposes. as well as the latest statistical and technological know-how for designing, scoring, and interpreting results. The content is written in a style and at a level of technical detail that will appeal to a wide range of readers from both applied and scientific backgrounds. A much-anticipated resource for this fast-moving discipline, this Handbook will provide readers with an in-depth understanding of the diverse approaches. contextual uses, and methodological principles at play within educational assessment today.Andre A. Rupp is Research Director at Educational Testing Service (ETS) in Princeton. New Jersey. where he works with teams that conduct comprehensive evaluation work for mature and emerging automated scoring systems. His research has focused on applications of principled assessment design frameworks in innovative assessment contexts as well as translating the statistical complexities of diagnostic measurement models into practical guidelines for applied specialists. Through dissemination and professional development efforts he is deeply dedicated to helping interdisciplinary teams navigate the complicated trade-offs between scientific, educational, political, and financial drivers of decision making in order to help shape best methodological practices for evidentiary reasoning for complex assessment design and deployment lifecycles.Jacqueline P. Leighton is Professor and Chair of Educational Psychology at the University of Alberta. Canada. She is past Director of the University of Alberta's Centre for Research in Applied Measurement and Evaluation (CRAME). As a registered psychologist with the College of Alberta Psychologists, her research is focused on measuring the cognitive and socio-emotional processes underlying learning and assessment outcomes, including cognitive diagnostic assessment and feedback delivery and uptake. She has published in a variety of educational-measurement journals and is past editor of Educational Measurement: Issues and Practice.