Nathan Dadey is interested in the design, scaling, and use of educational assessments, particularly assessments used for accountability purposes. He aims to produce methodological and applied work that contributes to improved understanding and use of assessment results in policy contexts.
In terms of methodological work with implications for applications, Nathan focused on tackling issues in which typical educational measurement approaches fall short. One such area is the measurement of the Next Generation Science Standards. Nathan has supported multiple state departments of education (Delaware, Wisconsin, and Nebraska) in developing conceptualizations of their state wide systems of assessments, lead content specialists in the creation of three dimensional tasks (leading entire set of Alabama state content leads), assisting multiple working groups within the Council of Chief State School Officers (CCSSO; leading workshops for the Science State Collaboratives on Assessment and Student Standards (SCASS) and authoring a white paper for the Technical Issues in Large Scale Assessment (TILSA) SCASS) and reviewing task quality and review tools (with Achieve). A second area deals with the numerous challenges inherent in designing and implementing comprehensive systems of assessment. While working to tackling these kinds of challenges, Nathan has to explore ways in which a set of “mini-interim” assessments can be scaled (with Curriculum Associates), written a policy brief addressing ESSA’s interim assessment provision and explored ways in which Bayesian networks can be used to summarize interim and summative assessment results.
In terms of applied work with methodological implications, Nathan’s work focuses on issues that threaten the validity of operational programs of assessment and accountability. These issues include the dimensionality of alternate assessment based on alternate achievement standards (on behalf of NCSC), the impact of interruptions on online assessment results (on behalf of the Smarter Balanced Assessment Consortia) as well as recommendations to address such impacts (on behalf of CCSSO), the representation of English Language Proficiency within state accountability systems (on behalf of the Latino Policy Forum), and the comparability of assessment scores across multiple digital devices (on behalf of the TILSA SCASS).
In addition, Nathan has served as a peer reviewer for the United States Department of Education as well as a reviewer for multiple journals, including Educational Measurement: Issues and Practice, and national conferences.
Nathan received a Ph.D. from the University of Colorado Boulder with a concentration in research and evaluation methodology.
Nathan Dadey focuses on using psychometric and statistical methods to address practical problems, including issues related to combining interim assessment data, dimensionality of alternative assessments, subscores, and vertical scales.
Recent and Relevant Publications
Dadey, N., Lyons, S., & DePascale, C. (2018). The comparability of scores from different digital devices: A literature review and synthesis with recommendations for practice. Applied Measurement in Education, 31(1), 30-50.
Lyons, S. & Dadey, N. (2017, March). Principal holistic judgments and high-stakes evaluations of teachers. Educational Assessment, Evaluation and Accountability, 29(2), 155-178.
Dadey, N. & Gong, B. (2017, April). Using interim assessments in place of summative assessments? Consideration of an ESSA option. Washington, DC: Council of Chief State School Officers (CCSSO).
Lyons, S. & Dadey, N. (2017, March). Considering English Language Proficiency within Systems of Educational Accountability under the Every Student Succeeds Act. Dover, NH: The National Center for the Improvement of Educational Assessment, Inc., & The Latino Policy Forum.
Briggs, D. C., & Dadey, N. (2015). Making sense of common test items that do not get easier over time: Implications for vertical scale designs. Educational Assessment, 20(1), 1-22.
Dadey, N. & Briggs, D. C. (2012). A meta-analysis of growth trends from vertically scaled assessments. Practical Assessment, Research & Evaluation, 17(14).