2021 Summer Internships: The Future Is Now

May 27, 2021

Center Staff and Interns Look Toward the Future of Educational Assessment and Accountability

As it appears the country is finally turning the corner on the pandemic, we are excited at the prospect of doffing our masks, rolling up our sleeves, and focusing on the future of educational assessment and accountability through our 2021 summer internship program. This summer for our 2021 summer internships, the Center welcomes five advanced doctoral students who will work with the Center’s professionals on four projects that will have direct implications for state and national educational policy related to the recovery from the pandemic as well as longer-term policies on the use of educational assessment and accountability to support student learning. 

Each intern will work with a Center mentor on one major project throughout the summer. At the end of the project, each intern will produce a written report, suitable for conference presentation and/or publication.

The interns and their Center mentors will engage in projects addressing four issues that reflect the changing face of educational assessment and accountability:

  • Addressing Practical and Technical Issues to Understanding the Academic Impact of the Pandemic
  • Using Opportunity-to-Learn Information to Enhance the Utility of State Test Scores
  • Collecting Evidence of The Impacts of Classroom Performance Assessments on Instruction
  • Modeling Student Performance on a Game-Based Assessment

Addressing Practical and Technical Issues to Understanding the Academic Impact of the Pandemic

Allie Cooperman from the University of Minnesota will work with Damian Betebenner on analyses to support states in their efforts to use state assessment data to help their understanding of and recovery from the effects of the COVID-19 pandemic on student learning. With the resumption of state summative assessment this spring,  and traditional accountability suspended for the year, states are struggling with issues related to how to use the assessment data they have collected. Underlying the use of 2021 assessment data for understanding the academic impact of the COVID-19 pandemic are thorny technical issues related to comparability with 2019 due to missing data. 

Allie will help develop, test, and document R-based analytics in support of state efforts to make use of their data to understand the impact of the COVID-19 pandemic on student learning. 

Using Opportunity-to-Learn Information to Enhance the Utility of State Test Scores

Thao Vo from Washington State University and Daniel Silver from the University of Southern California will work with Scott Marion to create a research-based framework for collecting, analyzing, and using opportunity-to learn (OTL) information to enrich states’ indicators and reporting systems and to provide information for improving interpretations and usefulness of aggregate test scores. While OTL analyses are important this year, this framework is future-oriented to help states plan for ongoing data collection and use of OTL indicators. 

Additionally, Daniel and Thao will support the analyses of OTL and test data for the 2020-2021 school year in at least one state each. Beyond supporting these states, Thao and Daniel will use what they learn from these analyses to refine the OTL framework.

Collecting Evidence of The Impacts of Classroom Performance Assessments on Instruction

Sarah Wellberg from the University of Colorado Boulder will work with Carla Evans to create a research-based framework states or districts could use to collect evidence about the impacts of classroom performance assessments on instruction. The goal of the project is to review and evaluate different methods intended to capture teachers’ perceptions about how the implementation of classroom summative performance assessments influences how and what they teach, as well as what students are asked to learn. 

The methods and associated research-based instruments and tools will be cataloged and evaluated based on specific criteria (cost, time, information quality, actionable feedback, etc.).

Modeling Student Performance on a Game-Based Assessment

Ella Anghel from Boston College will work with Nathan Dadey and Will Lorié on modeling student performance on a games-based assessment. Ella’s project will focus on the examination of a game-based assessment that is administered over the course of a school year in early literacy and mathematics. Through the application of longitudinal psychometric modeling methods, the project will aim to provide a comprehensive description of student performance on several targeted skills. 

Ella’s work on this project will inform the ongoing development and use of within-year extended game-based assessments.

Learn more about our internship program, offered each summer to advanced doctoral students in educational measurement and/or assessment and accountability policy. 

Share: