Teachers and students are shown standing on the earth to demonstrate how testing innovation is opening up to a broader population now.

Federal Innovative Assessment Pilot: Now Open to All States

U.S. ED Encourages States to Apply; $22 Million Potentially Available

In a bid to get more states to develop “innovative” assessments, the U.S. Education Department has lifted the cap on its innovative assessment pilot so any state may apply.

The announcement came at the department’s State Assessment Conference Sept. 26-27 in Arlington, Va. Until now, the Innovative Assessment Demonstration Authority (IADA) has been open to a maximum of seven states (or consortia of states), but only a handful have participated, and none have applied since 2020. To jump-start interest, USED officials lifted the cap on the number of states that may apply.

As you probably remember, the IADA allows states to use competency-based and other models in the assessments they use for federal accountability in exchange for two types of flexibility in their testing programs: States can run their new and previous assessment systems at the same time without requiring students to take both tests, and they can try out new models with a subset of districts before deciding whether to scale them statewide by the end of the program.

The last window for submitting IADA applications closed in March 2021. But at the conference, USED announced two new annual rolling application windows, starting in May 2024:

  • The first Friday in May, with potential approvals by August so work can start in the fall of that school year
  • The first Friday in December, with potential approvals by March for implementation the following school year

USED also funds a Competitive Grants for State Assessment (CGSA) program on a semi-regular basis. It has the authority to determine funding priorities, such as creating assessments for English learners. Two years ago, the priorities included IADA planning or implementation. 

While details about the newest CGSA guidelines have not yet been released, there are plans afoot to award $21.9 million before September 30, 2024. States may be allowed to use some of that appropriation to support IADA planning or implementation. 

Center Executive Director Scott Marion and Senior Associate Carla Evans led the IADA strand of the USED conference, so we can bring you these newsy tidbits directly from their source. 

Longstanding Challenges of Federal Testing Requirements

More than 225 people from 40 states registered for the conference. It featured discussion in three focus areas: IADA, participation rates in alternate assessment, and how the federal peer-review process might accommodate new models of accountability testing, such as through-year, performance-based assessments, or matrix-sampling of content (Center Senior Associate Nathan Dadey helped lead the peer-review strand). You can see all the materials from the conference here.

Tensions have been baked into IADA from the start, as Scott and Carla said at the conference. (See their overview slide presentation here, and the summary notes from the overview session here.) Flexibility, innovation, and the desire to get “instructionally useful” information pulled some IADA states in one direction, while the need to standardize and scale the assessments and to produce valid, reliable, and comparable results that can be used in school accountability systems pulled them in another. 

More than a few barriers have hobbled state participation in IADA, Scott and Carla noted, from the requirements of the program itself (designing and scaling the new tests statewide within five years, for instance) to turnover in state leadership and the anticipation of difficulty getting the new tests through federal peer review to use them for accountability. (There was a whole separate session on peer review issues for IADA assessments that used the approved states as examples.) 

Federal Rules Complicate the Innovative Assessment Pilot

In a report earlier this year, the Institute of Education Sciences studied states’ IADA progress during the 2020-2021 school year. COVID’s effects on schools that year undoubtedly made the assessment work tougher, but the study noted features of the program itself that challenged states’ progress.

We’ve written a lot about the dilemmas inherent in assessment innovation within the bounds of federal rules. Georgia’s assessment chief, Allison Timberlake, wrote recently about the difficulties that led her state to pull out of the IADA. Scott criticized the IADA’s comparability requirements as a key barrier to innovation. 

But we also worry about the other side of the dilemma: What happens if we take new models of testing—like performance assessment, which could yield meaningful information for teachers—and use them for federal accountability?

Assessment innovation comes with plenty of difficulties; the presentations from USED’s assessment conference—all provided on the conference’s website—supply abundant food for thought.

Share: