Search form


New Student Assessment Models More Valid?

Experts from the Rennie Center for Education Research and Policy and Policy Analysis for California Education (PACE) have released a report that makes the case for bold new student testing models that they are calling more fair and valid than their predecessors.

The report, “The Road Ahead for State Assessments,” addresses student assessment, which is a timely topic. There are many states that are currently in the process of adopting the new common core state standards in math and English language arts and are considering how to gauge students’ progress toward those standards.

"We must get beyond the incomplete snippets of information that current assessments provide for complex subjects like science inquiry," says Pendred Noyce, Rennie Center chair and trustee of the Noyce Foundation. "The Road Ahead for State Assessments,” commissioned by the Rennie Center and PACE, offers essential improvements for testing that will allow us to better evaluate students and prepare them for college and for STEM and other careers."

State education systems rely heavily on the use of large-scale assessments to evaluate and improve student performance. What the report finds to be flaws in current assessment systems, however, blur the true picture of achievement for many students. For example, there has been considerable debate about how best to measure the progress of students with special needs or limited English proficiency toward uniform academic standards, and whether they should be provided accommodations for taking the tests, or excluded altogether.

The three-part report urges the two U.S. Department of Education-funded consortia charged with developing new state assessments — the Partnership for the Assessment of Readiness for College and Careers (PARCC) and the SMARTER Balanced Assessment Consortium (SBAC) — to focus on designing assessments that take full advantage of new technologies to produce measures of student performance that are fair and accurate.

The papers that comprise the report focus on three areas. Robert Linquanti of WestEd reviews key problems in the assessment of English learners, and identifies the essential features of an assessment system equipped to provide true measures of their academic performance.

Mark Reckase of Michigan State University discusses computer adaptive assessment and its potential to better evaluate where students are on a learning continuum. Computer adaptivity can, for example, support better assessment of English learners.

In science, paper-and-pencil, multiple choice tests provide only weak and superficial information about students’ knowledge and skills. Chris Dede and Jody Clarke-Midura of Harvard University illustrate the potential for richer, more authentic measures of students’ scientific understanding with a case study of a virtual performance assessment now under development at Harvard. 

The report offers informed recommendations for assessment policy, including the following:

  • Build a system that ensures continued development and increased reliance on computer adaptive testing, which is an essential foundation for a system that can produce fair and accurate measurement of students’ knowledge and skills. 
  • Ensure that new state assessments of academic learning are integrated and aligned with assessments that gauge English language proficiency (ELP). New state assessments will need to specify the academic language competencies that English learners need in order to gain mastery of the Common Core State Standards.
  • Include virtual performance assessments as part of comprehensive state assessment systems, as they have considerable promise for measuring students’ inquiry and problem-solving skills in science and other complex subjects.


Education World®    
Copyright © 2011 Education World