Tuesday, October 8, 2013 04:59am
Awards / Honors
March 7th, 2011

Researchers receive national award for new psychological measurement

Writer: Michael Childs, 706/542-5889, mdchilds@uga.edu
Contact: Allan Cohen, 706/542-9722, acohen@uga.edu

Published in Awards / Honors, EPIT, Press Releases

Cohen

A team of University of Georgia education researchers has received a national award for the development of a new statistical method for measuring the growth of students’ problem-solving skills in mathematics.

This new method enables researchers to simultaneously measure both the different ways that students can reason about individual questions on a mathematics test and their overall growth in mathematics ability.

The researchers were named recipients of the 2011 Award for an Outstanding Example of an Application of Educational Measurement Technology to a Specific Problem from the National Council on Measurement in Education for their work.

Cho

Kim

The team included UGA College of Education professors Allan Cohen, director of the Georgia Center for Assessment, and Seock-Ho Kim, lead author Sun-Joo Cho, a UGA doctoral graduate and assistant professor of educational psychology at Vanderbilt University, and Brian A. Bottge, the William T. Bryan Endowed Chair in Special Education and Rehabilitation Counseling at the University of Kentucky.

Current methods for detecting growth of students’ skills in math by score-level analysis may fail to reflect subtle changes that might be evident at the item level, said Cohen.  This new method combines an analysis of individual performance on each test question coupled with methods that provide a deeper analysis of the differences in reasoning that students use to answer each test question.

The team published two articles in 2010 describing how item-level changes could be measured using data from a multiwave experiment with a teaching method called Enhanced Anchored Instruction (EAI). EAI is a method specially designed for developing the math skills of low-achieving adolescents, including students with learning disabilities in math.

One article, “Latent Transition Analysis with a Mixture Item Response Theory Measurement Model,” appeared in the journal, Applied Psychological Measurement. A second article, “Detecting Cognitive Change in the Math Skills of Low-Achieving Adolescents,” was published in the Journal of Special Education.

In 2008, the National Mathematics Advisory Panel recommended expanding the ways that skills are measured because assessment drives instructional decision making. According to the panel, more collaboration is needed between psychometricians, who have specialized knowledge in data analysis models, and teaching professionals, who are most familiar with the constructs that the tests are intended to measure. The result, it is hoped, will be new kinds of assessments that are of high quality, more sensitive to new forms of instruction, and more helpful to teachers in customizing their instruction for improved student performance

Despite calls for research on assessments that can improve learning, there are few, if any strategies available for helping low-achieving adolescents develop the cognitive representations needed to make sense of the types of mathematics items typically included in high-stakes assessments.

Typical test accommodations for students with disabilities include extended time and oral reading of test directions or items, but neither strategy significantly helps students with math disabilities understand the concepts addressed in test questions.

In this study, the researchers showed how the Latent Transition Analysis method and the mixture Rasch model can be combined (LTA-MRM) and used to examine the effects of instruction that go beyond the score level and beyond the effects of such manifest variables as membership in a particular demographic group, gender, age, etc.  The model provides a way of identifying groups of students that are latent, meaning they are not immediately visible, until they are identified by this method.  Once the students are identified as being in a latent group, it is possible through further study to determine differences in reasoning about specific mathematics problems that characterize each latent group.

Analysis methods such as the LTA-MRM are important for building new teaching and measurement models. First, the new model has the potential to guide development of appropriate instructional tools and to evaluate how they are working.  This is especially important for teaching low-achieving populations such as students with learning disabilities. This would benefit teachers and administrators who are looking for ways to assess students’ response to intervention (RTI). Second, the model can incorporate new ways of helping teachers consider both the potential heterogeneity in RTI and the effects of different instructional interventions over time. This would make test results more meaningful to teachers and address some of the concerns described in the Mathematics Advisory Panel report.

Finally, the model can give more credence to researcher-developed tests and provide a more sensitive alternative to standardized tests, thereby helping to eliminate false negatives in developmental research.

The award will be presented at the NCME’s annual conference April 7-11 in New Orleans.

Comments are closed.