Getting More From Our Measures by Taking Advantage of Technology Enhanced Items

Thursday, June 20, 2013: 10:15 AM-11:45 AM
Maryland 3-4 (Gaylord National Resort and Convention Center)
Presentations
  • OMAP Item Info NCSA Presentation 6-20-2013 v2.pdf (971.6 kB)
  • Online Mathematics Assessment Project -- Auto Scored CR and TE vs MC Item Information 6-20-2013.pdf (971.6 kB)
  • Content Strands:
    1. Transitioning assessment systems
    2. Improving data analysis, use, and reporting
    ABSTRACT:
    As states transition to new assessment systems that use online delivery systems, the expectation is that technology-enhanced items (TEIs) will allow the measurement of different, important aspects of content standards that paper and pencil technology cannot address in an on-demand assessment environment.  Two recent studies investigated the knowledge and skills measured by technology-enhanced items (TEIs). Results from field tests in multiple content areas across 5 states, covering grades in middle school and high school, will be used to address questions of meaning: What are these items measuring? How can TEIs be scored to capitalize on the increased depth and breadth of content?

    Four states, North Carolina, Kentucky, New Mexico, and South Carolina, participated in a study comparing the functioning of online multiple-choice (MC), constructed-response (CR), and TE items in grade 7 mathematics and Algebra I. The tests were constructed so that items addressing the same targeted aspect of a mathematics content standard were administered as MC items or as CR or TE items so that the results could be analyzed by assessment target. Item information functions and other assessment-target level data were collected to better understand how the items worked.  In addition, student responses were analyzed by content experts to determine the types of information each item yielded. A member of the research team will present the item information and sample student responses, discussing the different kinds of information yielded by the two (MC, CR/TE) items types. The results have implications for better understanding the construct validity of the information provided by different item types and for using results to understand what students need to do next in learning assessed content.

    In anticipation of the work of the assessment consortia, Missouri began a pilot of TEIs delivered using an online system that spanned the content areas of Algebra, English Language Arts I, Biology, and U.S. Government. Two separate forms were constructed for each subject area and administered to approximately 500 students. In addition, scores for current assessments and self reported grades were collected. The analyses of the data indicate that there is convergent and divergent validity of the new item types and that the items perform reasonably well using current measurement models. However, further investigation suggests that these items can provide additional information about student understanding and learning of the content. This portion of the presentation will provide an overview of the analyses, some of the additional information gained from these TEIs, as well as some concepts and thoughts about scoring these new types of items.

    Assessment staff members from Missouri and North Carolina will discuss the implications of these studies for state assessment and for the assessment consortia. They will address plans to implement TEIs in their state assessments, in part to assist the transition to the consortia assessments and to better measure the Common Core State Standards.

    Official:
    Discussant: