New Questions to Ask, New Questions to Answer - Researching New Item Types

Saturday, June 22, 2013: 8:00 AM-9:30 AM
National Harbor 6 (Gaylord National Resort and Convention Center)
Presentations
  • NCSA cPass Overview.pdf (217.8 kB)
  • Kramer Relevant Item Context.pdf (377.5 kB)
  • Cameron Clyne CCSSO Presentation Research.pdf (549.0 kB)
  • McBride - What%27s in a Name.pdf (413.4 kB)
  • Content Strands:
    1. Transitioning assessment systems
    ABSTRACT:
    The Career Pathways Assessment System (cPass) Collaborative, comprised of multiple states and a university, is developing innovative assessments to measure student achievement and workforce readiness based on career, vocational, and technical education. The Collaborative is investigating non-traditional item types and their relevance to students while assessing competence in the career pathways. The cPass project compared technology-enhanced items to their multiple-choice counterparts, investigated the pervasiveness of gender-based stereotypes in the workplace using situational judgment tasks (SJTs), and examined contextual effects of “story problems” to measure a mathematical skill in a career context. This session presents the results of these research projects.

    The cPass project is developing valid, authentic, and reliable items to measure students’ expertise and mastery of their chosen career pathway, going beyond traditional multiple-choice item formats. Multiple-choice items traditionally have one correct answer, and item developers avoid asking about preferences, opinions, or beliefs. By comparison, SJTs ask examinees to select one of several possible correct outcomes, given a certain circumstance. The selected answer in this case provides insight not only into declarative and procedural knowledge but also into examinees’ ability to think through several possible outcomes and their level of professional development. Other item types used in cPass include technology-enhanced items such as editing, ordering steps in a process, comparing and contrasting characteristics, and problem-solving – all in contexts that would be relevant for a student in any career pathway. Research is needed around these item types as states transition to new ways to assess students.

    Study 1. Some of the employability skills to be measured in cPass include leadership, maintaining a work-life balance, and conflict resolution. During reviews of SJT items developed to measure these skills, reviewers had different expectations for the “most correct action,” depending on the gender (as revealed by the name) of the character in the item. For example, in conflict resolution, reviewers expected women characters to be more persuasive and men characters to be more directive. To examine this finding further, male, female, and neutral versions of these items were created. This study compares the responses of both students and expert panelists on the SJT scenarios based on the gender of the character.

    Study 2. Technology-enhanced items leverage the computer-based testing environment to present authentic tasks, although in some cases, an item could be rendered in a multiple-choice context without loss of fidelity to the standard. However, one advantage of a technology-enhanced item is that it can ask the equivalent of several multiple-choice items at once. Additionally, a technology-enhanced item can evaluate a student’s procedural skills rather than just declarative knowledge or ability to select procedural or conceptual information from a list of options. Several of the technology-enhanced editing items were also rendered in multiple-choice format. This study assesses the information gained from technology-enhanced editing items compared to the multiple-choice versions.

    Study 3. In order to create items that are relevant and authentic, some questions test mathematical skills in the context of a “story problem.” While story problems are generally viewed as more realistic, they place additional cognitive demands on students. To examine this effect, a series of questions were developed, all using the same dataset but with different contexts. For example, for the pathway Manufacturing/Production, the data were production efficiency values for a factory; for Plant Systems, the data were plant heights as recorded by an agronomist. The student does not need specific pathway knowledge to answer the question; the context merely provides a real-world setting for the collection of the hypothetical dataset. This study examines differences in student performance on the items based on the presence and familiarity of the item context.