PARCC Text Complexity and Cognitive Complexity Measures: Their Role in Assessment Development and in Supporting Claims about Student Proficiency and Readiness

Saturday, June 22, 2013: 8:00 AM-9:30 AM
Maryland 3-4 (Gaylord National Resort and Convention Center)
Content Strands:
  1. Transitioning assessment systems
  2. Improving data analysis, use, and reporting
ABSTRACT:
The PARCC consortium is developing CCSS aligned assessments with several priorities in mind. In this session we describe (a) multidimensional frameworks for addressing text and item cognitive complexity that account for multiple sources of complexity, and (b) the role of target distributions of text and item cognitive complexity in engineering an assessment that meets PARCC priorities. We describe conceptual and empirical support for the frameworks, how they will be used in the item and test development process, and framework refinement and validation plans. And we contrast the PARCC cognitive complexity framework with other frameworks.

This session is relevant to conference strands 1. Transitioning assessment systems and 6. Improving data analysis, use, and reporting. Presenters will address these strands as they describe PARCC’s assessment aims, claims, and evidence about student achievement; text complexity and cognitive complexity frameworks for ELA/literacy and mathematics; and the intended roles of the frameworks in item and test development and score reporting.

In presentation 1, a PARCC state representative will describe the aims of the PARCC assessments (i.e., assess college and career readiness, assess achievement growth), with attention to design and development innovations like evidence centered design (ECD), inclusion of performance based and diagnostic assessment components, and the focus on cognitive targets as well as content standards targets.

A second presenter will describe quantitative and qualitative text complexity measures for literary and informational reading passages and multimedia stimuli and the process for applying these measures and arriving at a final determination of text complexity. The quantitative measures of text complexity are the those produced by the SourceRater, Reading Maturity Metric, and Lexile text analysis programs. Content area experts will apply rubrics to create the qualitative measures of text complexity—for literary and informational passages, Meaning (literary passages only), Purpose (informational passages only), Text Structure, Language Features, Knowledge Demands; and for multimedia stimulus material, Use of Graphics, Audio Stimuli, and Visual/Video Stimuli—and make judgmental ratings of Very Complex, Moderately Complex, and Readily Accessible text and multimedia stimuli. This presenter will (a) describe the role of these measures in selecting reading passages and multimedia stimuli for reading comprehension and composition items, (b) summarize the conceptual and empirical support the quantitative and qualitative complexity measures and their use in determining text complexity, and (c) summarize text complexity analysis results.

The third presenter will describe the cognitive complexity frameworks for ELA/literacy and mathematics items, conceptual and empirical support for the sources of complexity, and the role of the cognitive complexity measures in item and performance task writing and review and in test forms assembly. The multidimensional cognitive complexity framework for ELA/literacy includes four sources: Text Complexity (from above), Command of Textual Evidence, Response Mode, and Processing Demands. The framework for mathematics includes Mathematical Content, Mathematical Processes, Stimulus Material, Response Mode, and Processing Demands. The presenter will propose (a) ideas for PARCC to consider for validation studies and refinement of the cognitive complexity frameworks, (b) describe a proposed multistep tree regression process to validate the judgments of item writers regarding cognitive complexity, and (c) summarize cognitive complexity analysis results.

In presentation 4, a PARCC state representative will discuss the role of text and cognitive complexity in developing evidence to support claims about student proficiency and readiness from test performances and the role of these frameworks in plans for reporting student test performance, achievement growth, and college and career readiness and providing feedback to students, their teachers, and their families.

A discussant will discuss the role of text and cognitive complexity in engineering assessment design and development processes in order to support claims and develop evidence about student achievement, growth, and readiness.

Official:
Discussant: