Page Title:
Keywords:
Description:
Skip to main content
Processing Assessment Materials → Processing Schedules and Overall Counts for the 2011 Assessment

NAEP Technical DocumentationProcessing Schedule and Overall Counts for the 2011 Assessment

        

2011 Student Participation and Session Information

In the spring of 2011, the National Assessment of Educational Progress (NAEP) assessed both a nationally representative and state sample of students in mathematics and reading at grades 4 and 8, and in science at grade 8. A nationally representative  sample of students at grades 8 and 12 were assessed in writing via computer.  Special studies were conducted at grades 4 and 8 in Knowledge and Skills Appropriate (KaSA) mathematics, and a Trends in International Mathematics and Science Study (TIMSS) / NAEP Linking Study at grade 8. A pilot test for economics via traditional paper and pencil was assessed at grade 12.

Materials staff was responsible for the following tasks:

  • printing test booklets and questionnaires,
  • packaging and distributing materials,
  • receipt control,
  • data capture through image and optical mark recognition scanning,
  • data editing and validation,
  • creating training sets for new items,
  • training and performance scoring of constructed response items,
  • data file creation, and
  • inventory control and materials storage.

For the assessment, NAEP staff designed 553 student booklet types, 22 questionnaires, 10 worksheets, and 10 tracking forms. There were almost 2.8 million forms printed, resulting in approximately 1 million student documents and more than 500,000 questionnaires and worksheets scanned. Most questionnaires were offered either in traditional paper form or online at www.naepq.com. The site was available from January 3 through March 18, 2011.

Scoring of the 2011 NAEP Assessment occurred at four sites:

  • Mesa, Arizona (mathematics and economics)
  • Virginia Beach, Virginia (reading)
  • Lansing, Michigan (science)
  • Columbus, Ohio (writing)

As in past cycles, project and content specialists were on-site to monitor scoring. There was daily communication between the scoring sites and project staff in the District of Columbia; Iowa City, Iowa; and Princeton, New Jersey. Reports included the Trend Reliability and Mean Comparison (TRMC) Report and the Completion Report. The TRMC had a macro applied to highlight items that were in need of more discussion. More than 8.7 million responses were scored from March through August.


Last updated 17 March 2016 (GF)