Skip to main content

Table of Contents  |  Search Technical Documentation  |  References

HomeTDWProcessing Assessment MaterialsProcessing Schedules and Overall Counts for 2006

NAEP Technical DocumentationProcessing Schedules and Overall Counts for 2006

       

2006 Processing Schedule

2006 Student Participation and Session Information

 

horizontal line

In the spring of 2006, the National Assessment of Educational Progress (NAEP) assessed a national sample of students in civics and U.S. history at grades 4, 8, and 12 and in economics at grade 12. In addition, NAEP field tested mathematics items at grades 4 and 8 and reading items at grades 4, 8, and 12 for pre-calibration purposes, and pilot tested writing items at grades 8 and 12.

Materials staff were responsible for the following tasks:

  • printing of test booklets and questionnaires,
  • materials packaging and distribution,
  • receipt control,
  • data capture through image and optical mark recognition scanning,
  • data editing and validation,
  • preparation of training materials for scoring training,
  • data file creation, and
  • inventory control and materials storage.

NAEP staff received and processed more than 100,000 documents. Of these, approximately 94,000 were assessed student booklets. In addition to the student documents processed, there were fourth-grade teacher questionnaires, eighth-grade mathematics teacher questionnaires, eighth-grade reading teacher questionnaires, eighth-grade civics/U.S. history teacher questionnaires, twelfth-grade economics teacher questionnaires, twelfth-grade economics department chair questionnaires, fourth-grade school questionnaires, eighth-grade school questionnaires, twelfth-grade school questionnaires, students with disabilities (SD) questionnaires,  English language learners (ELL) questionnaires, administration schedules, and seven types of rosters. Most questionnaires were offered both in traditional paper form and online. (The SD, ELL, and Spanish questionnaires were not available online.)

In addition, several special studies were conducted:

  • Sensitivity to Instruction Study (eighth-grade mathematics only): The premise of this study was to discover whether NAEP is sensitive enough to measure differences in cognitive achievement between the beginning of the school year (within 3 weeks of the first day of school) and the end of the school year (within 3 weeks of the last day of school). This study was jointly funded by NCES and the NAEP Validity Panel.

  • Word Locator Study (fourth- and eighth-grades reading only): The premise of this study was to discover the effects of word locator cues (line numbering, bolding) on students' performance on items that measure vocabulary knowledge in the context of a package. These items are called "meaning vocabulary items." The first block of the student booklet contained a released NAEP reading comprehension passage and 10–12 items.The next two half-blocks contained meaning vocabulary items.The booklets were designated A, B, or C; the only difference among the three booklets was the method of cueing (line numbering, bolding, or no cue).

  • Participation and Engagement Study – twelfth grade only, mathematics and reading (cancelled.)

Scoring of the 2006 NAEP assessment occurred at four sites:

  • Mesa, Arizona (civics, mathematics, and U.S. history);
  • Virginia Beach, Virginia (reading and writing);
  • Iowa City, Iowa (reading and writing); and
  • Tucson, Arizona (economics).

Representatives of the Quality Control contractor visited all four sites to audit processing—receipt control, data preparation, scanning, and editing. As the Quality Control Monitor for NAEP, the contractor was responsible for the quality checks that are built into the process.


Last updated 07 January 2010 (GF)

Printer-friendly Version