Table of Contents | Search Technical Documentation | References
2017 Student Participation and Session Information | |||
The National Center for Education Statistics (NCES) conducted the 2017 National Assessment of Educational Progress (NAEP) with staff from several organizations under the umbrella of the NAEP Alliance. Item development, sample design, materials processing, data analysis, and program staff are responsible for:
The NAEP 2017 assessment was comprised of the following student assessments:
Materials processing and scoring staff were responsible for the following tasks in the NAEP 2017 assessment:
For NAEP 2017, a total of 348 types of scannable student booklets were designed and printed, along with the non-scannable large-print and braille student booklets used in the paper based assessment.
School and teacher questionnaires were offered online through a secure file transfer protocol (FTP) site. Scannable questionnaires were provided to respondents upon request. This site was available in early December 2016 through mid-March 2017.
Approximately, 115,000 questionnaires were entered online and 9,000 scannable questionnaires were completed for the operational assessment, digitally based pilot, and Puerto Rico proof of concept assessments.
There were approximately 176,000 students assessed in the operational assessments in mathematics and reading paper based; 585,000 assessed in mathematics and reading operational digitally based; 115,000 assessed in the integrated mathematics, mathematics KaSa, reading, civics, US history, and geography pilots; 47,000 assessed in writing operational, 6,000 assessed in mathematics KaSA in Puerto Rico, and 3,000 assessed in the writing comparability study in NAEP 2017.
Scoring of NAEP 2017 occurred in Iowa City, IA, for the writing, civics, US history, geography; Norfolk, VA, for the reading operational and integrated pilot; and Mesa, AZ, for the mathematics operational and integrated pilot. As in past cycles, project and content specialists were on site to monitor scoring. There was daily communication between staff at the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety of staff from NCES, the NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison (TRMC) Report, which highlighted items that were in need of more discussion, and the Completion Report, which showed detailed scoring information for each item, as well as cumulative current and historical monitoring statistics for both trend and non-trend items. More than 7 million responses were scored from approximately 1,200 items. All of NAEP 2017 responses were scored from April 3 through June 23, 2017.