Page Title:
Keywords:
Description:
Skip to main content
Processing Assessment Materials → An Overview of Interactive Computer Tasks (ICT)

NAEP Technical DocumentationAn Overview of Interactive Computer Tasks (ICT)

Science interactive computer tasks (ICT) were administered as a probe during the 2009 NAEP operational assessment. Schools and students from Grades 4, 8 and 12 were sampled for participation.

To avoid the variability of school computer lab equipment/configuration and bandwidth issues encountered during the 2007 pilot and 2008 field test, it was decided that the science ICT would be delivered using the Materials, Distribution, Processing, and Scoring (MDPS) staff’s web-based assessment delivery platform in "disconnected mode." That is, students navigated their ICT session using NAEP-supplied laptops and individual flash drives, thereby eliminating the need for pre-assessment work certifying school computer labs and web connectivity.

The assessment delivery platform in disconnected mode was designed as a contingency for those few schools with

  • district/school internet connectivity, security, permissions and/or bandwidth issues;
  • computer labs with equipment that did not meet minimum hardware/software requirements for use of the testing platform online.

All of the 2009 science ICTs were administered in disconnected mode.

Data Delivery and System Setup

Student and session information was received via secure file transfer protocol (FTP). This data provided the foundation for the deployment of the assessment delivery platform. The hierarchy of territory/region/schools was loaded, providing the framework for the addition of sessions and student barcode IDs and the subsequent creation of the test sessions.

Unlike previous ICT assessments, user IDs and passwords were not created for the field operations staff to use during training, pre-assessment visits, and assessment administration. The use of NAEP-supplied laptops and flash drives precluded the need for field staff access to the administrative functions of the assessment delivery platform. Instead, the administrative work of session and student data management was performed by the MDPS staff.

Nearly 10,000 flash drives were purchased, formatted, labeled with NAEP barcode IDs, and bundled into session materials for shipment. The process for loading the assessment content onto each of the flash drives included a verification step to assure that the test would run as expected. Despite this quality control check, issues arose during administration of the science ICTs. Extra student barcode IDs were added to each testing session with the intention of providing back-up flash drives for use by the field staff. Student data were protected by the same methods used for securing student booklets. Flash drives were batched by session and were securely shipped by bonded carrier for data retrieval and scoring.

Support of ICT Field Activities

As with the previous science ICT assessments, the MDPS staff provided training materials and support during the supervisor training. Each supervisor received four reusable flash drives representing the different test forms to be seen by the students. As a result, supervisors became acquainted with the operation of the laptop delivery method and test content.

Unlike previous science ICT assessments, the MDPS staff did not provide help desk support during the assessment activities. Rather, the field operations staff provided point-of-contact for technical issues seen in the field, while the MDPS staff handled requests for replacement flash drives and read-aloud accommodation tests.

Before each ICT assessment session, the field operations staff was asked to verify that the student flash drives were working as expected. While effective in helping to discover malfunctioning flash drives, this extra step caused an additional burden. Replacement units were produced and sent to the field, as requested. Following this verification step, requests were made to ship the following number of replacement flash drives:

  • 88 drives at grade 4,    
  • 5 drives at grade 8, and   
  • 3 drives at grade 12.

In addition, read-aloud accommodation flash drives were requested when a student required the accommodation. Barcode IDs were then added to the test session, and the flash drive was produced and shipped to the field. The total number of read-aloud accommodations requested were as follows:

  • 106 accommodations at grade 4  
  • 62 accommodations at grade 8    
  • 63 accommodations at grade 12  

Content Support

The science ICTs were developed by the item development staff and then sent to the MDPS staff to be integrated and tested for use with the assessment delivery platform. Short- and extended-response cognitive tasks were assembled, along with the student general (core) background questions, into specific test forms for presentation to the students. It was decided in September 2008 that the science-specific background questions would not be administered via computer since there were issues surrounding the use of “matrix” or multi-part questions. Following the 2008 field test, numerous and substantial changes were made to both the short- and extended-response cognitive tasks. Despite these changes to the tasks, the MDPS staff was able to make the scheduled shipment of supplies, including flash drives into the field, as planned.

Data Delivery – Student Response Extracts

The ICT flash drives were shipped back to the MDPS staff and checked in with the other session materials. The student response files were then uploaded to the assessment delivery platform server. This manual process required MDPS staff to log on to each student flash drive. The student response files were then extracted for delivery to the data analysis staff for scoring of the multiple choice items and scorable student behaviors in the extended tasks, as well as the open-ended student responses for scoring via the MDPS staff’s scoring platform.

Two different data extracts were delivered to the data analysis staff. Student behavioral data from the extended-response tasks included, but were not limited to

  • on-screen object manipulation by the student,
  • the number of times embedded reference materials were accessed, and
  • the number of times the student repeated a simulation.

For the short-response tasks, only the multiple-choice and open-ended responses were collected for analysis and scoring. Student answers to the core background questions were also collected during the test sessions and delivered each week to the data analysis staff.

The XML data, representing each student’s activity, were extracted from the delivery platform and delivered to the data analysis staff via secure FTP. After performance scoring, another data file containing the student demographic information and performance scores was provided to the data analysis staff for merging into the student record. This second file is the traditional score file that is provided after any NAEP performance scoring project.


Last updated 13 June 2016 (GF)