Skip Navigation
National Profile on Alternate Assessments Based on Alternate Achievement Standards:

NCSER 2009-3014
August 2009

Appendix B: Data Tables

The following data tables present the individual state responses for the 2006–07 school year for each item in the 50 states and the District of Columbia. For simplicity, the District of Columbia is considered one of the 51 "states".

Forty-nine states reported using a single alternate assessment based on alternate achievement standards in 2006–07. Michigan reported using two such alternate assessments; data for both assessments are presented on a single row in the data tables, separated by a slash mark (/). Florida's alternate assessment was based on standards other than alternate achievement standards; however, some of the state's data are included, with the caveat to use caution in interpreting those data.

Unless noted otherwise, the calculation of percentages uses a base of 51 to describe the status of the alternate assessments. If either of the alternate assessments in Michigan met the item criteria, the state was included in the calculation of percentages of states in meeting the criteria.

The dagger symbol (†) is used in the data tables to indicate that data were not available because the item did not apply to that state assessment for a specified reason or the state was not asked to respond to the item. The reason is specified at the end of each table. When data for some states were not available in tables with mutually exclusive response options, the sum of the percentages listed deviates somewhat from 100 because those states were included in the base for the calculation of the percentages.

Table 1

Summary of standard-setting methodologies

A. Overview
Table A1

Alternate assessment title

Table A2

Purposes of alternate assessment

Table A3 Alternate assessment approaches (structures/types of items used)

Table A4

What content areas were included in the alternate assessment?

Table A5

Grades assessed

Table A6

What was the time frame within which the alternate assessment occurred?

Table A7

How many state content standards were there for reading/language arts? On how many content standards in reading/language arts were students with significant cognitive disabilities using the alternate assessment assessed?

Table A8

How many state content standards were there for mathematics? On how many content standards in mathematics were students with significant cognitive disabilities assessed using the alternate assessment?

Table A9

Alternate assessment developer

Table A10

Who administered/assembled the alternate assessment?

Table A11

Who scored the alternate assessment?

Top

B. Alternate Achievement Standards
Table B1

Who was involved in creating the alternate achievement standards for students with significant cognitive disabilities for reading/language arts and mathematics?

Table B2

Standard-setting methodologies used to develop alternate achievement standards

Table B3

What were the names for the advanced, proficient, and basic achievement levels for students being assessed based on alternate achievement standards for reading/language arts and mathematics?

Table B4

What descriptors applies to each achievement level for students being assessed based on alternate achievement standards for reading/language arts and mathematics?

Table B5

What cut scores were developed for reading/language arts and mathematics?

Top

C. Technical Quality
Table C1

Who was involved in reviewing the technical characteristics of validity of the alternate assessment?

Table C2

Who was involved in reviewing the technical characteristics of reliability for the alternate assessment?

Table C3

Who was involved in reviewing the alignment of the alternate assessment with the state content standards and alternate achievement standards?

Table C4

Who was involved in reviewing fairness in the development of the alternate assessment?

Table C5

Did the state document the validity of the alternate assessment in terms of scoring and reporting structures consistent with the subdomain structures of its content standards?

Table C6

Did the state document the validity of the alternate assessment in terms of test and item scores related to internal or external variables as intended?

Table C7

What evidence supported the validity argument in terms of test and item scores related to internal or external variables as intended?

Table C8

Did the state document the validity of the alternate assessment in terms of purposes of the assessment, delineating the types of uses and decisions most appropriate and the assessment results consistent with the purposes?

Table C9

What evidence supported the validity of the alternate assessment in terms of purposes of the assessment, delineating the types of uses and decisions most appropriate and the assessment results consistent with the purposes?

Table C10

Did the state document the validity of the alternate assessment in terms of the assessment system's producing intended and/or unintended consequences?

Table C11

What evidence supported the validity argument in terms of the assessment system's producing intended and/or unintended consequences?

Table C12

Did the state document the validity of the alternate assessment in terms of measurement of construct relevance?

Table C13

What evidence supported the validity argument in terms of measurement of construct relevance?

Table C14

Did the state document the validity of the alternate assessment in terms of grade-level equating?

Table C15

Had the state content standards been extended or adapted to provide access for students with significant cognitive disabilities?

Table C16

How did the extended content standards map to the state content standards?

Table C17

Did the state document the reliability of the alternate assessment in terms of variability across groups?

Table C18

What evidence supported the reliability argument in terms of variability across groups?

Table C19

Did the state document the reliability of the alternate assessment in terms of internal consistency of item responses?

Table C20

Did the state document the reliability of the alternate assessment in terms of interrater consistency in scoring?

Table C21

What evidence supported the reliability argument in terms of interrater consistency in scoring?

Table C22

Have conditional standard errors of measurement (CSEMs) been reported for the alternate assessment?

Table C23

What was the initial process of aligning alternate achievement standards with the state content standards, and how was it validated?

Table C24

What ongoing procedures were used to maintain and improve alignment between the alternate assessment based on alternate achievement standards and state content standards over time?

Table C25

Was there a process to ensure fairness in the development of the alternate assessment?

Table C26

What evidence supported the process to ensure fairness in the development of the alternate assessment?

Table C27

Did the state document the validity of the alternate assessment in terms of implementation processes?

Table C28

What evidence supported the validity argument in terms of implementation processes?

Top

D. Eligibility and Administration
Table D1

What were the guidelines for IEP teams to apply in determining when a child's significant cognitive disability justified alternate assessment?

Table D2

What procedures were in place for informing parents when their child would be assessed using an alternate assessment?

Table D3

How was assessment content selected?

Table D4

How was the administration process monitored and verified?

Table D5

What procedures were followed in gathering performance evidence?

Table D6

Describe the role of student work (videos, photographs, worksheets/products) in the alternate assessment.

Table D7

Did the assessment of student work (tasks or products) take place as part of the day-to-day instructional activities, or were students asked to perform tasks "on demand"?

Table D8

Describe the role of teacher judgment in the alternate assessment.

Top

E. Scoring and Reporting
Table E1

How many scorers scored the alternate assessment?

Table E2

How were scoring conflicts resolved?

Table E3

What elements of student performance were used in scoring?

Table E4

What environmental elements were used in scoring?

Table E5

What types of training were provided for assessment administrators?

Table E6

What types of training were provided for assessment scorers?

Table E7

Who received individual student reports?

Table E8

How were individual student results on the alternate assessment expressed?

Table E9

For whom was interpretive guidance on the alternate assessment developed?

Table E10

Information included in reports given to parents

Top