The National Association of State Directors of Special Education (NASDSE) has released results of conversations with its members on the early implementation of the new Results Driven Accountability (RDA) system. RDA is the new process by which the Office of Special Education Programs (OSEP) at the U.S. Department of Education evaluates states’ performance in meeting the requirements of the Individuals with Disabilities Education Act (IDEA). In addition to RDA, states must also develop a State Systemic Improvement Plan (SSIP) as part of their State Performance Plans, which requires states identify one or more state-specific indicators on which to focus their improvement efforts.
RDA is designed to balance two interests. Under the new evaluation system, OSEP will examine, on the one hand, states’ compliance with the individual requirements of the IDEA – such as how Individualized Education Programs are constructed and whether time lines for evaluation of students are met – and, on the other hand, whether schools are achieving better educational outcomes for students. Once a state is evaluated, OSEP issues a determination letter rating the state in one of four categories – “Meets Requirements,” “Needs Assistance,” Needs Intervention,” or “Needs Substantial Intervention.” Fifty percent of the determination is based on compliance and the other 50 percent on student results.
In June 2014 the first round of determinations using the RDA system resulted in a rating of “Meets Requirements” for 18 states, outlying areas, and freely associated states, 36 deemed as “Needs Assistance,” and 6 rated as “Needs Intervention.” The 2014 determinations for the Part C Infants and Toddlers were not made based on the new system.
NASDSE asked for feedback from the state directors of special education on what OSEP got right in the new RDA process, where OSEP missed the mark, and recommendations for improving the system. After analyzing the responses collected through an interactive webinar and an online survey of its members, NASDSE developed the following recommendations for changes to the Results Driven Accountability process:
- Under RDA, one of the elements on which states are judged is participation in and scores from the National Assessment of Educational Progress (NAEP). NASDSE opposes the use of NAEP as an element for making current or future state determinations and notes assessments must be used only for their intended use and not an over-interpretation of intended use for rank ordering states. NAEP scores used for this year’s calculations should be used for information only and not be used as the base year for future determinations.
- For state directors to be more knowledgeable and more supportive of RDA, NASDSE recommends providing directors the opportunity to be actively engaged before the next round of determinations are made in examining, among other elements, the metrics used and the rank ordering of states. Directors acknowledge the importance of looking both at student results and compliance in assessing states’ performance. However, they believe the system as it is currently designed is flawed and should be discontinued, giving OSEP the opportunity to work closely with the state directors and other stakeholders on a more appropriate process.
- Several years’ implementation normally is required before new assessments and other changes yield valid and reliable data. States should be given that time to fully understand and put into place the new SSIP and RDA processes before determinations leading to sanctions are made. NASDSE notes this is consistent with the principles of implementation science frequently cited by the U.S. Department of Education, including OSEP, as an underlying theory of its work.
- NASDSE states “all means all,” i.e., that all students with disabilities should be included in calculations for participation and performance, including those students taking alternate assessments. Excluding those students is inconsistent with the IDEA because such a policy would result in unequal treatment.
- NASDSE supports making state determinations based on an individual state’s growth, both on compliance and outcomes. States should not be ranked according to a single indicator or a combination of indicators. State directors want their states to be evaluated based on their progress toward meeting the individual states’ indicators which they are required to develop under SSIP. States’ progress towards achieving those goals should not be compared to progress or lack of progress made by other states, because state goals are designed to be individualized based on multiple factors within a given state.
NASDSE recommends OSEP use this year’s process as a “test year,” allowing time for adjustments based on important feedback from state directors and other stakeholders. They support the conceptual framework of looking at both compliance and student results in determining state performance; however, they also believe the system was implemented too quickly with insufficient input from the field and a lack of time for states to prepare for implementation.
An Opportunity for LDA Members
LDA is very fortunate to host Dr. Melody Musgrove, Director of OSEP, at the 2015 national conference in Chicago. Our members will have the opportunity to engage in a conversation with Dr. Musgrove about the State Systemic Improvement Plans and Results Driven Accountability and reflect the perspective of LDA members on what’s working and what needs improvement. We hope you will avail yourselves of this chance to have direct input into these very important systems discussions.
Myrna Mandlawitz, M.Ed., J.D., is the Director of Public Policy for LDA of America. A native of Virginia, she has worked for over 20 years as a consultant/lobbyist on special and general education. Ms. Mandlawitz spent fourteen years as a classroom teacher and assisted in the development of Virginia’s program for infants and toddlers with disabilities. She is the immediate past president of the Committee for Education Funding, a coalition of 114 national organizations supporting increased federal investment in education.