AQuESTT EBA and Classification Updates Sue Anderson
[email protected] Matt Hastings
[email protected] AQuESTT Updates • Evidence-based Analysis (EBA) • Key Findings • Operations and Proposed Next Steps
• Classification of Schools and Districts • Proposed Timelines and Next Steps
Evidence-based Analysis (EBA) 2015 Findings
2015 AQuESTT Evidence-based Analysis (EBA) Activities
Supports
2015 AQuESTT Classification of Schools (Raw) Classification
Indicators Status NeSA-RMSW Adjustment for Improvement Adjustment for Growth Adjustment for NonProficient Limits for Participation Limits for Graduation
Evidence-based Analysis (EBA)
Final Classification 3 Report
Possible Adjustments
3
2015 EBA Activities - Key Findings • •
Personal Student Learning Plans Students manage and monitor own learning
• • •
Successful transitions for all students Processes for on-time grade completion Support for students at-risk of dropping out
• •
Online learning opportunities supplement face-to-face Evaluation of new program effectiveness
• •
Curriculum systematically reviewed/modified Students (elem) receive career awareness instruction
•
Assessment information/results shared in a timely manner with teachers, administrators, parents, students
•
Technology infrastructure meets teaching needs of faculty and staff
2015 EBA Requests for Support • • • •
Personal Student Learning Plans Measuring student engagement Strategies for family attendance/participation Safe, secure learning environment expectations
• • •
Process for addressing the needs of highly mobile students Process to identify and support students at risk of dropping out Process to support on-time grade completion
• • •
Supplemental online learning opportunities Program evaluation Before/after school programs
• •
Curriculum aligned to Career Readiness Standards Career awareness, exploration, preparation instruction
• •
Utilize formative, classroom-based assessments Sharing assessment information in a timely manner
• • •
Measuring/addressing teacher engagement Technology to support teaching and learning Formal staff evaluation process aligned to NTPPF
Evidence-based Analysis (EBA) Operations – What’s Next
EBA Operations • Empanel the EBA Advisory Committee • Enhancement decisions related to psychometric findings
• Timeline for next EBA fielding • Spring 2017 • Access though the NDE Portal
• Rule 10 Assurances Statement will be separate • Fall 2017 • Access through the NDE Portal
Psychometric Review • Good empirical evidence for the utility of the Evidence-Based Analysis (EBA) questionnaires for schools and districts: • Support for measurement of six distinct tenets • All item responses showed significant prediction by the tenets they were supposed to measure
Psychometric Recommendations 1. Model-Based • Simultaneous estimation provides more reliable tenet estimates than would come from separately computed mean or sum-scores • Some items contribute more than others at measuring tenets • Makes possible the use of customized, item-specific response formats to obtain more precise information than standard response scale
Psychometric Recommendations 2. Revise current 4-Point Response Scale • The “Never” scale point was hardly used • e.g. consider “Rarely” or “Seldom”
• The “Usually” scale does not go high enough • e.g. consider “Almost Always” • 1=Seldom – 2=Sometimes – 3=Often – 4=Almost Always
• And/or consider creating item-specific response scales • Standard scale only necessary when means or sum-scores used to measure tenet • Model-estimated tenets removes this constraint
• Instead of “Seldom”, further define what is meant by seldom for said item (e.g. once a month/semester/decade) • Prototype responses (rubric-like) that illustrate more conceptually what is mean by alternative scale points
Psychometric Recommendations 3. Item Content and Coverage • Provide clarity to nouns without clear referents (e.g. strategies, processes) • Prioritize enhancements to EBA items with relatively low item discrimination compared to others
Psychometric Recommendations 4. Added Information to Further Validity • Model fit and precision of measurement indicate responses are consistent • Provide opportunity to provide context for responses and better “tell our story” • Could also be considered for support items
Classification of Schools and Districts
Classification Considerations • Upgrades to the EBA • EBA fielding – Spring 2017
• Identifying additional AQuESTT/School Quality Indicators • Transition to newly aligned state assessments • Incorporating anticipated ESSA requirements into the AQuESTT system
AQuESTT Classification Cycle For 2015-2016 (Transition Year) • Retain 2015 Classification • Performance Progress Report • NeSA Data • • • • •
Status Improvement Growth Non-Proficient Students Participation
• Graduation Rate • Other AQuESTT Indicators of School Quality – data elements currently collected (e.g., absenteeism/attendance, early childhood programs, educator equity/effectiveness)
AQuESTT Classification Cycle 2016-2017 (Transition Year) • Retain 2015 Classification • Performance Progress Report • State Assessment Data/Raw Classification • Other AQuESTT Indicators of School Quality – data elements currently collected (e.g., absenteeism/attendance, early childhood programs, educator equity/effectiveness) • Upgraded EBA Results
AQuESTT Classification Cycle For 2017-2018 • New Classification of Performance • AQuESTT Academic Indicators • Raw Classification • ESSA Requirements
• AQuESTT School Quality Indicators • AQuESTT EBA • Other AQuESTT Indicators
Thank you. Questions? Contact: www.aquestt.com