Data Literacy Webinar: Understanding and Promoting Data Literacy in ...

Report 3 Downloads 70 Views
Understanding and Promoting Data Literacy in Teacher Preparation Programs Webinar for Teacher Educators in Nevada October 17, 2014 Ellen B. Mandinach, WestEd

Today’s Presenter Ellen Mandinach Senior Research Scientist Evaluation Research Program REL West at WestEd

Topics for Today  The landscape of education — Why do educators need to be data literate?  The construct — What is data literacy for teachers?  What do we know about how schools of education are helping to build teachers’ capacity to use data?  The complicated system — How do we improve the human capacity to use data?  Why is it so challenging and what are some next steps?  Your programs’ roles

Why is Data Literacy Important?  Emphasis by policymakers  Philosophical shift to continuous improvement  Evidence, not gut feelings  No longer a passing fad  Helping teachers to help all children learn

Why Now?  Emerging technological solutions from complex data systems to data dashboards  Proliferation of diverse data sources  The building of human capacity has not kept up with the development of the technological infrastructure  Even if educators know they should become data-informed, there are still many challenges  Accountability and evaluation of educators and schools of education

Data Use is NOT New

Let’s Take the 30,000 Foot View

What is Data Literacy for Teaching? The ability to transform information into actionable instructional knowledge and practices by collecting, analyzing, and interpreting all types of data (assessment, school climate, behavioral, snapshot, longitudinal, moment-to-moment, etc.) to help determine instructional steps. It combines an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge, and an understanding of how children learn. Gummer & Mandinach, in press; Mandinach, Friedman, & Gummer, in press

A Modified Definition from the Data Quality Campaign (2014) Data-literate educators continuously, effectively, and ethically access, interpret, act on, and communicate multiple types of data from state, local, classroom, and other sources to improve outcomes for students in a manner appropriate to educators’ professional roles and responsibilities.

Conceptual Framework

The Domain of Data Use for Teaching

Take Home Messages from Prior Work  Lack of clarity in the terminology — data literacy means different things to different people  Developmental continuum for educators’ acquisition of data literacy skills and knowledge is unknown  Process to elevate the importance to schools of education help build human capacity is complex • How best to integrate data literacy into higher education — stand-alone or cross program? • Courses or integrated suites of courses?

 Professional development is not enough  Recognition of the systemic nature of the issue

And Another Issue: An Important Distinction  Data literacy is not the same thing as assessment literacy  They are two different constructs  For example, teaching is so much more than simply assessment. Non-assessment data are also essential  Experts see assessment literacy as part of data literacy because data literacy refers to the use of many sources of data, not just assessment data

The Dell Project Objective and Methods Objective: To understand how many and what kinds of courses and experiences are being offered in schools of education that help prepare educators to use data Methods:  Survey  Syllabus Review  Licensure Review

The Survey Objective: Examine what schools of education are doing to enhance teachers’ data literacy  Response rate: 24.9 % (208 out of 836) [NV (4/6)]  Respondents were from 47 states, DC, and the Virgin Islands  Enroll between 51,840–96,543 pre-service teacher candidates  67.3 % [75%] are public colleges or universities (this reflects the second sample)  83.7 % [100%] offer teaching candidates bachelor’s degrees; 76.4 % [75%] offer master’s degrees

Survey Results  91.1 % [75%] claim that a focus on use of data is a sustained component of their teacher prep program in all or multiple courses  45.7 % [0%] plan on developing and implementing at least one new course focused on use of data  24.1 % [0%] claim to have one stand-alone use of data course, 38.2 percent claim to have multiple stand-alone courses  95.6 % [100%] claim to have use of data integrated within existing courses Note: “Don’t know” responses were not calculated into percentages for any survey results slides

Results from the Syllabus Review Of the syllabi with sufficient information for analysis:  76% focused on design, implementation, and analysis of assessments that would be used at the individual student or classroom level  Secondary focus — formative assessments, state assessments, or assessment policy issue

Survey and Syllabus Interpretations and Caveats  Results are not generalizable, but still informative  Many schools did not respond • Possible that some schools did not participate because they do not have courses on data use • Confusion with NCTQ’s grading of teacher prep programs • Concern that the survey was intended as a “gotcha”

Survey and Syllabus Interpretations and Caveats  A limited number of syllabi submitted, and even fewer examined  Clear that most schools believe they are teaching data use, particularly integrated into other courses. Is this really the case?  Combining the survey with the syllabus review shows that what schools report they do may not be what they do

Results from the Licensure Review General Characteristics          

Amount of data-related skills (range across states) Does it address data (12 states — no) Does it address assessment (2 states without) Does it list specific skills (7 states without) How specific are the statements (range across states) InTASC (6 states) Developmental continuum (7 states) Specific data standard (8 states) Danielson (1 state) Data literacy (22 states) vs. assessment literacy (37 states)

Results from the Licensure Review for Nevada General Characteristics  Amount of data-related skills — rated highly  Does it address data — yes  Does it address assessment — yes  Does it list specific skills — yes  How specific are the statements — rated highly  InTASC — Nevada is one of the states  Developmental continuum — yes  Data literacy (yes) vs. assessment literacy (yes)  Noted as one of the Data Quality Campaign’s six leading states

Results from the Licensure Review Skills (59)  Most frequent skills — assess, collaborate, plan, evaluate, monitor, communicate, use multiple sources, involve stakeholders, make decisions, document/review, provide feedback, self-assess, adjust, analyze, use data, collect/gather, interpret

Results from the Licensure Review Skills (59)  Moderate skills — identify, adapt, use technology, inquiry, reflect, question, differentiate, access, implement, design, ethics, use research, disaggregate  Least frequent skills — individualize, use statistics, act, summarize, predict/hypothesize, synthesize, solve problems, develop assessments, integrate, review, process, infer

Results from the Licensure Review—Nevada Skills                 

Assess Modify Collaborate Involve stakeholders Make decisions Identify/select Adapt Plan Evaluate Use technology Monitor Displays/ representations Inquiry Reflect Question Challenge assumptions Communicate

                

Manage Verify Document/review Examine Feedback Differentiate Adjust Access/retrieve/find/ work Analyze Interpret Apply Multiple sources Implement Use Generate Infer Process

               

Review Develop assessments Solve problem Design/guide Synthesize Ethics Research Statistics, Collect/gather Draw conclusions Individualize Implications/impact Goal setting Data quality Differentiate Disaggregate/group differences

Skills Not Included by Nevada but Noted by Other States          

Self-assessment Organize Process Integrate Diagnose Summarize Predict/prioritize/hypothesize Act/enact Manipulate Patterns/trends

An Example from Another State Arizona Department of Education  Also an InTASC state  Defined: • A data-literate educator must possess the knowledge and skills to access, interpret, act on and communicate findings that support student success

 Used the following key terms: Continuously, Effectively, Ethically, Access, Interpret, Act, and Communicate

An Example from Another State The Arizona Department of Education  Identified key standards to draft a reference chart/ guide for data literacy and instructional impact  Created a rubric with: No evidence, Approaches, Meets, and Exceeds as categories  The rubric outlines the data components, what the teacher may ask, the evidence of performance, and the rating

Desired Outcomes from Data Use The standards cite:  Performance  Instruction  Assessment  Student Learning  Growth  Student Needs  Guidance  Readiness

Licensure Review Interpretations and Caveats Just because it is in the documentation does not mean it is happening  Policy versus practice for an InTASC state  Difficult to find documents • Could not locate for one state • Documents were quite old in a few states • SEA staff were not always aware of such documentation

Note that Nevada has an upcoming task force to review and revise standards for teacher preparation programs.

Looming Questions and Issues: What We Still Don’t Know  Lack of clarity, common understanding, and consistent use in the terminology  Developmental continuum for educators’ acquisition of data literacy skills and knowledge is unknown  Process to elevate the importance to schools of education to have them help build human capacity is complex  Recognition of the systemic nature of the issue  Courses or integrated suites of courses?

How Do We Do This? Recommendations  Use consistent terminology, the research-based definitions, and identified skills and knowledge  Start to introduce data skills to teacher candidates early and throughout their courses and practices  Facilitate the change before the accountability hammer comes down on the institution and its graduates  Use models of good practice from which to learn  Courses or integrated suites of courses? Both, but definitely integrated if you can’t establish a stand-alone course

The Systemic Nature of the Issue Who are the Key Players?  State education agencies  State licensure agencies  Professional organizations  Schools of education  Testing organizations  Local education agencies  Others

What’s the difference between elephants mating and establishing the importance of data literacy?

Contact Information Ellen Mandinach WestEd [email protected] 202-674-9300