Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science?

Post on 19-Jan-2016

27 views 0 download

Tags:

description

Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science?. Presented by The Elementary & Middle Schools Technical Assistance Center The American Institutes for Research. EMSTAC Model: Insider – Outsider Approach. Linking Agent inside the district. TA Liaison - PowerPoint PPT Presentation

transcript

Is Your School Improving Outcomesfor Students with Disabilities:

Guesswork or Science?

Presented by

The Elementary & Middle Schools Technical Assistance Center

The American Institutes for Research

EMS CElementary & Middle SchoolsTechnical Ass is tance Center

www.emstac.org

TA

EMSTAC Model:Insider – Outsider Approach

TA Support

School District

TA Liaisonoutside the district

Linking Agentinside the district

Presentation Goals

• Provide an overview of the importance of evaluation efforts for school-based interventions involving students with disabilities

• Identify key steps in planning for and conducting an evaluation

• Show EMSTAC district “success stories,” demonstrating effective interventions

The Problem

• Most schools have multiple programs or interventions occurring; many are targeted exclusively or partly at students with disabilities.

• In many cases, however, the importance of a sound evaluation effort can be overlooked, underappreciated, and difficult to achieve.

Common Evaluation Challenges

• No “true” baseline data

• Not enough data have been collected

• Comparable data have not been collected

• Not enough time to collect desired data

• Longitudinal gaps (missing years)

• Separating the effects of one intervention from another

What Happens When You Don’t Evaluate?

• Inability to demonstrate that interventions are working

• Difficulty in obtaining sustained funding

• Problems with “scaling up” the intervention to other sites

• Dissemination of best practices to other districts & states is hindered

Evaluation is More Important Than Ever

• The “era of accountability:” Increased pressure from governing bodies to demonstrate results

• Too much focus on “just make things better”

• Budget difficulties at federal, state, & local levels

• IDEA ‘97: Students with disabilities must be included in accountability efforts

First Step: Getting a View of the Big Picture

• Program planning and evaluation go hand in hand

• Tailor to program purpose & goals

• Identify the purpose of the evaluation requirements

• Consider program reporting requirements

• Think ahead about available resources

• Be sensitive to the local context for the evaluation

• Identify program reporting

The Big Picture

Program Reporting

Requirements Evaluation

Evaluation Purpose

Local Context Time & Resources

Program Purpose & Goals

Second Step: Identifying the Evaluation Questions

• Questions related to program implementation– Program context

– Program delivery

– Access to the program

• Questions related to program impact– Impact on student performance

– Impact on teacher capacity

– Impact on moving research to practice

Third Step: Selecting a Design that Will Provide

the Data You Need• Experimental designs

• Quasi-experimental designs

• Simple before & after studies

• Time series designs

• Ethnographic research

Fourth Step: Identify & Develop Tools for

Collecting Data • Direct observation • Records & documents • Physical artifacts• Information from school

administrators, teachers, students, & parents

Special Considerations Time Reliability & validity Training data collectors Permission to collect data

& informed consent

Fifth Step: Data Analysis

• Storing data

• Quantitative analyses: Descriptive and univariate statistics

• Quantitative analyses: Inferential statistics

• Qualitative analysis

Sixth Step: Reporting the Findings

• Formal and informal reports

• Targeting the audience

• Keep it simple and straightforward

• Brief reports

• Website

• Public reports

• Support program improvement

EMSTAC “Success Stories”• Implementation looks different in each district

– various resources, time, staff, & experience with evaluation

• Examples of EMSTAC supported sites with positive student outcome results– Allegany County, MD

– Detroit, MI

– East Grand, CO

– Los Angeles, CA

Allegany County, Maryland

• Planning– Evaluation planned after program implementation

• Evaluation questions– Will the program and linked professional development

help general education teachers make instruction meaningful for all students?

– Are we going to see improvement in test scores?

• Design– Simple before and after design (Norm-referenced)

Allegany County, Maryland

• Data collection– Test results (i.e. CTBS, Curriculum Based Measures)

– Teacher surveys

• Data analysis– Analyze final scores

– Challenges with student attrition

• Reporting– Published data in department newsletter

– Plan scale up efforts

Allegany County, Maryland

Early Literacy Program EMSTAC assisted with needs

assessment, in 1998 Implemented portions of Early

Literacy Program (MSU) Training occurred in summer 1999 Data collected are based on results

of CTBS pre/post scores (1998/2000).

Language : 43 to 59.5 (28%) Lang. Mechanics : 43 to 54.0

(20%) Lang. Composition : 47 to

57.0 (18%)

0

10

20

30

40

50

60

Lng

uage

Lan

gM

ech

Lan

gC

omp

19982000

Detroit, Michigan Project ACHIEVE

Internal needs assessment, program selection & training process took a year

1999 began implementation of program in one middle school

by 2000-2001, school saw decrease in violations of code of conduct

• Class 1 referrals : 1,914 to 931 (51%)

• Class 2 referrals : 394 to 227 (42%)

• Class 3 referrals : 18-8 (56%).

0

400

800

1200

1600

2000

Class 1 Class 2 Class 3

19992001

East Grand, Colorado Literacy across curriculum

District conducted needs assessment

EMSTAC assisted with program selection via web

Implementation began in 1999 in one school

Results are based on pre-post test results (1999, 2001) from CO State Assessment Program

Reading : 5% (70-75) Math : 16% (50% to 66%) Writing unchanged

0%

10%

20%

30%

40%

50%

60%

70%

80%

Reading Math

19992000

Los Angeles, California

Peer Assisted Learning Strategies District conducted needs assessment PALS implementation began in 2000 Measures included CBM tools and probes (fall and spring) Probes indicated that during 2000, all grade levels using

program saw increases in WCPM (Words read correctly per minute)

Thus far, second grade classrooms have had highest gains of between 40% to 97.

Conclusion

• Evaluation is increasingly important.

• There are a set of key principles that guide sound evaluation efforts.

• You don’t have to be an expert to organize & conduct a sound evaluation.

• There are many useful resources for practitioners undertaking evaluation efforts (i.e. EMSTAC Evaluation Guide)

Presenters

• Jim Hamilton, EMSTAC Director

• Don Dailey

• Bradley Carl

• Suzanne Ritter

• e-mail us at www.emstac.org