+ All Categories
Home > Documents > Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San...

Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San...

Date post: 04-Jan-2016
Category:
Upload: zoe-rice
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
22
Math and Science Partnership Math and Science Partnership Program Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public Works, Inc. The instructional practices and assessments discussed or shown are not an endorsement by the U.S. Department of Education
Transcript

Math and Science Partnership ProgramMath and Science Partnership Program

Approaches to

State Longitudinal EvaluationMarch 21, 2011

San Francisco MSP Regional Meeting

Patty O’Driscoll

Public Works, Inc.

The instructional practices and assessments discussed or shown are not an endorsement by the U.S. Department of Education

What is a longitudinal evaluation?

• An analysis of data collected:– on the same set of participants (e.g.,

teachers or students)… – using a common measure (e.g., survey

data, test scores or percentiles)…– collected regularly (e.g., annually) to

assess the extent to which outcomes are changing.

Examples of Longitudinal Data

• Student achievement data (nationally normed or state test)– Collected in spring each year for students of

participating teachers• Teacher survey data (state or locally developed

instrument)– Collected each fall to evaluate summer

professional development• Classroom observations of teachers

– Collected in participating/non-participating classrooms to assess implementation of instructional practices during school year

Key Elements of Longitudinal Studies

• Participants– Follow the same students or teachers over time– Need consistent definitions of who to include in the

study (e.g. ability to track dosage of PD)• Measures/Instrument Development

– Use the same measures/instruments at each wave of data collection

• Data Collection Methods– Timing of data collection must be the same each

year (e.g., achievement data collected each fall not in the fall one year and spring the next year)

Multiple Evaluation LensesMultiple Evaluation Lenses

1) Statewide Perspective (across MSP grants)

2) Local Partnership Impact (within MSP grantees)

3) Federal Reporting

Statewide PerspectiveStatewide Perspective

• Longitudinal data to track sets of grantees over time (across MSP grants). Useful to track:– Different approaches/models of PD– Regional differences/variation of impact

by partnership– Statewide impact of grant/PD on

teachers and students– Contribute to reporting to federal level

Local Partnership ImpactLocal Partnership Impact

• Longitudinal data to track single MSP grants (within MSP). Useful to track:– Local research questions– PD model and impact given local

context for implementation– Partner effectiveness and improvement

in implementation– Not for comparison across MSPs

Why Consider Longitudinal Data Collection and Evaluation

• Program process monitoring– “Systematic and continual documentation of

key aspects of program performance that assess whether the program is operating as intended.” (Ross, Lipsey, Freeman)

• Program outcome monitoring– “The continual measurement of intended

outcomes of the program.”

Sample Questions

• Program process monitoring– Is the duration and intensity of PD consistent

over time?– Who’s participating and targets being met?– Is the PD content and emphasis

changed/relevancy to student-teacher needs?

• Program outcome monitoring– Long-term achievement trends of students

taught by MSP teachers? Are gains sustained?– Are MSP practices becoming embedded in the

classroom? Are they supporting achievement?

Who’s Your Audience?

• MSP Program Directors– Fidelity to program goals and requirements– Duration and intensity of PD across MSPs– Teacher/Student trends

• Policymakers at the local, state, and federal levels– What’s working? Replication, expansion? Funding

decisions?– Pay off?

• Larger community of practitioners– Add to evidence-base of what is working and why

Setting up a Longitudinal Evaluation

• Lots of Work!– Takes resources and expertise. Funds are needed to

collect and analyze the data. – Takes time and commitment. Planning, monitoring data

collection and sharing of data, conducting the analysis is time-consuming.

– Takes a long-term vision and common set of goals and outcomes. Requires a multi-year commitment and up-front planning/tweaks and adaptations over time.

• Embed reporting and data collection in RFA/require MSPs to reserve grant funds for evaluation

• Establish common goals across grantees

Setting up a Longitudinal Evaluation

• Start planning now…– The next MSP State grant competition is an opportunity to set

up a data collection and evaluation system• Can’t (very difficult) to build retroactively• Building mid-stream also difficult• Build prospectively..definitions, measures, and reporting

requirements established before program begins

• Ideas…– Require/encourage grantees to collect common set of data

elements• Attendance, length/duration of PD• Teacher knowledge• Teacher practice• Student achievement

– Pick the most important to measure across MSPs

Setting up a Longitudinal Evaluation

• More ideas for the MSP Application Notice– State direction and focus areas of the MSP competition– List common measures and consider whether self-report vs.

through observation or performance (e.g. classroom observations; student assessments)

– Consider if you can require vs. ideas for encouraging• Additional points for participation in the evaluation • Larger awards for participation in the evaluation• Sheltered competition (set aside funds for testing of PD models

and innovation)

Setting up a Longitudinal Evaluation

• Analyzing and Reporting the Data– Establish research questions upfront--needs to be part of data

collection strategy including: • Data elements• Target population• Data collection timing and process

• Diversity of MSP grantees must be considered--PD model, subject areas, grade levels covered– Data collection and analysis must align to grant competition

emphasis– Think about trend data across cohorts of MSP grantees to

provide a long term view of PD practices and changes over time/subgroup analysis

An ExampleAn Example

Central versus local

• CDE/Public Works, Inc.

• CDE/Federal

• Partnership/ Local evaluators

Statewide Evaluation Statewide Evaluation Research QuestionsResearch Questions

1. How have the Partnerships ensured that all students have access to, are prepared for, and are encouraged to participate and succeed in challenging and advanced mathematics and science courses?

2. How have the Partnerships enhanced the quality of the mathematics and science teacher workforce?

3. What evidence-based outcomes from the Partnerships contribute to our understanding of how students effectively learn mathematics and science?

Data CollectionData Collection

• Qualitative research includes site visits and phone interviews (Spring/Summer)• Teacher Database that incorporates data on all teachers in participating districts and attendance in professional development for all participating teachers (ongoing)• Partner Survey (Winter)• Participating Teacher Survey (Spring)• Student rosters of comparison and treatment teachers for statewide study/CST Data (collected in Fall)

Teacher outcomes examined Teacher outcomes examined in state evaluationin state evaluation

TeachersNumber and characteristics of teachers who

participate in professional development; satisfaction with training

• Demographics/Years teaching• Hours of Training• Qualifications/Assignments• Treatment vs. Comparison Student

Performance on CSTs

Student outcomes examined Student outcomes examined in state evaluationin state evaluation

Students

• Improved student academic achievement on state mathematics and science assessment across the state and at the partnership level

Local Evaluation in CaliforniaLocal Evaluation in California

Two Goals:

• Fulfill/support state evaluation requirements

• Fulfill commitments made in Local Evaluation Plan in response to RFA

Most important for local: Teacher knowledge and instructional strategies & measuring student knowledge with local assessments. Also, the completion of an evaluation report to attach to fall Federal Report.

Closing Thoughts and Considerations

• Feasibility and cost• Getting evaluators on board may be part of grant requirements but

can also be built through buy-in and the provision of technical assistance

• Evaluation and data systems expertise at the state level is crucial as a liaison to the individuals responsible for the evaluation

• Consider: – Payoff from data can be great but takes time and commitment– Contributions of evidence collected and best practices is valuable to

the field– Less reinventing, more using lessons learned

Contact InformationContact Information

• Patty O’Driscoll, [email protected]

• Phone: (707) 933-8219

• www.publicworksinc.org


Recommended