Date post: | 08-Jan-2018 |
Category: |
Documents |
Upload: | noreen-long |
View: | 216 times |
Download: | 0 times |
Value Added Model and Evaluations:Keeping It SimplePolk County Schools – November 2015
The Statute: 1012.34 (3)(a)1. Performance of students.—At least one-third of a performance evaluation must be based upon data and indicators of student performance in accordance with subsection (7). This portion of the evaluation must include growth or achievement data of the teacher’s students or, for a school administrator, the students attending the school over the course of at least 3 years. If less than 3 years of data are available, the years for which data are available must be used.
(7)(b) Each school district shall measure student learning growth using the formulas approved by the commissioner under paragraph (a) and the standards for performance levels adopted by the state board under subsection (8) for courses associated with the statewide, standardized assessments administered under s. 1008.22 no later than the school year immediately following the year the formula is approved by the commissioner. For grades and subjects not assessed by statewide, standardized assessments, each school district shall measure student performance using a methodology determined by the district.
What is Value-Added? Value-added models measure the influence of schools or teachers on the academic growth rates of students.
Value-added compares the change in achievement of a group of students from one year to the next to an expected amount of change based on their prior achievement history and other potential influences.
Comparing Student Performance: Traditional Achievement/Proficiency Models
All students are expected to score at or above proficiency levels, regardless of individual characteristics Schools & teachers evaluated based on percentage of students attaining goals Same expectations for all schools & teachers Examples: School grades model Polk 2013-14 evaluation manual for teachers of courses
without state-developed VAM data
Florida’s Variables in Value Added Model Up to two years of prior achievement scores
Number of subject-relevant courses Disability status English language learner status Gifted status Mobility Attendance Difference from modal age (such as retention) Classroom characteristics – class size and homogeneity of prior test scores
Comparing Student Performance: Value Added Models Students’ individual characteristics, such as number of absences, prior
test data, ELL status, etc. are used to create a cohort group of students statewide
Students are examined in relation to their peers with same characteristics in cohort group
Expectations of performance are differentiated for all students based on these characteristics
Schools & teachers evaluated based on students’ difference from predicted individual score
Examples: Florida’s value added model District-developed models being considered for 2015-16 evaluations by
Polk’s Teacher Evaluation Advisory Committee
What Scores Are Used in the Value Added Model?
For 2014-15, Value Added Models were calculated for: - FSA ELA - FSA Mathematics - Grade 9 FSA Algebra 1 EOC 2013-14 and 2012-13 FCAT Reading and Mathematics scores were used as prior year data
Prior Achievement Scores In subjects that are annually tested, such as ELA and math, it is easy to understand how: - A prior reading test can be predictive of performance on an ELA test e.g., 5th grade reading performance predicts 6th
grade ELA performance - A prior math test can be predictive of performance on a math test e.g., 4th grade math performance predicts 5th grade
math performance
Let’s Look at SIMPLE Prediction
• Basic prediction assumes the student who scored 375 in Year 1 would score similar to all other students who scored 375.
• If all students who scored 375 in Year 1 score 400 in Year 2, the models would predict the student would perform similar.
• If the student scores above the prediction, we have value-added.
Predictions Are Not Always In an Upward Direction
• Some cohorts of students, such as those as the top of the scale, or those who have a significant number of absences or a significant cognitive impairment, may have a negative prediction.
• This individualization is possible in VAM because of the comparison of students with their peers who have the same characteristics.
Understanding the Value Added Scale
A Zero is GOOD! A zero indicates that the predicted data and the actual data match (like achieving par on a golf course). + For a positive number – student growth exceeded what was projected. - For a negative number – students did not show as much growth as expected.
An AnalogyGOLF - The player has 18 different holes
- The player earns a score at each
- Some holes are at par, some are under, some are over
- The overall score is an aggregation of the individual scores Par = meeting expectations
VALUE ADDED - The teacher has several class periods
- The teacher has a score for each
- Some students achieve as expected, some underperform, some overperform
- The overall score is an aggregation of the individual scores Zero = meeting expectations
Let’s Look Closer at a SIMPLE Prediction
Understanding VAM Classification: 2014-15 Scale
Unsatisfactory: below the district average by 2.5 or more standard errors (99% confidence interval)
Needs Improvement: below the district average by 1.5 standard errors (89% confidence interval)
Effective: within 1.5 standard errors of the district average (89% confidence interval)
Highly Effective: above the district average by 2.5 or more standard errors (99% confidence interval)
Finding 2014-15 VAM Data-Navigate to OneDrive-Shared With Me-AAEYourSchoolNumber-File is 3YrAggVAM0031
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of state-tested subject areas & grade levels without a state-created VAMPercent of Students Scoring Proficient or Higher on State Test
Rating
0%-31% Unsatisfactory31.01%-41% Needs Improvement/
Developing41.01%-70% Effective70.01%-100% Highly Effective
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of students in grades 3-10 teaching non-state assessed courses (not AP/IB)Percent of Students Scoring Proficient or Higher on closest state assessment (usually FSA ELA)
Rating
0%-31% Unsatisfactory31.01%-41% Needs
Improvement/ Developing
41.01%-70% Effective70.01%-100% Highly Effective
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of students in grades 11-12 teaching non-state assessed courses (not AP/IB)Percent of Students Scoring
Concordant or Higher/College-Ready or Higher (whichever is lower) on ANY college-ready assessment (SAT, ACT, PERT)
Rating
0%-31% Unsatisfactory31.01%-41% Needs
Improvement/ Developing
41.01%-70% Effective70.01%-100% Highly Effective
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of students in grades K-2Percent of Students Scoring Proficient or Higher on ELA End of Year Exam
Rating
0%-31% Unsatisfactory31.01%-41% Needs
Improvement/ Developing
41.01%-70% Effective70.01%-100% Highly Effective
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of AP or IB-assessed coursesPercent of Students Scoring Proficient or Higher on Related AP or IB Exam
Rating
0%-10% Unsatisfactory10.01%-20% Needs
Improvement/ Developing
20.01%-50% Effective50.01%-100% Highly Effective
2013-14 Collective Bargaining Agreement: Proficiency Models Teachers of Adult Students Assessed by TABE or CASAS
Percent of Students Who Improve By 1+ Levels on TABE or By 5+ Points on CASAS
Rating
0%-31% Unsatisfactory31.01%-41% Needs
Improvement/ Developing
41.01%-70% Effective70.01%-100% Highly Effective