Measuring Growth Without a Measuring Tape: What Teachers Need to Consider in Thinking About Teacher Effectiveness
Presented by: Sara Bryant, Measured ProgressMichigan Assessment Consortium
April 15, 20131:00—2:30
Acknowledgment
The work described here has been developed for the Literacy Design Collaborative by Measured
Progress and the Stanford Center for Assessment, Learning, and Equity with funding
by the Bill and Melinda Gates Foundation.
Overview
• Objectives for today’s session• Teacher Moderated Scoring Systems (TeaMSS)• Literacy Design Collaborative (LCD)• Teacher Effectiveness • “Take 5s”
Session Objectives
• To learn about Measured Progress’ work on TeaMSS
• To learn about a partnership with the LDC• To think about how this project might inform
your own local work on teacher effectiveness models
Take 5s
The Big Question:
As I learn about Measured Progress’ work with LDC and the Gates Foundation, what
connections am I making to my own local work?
Project Components
Measured ProgressScoring Professional Development
Teacher Moderated Scoring Systems (TeaMSS)
• Teachers scoring student tasks
• Common rubrics aligned to Common Core
Standards (CCS)
• Common summative assessments (“tasks”) aligned to
rubrics and CCS
• Other PD tools and resources to help teachers learn
to score and become calibrated with others
PD Components
1. Grade student work with no rubric/guidance2. Learn the intricacies of a common rubric3. Learn how anchor sets are used as a scoring tool4. Practice scoring rubric elements5. Reflect on essential Scoring Principles6. Practice, Practice, Practice7. Score two final papers to look for calibration8. Continue practicing with additional student tasks
Learning ObjectivesBig Ideas
• Adopting a Mind-Set of “Learning to Score”
• Understanding and Using Rubrics
• Understanding and Using Anchor Sets
• Scoring
• Application to Classroom
Grading: reflects the performance of students relative to expectations at a particular point in time.
Scoring: uses fixed standards of quality that do not change over time.
Grading vs. ScoringWhat’s the Difference?
Holistic Scoring: balances characteristics of writing to arrive at a score appropriate to its overall quality.
Analytic Scoring: considers criteria of assessment separately, identifying a single score for each criterion.
Analytic vs. Holistic ScoringWhat’s the Difference?
Take 5
When thinking about your own experiences with organizing and implementing scoring common
student work, what tools and trainings might be helpful to enhance the experiences?
What tools and resources might be helpful to learn more about scoring common student
work?
1. Know the rubric.
2. Trust evidence, not intuition.
3. Match evidence to language inthe rubric.
4. Weigh evidence carefully;base judgments on the
preponderance of evidence.
5. Know your biases;leave them at the door.
6. Focus on what the student does, not on what the student does not do.
7. Isolate your judgment: One bad element does not equal a bad paper.
8. Resist seduction: One good element does not equal a good paper.
9. Recognize direct copy or plagiarism.
10. Stick to the rubric.
Literacy Design Collaborative
Examples on the following slides and more information about LDC can be found at:
www.literacydesigncollaborative.org
What is LDC?
• A framework for building literacy skills and core content knowledge - aligned to Common Core Standards (CCS)
• “Template Tasks” built on text-based essential questions and a genre of writing (e.g. essay)
• Common rubrics for argumentation, informational and narrative writing
Template Task
“LDC ‘template tasks’ provide fill-in-the-blank shells that teachers use to create powerful
assignments. For example, Template Task 2 calls for student analysis that builds an argument.”
- www.literacydesigncollaborative.org/tasks
Template Tasks
“[Insert question] After reading _____ (literature or informational texts), write _________ (essay or substitute) that addresses the question and support your position with evidence from the text(s). L2 Be sure to acknowledge competing
views.”
- www.literacydesigncollaborative.org/tasks
Rubric Scoring Elements
• Focus• Controlling Idea• Reading/Research• Development• Organization• Conventions• Content Understanding
Scoring Element Example
Controlling Idea
Scoring Element 1 1.5 2 2.5 3 3.5 4
Controlling Idea
Attempts to establish a
claim, but lacks a clear purpose.(L2) Makes no
mention of counter claims.
Establishes a claim.
(L2) Makes note of counter
claims.
Establishes a credible claim. (L2) Develops
claim and counter claims fairly.
Establishes and maintains a substantive
and credible claim or proposal.
(L2) Develops claims and counter claims
fairly and thoroughly.
Scoring Element Example
Organization
Scoring Elements 1 1.5 2 2.5 3 3.5 4
Organization
Attempts to organize ideas, but
lacks control of structure.
Uses an appropriate
organizational structure for
development of reasoning and
logic, with minor lapses in structure and/or coherence.
Maintains an appropriate
organizational structure to address
specific requirements of the prompt. Structure
reveals the reasoning and logic
of the argument.
Maintains an organizational structure that
intentionally and effectively enhances the presentation of
information as required by the specific prompt.
Structure enhances development of the
reasoning and logic of the argument.
Scoring Element Example
Reading Research
Scoring Element 1 1.5 2 2.5 3 3.5 4
Reading/ Research
Attempts to reference reading
materials to develop
response, but lacks
connections or relevance to the purpose of the
prompt.
Presents information from reading
materials relevant to the purpose of the prompt with
minor lapses in accuracy or
completeness.
Accurately presents details from reading
materials relevant to the purpose of the
prompt to develop
argument or claim.
Accurately and effectively presents important details
from reading materials to develop argument or claim.
Scoring Element Example
Content Understanding
Scoring Element 1 1.5 2 2.5 3 3.5 4
Content Understanding
Attempts to include
disciplinary content in
argument, but understanding
of content is weak; content is irrelevant,
inappropriate, or inaccurate.
Briefly notes disciplinary
content relevant to the prompt; shows basic or uneven understanding
of content; minor errors in
explanation.
Accurately presents
disciplinary content relevant
to the prompt with sufficient
explanations that demonstrate
understanding.
Integrates relevant and accurate
disciplinary content with thorough
explanations that demonstrate in-
depth understanding.
Take 5
When thinking about your common assessment work in your districts, what LDC processes and
structures might appeal to you and your colleagues?
Putting it All Together
• Local teacher development of modules • Common modules and tasks used across
districts and states• Student work samples and common rubrics
used to develop scoring professional development
• Teachers scoring student tasks across districts and states
Putting it All Together
Professional Dialogue
Calibration
Take 5
How might a common assessment model that includes common local modules, assessment
and teacher scoring be part of a Michigan teacher effectiveness model?
Final Thoughts
• Models such as LDC honors teacher involvement in the process of curriculum, instruction, and assessment.
• Scoring professional development allows teachers to become part of the game.
• Teacher dialogue about student work enhances teacher knowledge.• Teacher effectiveness can be measured
using processes such as these!
For More Information
Sara [email protected]
Literacy Design Collaborativewww.literacydesigncollaborative.org
Measured Progresswww.measuredprogress.org