+ All Categories
Home > Documents > New York State Academy for Teaching and...

New York State Academy for Teaching and...

Date post: 05-Feb-2018
Category:
Upload: nguyenthuy
View: 214 times
Download: 0 times
Share this document with a friend
168
New York State Academy for Teaching and Learning Learning Experience Please complete the informational tables below and return this form attached to the Learning Experience. Contact Information Residing Address (including street, city, state, and zip-code) Patricia Loncto 84 Lake Street Youngstown, NY 14174 Phone numbers Home Phone 716-745-3211 Cell Phone 716-946-8480 Email Address [email protected] School Information Grade Level Instructed Undergraduate and Graduate level programs Content Area addressed within LE Rubrics School District in which the LE was implemented Daemen College Specific School within District Department of Education School Address (including street, city, state, and zip-code) 4380 Main Street, Amherst NY 14226-3592 4061 Creek Road, Youngstown NY 14171 Title of Learning Experience: Constructing Rubrics Interstate New Teacher Assessment and Support Consortium (INTASC) Standard assessed: Standard 6 – Assessment The candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making. 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning. Also aligns to Charlotte Danielson Rubrics for Teacher Evaluation Domains 6/24/2022 Pat Loncto Rubric LE final 1
Transcript
Page 1: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

New York State Academy for Teaching and LearningLearning Experience

Please complete the informational tables below and return this form attached to the Learning Experience.Contact Information

Residing Address (including street, city, state, and zip-code)

Patricia Loncto84 Lake StreetYoungstown, NY 14174

Phone numbers Home Phone716-745-3211

Cell Phone716-946-8480

Email Address [email protected]

School InformationGrade Level Instructed Undergraduate and Graduate level programs

Content Area addressed within LE RubricsSchool District in which the LE was

implementedDaemen College

Specific School within District Department of EducationSchool Address (including street,

city, state, and zip-code)4380 Main Street, Amherst NY 14226-35924061 Creek Road, Youngstown NY 14171

Title of Learning Experience: Constructing Rubrics

Interstate New Teacher Assessment and Support Consortium (INTASC) Standard assessed:Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making. 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Also aligns to Charlotte Danielson Rubrics for Teacher Evaluation – Domains

Bold and Underline the Standard/Performance Indicator Instructional Level being assessed.

Undergraduate Daemen Graduate Daemen In-service

Peer Review Date: July 8, 2013 PICCS Slice by Slice Protocol for student work onlyJuly 17, 2014 TLQP Peer Review Protocol Daemen College

Peer Review Focus Question(s): 1) Implementation of Learning Experience: How effective is this Learning Experience in guiding

teachers to create valid, reliable, and student friendly rubrics that will improve student achievement?2) Relevant Research: How do educational theories justify the time it takes to create valid and reliable

rubrics?3) Turnkey Capabilities to Various Audiences (I.E. instructional levels and demographics): How

well could a TLQP member use this Learning Experience as a professional development workshop or independent study?

5/5/2023 Pat Loncto Rubric LE final 1

Page 2: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TABLE OF CONENTSSection Beginning PageLearning Context – background information concerning the learning targets

Purpose/Rationale for Learning Experience Enduring Understandings, Essential Questions, Guiding Questions Guiding Questions for Differentiated Instruction Reflection Questions Review Of Literature Overview Congruency Table Class Background Overview of What Students Need to Know Key Contextual Vocabulary Assessment Plan: Diagnostic, Formative and Summative Rubrics

3

Student Work – data related to the writer’s implementation of the Learning Experience Data Set 1: Diagnostic vs. Summative Results by Level Data Set 2: Diagnostic vs Summative Results per Student Statistical Analyses of Rubric Diagnostic vs Summative Results Application of Rubric Knowledge

14

Procedure – step by step implementation information coordinating with the PowerPoint 17

Resources and Materials Required for Instruction 19

Differentiation Instructional TableRe-engagement Strategy

20

Time Required for Implementation and Assessment 21

Reflection by the writer of the Learning Experience 22

Appendix 1: Floor Plan, Rules and Procedures 23

Appendix 2: EXERCISE Packet, Teacher Exemplar 24

Appendix 3: Student Work Samples: Developing, Proficient, Distinguished 60

Appendix 4: Individual Student Class Data related to the writer’s implementation 76

Appendix 5: Slice by Slice Protocol – used for peer review of student work Charrette Protocol – used to collaborate on the editing of a rubric

78

Appendix 6: Learning Experience Planning Documents, Alignment to INTASC Standards, Danielson Framework for Teaching Rubrics, NYSUT Teacher’s Practice Rubric

82

Appendix 7: Review of Literature – Theories and Research that ground the rationale for creating and using rubrics.

89

Appendix 8: Peer Review Protocol for Professional Development Learning Experience – used for Peer Review of the Learning Experience

96

5/5/2023 Pat Loncto Rubric final 2

Page 3: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

LEARNING CONTEXTPurpose/Rationale for Learning Experience: This Learning Experience improves student achievement by enhancing the candidates’ ability to construct valid and reliable assessment tools in the form of rubrics that: document and evaluate student learning outcomes based on New York State Learning

Standards/Performance Indicators (PI)/Common Core State Standards (CCLS) inform teacher choices about student progress, and adjust instruction accordingly ensure that students self-assess their learning strengths and needs, and set personal learning goals

Enduring Understanding(s): Rubrics are instructional tools.Rubrics are assessment tools.Rubrics link assessment to learning objectives and outcomes (NYS Standards/PI/CCLS).

Essential Question(s): How do students show they have achieved the learning outcomes? (teacher-friendly)How well did they get it? (student-friendly)

Guiding Questions: What are the benefits of creating and using a rubric? When is it appropriate to use a rubric? What makes a rubric valid? – stress this question and continually analyze rubrics for this quality What makes a rubric reliable? – stress this question and continually analyze rubrics for this quality How does one determine the validity and reliability of rubrics found online? What influences a teacher’s decision to use a rubric for assessment of learning? How do rubrics benefit students? How do rubrics influence the effectiveness of instruction? How are rubrics related to objectives? How do rubrics indicate the achievement of objectives/learning outcomes? What is an effective and efficient process for developing rubrics? What is a weighted rubric? What is the value of using a weighted rubric to the teacher, and to the student? How is a weighted rubric constructed? What are some of the challenges in constructing and using a rubric?

Guiding Questions for Differentiated Instruction How can rubric scores be translated into grades? How can students be involved in the process of constructing rubrics?

Reflection Question(s): Diagnostic: What is the one question I have about rubrics that must be answered before I can move on to

constructing rubrics properly? Formative:

How do I discover when a rubric is needed vs. when a different type of assessment tool could be sufficient (ie checklist, anecdotal record, multiple choice test with answer key….)?

How do my objectives and outcomes connect to the valid and reliable rubric I created? Summative: What misconceptions about rubrics/assessment did I uncover? What is the one question I

have about rubrics that must be answered before I can move on to constructing rubrics properly?

5/5/2023 Pat Loncto Rubric final 3

Page 4: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Review of Literature Overview (see Appendix 7 for details)

Basic premise for developing this Learning Experience: A scoring rubric is a tool used as a formative and summative assessment strategy to judge and evaluate the continuous intellectual, social, and physical development of the learner. (Brookhart 2013)

Essential Questions for Literature Reviewo Implementation of Learning Experience: How effective is this Learning Experience in guiding

teachers to create valid, reliable, and student friendly rubrics that will improve student achievement?o Relevant Research: How do educational theories/researchers justify the time it takes to create valid

and reliable rubrics?o Turnkey Capabilities to Various Audiences (I.E. instructional levels and demographics): How

well could a TLQP member use this Learning Experience as a professional development workshop or independent study?

Research Theories behind the “Why” of rubrics is supported by (see Appendix 7 for specific references):

o Cognitive Learning Theoryo Constructivist Approacho Social Development Theory

References for Valid and Reliable rubrics: Brookhart, S. M. (2013). How to create and use RUBRICS. Alexandria, Virginia: ASCD. Goodwin, B., & Hubbell, E. (2013). The 12 touchstones of good teaching: A Checklist for Staying

Focused Every Day. Alexandria, Virginia: ASCD. Jonsson, Anders and Gunilla Svingby. (2007). The use of scoring rubrics: Reliability, validity and

educational consequences. Malmo, Sweden. Retrieved from: www.sciencedirect.com Educational Research Review, Volume 2, Issue 2, 2007, Pages 130-144.

5/5/2023 Pat Loncto Rubric final 4

Page 5: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Congruency TableInstructional Level: Undergraduate level (Daemen #327), and Graduate level (Daemen #518)

Interstate New Teacher Assessment and Support Consortium (INTASC) Standards:

Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher and learner’s decision making.

INTASC StandardKey Indicators

Instructional TaskState observable candidate behaviors directly linked to each Standard. Specify mode of instruction (individual, small/large group).

Learning ObjectivesReplicate objectives referenced in Learning Context. Learning Objectives are measurable and observable.

Candidate’s WorkName Self -Made Teaching Aids and Commercial worksheets, performances, or products providing concrete evidence of learning specified in each Standard.

Assessment ToolName tool(s) used to assess candidates’ work, such as checklist,/rubric. Link rubric’s dimensions/attributes to each Standard.

Standard 6 – Assessment of Learning6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Summative Task:Candidate individually constructs a rubric for a summative assessment aligned to the NYS Learning Standards and Performance Indicators to support, verify, and document learning.

Candidate individually in writing reflects on learning by re-answering diagnostic questions after the lesson and comparing the new responses to the diagnostic responses; analyzes all responses and verifies personal growth and needs.

Summative Objective:To construct a rubric that leads to the gathering of purposeful data to document learning about what students know and are able to do from what can be heard, observed, or read concerning learning outcomes scoring at least a level 3 or above on all rubric dimensions.

To express personal growth and needs as support for candidate’s further study and investigation.

Summative Work:Candidate-constructed rubric aligned to task and NYS Learning Standards/PI/CCLS.TASKS 5, 6.

Completed What do you know now? summative assessment and reflection worksheet.

Summative Tool:A Rubric About Rubrics with dimensions:

Instructional Value Dimension and

Language of the Standards

Level Descriptors

Instructor anecdotal reflection notes ensuring the instructor’s continuous growth.

5/5/2023 Pat Loncto Rubric LE final 5

Page 6: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Congruency Table continuedINTASC StandardKey Indicators

Instructional TaskState observable candidate behaviors directly linked to each Standard. Specify mode of instruction (individual, small/large group).

Learning ObjectivesReplicate objectives referenced in Learning Context. Learning Objectives are measurable and observable.

Candidate’s WorkName Self -Made Teaching Aids and Commercial worksheets, performances, or products providing concrete evidence of learning specified in each Standard.

Assessment ToolName tool(s) used to assess candidates’ work, such as checklist,/rubric. Link rubric’s dimensions/attributes to each Standard.

Standard 6 – Assessment6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Formative Task:With instructor support the candidate completes a variety of exercises concerning rubric development to verify progress.

Candidate self-assesses, individually, with peers and with instructor, the quality of constructed rubric continually during creation and prior to submitting the rubric to instructor to document learning progress.

Formative Objective:To scaffold knowledge and skills needed to construct valid and reliable rubrics.

To determine the quality of construction of rubric using Rubric Checklist.

To determine the quality of rubric compared to levels 3 and 4 on the A Rubric About Rubrics.

To engage learners in their own growth and to guide teacher’s and learner’s decision making concerning valid and reliable rubric creation.

Formative Work:Completed Constructing Rubrics Teaching to the Standards Class EXERCISE PacketTASKS 2-4.

Completed Rubric Checklist.

A Rubric About Rubrics score for candidate-constructed rubric aligned to NYS Standards/PI/CCLS.

Formative Tools:Instructor anecdotal notes to inform teaching methods and assignments for candidates.

Rubric Checklist.

A Rubric About Rubricswith dimensions:

Instructional Value Dimension and

Language of the Standards

Level Descriptors

5/5/2023 Pat Loncto Rubric final 6

Page 7: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Congruency Table continuedINTASC StandardKey Indicators

Instructional TaskState observable candidate behaviors directly linked to each Standard. Specify mode of instruction (individual, small/large group).

Learning ObjectivesReplicate objectives referenced in Learning Context. Learning Objectives are measurable and observable.

Candidate’s WorkName Self -Made Teaching Aids and Commercial worksheets, performances, or products providing concrete evidence of learning specified in each Standard.

Assessment ToolName tool(s) used to assess candidates’ work, such as checklist,/rubric. Link rubric’s dimensions/attributes to each Standard.

Standard 6 – Assessment6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Diagnostic TaskCandidate individually in writing self-assesses prior knowledge about constructing rubrics to document prior knowledge.

Diagnostic Objective:To complete diagnostic tool that provides an indication of candidate’s prior knowledge of learning outcomes in order to guide teacher’s and learner’s decision making, instruction and support, or enrichment concerning valid and reliable rubric creation.

Diagnostic Work:Completed What do you know? diagnostic assessment worksheet.TASK 1.

Diagnostic ToolsWhat do you know? worksheet, not officially scored.

Instructor anecdotal notes to inform teaching methods and assignments for candidates.

5/5/2023 Pat Loncto Rubric final 7

Page 8: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Class Background Senior students enrolled in pre-service education programs at Daemen College during student

teaching. Teacher Leadership Quality Partnership members enrolled for graduate credit at Daemen

College.

Overview of what students need to know/ be able to do/ understand in order to succeed-Prior to Learning Experience:

Knowledge of NYS Learning Standards/PI/CCLS Knowledge of basic diagnostic, formative and summative assessment strategies Basic skill in lesson plan development

During the implementation of LE: Skill to construct a valid and reliable rubric Skill to reflect and modify candidate’s practice based on student, peer, and self-evaluative data Understand that rubrics are instructional tools as well as assessment tools. Understand when to use a rubric and when not to use a rubric Understand key vocabulary enough to recognize the meaning when applied to practice

After the implementation of LE: Use the candidate-constructed rubric to score student work with three benchmark papers for

distinguished, proficient and developing levels of competency. Collect classroom set of student work for peer review for each level of competency.

Key Contextual Vocabulary for understanding content within this Learning Experience: (Note: depending on specific background of candidate population and purpose of the implementation of this Learning Experience, these vocabulary terms may need to be explicitly taught to the candidates within the learning opportunities.)

Anecdotal Notes are recorded specific observations of individual student behaviors, knowledge, skills and attitudes as they relate to the outcomes in the programs of study. Such notes provide cumulative information on student learning and direction for further instruction. Anecdotal notes are brief, objective and focused on specific outcomes. Notes taken during or immediately following an activity are generally the most accurate. http://www.learnalberta.ca/content/mewa/html/assessment/anecdotalnotes.html

Assessment Tool(s) are items that provide concrete evidence of learning. They can be recall-based, performance-based, product-based, and process-based (reflections). Assessment tools must be aligned to the stated Standards/PI/CCLS, that is, they must show what each student knows and can do in relation to the performance indicators that the lesson sets out to accomplish. They result from engagement in the Learning Opportunities related to the performance indicators.

Candidate is an undergraduate or graduate student in the Department of Education from Daemen College.

Diagnostic Assessment is a method or tool used to ascertain, prior to instruction, each student’s strengths, needs, knowledge, understanding, and skills. Diagnostic assessments identify what the student knows, understands, or can do before the learning, thereby helping the teacher plan instruction. When compared to the summative assessment for the specified learning, the diagnostic assessments also become evidence of growth as a result of learning.

5/5/2023 Pat Loncto Rubric final 8

Page 9: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Formal assessments have data which support the conclusions made from the test. We usually refer to these types of tests as standardized measures. These tests have been tried before on students and have statistics which support the conclusion such as the student is reading below average for his age. The data is mathematically computed and summarized. Scores such as percentiles, stanines, or standard scores are most commonly given from this type of assessment. http://www.scholastic.com/teachers/article/formal-versus-informal-assessments Formal assessments, also known as standardized assessments, measure overall student achievement as compared to other students to identify comparable strengths and weaknesses with peers. Formal assessments are data-driven and have been conducted previously with other students to determine statistics that support the test conclusions. They are administered under regulated test-taking conditions, and the data is mathematically computed and reviewed, offering scores as percentiles, stanines or rankings. A primary example of a formal assessment is the Scholastic Aptitude Test, or SAT.

http://www.ehow.com/info_8602405_difference-between-formal-summative-assessment.html

Formative Assessment is a purposefully planned collection of evidence of student progress through the use of embedded learning strategies designed to meet the Learning Standard/Performance Indicators which teachers and/or students use to ascertain what students currently know, understand and are able to do during instruction. The purpose of formative assessment is to measure student progress in increments of achievement in order to adjust instruction or for students to adjust their current learning tactics.

Grade is to assign a value that is evaluative and summative, vs Score which is to measure progress that is formative for the purpose of growth.

Informal assessments are not data driven but rather content and performance driven. For example, running records are informal assessments because they indicate how well a student is reading a specific book. Scores such as 10 correct out of 15, percent of words read correctly, and most rubric scores; are given from this type of assessment. http://www.scholastic.com/teachers/article/formal-versus-informal-assessments Informal assessments, also known as criterion-referenced measures or performance-based measures, are driven by content and performance, rather than data. A homework assignment score of 14 out of 15 and a group project rubric score of "Excellent" are examples of informal assessments. Informal assessments are used to inform instruction and dictate further practice needed, rather than to measure learning compared to peers, which formal assessments provide. http://www.ehow.com/info_8602405_difference-between-formal-summative-assessment.html

Instructional Tool can be defined as any document, material or item used to facilitate learning. Specifically a rubric is an instructional tool that can inform teachers of the measures of student learning success and impact teacher and student decisions for modifying/adjusting instructional strategies.

Interstate New Teacher Assessment and Support Consortium (INTASC) Standards, Danielson Framework for Teaching and NYSUT Teacher’s Practice Rubrics outline what all teachers should know and be able to do to be effective in today's learning contexts. The Interstate Teacher Assessment and Support Consortium (INTASC) is a consortium of state education agencies and national educational organizations dedicated to the reform of the preparation, licensing, and on-going professional development of teachers. Danielson and NYSUT Rubrics are related to teacher growth and evaluation.

5/5/2023 Pat Loncto Rubric final 9

Page 10: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Learning Outcome is the specification of the knowledge, skills, understandings, and abilities that students have attained as a result of their involvement in a particular set of educational experiences.

New York State Standards/Performance Indicators (NYS Standards/PI) are specific criteria/objectives for identifying what students are expected to know, understand, and be able to do in disciplines other than English Language Arts (Literacy) and Mathematics.

New York State Common Core State Standards (CCLS) are specific criteria/objectives for identifying what students are expected to know, understand, and be able to do in Literacy and Mathematics.

Rubric is a scale that describes levels of performance for each of the important aspects of learning in terms of quality not quantity. Rubrics are a formative assessment tool when scores are not given but rather student and teacher use the levels of the rubric criteria for conversation about the strengths and needs of the student at a particular time. Rubrics are a summative assessment tool when scores are given to evaluate student achievement of learning outcomes.

Reliable means the rubric score is consistent across several different scorers.

Student is a grade pre-K-12 learner.

Summative Assessment is an evaluation to comprehensively assess each student’s learning for attainment of learning outcomes and the effectiveness of an instructional segment or program.

Understanding is the ability to make connections and bind together knowledge into something that makes sense of things, rather than seeing only unclear, isolated, or unhelpful facts. Understanding is necessary so that the information can be wisely and effectively used. Knowledge does not equal understanding.

Valid means the rubric assesses what is central to the attainment of student knowledge, understandings, and skills related to the Standards/PI/CCLS.

Weighted rubric is an analytic rubric in which certain concepts/criteria are judged more

heavily than others. http://www.teachervision.fen.com/teaching-methods-and-management/rubrics/4525.html

ASSESSMENT PLANDiagnostic Tool: What do you know? (worksheet)Prior to teaching the lesson candidates explain their present knowledge concerning assessment, assessment moments, rubric creation, and the connection between using a rubric for instruction as well as personal growth. Candidates also reflect, score their assessment, and express their current needs. The instructor collects and scores the assessment. This information is used to modify and adjust the lesson based on the candidates’ starting point as a result of an analysis of the responses.

5/5/2023 Pat Loncto Rubric final 10

Page 11: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Rubrics Diagnostic/Summative Scoring ToolTo be used for quick scoring of each question on Diagnostic and Summative Assessments and

further analysis of student understanding. (INTASC 6a)

4Heart

Pounding

All encompassing answer that gets to the heart of the concept with concise detail using precise best practice language.

3Heart

Beating

Answer contains enough detail to correctly explain the concept and uses a sprinkling of educational language to describe ideas.

2Pulse

Answer contains generalized statements using layman terms.AND/OR

Thinking reveals some misconceptions.

1Need

Resuscitation

Deficient understanding of concept area is evident by incomplete or meaningless statements.

AND/OR Thinking reveals serious misconceptions.

Formative Tools/learning opportunities: Learning is scaffolded by requiring candidates to complete Challenge exercises that help them analyze their perceptions and skill concerning the validity, reliability, creation, and use of rubrics. The rubrics examined are authentic rubrics created by pre-service candidates in the past and current classroom teachers. Reflection, class discussion, self-assessment and peer assessment of work is continual throughout the tasks. At the mid-term candidates have the opportunity to receive feedback on their personally created rubrics from peers during a peer review prior to the rubric’s use with students. A score is given for progress on the rubric creation at the time of the review. Differentiated tasks are provided for those candidates whose knowledge on the diagnostic tool demonstrated enough proficiency to move to the next level of rubric learning and/or their desire to know more gave them the desire to complete the differentiated tasks that are not scored.

Summative Assessment: Candidates create a rubric for an authentic student assignment. Following this lesson and that creation, candidates use the rubric with their students, score the assignment using the rubric, and present the resulting student work and scored rubric to the instructor. Candidates retake the diagnostic tool this time called “What do you now know?” to compare the increase/decrease in understanding as a result of instruction and class assignments.Both summative assessments are scored and graded using the scoring tools: A RUBRIC ABOUT RUBRICS and Rubrics Diagnostic/Summative Scoring Tool.

A Rubric About Rubrics and alignment to INTASC Standards: Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making. 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 11

Page 12: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

If the candidate understands assessment he/she constructs and uses valid and reliable strategies and tools for continuous examination of student intellectual development and growth concerning the learning outcomes. The A Rubric about Rubrics monitors the candidate’s development for creating rubrics, an assessment tool aligned to INTASC standard 6, similar to the way the candidate should be demonstrating his/her capacity to monitor students in the attainment of the Standards/PI/CCLS in the rubric he/she constructs for the purposes of guiding the teacher’s and learner’s decision making.

Rationale for the diagnostic assessment being retaken as a summative assessment: Analysis of the diagnostic task (What do you know?) is also a model for how the information collected prior to learning is used to modify and adjust the lesson strategies based on the students’ starting point as a result of an analysis of the responses. The summative retake of requesting the same information (What do you know now?) models how to evaluate intellectual growth from the diagnostic to summative moment of instruction.

Use of Rubric Score in context of other semester grades/report cards: See current syllabus for updates on assignment values throughout the course.

Candidate’s Role in assessment process: Candidates continually self-assess using the A Rubric About Rubrics during the creation of the rubric. At the mid-term candidates peer review each other’s rubric and receive a tentative score which can be changed at the end of the term if the rubric is edited for improvement. Candidates have the option to continue learning about rubrics by choosing to complete the differentiated enrichment tasks that are not scored.

5/5/2023 Pat Loncto Rubric final 12

Page 13: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score (12 out of 12) Score:Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 13

Page 14: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

STUDENT WORKDATA SET 1: Diagnostic vs. Summative “What do you Know/Now Know”, results by Proficiency Level

Total Students: 12Assessment Developing Proficient Distinguished

Diagnostic 10 2 0Summative 5 4 3

0

2

4

6

8

10

Number of Students

Developing Proficient Distinduished

Levels of Proficiency

Diagnostic vs Summative by Level

Diagnostic

Summative

The diagnostic vs. summative data by proficiency level shows that many students showed growth as a result of formative instruction. Fifty percent of the students that scored in the developing level on the diagnostic assessment increased their summative assessment scores to the proficient or distinguished level after participating in formative instruction activities. NOTE: Numbering of questions on diagnostic and summative assessments differ however the data reflects consistent cross referencing of the same questions making the data accurate.

DATA SET 2: Diagnostic vs. Summative “What do you Know/Now Know, results per Student (Out of 40 points)

05

10152025303540

score

1 2 3 4 5 6 7 8 10 11 12 13

student number

Rubric Diagnostic vs Summative results per student

Diagnostic

Summative

5/5/2023 Pat Loncto Rubric final 14

Student Diagnostic Summative1 13 212 11 203 21 324 16 255 14 346 18 297 21 248 32 3410 11 3911 17 2512 22 2813 28 31

Distinguished

Key:

Developing (0-27 Points)

Proficient (28-33 Points)

Distinguished (34-40 points)

Page 15: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

The diagnostic vs. summative data per student indicates that even though 5 students remained at the developing level on the summative assessment, all students showed growth regarding knowledge of rubrics as a result of formative instruction indicated by the increase in diagnostic vs. summative scores seen on the bar graph above. Those remaining at the developing level scored 3-9 points higher on the summative assessment after participating in formative instruction activities. In fact, those students who remained in the developing range on the summative assessment were only 3-8 points away from attaining proficiency.. Five students benefitted greatly from formative instruction. Their scores were in the developing range on the diagnostic assessment; however, due to formative instruction, 2 scored in the proficient level and 3 in the distinguished level on the summative assessment. The 1 student who scored in the proficient range on the diagnostic remained in the proficient range on the summative assessment, but with a 3 point increase due to formative activity participation.

Statistical Analyses of Rubric Diagnostic vs. Summative Assessment Results

Mean (Average) 19 29Median 18 29Mode 11 25

Standard Deviation 7 6t-test 0.0004079

The statistical analyses of Rubric Diagnostic vs. Summative Assessment Results confirms student achievement due to formative instruction was effective because the mean (average) increased 10 points from a developing to a proficient level on the diagnostic vs. summative assessment, while the median followed suit with an increase of 11 points. This shows that students were able to more accurately answer rubric based questions once formative instruction was provided. The mode also changed as a result of formative instruction from a low developing level of 11 to a high developing level of 25. The standard deviation indicates that student scores clustered around the mean by 7 points in the developing range on the diagnostic assessment, whereas the summative scores ranged from a high developing level of 23 to a distinguished level of 35. Hence, overall, the summative assessment score increase is most likely a result of effective formative instruction. This is supported even more so by the t-test p-value of 0.0004079, which confirms that formative instruction was effective since the p-value is below the standard alpha value of 0.05. This indicates that the diagnostic and summative assessment results are dissimilar, revealing that formative instruction between the 2 assessments was effective and the results are most likely not due to chance.

Application of Rubric Knowledge

DATA SET 3: Student Constructed Rubric Scores (Total and by Dimension) Using "A Rubric About Rubrics"

Rubric Scores

Developing

Proficient

Distinguished

Validity 9 2 1Reliability 10 2 0

5/5/2023 Pat Loncto Rubric final 15

Page 16: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

0123456789

10

Number of Students

Developing Proficient Distinguished

Levels of Proficiency

Rubric Scores (Total and by Dimension)

Total scoreValidityReliability

0123456789

Number of students

Developing Proficient Distinguished

Levels of Proficiency

Rubric Validity

Validity

0

2

4

6

8

10

Number of Students

Developing Proficient Distinguished

Levels of Proficiency

Rubric Reliability

Reliability

Although candidates showed growth in rubric knowledge on the summative assessment, they did not demonstrate a transfer of the knowledge to the skill of developing rubrics as is shown by the student scores in the developing level on the rubrics created for their learning experiences. Their understanding that a rubric assesses learning outcomes is demonstrated by higher scores for validity (rubric alignment to the state standards addressed in their LEs) than reliability, however overreliance on quantitative measures without quality descriptors in the rubric resulted in a high concentration of developing level rubrics. Knowledge does not translate into practice. Perhaps experience and authentic situations for rubric use impact the competency for creating valid and reliable rubrics.

NOTE: INSTRUCTIONAL VALUE Dimension on "A Rubric About Rubrics" was not used for assessing the candidates. This dimension was added after the July 9, 2013 Assessment/Slice Protocol with a PICCS peer review.

5/5/2023 Pat Loncto Rubric final 16

Page 17: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

PROCEDUREFor materials to use during the Procedure and for Differentiated Enrichment Tasks see:

Exercise Packet within this LE document Corresponding Assessment Class Constructing Rubrics PowerPoint

The Procedure may be modified for use during a Professional Development series.The time frame will vary depending on the range of instructional objectives and the time allotted for instruction and personal work. The most productive professional development schedule would be to hold three two hour sessions with mentoring between sessions. The sessions occur over time within two months in order to give participants time to reflect and practice constructing rubrics.

Summative Objective:Construct a valid and reliable rubric that leads to the gathering of purposeful data to document learning about what students know and are able to do from what can be heard, observed, or read concerning learning outcomes scoring at least a level 3 or above on all rubric dimensions. The rubric must:

document and evaluate student learning outcomes based on New York State Learning Standards/Performance Indicators (PI)/Common Core State Standards (CCLS)

inform teacher choices about student progress to adjust instruction accordingly ensure students self-assess their learning strengths and needs, and set personal learning goals To express personal growth and needs as support for candidate’s further study and

investigation.

Formative Objective:To scaffold knowledge and skills needed to construct valid and reliable rubrics.To determine the quality of construction of rubric using Rubric Checklist.To determine the quality of rubric compared to levels 3 and 4 on the A Rubric About Rubrics.To engage learners in their own growth and to guide teacher’s and learner’s decision making concerning valid and reliable rubric creation.

Diagnostic Objective:To complete diagnostic tool that provides an indication of candidate’s prior knowledge of learning outcomes in order to guide teacher’s and learner’s decision making, instruction and support, or enrichment concerning valid and reliable rubric creation.

Instructional Tools: Assessment Class PowerPoint with instructor notes for commentary Rubric EXERCISE Packet

Instructional Method Cycle for each step: Teacher delivers mini-lecture using PowerPoint slides. Candidate completes TASK. Group debriefs TASK; mini-lecture when appropriate Personal mentoring available upon request

5/5/2023 Pat Loncto Rubric final 17

Page 18: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Step by step Procedure: Follow steps 1-10 using coordinating PowerPoint slides and Rubric EXERCISE Packet

1. Diagnostic Assessment and Reflection to activate and assess prior knowledge: Power Point Slides 1,2 Refer to Rubric EXERCISE Packet. Write all answers in packet. TASK 1: Complete What do you know? Diagnostic Assessment. Hold until all TASKS are completed.

2. Teacher mini-lecture PowerPoint Slides 3-17 Listen to introductory remarks by instructor using PowerPoint slides on this lesson’s Enduring

Understandings, Essential Question, Guiding Questions, Checklists vs Rubrics.

3. Independent Practice Candidate Tasks PowerPoint Slide 18, 19 TASK 2: Complete Formative Challenge Questions in EXERCISE Packet. Challenge A – modeled “think aloud” with instructor Challenge B – pair “think aloud” Debrief reliability and validity: What did you notice in the Challenge Questions?

4. PowerPoint Slide 20 TASK 2: Challenge C in EXERCISE Packet Debrief reliability.

5. PowerPoint Slide 21, 22 TASK 2: Challenge D in EXERCISE Packet Debrief comparison of draft and final edited rubric,

6. PowerPoint Slide 23 TASK 3: Review sample rubric designs in EXERICSE Packet; make notations on samples to indicate

your personal design preferences when creating a rubric. Debrief preferences for rubric design from Challenge D in EXERCISE Packet.

7. PowerPoint Slide 24 TASK 3: Contemplate. Reflect. Debrief using Contemplation Questions and Reflection found in EXERICSE Packet and Power Point

slide.

8. PowerPoint Slide 25-27 TASK 4: Read information on creating rubrics in EXERCISE Packet. Highlight information you want

explained by instructor. Debrief confusions about constructing rubrics.

9. SUMMATIVE ASSESSMENT – HOMEWORK and Reflection completed on personal time for next session.

PowerPoint Slide 28 TASK 5: - Construct a rubric. Submit to instructor with your self-assessed score from using A Rubric

About Rubrics recorded in the upper right corner found in EXERCISE Packet. Debrief directions for constructing an original rubric; collection by instructor, personal, peer, instructor

scoring.

5/5/2023 Pat Loncto Rubric final 18

Page 19: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Ongoing Reflection TASK: Self-assess your rubric and check box when criteria are met; measure constructed rubric against A Rubric About Rubrics; adjust accordingly prior to submission to instructor.

10. Closure and Reflection PowerPoint Slide 29-31 TASK 6: Complete What do you know now? SUMMATIVE ASSESSMENT in EXERCISE Packet.

Submit to instructor. Debrief comparison of Diagnostic and Summative Assessment and reflection.

Differentiated Enrichment TASKS found in Exercise Packet: To be used when instructor determines candidate’s knowledge base is advanced for the current pacing sequence or candidate requests acceleration and enrichment.

Differentiated Enrichment TASK A: Calculate the point value for each attribute, the maximum rubric score, and the points scored on each rubric and overall percentage score.

Differentiated Enrichment TASK B: Design a score sheet for translating rubric scores into grades. Guiding Question: How can rubric scores be translated into grades?

Differentiated Enrichment TASK C: Review “More on RUBRICS” to address frequently asked questions. What new thoughts come to mind that could impact teaching and learning in your classroom? Construct a rubric with student input. Guiding Question: How can students be involved in the process of constructing rubrics?

Extensions: The following possible scaffolding learning opportunities could be developed to replace or augment the learning described in this Learning Experience:

Write a rubric strip as a whole group with teacher think aloud. Write a rubric strip in small group. Edit a rubric using the Charrette Protocol (see Appendix 5)

RESOURCES AND MATERIALS REQUIRED FOR INSTRUCTIONReferences:

Brookhart, S. M. (2013). How to create and use RUBRICS. Alexandria, Virginia: ASCD Goodwin, B., & Hubbell, E. (2013). The 12 touchstones of good teaching: A Checklist for

Staying Focused Every Day. Alexandria, Virginia: ASCD. Jonsson, Anders and Gunilla Svingby. (2007). The use of scoring rubrics: Reliability, validity

and educational consequences. Malmo, Sweden. Retrieved from: www.sciencedirect.com Educational Research Review, Volume 2, Issue 2, 2007, Pages 130-144.

Miller, Andrew K. (2013). 4 tips to get more out rubrics. Retrieved from http://www.andrewkmiller.com/2013/04/4-tips-to-get-more-out-of-rubrics

Wiggins, G., & McTighe, J. (2005). Understanding by design (Expanded 2nd Edition ed.). Alexandria, Virginia: ASCD.

Supplies: Projection for PowerPoint Computer Distance Learning Lab, optional

Student Materials {S elf M ade T eaching A ids (SMTA) and Commercially made}

5/5/2023 Pat Loncto Rubric final 19

Page 20: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

See Rubrics EXERCISE Packet in Appendix 2.

Differentiated Instruction Table

Modification Type Specific Modification Rationale BenefitsEnvironmental and Management

Distance Learning Lab or Computer Lab

Independent study as an option for absent students or students wishing to take the unit as an independent study.

Allows for differentiation in time and place for student access to learning.

Instructional Email access to instructor for formative feedback while completing exercises and summative task

Constructing rubrics is complicated and can be overwhelming. Some students learn this skill by slowly progressing through steps with guidance.

Provides formative support.

Content/Material Choice of discipline and focus for which to create a rubric

Learning works best when the student has choice and when the task is authentic.

Students may select a focus for the rubric for which there is an authentic use in his/her classroom.

Tasks Differentiated tasks are provided in the Rubrics EXERCISE Packet

Tasks can be broken into segments for learning.

For accelerated students or students who have interests beyond the scope of the unit.

Accommodates time periods or candidate needs.

Allows for differentiation according to interest or competency.

Pacing fits individual needs.

Re-engagement Strategy: Explain how students who did not gain proficiency on the summative LE/LS objectives will be provided a differentiated approach through re-engagament/reteaching in order to gain proficiency:Candidates have opportunities for personal mentoring during and after the Learning Experience in order to satisfactorily create a valid and reliable rubric. The emphasis is on learning how to create a valid and reliable rubric NOT on getting a grade. Candidate dedication to learning determines the amount and timing of the request for face-to-face or electronic mentoring.

5/5/2023 Pat Loncto Rubric final 20

Page 21: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TIME REQUIRED

Planning: One hour to prepare materials

Implementation During 2 hour Class: Time Activity

4:30-4:45 Class Housekeeping4:45-4:55 Diagnostic (Task 1)4:55-5:00 EQ/EU/Checklist vs. Rubric5:00-5:10 Reliability and Validity

Challenge A and B (Task 2)5:10-5:15 Challenge C5:15-5:30 Challenge D5:30-5:45 Break5:45--6:00 Task 36:00-6:10 Task 46:10-6:30 Homework6:30-6:45 Summative /Task 6

Assessment (per student): Formative assessment scoring and feedback varies according to student need. Total average time per student can range from 0-5 hours however the time is not in one sitting but over time with several revisions to the summative assessment prior to completion.

Assessment of diagnostic and summative reflections takes about 5 minutes per student.Assessment of summative rubric takes about 15 minutes per student.

Schedule / unit plan: The Teaching to the Standards Course meets at Daemen College during the Fall and Spring semesters. The summative assessment for the course is a Learning Experience to include student work scored with a rubric. Instruction on writing rubrics (this Learning Experience) is one three-hour class with a Learning Experience and rubric peer review at mid-term and final submission at the conclusion of the semester. Students receive informal mentoring upon request and formal feedback at the mid-term.

NOTE: If this Learning Experience is used for Professional Development the time frame will vary depending on the range of instructional objectives and the time allotted for instruction and personal work. The most productive professional development schedule would be to hold three two hour sessions with mentoring between sessions. The sessions occur over time within two months in order to give participants time to reflect and practice constructing rubrics.

5/5/2023 Pat Loncto Rubric final 21

Page 22: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

REFLECTIONI began to formally develop this Learning Experience as a review class on rubrics for Daemen College pre-service students attending the college course “Teaching to the Standards” in 2004. Prior to that time I conducted several professional development courses throughout NYS on rubric writing because it is an interest, passion or maybe a compulsion of mine.

Detail fascinates me sometimes to a fault so I put this trait to productive use by perfecting workshop sessions on rubric development. I saw the benefit of using quality rubrics in my own classroom for me and for my students. However I had to develop the rubrics myself and in collaboration with others because the rubrics found online and in professional books usually did not give me valid and reliable data. This led me to sharing my journey and expertise with others in the hopes that other teachers could populate education with quality rubrics that could save teachers time and help students perfect their work.

Most difficult was collecting thoughtful diagnostic and summative reflective data. Workshop time is short and most participants want to jump into the writing of the rubric to just get it done so it can be used. Taking time to complete a diagnostic survey is not the participant’s priority and it shows in the incomplete and underdeveloped answers. Developing a tool to collect the pre and post knowledge, understanding, and attitudes required me to continually revise the tool as I saw the participants struggle to complete the tool and/or just scribble an answer without much intent. Next I realized I needed to quantify the data to determine participant growth – enter a need for a short answer rubric.

The second challenge this round has been revising the A Rubric About Rubrics to account for teacher-created rubrics that are based on quantity not quality. The way my rubric was previously written a designer could receive a valid and reliable score but yet his/her rubric would not be effective at assessing the quality of attaining the Standard, nor would it have value for student self-assessment. I found new value in writing my own exemplar for the pre and post assessment questionnaire in order to clarify my expectations in my own mind. Scoring the student work was time consuming because I did not really know what I was looking to find.

The process of locating benchmark papers was another strategy I used to update the A Rubric About Rubrics as I began to identify criteria in student work that would not fit into any rubric level as it had been written in previous versions. This is a number one signal that the rubric needs to be adjusted. I believe this need arose because the audience for whom I delivered this instruction changed from veteran teachers to pre-service teachers who do not have pedagogical vocabulary/content knowledge/experience, or an operational understanding of Standards-based assessment in the classroom.

In fact I am wondering how I functioned this long with such an inferior rubric being placed in a Learning Experience about writing rubrics???!!! Maybe it proves that being an educator is a journey and as the field of education advances so must a teacher’s practice grow and change. That is my personal experience anyway.

What you see in this Learning Experience is the best I can do without more collaborative input before I try again. To be fair to the reader and those who want to write units of instruction, this Learning Experience was written in my head and personal classroom as well as professional development experience for about 10 years. The actual writing and teaching of Constructing Rubrics has been 20 revisions over another 10 years.

My dedication to perfecting the Rubric Learning Experience has been worth the time and effort because my rubric “groupies” have a level of competence that tells me I am leaving a legacy that will never end. As education’s pedagogical knowledge grows more precise these professionals have the capacity to impact its growth at the same time they impact student learning. This topic of constructing Rubrics is the centerpiece of my career couched in the peer review format as my protocol of choice for improving education.

5/5/2023 Pat Loncto Rubric final 22

Page 23: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

APPENDICES

Appendix 1: Floor Plan, Rules and ProceduresFLOOR PLAN

RULES and PROCEDURES

5/5/2023 Pat Loncto Rubric final 23

Page 24: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 2: EXERCISE Packet, Teacher Exemplar

Constructing RubricsEXERCISE Packet

Contains student learning opportunity instructions and handouts to be used throughout the Learning Experience.

Use the Procedure in the Learning Experience and the Assessment Class Constructing Rubrics PowerPoint to guide the instruction.

A good rubric: The criteria for assessment

represent what is important,NOT what is easiest

to count, see, and score.

5/5/2023 Pat Loncto Rubric final 24

Page 25: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Learning Experience TASKS Consensogram: How much do you know about rubrics?

TASK 1: Complete What do you know? Diagnostic Assessment.

Listen to introductory remarks by instructor: Enduring Understandings, Essential Question, Guiding Questions, Checklists vs Rubrics

TASK 2: Complete Formative Challenge Questions. A – modeled “think aloud” with instructor B – pair “think aloud” C-D - independently Participate in large group debrief: What did you notice in the

Challenge Questions?

TASK 3: Review sample rubric designs; make notations on samples to indicate your personal design preferences when creating a rubric. Contemplate. Reflect.

TASK 4: Read information on creating rubrics. Highlight information you want explained by instructor.

TASK 5: SUMMATIVE ASSESSMENT - Construct a rubric. Submit to instructor with your self-assessed score from using A Rubric About Rubrics recorded in the upper right corner.

Ongoing TASK: Self-assess your rubric and check box when criteria are met; measure constructed rubric against A Rubric About Rubrics; adjust accordingly prior to submission to instructor.

TASK 6: Complete What do you know now? SUMMATIVE ASSESSMENT. Submit to instructor.

5/5/2023 Pat Loncto Rubric final 25

Page 26: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Consensogram – a quick diagnostic assessment prior to the start of the first class.

Design a chart with four columns to show the frequency of distribution of responses to the question “How much do you know about rubrics? – could teach other how to design a rubric, could design a quality rubric for myself, need individual guidance to design a rubric, would not even try to design a rubric” to measure the group's perceptions and allow individuals to view their responses in relation to the entire group by having each candidate place a dot sticker to indicate their starting knowledge base. The completed chart will indicate the degree of knowledge on the issue. The instructor can adjust the lesson based on the knowledge of the candidates. Dots provide anonymity.   Could teach another how to design a rubric.

Could design a quality rubric for myself.

Need individual guidance to design a rubric.

Would not even try to design a rubric.

At the beginning of the first instructional session the Consensogram is discussed by the instructor as a think aloud.

The Consensogram may be given at various times during the course of the Learning Experience to see if the data changes. All candidates can then reflect on why the data changed noting growth and in some cases regression. Metacognitive analysis of these occurrences is valuable for instructional pacing and redirection as well as candidate decisions for next steps within his/her learning path.

5/5/2023 Pat Loncto Rubric final 26

Page 27: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TASK 1: What do you know? DIAGNOSTIC ASSESSMENTAnswer the following questions to the best of your ability with concise statements. No more than five minutes is given for this task. Please attempt to answer all questions with fluency. Do not labor over response.

TASK 6: What do you know now? SUMMATIVE ASSESSMENTUse a different writing tool, highlighter, or another method to edit Diagnostic answers according to

knowledge gained. 1. How would you know the students achieved the learning outcome(s)?

Moment Purpose:State the purpose for each of the following assessment Moments:

2. Summative

3. Formative

4. Diagnostic

5. When is a rubric an appropriate assessment/scoring tool?

6. What makes a quality rubric?

7. How do rubrics influence the effectiveness of instruction?

8. How do rubrics benefit students?

9. Why is it important that numbers for rubric levels are consecutive?

10. What is a weighted rubric?

11. Use the Rubric about Rubrics to score a previously created rubric; record the score_______

5/5/2023 Pat Loncto Rubric final 27

Page 28: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Use this rubric and the Rubric about Rubrics to score your Diagnostic and Summative Assessment

Refer to teacher exemplar to guide the scoring. Do not add to the diagnostic answers after seeing the teacher exemplar but use the exemplar to guide further learning during the instruction segments.

Rubrics Diagnostic/Summative Scoring ToolTo be used for quick scoring of each question on Diagnostic and Summative Assessments and

further analysis of student understanding. (INTASC 6a)

4Heart

Pounding

All encompassing answer that gets to the heart of the concept with concise detail using precise best practice language.

3Heart

Beating

Answer contains enough detail to correctly explain the concept and uses a sprinkling of educational language to describe ideas.

2Pulse

Answer contains generalized statements using layman terms.AND/OR

Thinking reveals some misconceptions.

1Need

Resuscitation

Deficient understanding of concept area is evident by incomplete or meaningless statements.

AND/OR Thinking reveals serious misconceptions.

REFLECTION AFTER SCORING: Write answer below question.What misconceptions about rubrics/assessment did I uncover? What is the one question I have about rubrics that must be answered before I can move on to constructing rubrics properly?

5/5/2023 Pat Loncto Rubric final 28

Page 29: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Teacher Exemplar: Diagnostic/Summative belowNAME: TASK 1: What do you know? DIAGNOSTIC ASSESSMENTAnswer the following questions to the best of your ability with concise statements. No more than five

minutes is given for this task. Please attempt to answer all questions.

TASK 6: What do you know now? SUMMATIVE ASSESSMENTUse a different writing tool, highlighter, or another method to edit Diagnostic answers according to

knowledge gained. 1. How would you know the students achieved the learning outcome(s)? I would design a rubric

with 4 levels of competency aligned to the learning outcomes (knowledge, skills, understandings). The students and I would use the tool to monitor student progress as demonstrated in the student work demonstrating the criteria related to attainment of those learning outcomes.

Moment Purpose:State the purpose for each of the following assessment Moments:

2. Summative Evaluation to comprehensively assess student learning for attainment of learning outcomes and the effectiveness of an instructional segment or program.

3. Formative For teachers and students to check student progress in increments of achievement toward attainment of learning outcomes in order to adjust instruction or for students to adjust their learning tactics.

4. Diagnostic Assess student knowledge, understanding, and skills, prior to beginning a Learning Experience thereby helping the teacher plan instruction. When compared to the summative assessment for the specified learning, the diagnostic assessments also become evidence of growth as a result of learning.

5. When is a rubric an appropriate assessment/scoring tool? A rubric is appropriate for authentic processes, performances, and products found in the real world.

6. What makes a quality rubric? A quality rubric is valid, that is it assesses what is central to the attainment of student knowledge, understandings, and skills related to the Standards/PI/CCLS. AND it is reliable, that is the rubric score is consistent across several different scorers.

7. How do rubrics influence the effectiveness of instruction? Following diagnostic assessment and during formative assessment of student learning the rubric scores signal when the teacher should modify and adjust instruction based on student needs.

8. How do rubrics benefit students? During formative assessment of student self assessment and teacher assessment of student learning the rubric scores signal when students need to adjust their learning tactics and/or when the teacher must provide extra support/enrichment.

9. Why is it important that numbers for rubric levels are consecutive? When rubric levels are not consecutively numbered there are no criteria for the absent number levels and therefore no score can be given that number value.

10. What is a weighted rubric? A rubric in which certain concepts/criteria are judged more heavily than others.

11. Use the Rubric about Rubrics to score a previously created rubric; record the score_______.

5/5/2023 Pat Loncto Rubric final 29

Page 30: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score (12 out of 12) Score:Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 30

Page 31: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

RUBRICSTO CRITIQUE

5/5/2023 Pat Loncto Rubric final 31

Page 32: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Fact Sheet – For your information

WHEN ARE THE MOMENTS TO ASSESS?Assessment Moments, Stakes, and Purposes, Examples

MomentsWhen do you

assess?

StakesWhat is the level of

risk for student?

PurposesWhy assess?

ExamplesHow do you assess?

DIAGNOSTIC(before

teaching to determine what student already

knows)

LowNo grade given

-to gather data and diagnose students’ knowledge and skills before learning,

-to plan for instruction,

-to place children, secure additional services

concept map quiz or test on demand task on demand application of skill reflection anecdotal notation

FORMATIVE(while teaching to give student

and teacher opportunities to

self-correct)

LowNo grade given

unless formative leads to summative.

Teacher choice.

-to gather data and give feedback,

-to monitor or adjust instruction, services

-as students, to monitor and self-correct learning tactics (the procedures used when trying to learn something)

review quiz or test portfolio item addition teacher-student conference journal, log think aloud skill application through role play observation check list peer review reflection

(combined with student, peer, and teacher checklists, rubrics and anecdotal notation)

SUMMATIVE(after teaching for student evaluation

andto self-monitor, self-manage, and self-modify student learning and teacher performance in the future)

HighGrade given

-to gather data and evaluate

-to make decisions regarding grades, promotion, graduation

-to reflect on student learning and teacher performance

test portfolio submission demonstration of a complete skill in

an authentic/near authentic situation presentation project reflection

(combined with teacher and possibly student or peer evaluative checklists, rubrics, and anecdotal notation)

5/5/2023 Pat Loncto Rubric final 32

Page 33: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TASK 2: Complete the following Challenge exercises.

Remember that rubrics are always works in progress. What you see here are not end products.

CHALLENGE AHighlight the row in the Computation rubric below that will make scoring more “reliable”, i.e. consistent across several different scorers. Be prepared to explain why.

Dimension 4 3 2 1ComputationExtent to which student shows mastery of computation by accurately executing all procedures, correctly applying and labeling all visual representations (charts, tables, graphs, etc.) of the problem and demonstrating the correct use of available technology or manipulatives.

Row 1

All aspects of the solution are completely accurate by executing all procedures.

Multiple representations and labeling verify the solution.

Computations are essentially accurate however some of the steps are not visible.

All visual representations and labeling are complete and accurate.

Minor computational errors are clearly visible.

Representations, labeling, and inefficient procedures impede success in arriving at a correct solution.

Errors in computation are serious enough to flaw the solution.

Mathematical representations are inaccurate and labeled incorrectly.

ComputationThe extent to which the student uses mathematical strategies to arrive at a correct solution.

Row 3

Chooses a direct, efficient computation strategy.

Computations methodically arrive at correct solutions.

Chooses an appropriate computation strategy that meanders to a solution.

Computations that arrive at correct solution can be followed.

Selects a plausible, yet somewhat inefficient, computation strategy, however, is unable to arrive at a correct solution.

Selects an inappropriate computation strategy making the solution irrelevant

ComputationRow 3

completely accurate generally accurate minor inaccuracies major inaccuracies

5/5/2023 Pat Loncto Rubric final 33

Page 34: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

CHALLENGE B

Remember that rubrics are always works in progress. What you see here are not end products.

Learning Experience Rubric for AssessmentPossible Points Score

Reflective Journal (writings with depth – thought) 1 – 4 4Completion of Pre and Post tests; and Reflective Journal 1 – 6 6Daily participation (working together willingly and effectively)

Day one to eight 1 – 2 16Day nine 1 – 4 4

Letter to the agency (Research, choice) 5 – 10 10

Creation of CD-ROMAccurate information 5 – 20 20Appropriate action 5 – 20 20Link to next screen 5 – 15 15Design 2 – 5 5

Will the samples in the CHALLENGES above lead to “reliable” scoring, i.e. consistent scoring across several different scorers? Why/why not?

Do the samples above assess what is “valid”, i.e. that it assesses what is central to the understanding of student knowledge and skills related to the Standards/PI? How do you know?

5/5/2023 Pat Loncto Rubric final 34

Page 35: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

CHALLENGE C: Mechanics Rubric: Poor and Good Example. Why is the Poor Example less “reliable” than the Good Examples for evaluating student work?

POOR Remember that rubrics are always works in progress. What you see here are not end products.

GOOD Remember that rubrics are always works in progress. What you see here are not end products.

GOOD Remember that rubrics are always works in progress. What you see here are not end products.

5/5/2023 Pat Loncto Rubric final 35

Grammar, spelling, capitalization, punctuation

Writer makes minimal mechanical errors.

Writer makes less than 5 mechanical errors.

5-9 mechanical errors. 10 or more mechanical errors.

Grammar, spelling, capitalization, punctuation

Correct grammar, spelling, capitalization, and punctuation are used throughout the writing piece and therefore convey the writer’s ideas accurately with immediate understanding by reader.

Minor errors in capitalization, punctuation, and spelling do not interfere with the meaning or idea of the writing piece.

Errors in capitalization, punctuation, and spelling create considerable confusion in one’s ability to understand the writing piece and interrupt reading fluency.

Essay unreadable because errors in capitalization, punctuation, and spelling frustrate the reader and comprehension stops.

Grammar, spelling, capitalization, punctuation

Correct grammar, spelling, capitalization, and punctuation assist the reader in understanding the writing piece.

Minor errors in capitalization, punctuation, and spelling allow the reader to understand the writing piece.

Errors in capitalization, punctuation, and spelling inhibit and interrupt the reader’s understanding of the writing piece.

Errors in capitalization, punctuation, and spelling prevent the reader from understanding the writing piece.

Page 36: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

CHALLENGE DRemember that rubrics are always works in progress. What you see here are not end products.

Compare the draft and the final rubric for Essay Rubric . What changes occurred between first and second drafts. How did the change improve the rubric with regard to “reliability and validity”?

Draft Essay Rubric by Jean Henesey

Cultural Essay Rubricby Jean Henesey (Final revised edits)

Remember that rubrics are always works in progress. What you see here are not end products.Dimensions 4 3 2 1 TOTAL

5/5/2023 Pat Loncto Rubric final 36

Dimensions 4 3 2 1Fact/Opinion The essay shows an

excellent understanding of ways to enable people of other cultures to live comfortably in a new community.

The essay shows a good understanding of ways to enable people of other cultures to live comfortably in a new community.

The essay shows a partial understanding of ways to enable people of other cultures to live comfortably in a new community.

The essay shows a lack of understanding of ways to enable people of other cultures to live comfortably in a new community

Details The written details are well organized and written so the reader gets a picture in his mind.

The written details are organized and complete.

Some of the written details are organized.

The written details are disorganized and incomplete.

Required Elements

The essay contains all 3 required elements.

The essay contains 2 of the required elements.

The essay contains 1 of the required elements.

The essay contains none of the required elements.

Mechanics Correct capitalization, punctuation, and spelling were used throughout the essay.

There are minimal errors in capitalization, punctuation, and spelling. They do not affect the ability to understand the essay.

There are some errors in capitalization, punctuation, and spelling which affects the ability to understand the essay.

There are many errors in capitalization, punctuation, and spelling.

Page 37: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Fact/Opinion

ELAWR.3.1.cWR.3.1.hWR.3.3.bWR.3.3.c

2X

Ideas in the essay show ways to enable people of other cultures to feel welcomed by giving juicy, factual, and important details from the story.

Ideas in the essay show ways to enable people of other cultures to feel welcomed by giving factual and important details from the story.

Ideas in the essay are from the story but are unimportant when describing ways to enable people of other cultures to feel welcomed in a new community

Ideas are unimportant and not from the story.

Details/ Organization

ELAWR.3.1.g

2X

The written details are well organized and the reader gets a picture in his mind.

The written details are organized and complete so the reader understands what is written.

Some of the written details are organized however the reader gets confused.

The written details are disorganized and incomplete.

Mechanics

2X

Correct capitalization, punctuation, and spelling were used throughout the essay.

Minor errors in capitalization, punctuation, and spelling do not interfere with the meaning or idea of the essay.

Errors in capitalization, punctuation, and spelling affect the ability to understand the essay.

Essay cannot be read because of errors in capitalization, punctuation, and spelling.

Appearance

1X

The handwriting is readable at a glance.

The handwriting is readable.

The handwriting has to be reread.

The handwriting is unreadable.

TOTAL: GRADE:28 possible pointsA = 28-26 B = 25-23 C = 22-20 D = 19-18 F = 17

Reflection: How would you continue to improve this rubric for its reliability and/or validity?

5/5/2023 Pat Loncto Rubric final 37

Page 38: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

RUBRIC Sample Forms and FormatsRemember that rubrics are always works in progress. What you see here are not end products.

TASK 3: Review the following sample rubric designs; make notations on samples to indicate your personal design preferences when creating a rubric. Contemplate. Reflect.

5/5/2023 Pat Loncto Rubric final 38

Page 39: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Treasures - Descriptive Writing Rubric – Grade 1Remember that rubrics are always works in progress. What you see here are not end products.

Criteria 4Brilliant Diamond

3Shiny Gem

2Unpolished Stone

1Rough Rock

Score

Descriptive LanguageThe extent to which student uses descriptive words.(museum card and journal)

Vivid, precise words create memorable pictures.

Clear interesting words bring description to life.

Words adequate, appeal to senses.

Vague, dull, or misused words.

Possible score 8

Ideas and ContentThe extent to which the student connects information to object.(museum card)

Accurate description, incorporating unique details related to the object.

Accurate details relate to object.

Inaccurate details confuse understanding about the object.

Rambling, inaccurate details that are unrelated to the object.

Possible score 4

ReflectionThe extent to which the student connects personal experiences and feelings to text(journal)

Focused thoughts related to writer and the main ideas of the text.

Thoughts relate to the writer and the text.

Thoughts are unrelated to the writer’s personal experience/feelings

ORinaccurate details about the connection to the text confuse reader’s understanding.

Rambling, thoughts that are unrelated to the writer

ANDunrelated to the text.

Possible score 4

ConventionsThe extent to which student uses correct spelling, punctuation, capitalization.

Excellent control, few or no errors, making the message able to be read and understood quickly.

A few errors that do not interfere with readability and do not distract from understanding the message.

Errors interfere with readability

ORunderstanding the message.

Due to the amount of errors, the message cannot be read

ANDcannot be understood.

Possible score 4

Total Score (20/20):Kari Schmitt

5/5/2023 Pat Loncto Rubric final 39

Page 40: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Interview Magnets Rubric - KindergartenRemember that rubrics are always works in progress. What you see here are not end products.

Dimension 4 3 2 1 ScoreKnowledge gained

The extent to which student understands concepts of attract and repel characteristics of magnets.

(maximum 8)

Student can accurately tell the interviewer 4 or more facts about magnets independently using scientific language.

Student can accurately tell the interviewer 3 or more facts about magnets independently.

Student can accurately tell interviewer two or more facts about magnets with no more than two prompts.

Even with prompting, student is unable to give more than one fact about magnets.

Classification

The extent to which student can sort objects into categories.

(maximum 4)

When classifying objects is able to create and label own categories related to magnetism.

All objects are correctly sorted.

Is able to classify objects when provided labels and criteria related to magnetism.

All objects are correctly sorted.

Is able to classify objects when provided with labels and prompts.

Objects are correctly sorted with prompts.

Even with labels and prompting student is unable to classify or sort correctly.

Speaking

The extent to which student can share what they know and have learned about a topic.

(maximum 4)

Student can be understood immediately by engaging the listener with appropriate voice level and speaking in complete sentences

Student can be understood when listener concentrates on what is being said.

Difficult to understand student because voice volume is low

ORstudent uses incomplete sentences. Listener must ask student to repeat the answers.

Difficult to understand student because voice volume is low

ANDstudent uses incomplete sentences. Listener must ask student to repeat answers.

Journal questions (maximum 6) TotalELA speaking Standards/PI: K.3.1 MST K.4.3.1e MST K.4.3.1fLewiston Porter Primary Bldg. Report Card Scoring (translation of rubric scores): E=22-19 P=18-14 N=13

Josh Suita

5/5/2023 Pat Loncto Rubric final 40

Page 41: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Fractions Rubric – Grade 5 Remember that rubrics are always works in progress. What you see here are not end products.

Dimension 4 3 2 1 Score Part A calculation test score (six questions/one point each) 1.

2.3.4.5.6.

Short Answer Part BThe extent to which student explains the mathematical ideas and processes used to solve fraction problem.(24)

-Communicates through writing or illustrations: a recognition of what a fraction is, equivalent fractions, and the proper method of comparing fractions.

-Reaches conclusions by showing step by step process using mathematical language.

- Reader effortlessly understands writer’s thinking.

-Leaves the reader with no questions.

-Communicates through writing or illustrations: a recognition of what a fraction is, equivalent fractions, and the proper method of comparing fractions.

-Reaches reasonable conclusions by showing step by step process to reach solution.

-Reader must concentrate while reading answer.

-Unclear communication leaving reader with questions about process used to solve problem. Must ask student questions for reader understanding of what is written.

-Failure to address some points relevant to the solution, faulty reasoning, weak conclusions,Leaves out important steps to solution.

Communication merely restates the item or copies given data.

-Fails to address points relevant to the solution leading to invalid conclusions, omits a significant part of solution

1.

2.

3.

4.

5.

6.

Fairness ReflectionThe conceptual extent to which student connects use of fractions to fairness. (4)

Responds to reflection questions clearly with thoughtful general statements which are supported by strong, accurate evidence.

Responds to reflection questions with thoughtful general statements which are supported by accurate evidence.

Digresses from reflection questions which may or may not be supported or which may sometimes contain minor inaccurate evidence.

Reflection is vague, unclear, and contains significantly inaccurate evidence.

Rubric Total (28)

Final Score: Part A plus Part B (34)Mike DiCamillo

Continued on next page

5/5/2023 Pat Loncto Rubric final 41

Page 42: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Continued from rubric above by Mike DiCamilloTranslation of Rubric Scores for grading with Fractions Rubric

Letter Grade Percentage Points EarnedA+ 100-97 34-33A 96-93 32A- 92-90 31B+ 89-87 30B 86-83 29B- 82-80 28C+ 79-77 27C 76-73 25-26C- 72-70 24D 69-65 23F Below 65 22 and below

Primary Child-Friendly Rubric – Grade 1: Descriptive Writing RubricRemember that rubrics are always works in progress. What you see here are not end products.

Dimension 4 3 2 1

I used adjectives to describe my treasure.I used capitals and end marks correctly.5/5/2023 Pat Loncto Rubric final 42

Page 43: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Kari Schmitt, Camille Plewa

5/5/2023 Pat Loncto Rubric final 43

Page 44: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

A YEAR VIEWED FROM SPACE SCORING RUBRIC – Earth ScienceRemember that rubrics are always works in progress. What you see here are not end products.

Dimension Level 4Above and beyond

Level 3Complete and

correct

Level 2

Almost there

Level 1

On your way

Level 0

Begin again

Level X

Begin

Total

Communication Skills:Response uses communication skills to present ideas in the following format: Written: sentence

structure, grammar, spelling.

Students accomplish Level 3 AND enhance communication in some significant way, such as: Using additional images or

diagrams effectively Using additional formats of

communication effectively.

Student communicates ideas clearly with few or no technical errors.

Student may have several technical errors BUT they do not prevent the audience from understanding the message.

Student’s communication is unclear OR many technical errors seriously distract the audience from understanding the message

Student message is indiscernible, illegible or irrelevant.

Student had no opportunity to respond.

Understanding Concepts:What to look for: Response identifies

and describes science concepts relevant to a particular problem or issue.

Student accomplishes Level 3 AND goes beyond in a significant way, such as: Using relevant information

not provided in class to elaborate on your response.

Using a diagram to clarify scientific concepts.

Relating the response to other science concepts.

Student accurately and completely explains and uses relevant science concepts.

Student explains or uses scientific concepts but may include irrelevant concepts or contain errors.

Student incorrectly explains or uses scientific concepts.

Student response is indiscernible, illegible, or irrelevant.

Student had no opportunity to respond.

Analyzing DataWhat to look for: Response

accurately summarizes data, detects patterns and trends, and draws valid conclusions based on the data used

Student accomplishes Level 3 AND goes beyond in a significant way, such as: Explaining unexpected

results. Judging the value of

investigation. Suggesting additional

relevant investigation.

Student analyzes and interprets data correctly and completely AND student’s conclusion is compatible with analysis of the data.

Student notes patterns or trends correctly BUT conclusion is incomplete.

Student attempts an interpretation BUT ideas are illogical OR ideas show a lack of understanding.

Data is indiscernible, illegible, or irrelevant.

Student had no opportunity to respond.

Delores Anderson

5/5/2023 Pat Loncto Rubric final 44

Page 45: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Fluency Rubric Primary Grades

Remember that rubrics are always works in progress. What you see here are not end products.

4 Reads primarily in larger, meaningful phrases Fluent, phrased reading with a few word-by-word slow downs for problem solving Expressive interpretation is evident at places throughout the reading Attention to punctuation and syntax Rereading for problem solving may be present, but is generally fluent

3 A mixture of word-by-word reading and fluent, phrased reading (expressive interpretation) Evidence of attention to punctuation and syntax Rereading for problem solving may be present

2 Mostly word-by-word reading, but some two-word phrasing and even a couple of three or

four-word phrases (expressive interpretation) Evidence of syntactic awareness of syntax and punctuation, although not consistently so Rereading for problem solving may be present

1 Very little fluency All word-by-word reading with some long pauses between words Almost no recognition of syntax or phrasing (expressive interpretation) Very little evidence of awareness of punctuation Perhaps a couple of two-word phrases, but generally disfluent Some word groupings awkward

Adapted from Fountas & Pinell: Guided Reading by Alice Grabowski

5/5/2023 Pat Loncto Rubric final 45

Page 46: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

CONTEMPLATION:

1. What are the benefits of creating and using a rubric?

2. When is it appropriate to use a rubric?

3. What makes a rubric valid?

4. What makes a rubric reliable?

5. How does one determine the validity and reliability of rubrics found online?

6. What influences a teacher’s decision to use a rubric for assessment of learning?

7. How do rubrics benefit students?

8. How do rubrics influence the effectiveness of instruction?

9. How do rubrics indicate the achievement of objectives/learning outcomes?

10. What is an effective and efficient process for developing rubrics?

11. What is a weighted rubric?

12. What is the value of using a weighted rubric to the teacher, and to the student?

13. How is a weighted rubric constructed?

14. What are some of the challenges in constructing and using a rubric?

SCORE another educator’s rubric from this class and write brief feedback comments.

REFLECTION: How do I discover when a rubric is needed vs. when a different type of assessment tool could be sufficient (i.e. checklist, anecdotal record, multiple choice test with answer key….)?

5/5/2023 Pat Loncto Rubric final 46

Page 47: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TASK 4: DESIGNING RUBRICS HINTS [references Giselle Martin-Kniep, Learner-Centered Initiatives, LTD., Wiggens and McTighe (2005), Brookhart (2013), Goodwin and Hubbell (2013)]

Read the following on constructing rubrics. Highlight information you want explained. RUBRICS: are about QUALITY, not quantity. can be translated into numerical or letter grades. require experience to construct and use effectively. provides students with consistent self-assessment tools. have instructional value. can be viewed as rubric strips with the potential for use

multiple times with a variety of tasks; can be mixed and matched into a rubric table, thus saving the teacher time during rubric construction.

are written in student-friendly language or one in teacher-friendly and corresponding rubric in student-friendly

use the language of the standards. use lowest inference/subjectivity possible without becoming

overly descriptive to the point of a checklist. are constructed with student input.

1. Have the students construct the rubric with you before beginning the task. Have students look at several exemplars to describe what a quality task looks like. However, construct a draft rubric for yourself first to help you guide the discussion.

2. Use the language of the Standards/PI in the descriptions of quality.

3. Describe what you see. Avoid saying what you don’t see. Poor ex. “Does not stay on task.” Good ex. “Engages in activities unrelated to work.”.

4. Use consecutive numbers for levels. Poor ex. “7 5 3 1” Good ex. “4 3 2 1”. “0” is usually reserved for no product to score.

5. The number of performance levels should be an even number if students are using it for self assessment. Poor ex. 5 or 3 levels Good ex. 4 or 2 levels. An odd number allows the evaluator to choose the middle-of-the-road rather than commit to a decision between competent or below average performance.

6. Use a weighted rubric to control a passing grade or to give more importance to a dimension. For example: one dimension counts triple.

7. If you are grading and find a product scoring between levels on the rubric, you should not invent a new score because there is no description for it. Poor ex. “3+”. You should choose the level where the preponderance of evidence leads you.

8. Use a rubric as a guide for feedback and revision, delay grading - or revise grade; instead confer with student on the descriptions of quality for which he/she could aim.

9. Do not use a rubric where increases in scores are based on quantity and can be counted. Poor ex. “gives five reasons”. This becomes a checklist of things to be done, not a guide for quality.

10. Do not use a rubric when exemplar cannot be found in the real world. Poor ex. “book report” Good ex. “book review”

11. Replace level numbers with student-friendly words that provide a mental model for the quality expected. Good ex. Nobel Prize (4), Best Seller(3), Editing(2), Drafting(1). Use student friendly language throughout for student self-assessment ease.

12. Post rubric in the classroom. Once developed, it is a contract and cannot be changed until the next set of students uses it.

5/5/2023 Pat Loncto Rubric final 47

Page 48: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

TASK 5: Construct a rubric. Check boxes as you complete each step until all boxes are checked. Do not take short cuts. This method saves time in the end because it allows your brain to compartmentalize the criteria you want to assess. Teacher Process in Constructing 4-Level Rubrics: Directions: Check the box as you complete each step.1. □ Make a VOCABULARY LIST on the My Performance Indicator Vocabulary List graphic organizer, using language from the Standards/Performance Indicators and core curriculum you have identified for the series of lessons/task.

2. □ Describe what you expect to see in QUALITY student work at the bottom of the My Performance Indicator Vocabulary List graphic organizer.

3. □ Cluster your expectations into like-categories and give them a NAME. This name will become a rubric DIMENSION. You should have no more than 6 dimensions for your rubric, and you could have as few as two. Be sure to refer to your vocabulary list.

4. □ List and define each dimension in FIRST block per row. Be sure to refer to your vocabulary list.

5. □ Arrange the dimension rows in PRIORITY order with the most important dimension in the first row.

6. □ Specifically describe what you expect to see for each dimension at 4 levels of quality by following the steps in the bullets below. Write the levels in the following order, 3-4-2-1 for one dimension before going to another dimension. Start at 3 because that is what you expect the students to be able to achieve at this grade level for these Standards/PI.

3 = Good 4 = Beyond good 2 = Needs work 1 = Try again □ Gather four pieces of scrap paper, one for each level of a specific dimension.

□ On one piece, write the qualities of a GOOD process, product, performance, or understanding for the dimension identified. To do this, close your eyes and imagine. Describe what you see. Try to avoid thinking about what you don’t see. Use your vocabulary list. After you finish, put away that piece of paper. This is IMPORTANT because you will have a more thorough rubric if you brainstorm each level independently. Later you can make the language parallel.

□ On another piece, write the qualities of a BEYOND GOOD process, product, performance, or understanding using your vocabulary list. After you finish, put away that piece of paper.

□ On the next piece of paper, write the qualities of a NEEDS WORK process, product, performance, or understanding that has many problems or errors. Describe what you see. Do not worry if the words you are using do not resemble the ones you used for the other two levels of the rubric. Use your vocabulary list. After you finish, put away that piece of paper.

□ On the last piece of paper, write the qualities of a TRY AGAIN process, product, performance, or understanding that reveals strong misinformation, or misunderstanding. Use your vocabulary list.

□ Uncover all four pieces of paper and edit your levels so that all dimensions refer to the same criteria. Type into the appropriate rows on the Rubric graphic organizer. How you label the levels, with words or numbers, is your choice but always start with the highest level of achievement at the left. Children tend to read the first criteria column and stop. You want that column to tell the learner what quality looks like just in case they stop reading.

□ Write next dimension and so on. When finished, make the rubric pleasing to the eye for ease of reading. Add a graphic near the title for easy identification.

5/5/2023 Pat Loncto Rubric final 48

Page 49: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

My NYS Standards/PI/CCLS Vocabulary List

Directions: List the key vocabulary for the targeted learning by referring to the NYS Standards/PI/ CCLS and selected and Subject’s Core Curriculum. Also add other content-focused vocabulary from personal experience. http://www.emsc.nysed.gov/deputy/Documents/learnstandards.htm).

Name: Standards Area:Grade Level: Title of Unit:

List of NYS Standards/PI/CCLS stated verbatim from the NYS documents:

In the chart below, list the words in the NYS Standards/PI/CCLS selected that are significant. Pay particular attention to words that are part of that discipline’s literacy (Tier 3), for example: plot in ELA, inquiry in science, strategy in math, pitch in music.

Next, list other significant Tier 2 and Tier 3 words from personal experience when teaching this topic. Focus on words that relate to what you expect to see in rubric 3 or 4 level work representing achievement of the stated NYS Standards/Performance Indicators for this Unit of Instruction.

Key Phrases Nouns Verbs Adjectives

List quality expectations in student work based on Standards/PI/CCLS:

5/5/2023 Pat Loncto Rubric final 49

Page 50: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

RUBRICDIMENSION 4 3 2 1

5/5/2023 Pat Loncto Rubric final 50

Page 51: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Ongoing task: Self-assess the rubric and check box when criteria are met. Checklist for Self-Assessment in Constructing Rubrics Directions: Check the box as you assess your constructed rubric for each step below.

1. □ Students participated in the development of the rubric.

2. □ Rubric is titled.

3. □ The levels are even in number and the numbers are replaced with student- friendly mental model descriptors.

4. □ The rubric has dimensions that are defined.

5. □ The dimensions of the rubric are prioritized or placed in a purposeful order.

6. □ When quantitative terms are used, they are supported with quality attributes. Good example: “has 3 or more errors making the product unable to be used.”Poor example: “has 3 or more errors.”

7. □ When adjectives must be used, they are clarified by an impact statement making scoring more reliable and consistent across several different scorers. Good example: “Examples are relevant, making logical connections for the reader to follow.” Poor Example: “Examples are relevant to the topic.”

8. □ Top level of the rubric is above the expected standard but not more work.

9. □ The rubric uses student-friendly language that students will understand or can be taught to understand. OR includes a teacher-friendly with corresponding student-friendly rubric.

10. □ Continually use the A Rubric About Rubrics to self-assess the constructed rubric.

11. □ Have the students use the rubric to assess work in progress. Solicit feedback from students on how well the rubric helps them, keep notes, revise as necessary for next group of student work.

12. □ Score student work, ask a peer teacher to score student work, compare results looking for consistent scoring, revise rubric as necessary, to improve consistency, for the next group of student work.

5/5/2023 Pat Loncto Rubric final 51

Page 52: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score (12 out of 12) Score:Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 52

Page 53: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Differentiated TASKS:Differentiated Enrichment TASK A: Calculate the point value for each attribute, the maximum rubric score, and the points scored on each rubric and overall percent (%) score.Guiding Question: How can rubric scores be translated into percentage grades?

Pre-Test: Rubric 1Attribute Level 4 Level 3 Level 2 Level 1AWeight 2Point Score:___

XBWeight 2Point Score:___

XCWeight 1Point Score:___

X

The Maximum Rubric Point Score: ______

Point Score: ______

Percentage Score: ______

Pre-Test: Rubric 2Attribute Level 4 Level 3 Level 2 Level 1AWeight 3Point Score:___

XBWeight 2Point Score:___

XCWeight 1Point Score:___

XDWeight .25Point Score:___

X

The Maximum Rubric Point Score: ______

Point Score: ______

Percentage Score: ______5/5/2023 Pat Loncto Rubric final 53

Page 54: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Differentiated Enrichment TASK B: Review the Sample below. Design a score sheet for translating rubric scores into grades using this template. Guiding Question: How can rubric scores be translated into grades?

SAMPLE Translating Rubric Scores into Grades Points Accrued

(assumes 36 is the total possible score on rubric chart for this assessment)

Percentage Equivalent Grade

School DistrictLetter Grade

(assumes a scoring translation of going from percent to letter grade defined by school)

36 100% A+35, 34 97, 94% A33 92% A-32 89 B+31, 30 86, 83% B29 81% B-28 78% C+27 75% C26 72% C-25 69% D+24 67% DX X D-23 64% FDirections: To calculate percentage number grade when you have an analytic score: divide the number of points earned by the Total Possible Points Score. i.e. Student earns 30 points out of a total possible score of 36 points; divide 30 by 36 to get 83%. If necessary, convert the % to a letter grade based on school district ranges.

Steps:1. Write each possible point accrued on a separate line in column one.2. Calculate percentage for each point accrued.3. Place percentage value(s) in column two, writing all percent values contained within each District letter grade

range from column three on the same line in column two.4. X indicates that there is no mathematical equivalent for the letter grade range with the total points accrued for this

rubric.5. Be certain to provide verbal feedback with a personal instructive written comment either on the rubric, or

individualize the response to the student by highlighting prewritten statements, applicable to the task and his/her performance, on a separate comment sheet. i.e. Highlight statement: “Your final draft improved from level 2 to 3 because you restructured your organization.”

Sometimes it is necessary to adjust total points accrued because, when the number of total possible points is low, the scores do cause grades to be too low thus giving an inaccurate picture of achievement. Therefore, consider these techniques to increase total point values to be more reflective of broader achievement:

Weight the dimension(s) on the rubric giving those with more significance higher point values by multiplying the column value by a predetermined weight. i.e. Dimension “Organization” might use a multiplier of 2, so level 4 is worth 8, level 3 is worth 6, and so forth. However “Mechanics” level 4 is worth 4, and so forth.

Add rubric total raw score to objective test/quiz total raw score to determine total points accrued before calculating percentages. i.e. Rubric total possible raw score is worth 32 points, test possible raw score is worth 20 points, therefore calculations are counted down from a possible score of 52.

Combine the total point values from each of a series of tasks which use the same rubric before calculating percentages. i.e. Total point value from 3 persuasive essays scored with the same rubric.

5/5/2023 Pat Loncto Rubric final 54

Page 55: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Translating Rubric Scores into Grades templatePoints Accrued

(Count down from total possible score for this assessment)

Percentage Equivalent Grade

School DistrictLetter Grade

(assumes a scoring translation of going from percent to letter grade defined by school)

A+AA-B+BB-C+CC-D+DD-F

Directions: To calculate percentage number grade when you have an analytic score: divide the number of points earned by the Total Possible Points Score. i.e. Student earns 30 points out of a total possible score of 36 points; divide 30 by 36 to get 83%. If necessary, convert the % to a letter grade based on school district ranges.

Steps:6. Write each possible point accrued on a separate line in column one.7. Calculate percentage for each point accrued.8. Place percentage value(s) in column two, writing all percent values contained within each District letter grade

range from column three on the same line in column two.9. X indicates that there is no mathematical equivalent for the letter grade range with the total points accrued for this

rubric.10. Be certain to provide verbal feedback with a personal instructive written comment either on the rubric, or

individualize the response to the student by highlighting prewritten statements, applicable to the task and his/her performance, on a separate comment sheet. i.e. Highlight statement: “Your final draft improved from level 2 to 3 because you restructured your organization.”

Sometimes it is necessary to adjust total points accrued because, when the number of total possible points is low, the scores do cause grades to be too low thus giving an inaccurate picture of achievement. Therefore, consider these techniques to increase total point values to be more reflective of broader achievement:

Weight the dimension(s) on the rubric giving those with more significance higher point values by multiplying the column value by a predetermined weight. i.e. Dimension “Organization” might use a multiplier of 2, so level 4 is worth 8, level 3 is worth 6, and so forth. However “Mechanics” level 4 is worth 4, and so forth.

Add rubric total raw score to objective test/quiz total raw score to determine total points accrued before calculating percentages. i.e. Rubric total possible raw score is worth 32 points, test possible raw score is worth 20 points, therefore calculations are counted down from a possible score of 52.

Combine the total point values from each of a series of tasks which use the same rubric before calculating percentages. i.e. Total point value from 3 persuasive essays scored with the same rubric.

5/5/2023 Pat Loncto Rubric final 55

Page 56: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Differentiated Enrichment TASK C: Review “More on RUBRICS” to address frequently asked questions. What new thoughts come to mind that could impact teaching and learning in your classroom? Construct a rubric with student input. Guiding Question: How can students be involved in the process of constructing rubrics?

More on RUBRICS:Jackie Goodwine, & Barbara Johnson,Greece Central Schools, Audrey Korokeyi, Rochester City Schools, Kathy Miner, Clyde-Savannah Schools

Q: What deserves a rubric?

Use rubrics for authentic processes, performances, and products found in the real world.

Q: What is best practice regarding rubric use and development?

Rubric Development Teacher identifies important criteria prior to developing a rubric with children. Teacher and student refine/revise the rubric during and after use but only use the edited rubric for

the next time the task is assigned or in extreme circumstances when the original rubric becomes unusable.

Rubric Use Rubric is accompanied by several examples of student work for each level on the rubric. Rubric is used before, during, and after students work. Teacher models the use of rubric for self-assessment (as a whole class, small group, and/or in

individual conferences) Student are expected to:

Use the rubric to self assessGive feedback to each other using the rubric (when developmentally appropriate)

Teacher uses the rubric to give students feedback during and after the students’ work. Rubric is posted in the classroom and referred to during work sessions. Rubric is given to students as a working tool. Rubric is shared with parents.

Q: Why involve students?

When students are involved in the process they... think about quality learn more about the product, performance, and process that they will complete internalize much of what is on the rubric are able to understand the language of the rubric are able to use the rubric more easily feel ownership of the rubric see the rubric as an instructional tool that they can use see the rubric as an assessment tool that they can use become partners in the assessment process

Q: What roles can students play?5/5/2023 Pat Loncto Rubric final 56

Page 57: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Students can be involved in the development and use of rubrics when they are asked to: identify attributes of quality for a product, process, or performance. cluster attributes draft a rubric. think about and discuss the importance and weighting of attributes. give feedback on the strengths and weaknesses of a rubric. refine a rubric to make it more useful. use a rubric to self-assess a product, process, or performance. use a rubric to assess anonymous student work. use a rubric for peer assessment.

Q: What are some of the potential problems of involving students?

Asking students to use a rubric on their own work when they are completely unfamiliar with it can be anxiety producing.

When a rubric is used alone, without models, it may not fully convey to students what quality looks like.

A rubric may be constraining in terms of allowing students to use their imagination.

5/5/2023 Pat Loncto Rubric final 57

Page 58: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

4 Tips to Get More Out of Rubrics

Miller, Andrew K. (2013). 4 tips to get more out rubrics. Retrieved fromhttp://www.andrewkmiller.com/2013/04/4-tips-to-get-more-out-of-rubrics

When I work with educators on their professional development needs, rubrics frequently come up as something that teachers want to understand better and be able to find quickly and easily from a variety of sources for immediate use in the classroom. Often, however, rubrics found on the Internet are not of good quality—they may not be grounded in learning targets or the language may be too vague and confusing for students. The good news is that there are many great resources and tips out there to build your own rubrics. Here are some ideas to start.

Use Common Rubrics. For some students, school is one of the few stable routines in their young lives. Let’s support this safe and supportive culture by using common rubrics across subject areas or grade levels. This will help to ensure that each teacher is looking at student work objectively and lets students know that the expectations are the same regardless of the classroom. These common rubrics might be based in content or even 21st century skills. Students will appreciate these common expectations and common language around learning.

Decide Between Checklists or Rubrics. I used to fall into the trap of having too many numbers in my rubrics. I listed different numbers of sources, sentences, and so on, under each level, from approaching to exceeding a standard. Numbers don’t indicate quality. Focus on quality indicators when creating rubrics. However, if students need to have a specific number of something as a nonnegotiable, then create a checklist for them. Ask yourself, is this better on a checklist or a rubric?

Use Them! Rubrics are useless unless you use them. Why do students often throw them away or lose them? Because they don’t see the value in them as a learning tool! It is critical to have students use a rubric for an entire curriculum unit, project, or even over the course of a year. Use rubrics to set goals, provide peer- and self-assessment, and reflect on learning. Through intentional and meaningful use, rubrics can become a tool that students see as invaluable.

Focus on Learning Targets. Unless you are truly assessing creativity, it may not be appropriate to list creativity on the rubric. Similarly, neatness may not be appropriate if it isn’t directly related to the content or core discipline you intend to assess. Make sure you focus on learning targets, which could be standards or specific criteria, when you create the rubric. Articulate what the learning will look like in terms of approaching, meeting, and exceeding standards.

"[A]s learners cannot actually 'construct' their own learning (because, in Foucault's pithy phrase, they can't know what they don't know), the role of teachers cannot be reduced to that of guide and facilitator rather than as a source of strategies and expertise."—Johan Muller and Michael Young, "Three Scenarios for the Future"

My biggest recommendation is to collaborate with others to create rubrics that are specific to your school, district, and learning targets. Whether they are state, Common Core, or 21st century standards, some of the best rubrics can be developed in-house. Use these tips as well as books from ASCD to support your work in building the best rubrics.

5/5/2023 Pat Loncto Rubric final 58

Page 59: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Teacher Exemplar: Diagnostic/Summative below[references Giselle Martin-Kniep, Learner-Centered Initiatives, LTD., Wiggens and McTighe (2005), Brookhart (2013), Goodwin and Hubbell (2013)]

NAME: TASK 1: What do you know? DIAGNOSTIC ASSESSMENTAnswer the following questions to the best of your ability with concise statements. No more than five

minutes is given for this task. Please attempt to answer all questions.

TASK 6: What do you know now? SUMMATIVE ASSESSMENTUse a different writing tool, highlighter, or another method to edit Diagnostic answers according to

knowledge gained. 1. How would you know the students achieved the learning outcome(s)? I would design a rubric

with 4 levels of competency aligned to the learning outcomes (knowledge, skills, understandings). The students and I would use the tool to monitor student progress as demonstrated in the student work demonstrating the criteria related to attainment of those learning outcomes.

Moment Purpose:State the purpose for each of the following assessment Moments:

2. Summative Evaluation to comprehensively assess student learning for attainment of learning outcomes and the effectiveness of an instructional segment or program.

3. Formative For teachers and students to check student progress in increments of achievement toward attainment of learning outcomes in order to adjust instruction or for students to adjust their learning tactics.

4. Diagnostic Assess student knowledge, understanding, and skills, prior to beginning a Learning Experience thereby helping the teacher plan instruction. When compared to the summative assessment for the specified learning, the diagnostic assessments also become evidence of growth as a result of learning.

5. When is a rubric an appropriate assessment/scoring tool? A rubric is appropriate for authentic processes, performances, and products found in the real world.

6. What makes a quality rubric? A quality rubric is valid, that is it assesses what is central to the attainment of student knowledge, understandings, and skills related to the Standards/PI/CCLS. AND it is reliable, that is the rubric score is consistent across several different scorers.

7. How do rubrics influence the effectiveness of instruction? Following diagnostic assessment and during formative assessment of student learning the rubric scores signal when the teacher should modify and adjust instruction based on student needs.

8. How do rubrics benefit students? During formative assessment of student self assessment and teacher assessment of student learning the rubric scores signal when students need to adjust their learning tactics and/or when the teacher must provide extra support/enrichment.

9. Why is it important that numbers for rubric levels are consecutive? When rubric levels are not consecutively numbered there are no criteria for the absent number levels and therefore no score can be given that number value.

10. What is a weighted rubric? A rubric in which certain concepts/criteria are judged more heavily than others.

11. Use the Rubric about Rubrics to score a previously created rubric; record the score_______.

5/5/2023 Pat Loncto Rubric final 59

Page 60: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score (12 out of 12) Score:Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 60

Page 61: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 3: Student Work NOTE: Numbering of questions on diagnostic and summative assessments differ however the data reflects consistent cross referencing of the same questions making the data accurate. NOTE: Handouts and worksheets have been changed to reflect comments for improvement received during a formal Peer Review. The data in these tables refers to the original tasks prior to editing. Previously created rubric score data is not included in this student work because the information was not required of these students at the time the original lessons were given.

Question recording scoring table for Rubric Diagnostic What do you know?

Question recording scoring table for Rubric Summative What do you know now?

Rubric about Rubrics Score

5/5/2023 Pat Loncto Rubric final 61

Question Student2Developing

Student 10Proficient

Student 13Distinguished

1 2 1 32 0 1 43 0 1 24 0 1 35 2 1 16 2 1 37 1 1 18 2 2 49 0 1 310 2 1 4TOTAL 11 11 28

Student2Developing

Student 10Proficient

Student 13Distinguished

RubricScore

2-2=4/8.

2-2=4/8 3-3=6/8

Page 62: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Developing Student Candidate Diagnostic Assessment

5/5/2023 Pat Loncto Rubric final 62

Page 63: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Developing Student Candidate Summative Assessment

5/5/2023 Pat Loncto Rubric final 63

Page 64: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Developing Student Candidate Rubric

5/5/2023 Pat Loncto Rubric final 64

Page 65: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

5/5/2023 Pat Loncto Rubric final 65

Page 66: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Developing Student Candidate Diagnostic and Summative Scores

Rubrics Diagnostic/Summative Scoring ToolTo be used for quick scoring of each question on Diagnostic and Summative Assessments and

further analysis of student understanding. (INTASC 6a)

4Heart

Pounding

All encompassing answer that gets to the heart of the concept with concise detail using precise best practice language.

3Heart

Beating

Answer contains enough detail to correctly explain the concept and uses a sprinkling of educational language to describe ideas.

2Pulse

Answer contains generalized statements using layman terms.AND/OR

Thinking reveals some misconceptions.

1Need

Resuscitation

Deficient understanding of concept area is evident by incomplete or meaningless statements.

AND/OR Thinking reveals serious misconceptions.

Scores per Diagnostic question: 2,0,0,0,2,2,1,2,0,2=11Scores per Summative question: 2,3,0,3,1,1,2,3,2,3=20A RUBRIC ABOUT RUBRIC SCORE: 4This candidate is developing because growth from diagnostic to summative assessment in content knowledge indicates misunderstandings about the creation and development of rubrics. The practical application of creating a rubric displays these misunderstandings as well, namely that of using quantity to distinguish levels of competency, describing what is not seen rather than what IS seen, using vague descriptors like “most” and “some”, and scoring extraneous features rather than standards-based learning. These misunderstanding render this student created rubric invalid and unreliable.

5/5/2023 Pat Loncto Rubric final 66

Page 67: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Developing Student Candidate A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score when full rubric used (12 out of 12) Total Possible Score (8 out of 8) Score: 4Purpose/intent of Rubric Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.5/5/2023 Pat Loncto Rubric final 67

Page 68: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 68

Page 69: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Proficient Student Candidate Diagnostic Assessment

5/5/2023 Pat Loncto Rubric final 69

Page 70: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Proficient Student Candidate Summative Assessment

5/5/2023 Pat Loncto Rubric final 70

Page 71: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Proficient Student Candidate Rubric

5/5/2023 Pat Loncto Rubric final 71

Page 72: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Proficient Student Candidate Diagnostic and Summative Scores5/5/2023 Pat Loncto Rubric final 72

Page 73: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Rubrics Diagnostic/Summative Scoring ToolTo be used for quick scoring of each question on Diagnostic and Summative Assessments and

further analysis of student understanding. (INTASC 6a)

4Heart

Pounding

All encompassing answer that gets to the heart of the concept with concise detail using precise best practice language.

3Heart

Beating

Answer contains enough detail to correctly explain the concept and uses a sprinkling of educational language to describe ideas.

2Pulse

Answer contains generalized statements using layman terms.AND/OR

Thinking reveals some misconceptions.

1Need

Resuscitation

Deficient understanding of concept area is evident by incomplete or meaningless statements.

AND/OR Thinking reveals serious misconceptions.

Scores per Diagnostic question: 1,1,1,1,1,1,1,2,1,1=11Scores per Summative question: 4,4,4,3,4,4,4,4,4,4=39RUBRIC ABOUT RUBRIC SCORE: 4

This candidate is proficient because growth from diagnostic to summative in content knowledge indicates a grasp of key understandings when creating and using rubrics. However, the practical application of creating a rubric is faulty when it comes to two common misunderstandings about rubrics leading to unreliable results: using quantity to distinguish levels of competency and describing what is not seen rather than what IS seen. Although the Standards are referenced on a separate page, one can infer that the content for this rubric is standards-based within the rubric. Had the candidate referenced the Standards within the rubric dimensions or on the rubric itself, and had the candidate used the language of the standards in the descriptors, the rubric would have scored a 3 or 4 on validity.

5/5/2023 Pat Loncto Rubric final 73

Page 74: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Proficient Student Candidate: A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score when full rubric used (12 out of 12) Total Possible Score (8 out of 8) Score: 4

5/5/2023 Pat Loncto Rubric final 74

Page 75: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

5/5/2023 Pat Loncto Rubric final 75

Page 76: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Distinguished Student Candidate Diagnostic Assessment

5/5/2023 Pat Loncto Rubric final 76

Page 77: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Distinguished Student Candidate Summative Assessment

5/5/2023 Pat Loncto Rubric final 77

Page 78: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Distinguished Student Candidate Rubric

5/5/2023 Pat Loncto Rubric final 78

Page 79: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Rubrics Diagnostic/Summative Scoring ToolTo be used for quick scoring of each question on Diagnostic and Summative Assessments and

further analysis of student understanding. (INTASC 6a)

4Heart

Pounding

All encompassing answer that gets to the heart of the concept with concise detail using precise best practice language.

3Heart

Beating

Answer contains enough detail to correctly explain the concept and uses a sprinkling of educational language to describe ideas.

2Pulse

Answer contains generalized statements using layman terms.AND/OR

Thinking reveals some misconceptions.

1Need

Resuscitation

Deficient understanding of concept area is evident by incomplete or meaningless statements.

AND/OR Thinking reveals serious misconceptions.

Distinguished Student Candidate Diagnostic and Summative Scores

Scores per Diagnostic question: 3,4,2,3,1,3,1,4,3,4=28Scores per Summative question: 3,4,2,3,2,3,2,4,4,4=31RUBRIC ABOUT RUBRICS SCORE: 6This candidate is distinguished because content knowledge increased slightly and practical application of knowledge resulted in a valid and reliable student-created rubric. Referencing the Standards in the rubric dimensions could have raised this rubric to a 4 level for validity.

5/5/2023 Pat Loncto Rubric final 79

Page 80: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Distinguished Student Candidate: A RUBRIC ABOUT RUBRICSDIMENSIONS 4 Highly Effective 3 Effective 2 Developing 1 Ineffective

Dimensions and Language of Standards Assessed

Is the rubric valid?If yes:The language used to construct criteria within the rubric dimensions are linked to school problem of practice data and Standards/PI/CCLS for continuous formal and informal assessment of the learning attainment.

Language used to describe criteria for each dimension is precisely and obviously linked to all learning outcomes’ knowledge, skills, and understandings being assessed by what is seen in the task(s).

Language used to describe the criteria relates to all learning outcomes’ knowledge, skills, and understandings being assessed in the task(s).

Connection to all learning outcomes to be assessed cannot be determined from the language used to describe the criteria.

AND/ORCriteria count/number observable features (i.e. answers all 3 problems correctly) for getting the task(s) done without describing how the feature impacts the knowledge, skills, and understandings necessary for attainment of the learning outcomes.

AND/ORCriteria scores extraneous features of task(s) unrelated to targeted learning outcomes (i.e. neatness, behavior, tardiness).

Language is unrelated to learning outcomes making it unclear to the observer how the learning outcomes are being assessed by the task(s) and/or product.

Level Descriptors

Is the rubric reliable?If yes:The descriptions of quality at the various levels allow for reliably consistent measurement of success across different scorers to ensure an accurate assessment of the learning attainment.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent across several different scorers beyond those present in the classroom.

Descriptors in each level are specific enough to make measuring the level of learning outcomes’ knowledge, skills and understandings reliably consistent for the teacher and students to arrive at identical or contiguous scores.

Descriptors rely on opinion or lack specificity about what is seen leading to the possibility of inconsistent scoring across observers during measurement of learning.

AND/ORDescriptors merely count observable features making the dimension a checklist of items to be completed with no description for the quality of the feature (i.e. has 3 sentences, answers 2 out of 5 math problems correctly).

Descriptors contain gaps in delineation across the rubric levels when enumerating the continuum of quality (i.e. non-consecutive number of levels, or the work falls between levels) thus requiring the observer to infer and make subjective judgments about the learner’s level of competency.

Instructional Value

Can students use the rubric to move their learning forward? If yes:The student knows and recognizes what to do next to make changes to improve throughout the learning.

Student-created criteria and/or criteria analyzed with teacher is written in student-friendly language and is given to students prior to engagement in learning to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria, written in student-friendly language, is clarified with students prior to engagement in the learning so as to guide student self-assessment as well as teacher assessment throughout the learning.

Teacher generated criteria uses educational language which would confuse students,

ORuses over simplified language which omits essential information for guiding student self-assessment with our without teacher assistance.

AND/ORStudents become aware of the rubric too late in the learning to guide self-assessment formatively.

The teacher-generated rubric is used only by teacher, not students, thereby having teacher evaluation as its purpose rather than student independent learning through self-assessment.

Comments and Score Total Possible Score when full rubric used (12 out of 12) Total Possible Score (8 out of 8) Score: 6Purpose/intent of A Rubric About Rubrics: to be a universally applicable tool (across disciplines, contexts, instructional levels, at a moment in time) for reflecting on the effectiveness of a rubric that will be used to measure success throughout student learning.INTASC 6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.5/5/2023 Pat Loncto Rubric final 80

Page 81: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 4: Individual Student Class Data – Student 9 could not be validly scored and therefore is not included in the data table.

item responses to diagnostic and summative assessment validity and reliability data for student-designed rubric

Question recording scoring table for Rubric DiagnosticQuestion Student Student

2 DevStudent3

Student4

Student 5

Student 6

Student 7

Student8

Student 9

Student 10 P

Student 11

Student 12

Student 13 DIS

1 2 2 4 2 1 4 2 4 1 1 2 32 0 0 0 0 2 0 2 1 1 1 1 43 0 0 2 0 0 0 2 2 1 3 1 24 1 0 4 0 1 3 3 4 1 0 3 35 1 2 2 2 1 2 1 4 1 1 2 16 2 2 2 3 2 2 4 4 1 2 3 37 3 1 2 2 2 2 1 2 1 2 1 18 2 2 2 4 3 2 2 4 2 3 4 49 1 0 2 2 1 2 2 3 1 1 1 310 1 2 1 1 1 1 2 4 1 3 4 4TOTAL 13 11 21 16 14 18 21 32 11 17 22 28

Question recording scoring table for Rubric SummativeQuestion Student

1Student2

Student3

Student4

Student 5

Student 6

Student 7

Student8

Student 9

Student 10

Student 11

Student 12

Student 13

1 1 2 4 2 2 2 2 3 4 4 3 32 2 3 4 1 4 2 2 1 4 1 3 43 2 0 4 2 4 4 4 2 4 1 2 24 2 3 4 3 4 3 3 4 3 2 3 35 1 1 3 2 4 2 1 4 4 2 2 26 3 1 2 4 4 4 4 4 4 4 4 37 3 2 1 2 2 2 1 4 4 3 1 28 3 3 3 4 4 3 2 4 4 3 4 49 3 2 3 1 2 3 1 3 4 1 2 410 1 3 4 4 4 4 4 4 4 4 4 4TOTAL 21 20 32 25 34 29 24 34 39 25 28 31

5/5/2023 Pat Loncto Rubric final 81

Page 82: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Rubric about Rubrics ScoreStudent 1

Student2

Student3

Student4

Student 5

Student 6

Student 7

Student8

Student 9

Student 10

Student 11

Student 12

Student 13

RubricScore

2-2=4/8Descriptors are a counting of tasks not quality vocabulary related to Standards. 2nd attribute is unrelated to Standard.

2-2=4/8Relation to Standard is unrecognizable as stated.2nd, 5th attributes unrelated to Standard.

2-2=4/8

Descriptors are a counting of tasks not quality vocabulary related to Standards.

2-2=4/8 2-2=4/8 2-2=4/8 4-2=6/8 3-2=5/8 2-2=4/8Quantitative rubric

1-1=2/8No teacher-friendly rubric to go with student-friendly

2-3=5/8 3-3=6/8

NOTE: Numbering of questions on diagnostic and summative assessments differ however the data reflects consistent cross referencing of the same questions making the data accurate.

5/5/2023 Pat Loncto Rubric final 82

Page 83: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 5: Slice by Slice ProtocolExamining Assessment Tools Slice by Slice Protocol

Developed by Gene Thompson-Grove/Bush Educational Leadership Program www.schoolreforminitiative.orgrevised by Patricia Loncto and Catherine Sedota

Purpose: To closely examine the assessment tool A Rubric About Rubrics with slices of corresponding student work and discuss implications. A slice is a sampling of student work across a broad sample of students (i.e. across grade levels and/or subjects) in order to gain new insights and perspectives on teaching and learning.

Roles: Facilitator, Presenter, Participants

Time: 47-60 minutes

Norms: Be respectful of the presenter, and of the student and his or her work. Contribute to substantive conversation. Try to keep your comments succinct, and monitor your own air time Use warm and cool comments (see Tuning Protocol)

READ – 5 minutes Individually read the protocol. Ask clarifying questions concerning the protocol.

DO Protocol – 32-45 minutes1. 5-10 minutes - The presenter gives a brief description of the assessment’s purpose and context, focus

question, protocol handouts, and answers a few clarifying questions, if necessary.

2. 3-5 minutes - Describing the Assessment ToolThe facilitator asks: “What do you see in the assessment tool?”During this period the group gathers as much information as possible as the members describe what they see, avoiding judgments about the quality of the assessment tool or interpretations about what the assessment task asks students to do. If judgments or interpretations do arise, the facilitator should ask the person to describe the evidence on which they are based. It may be useful to list the group’s observations on chart paper. If interpretations come up, they can be listed in another column for later discussion.

3. 10 minutes - Focusing on the AssessmentEach participant examines a slice of student work (sometimes with another colleague), reviewing it in depth using the presenter’s focus questions to frame the conversation. Participants also write questions that emerge for them as they look at the student work slice and related assessment particulars.

4. 10-15 minutes Interpreting the Assessment The facilitator asks the following questions. Participants share responses and the evidence supporting their responses. (Questions vary depending on central idea being examined for the presenter.)

From the candidates’ perspective, for what are they being assessed as they complete this assignment?

If this assessment was completed successfully by a candidate, what would it tell us about what this candidate knows, understands, and is able to do?

5/5/2023 Pat Loncto Rubric final 83

Page 84: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

The focus question: How well does this assessment tool (Rubric about Rubrics) work to determine the validity and reliability of teacher-created rubrics across subject area disciplines, instructional levels

(pre-k to graduate level), geographical location (rural, city, suburban)? Provide warm and cool feedback related to the focus question.

5. 5 minutes - Implications for Our PracticeThe facilitator asks:

What are the implications of this work for teaching, learning and assessment? What teaching and learning issues have been raised for you in terms of your own practice? What issues have been raised in terms of school wide practices?

DEBRIEF/USES 10 minutes As a group, share what you have learned. Reflect on how well the protocol worked — what went well, and what could be improved (Plus-Delta) What uses do you see for this Protocol?

Summary of July 9, 2013 Slice/Slice Protocol PICCS review:Experienced educators suggested another strip be added to A Rubric About Rubrics. The impact on student self-assessment was found to be valuable for a rubric therefore a strip labeled “Instructional Value” was added after the review. Since this strip did not exist at the time when this student work was collected, it was not used to assess the student work in this packet.

5/5/2023 Pat Loncto Rubric final 84

Page 85: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Charrette Protocol for Rubric CreationAdapted by Patricia Loncto from

Developed by Kathy Juarez, Piner High School, Santa Rosa, California; revised by Gene Thompson-Grove, January 2003, and by Kim Feicke, October 2007.

INTRODUCTIONPurpose The Charrette is a term and process borrowed from the architectural community. Its purpose is to improve a piece of work. Charrettes are used to “kick up” the level of performance. Individuals or teams call for a Charrette when they are “stuck” — when the members of the team have reached a point in the process where they cannot easily move forward on their own. They bring their current ideas, or the actual work in progress, to the Charrette, and then ask the PLC to “work on the work” with them.

Using the Protocol Charrettes are not normally held after the completion of a project. Instead, they are held in a low stakes/no stakes environment, where the requesting team has much to gain from the process and virtually nothing to lose. In short, Charrettes are used to scrutinize and improve work while it is still in progress, before it is ever placed in a high stakes environment. They can be used whenever an individual or small group has a design problem or issue.

One other consideration: the Charrette is used only when there is sufficient trust present in a group, and when the prevailing atmosphere is one of cooperation rather than competition. Underlying the successful use of the Charrette are 2 fundamental beliefs:

1. Individuals or groups working together can usually produce better work than individuals or groups working in isolation (“none of us is as smart as all of us”), and

2. There is no piece of work that with more time, thought and effort couldn’t be improved (“with learning there is no finish line”).

3. The emphasis is on improving the work, which now belongs to the entire group. The atmosphere is one of “we’re in this together,” and our single purpose is “to make a good thing even better.”

Roles1. The facilitator/moderator – keep PLC focused, observe the protocol, occasionally summarize the

discussion or ask questions. At the conclusion of the time allotted briefly summarize what was gained, open discussion for next steps.

2. Recorder – record information as it is being created3. Technician – set up the technology and locate electronic resources as needed during the Charrette (IE.

Synonyms, definitions, primary reference documents)4. Each participant – contribute ideas, follow norms

Time20-60 minutes. Varies according to time allotted for the session or when the rubric or conversation reaches a natural stopping point. No more than 1 hour per session is advised. Better to take a break and let thought settle.

MaterialsComputerProjection technologyRubric electronic file under considerationWord Wall previously constructed for topic under consideration

Norms Have a Goldilocks mindset that searches for the “just right” idea without feeling personally attacked

when another suggestion replaces your suggestion. No suggestion is wrong and it might inspire the “just right” – say it

5/5/2023 Pat Loncto Rubric final 85

Page 86: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

All opinions are valid they just might not be accepted as “just right” for the task by the majority. Selections of “just right” depend on what everyone can live with rather then what everyone wants.

READ Individually read the protocol. Ask clarifying questions concerning the protocol. Topic for protocol: How can we make this better?

DO Protocol The requesting team/individual presents the rubric “work in progress” and states what it needs.

Using a projection unit collaboratively edit the rubric with words/phrases that describe each of the levels of competency using the word wall as a guide or asking the technician to locate synonyms.

Once the dimensions are decided, begin with the 3 level for one dimension at a time, next create level 4, next level 1 or 2 for that dimension. If ideas for levels or dimensions other than the one under consideration come to mind jot the ideas in the appropriate box but resist the temptation to begin developing that box. (this process is described in the PICCS Resource Guide Appendix 13.

DEBRIEF (large group) Where do you want to go from here?

USES (large group) Brainstorm the uses for Charrette Protocol.

5/5/2023 Pat Loncto Rubric final 86

Page 87: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 6: Rubric Learning Experience Planning Documents for Alignment to INTASC Standards and 2013 Danielson Framework for Teaching Rubric

Finding the Big Ideas Grade level: Undergraduate and Post GraduateDirections: Fill in all the blanks with the same word for the topic of a unit. WRITE your answer for each question in the space under the question. Write the first ideas that come to your mind. Select a unit that you enjoy and like to teach. You may find that you are not quite as clear as you might be about what students should leave understanding. If you feel really feel “stuck”, look at your Standards/PI, look at the Core Curriculum and other resources for ideas.Purpose of finding the Big Idea: to discover what makes this topic worth your time to teach and the student’s effort and time to learn.NOTE: This design tool with prompts has been adapted from UBD.

Why study creating rubrics? So what?(topic)

To have a tool that defines quality and can be used for self improvement as well as grading.

What makes the study of creating rubrics universal? (topic)

Rubrics are the assessment tool of the times.

If the unit on creating rubrics is a story, what’s the moral of the story? (topic)

If the goal of education is independent learners then the learner needs to know what quality looks like.

What’s the Big Idea implied in the skill or process of creating rubrics? (topic)

Search for evidence of quality.

What larger concept, theme, or issue underlies creating rubrics? (topic)

Evidence of learning.

What couldn’t we do if we didn’t understand creating rubrics? (topic)

Help students become independent learners, give reliable and valid grades, adjust teaching for excellence.

How is creating rubrics used and applied in the larger world? (topic)

Evidence of achievement is what gets a wage; and for some, brings personal satisfaction.

What is real-world insight about constructing rubrics? (topic)

Teachers must give students a grade.

What is the value of studying creating rubrics? (topic)

To ensure valid and fair grading and cultivate independent learners.5/5/2023 Pat Loncto Rubric final 87

Page 88: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

MY Performance Indicator Vocabulary List

Name: Pat Loncto Standards Area: INTASC StandardsGrade Level: Post-degree Title of Workshop: Constructing Rubrics

Brief description of lesson: This lesson improves student achievement by enhancing the candidates’ ability, grounded in the INTASC Standards, to construct appropriate quality assessment tools in the form of valid and reliable rubrics that will: document and evaluate student learning outcomes based on New York State Learning

Standards/Performance Indicators (PI)/Common Core State Standards (CCLS) inform teacher choices about student progress and to adjust instruction accordingly ensure students self-assess their learning strengths and needs, and set personal learning goals

INTASC Standards and Performance Indicators:Interstate New Teacher Assessment and Support Consortium (INTASC) Standard assessed:Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.

6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Key Phrases Nouns Verbs AdjectivesAnecdotal NotesAssessment ToolLearning outcomeInterstate New Teacher Assessment and Support Consortium StandardsNew York State Standards/Performance IndicatorsNew York State common Core State Standards (CCLS)Summative, formative, diagnostic assessment

AssessmentCandidateCriteriaLearnerRubricStudentUnderstanding

AssessConstructReflectSelf-assess

FormalInformalReliableValidWeighted

SUMMATIVE ASSESSMENT statement: Construct a rubric for a summative assessment. Score with A Rubric About Rubrics. Reflect on growth.

5/5/2023 Pat Loncto Rubric final 88

Page 89: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

LESSON SKETCH Discipline:- EDU Teaching to the Standards Class- Professional Development

Title:Constructing Rubrics

Grade Level: Undergraduate and Post Graduate Levels

Teacher :Patricia Loncto

Enduring Understanding(s): Rubrics are instructional tools. Rubrics are assessment tools. Rubrics link assessment to learning outcomes

(NYS Standards/PI/CCLS).

Essential Question(s): Did students achieve the learning

outcomes? (teacher-friendly) Did they get it? (student-friendly)

Length of Periods:2 hrs

Lesson Components Session 1 plus HomeworkStudent Guiding Question(s)What questions direct this lesson and connect to the essential question?

Student Guiding Questions for Differentiated Instruction and Enrichment

What are the benefits of creating and using rubrics? How do rubrics influence the effectiveness of instruction? How do rubrics benefit students? What influences a teacher’s decision to use a rubric for assessment of learning?

What makes a rubric valid? What makes a rubric reliable? How does one determine the validity and reliability of rubrics found online?

How do rubrics indicate the achievement of objectives/learning outcomes? What is an effective and efficient process for developing rubrics? What is a weighted rubric? What is the value of using a weighted rubric to the teacher, and to the student? How is a weighted rubric constructed? What are some of the challenges in constructing and using a rubric?

How can rubric scores be translated into grades? How can students be involved in the process of constructing rubrics?

INTASC Standards/PI What do you want your students to know and/or be able to do by the end of this learning?

Interstate New Teacher Assessment and Support Consortium (INTASC) Standard assessed:Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Assessment Tool(s) Diagnostic assessment What do I know?5/5/2023 Pat Loncto Rubric final 89

Page 90: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

How do you obtain evidence of each student’s learning?

Observation with anecdotal notes Challenge exercises Rubric Checklist A Rubric About Rubrics Summative assessment What do I know now? Reflection

Skills What steps/procedures do you want your students to learn during this lesson?

Construct a rubric Self-assess

Learning OpportunitiesWhat are the students doing during this lesson?

Refer to Rubric PowerPoint to guide TASK progression. Use Rubric EXERCISE packet to write answers. TASK 1: Complete What do you know? Diagnostic Assessment. Listen to introductory remarks by instructor. TASK 2: Complete Formative Challenge Questions

A – modeled “think aloud” with instructor B – pair “think aloud” C-D – independently Participate in large group debrief: What did you notice in the Challenge Questions? TASK 3: Review sample rubric designs; make notations on samples to indicate your personal design

preferences when creating a rubric. Contemplate in large group, small group or individually. Reflect. TASK 4: Read information on creating rubrics. Highlight information you want explained by instructor.

Remaining Tasks are homework assignments completed on personal time for next session. TASK 5: Summative Assessment - Construct a rubric. Ongoing TASK: Self-assess the rubric and check box when criteria are met; measure constructed rubric

against A Rubric About Rubrics; adjust accordingly. TASK 6: Complete What do you know now? SUMMATIVE ASSESSMENT and reflection.

Differentiated instruction and enrichment TASKS Differentiated Enrichment TASK A: Calculate the point value for each attribute, the maximum

rubric score, and the points scored on each rubric and overall percent (%) score. Differentiated Enrichment TASK B: Design a score sheet for translating rubric scores into

grades. Guiding Question: How can rubric scores be translated into grades? Differentiated Enrichment TASK C: Review “More on RUBRICS” to address frequently asked

questions. What new thoughts come to mind that could impact teaching and learning in your classroom? Construct a rubric with student input. Guiding Question: How can students be involved in the process of constructing rubrics?

Teaching Strategies NOTES:What does the teacher need to

Copy packets Highlighters

5/5/2023 Pat Loncto Rubric final 90

Page 91: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

remember during this lesson? Post-its

Student Reflection Opportunity/QuestionHow do students connect their learning to their personal lives?

Diagnostic: What is the one question I have about rubrics that must be answered before I can move on to

constructing rubrics properly?Formative: How do I discover when a rubric is needed vs. when a different type of assessment tool could be sufficient (ie

checklist, anecdotal record, multiple choice test with answer key….)?Summative: What is the one question I have about rubrics that must be answered before I can move on to

constructing rubrics properly?

Creating Rubrics LE alignment to:Interstate New Teacher Assessment and Support Consortium (INTASC) Standards Assessment Evidence

Standard EvidenceInterstate New Teacher Assessment and Support Consortium (INTASC) Standard assessed:Standard 6 – AssessmentThe candidate understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.6a The teacher balances the use of formative and summative assessment as appropriate to support, verify, and document learning.

Candidate constructs a rubric to be used as formative and summative assessments that support, verify, and document candidate’s learning for purposes of documenting his/her students’ learning that is aligned to learning outcomes.

Candidate uses assessment strategies such as self-assessment, reflection, peer and instructor conferencing to assesses his/her knowledge about rubric creation before, during, and at the end of the lesson to monitor personal learning progress and to guide the instructor’s teaching decisions.

Candidates = participants who complete exercises

5/5/2023 Pat Loncto Rubric final 91

Page 92: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Creating Rubrics LE alignment to 2013 Danielson Framework for Teaching RubricThe use of a valid and reliable rubric can impact a teacher’s competency at the “Highly Effective” level in the following Domains and Elements within the Danielson Framework Rubric. Evidence explaining the congruency between a valid and reliable rubric

with the Danielson criteria follows each element below.

1a Knowledge of Content and Pedagogy: Teacher displays extensive knowledge of the important concepts in the discipline and the ways they relate both to one another and to other disciplines. Teacher demonstrates understanding of prerequisite relationships among topics and concepts and understands the link to necessary cognitive structures that ensure student understanding. Teacher’s plans and practice reflect familiarity with a wide range of effective pedagogical approaches in the discipline and the ability to anticipate student misconceptions.Evidence: A valid rubric uses the discipline concepts as the basis for the criteria described. When students use the rubric to assess learning their misconceptions are revealed by the level of understanding at which they score.

1c Setting Instructional Outcomes: All outcomes represent high-level learning in the discipline. The outcomes are clear, are written in the form of student learning, and permit viable methods of assessment. Outcomes reflect several different types of learning and, where appropriate, represent opportunities for both coordination and integration. Outcomes are differentiated in whatever way is needed for individual students.Evidence: A valid rubric explicitly states the quality of learning for the targeted outcomes. The revelation of the outcomes for the student and the teacher uncovers the need and direction for differentiation.

1f Designing Student Assessments: Teacher’s plan for student assessment is fully aligned with the instructional outcomes and has clear criteria and standards that show evidence of student contribution to their development. Assessment methodologies have been adapted for individual students, as needed. The approach to using formative assessment is well designed and includes student as well as teacher use of the assessment information. Teacher intends to use assessment results to plan future instruction for individual students.Evidence: A valid rubric aligns to instructional outcomes and has increased reliability when students contribute to the development because the teacher is assured that students understand the criteria for success. Valid rubrics used as formative assessment become instructional tools for student and teacher decisions during the learning process.

2b Establishing a Culture of Learning: The classroom culture is a cognitively vibrant place, characterized by a shared belief in the importance of learning. The teacher conveys high expectations for learning by all students and insists on hard work. Students assume responsibility for high quality by initiating improvements, making revisions, adding detail and/or assisting peers in their precise use of language. Evidence: A valid rubric conveys high expectations at the 4 level. When students contribute to the rubric construction and formatively assess their work they assume responsibility for their learning.

3d Using Assessment in Instruction: Assessment is fully integrated into instruction through extensive use of formative assessment. Students appear to be aware of, and there is some evidence that they have contributed to the assessment criteria. Questions and assessments are used regularly to diagnose evidence of learning by individual students. A variety of forms of feedback, from both teacher and peers, is accurate and specific and advances learning. Students self-assess and monitor their own progress. Teacher successfully differentiates instruction to address individual students’ misunderstandings. Evidence: See Evidence above. When students participate in the creation of a valid rubric and when the rubric is used formatively during instruction by the student, peers and the teacher, the resulting feedback advances learning because the feedback links to accurate and specific information about the quality of advancement toward the learning outcomes.

5/5/2023 Pat Loncto Rubric final 92

Page 93: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Creating Rubrics LE alignment to 2012 NYSUT Teacher’s Practice Rubric

The use of a valid and reliable rubric can impact a teacher’s competency in the following Elements within the NYSUT Rubric. Alignment evidence explaining the congruency between a valid and reliable rubric with the NYSUT Element follows each element below.

Element II.4: Articulates learning objectives/goals with learning standardsEvidence: Valid rubric dimensions and levels of success aligned with learning standards articulate expectations for all students.

Element III.1A: Aligns instruction to standardsEvidence: Use of valid rubrics actively and cognitively promote student-to-student and student-to-teacher interactions through conversations about progress toward learning target standards. Student use of rubrics helps them become aware of the learning standards so they are able to make connections between different learning experiences and learning standards thereby having the capacity to become responsible for their own learning.

Element III.3A: Articulates measures of success Evidence: Valid rubrics articulate high expectations for all students at the 4th level of competency. Students have a clear understanding of measures of success when teachers analyze the rubric criteria with students emphasizing and articulating how their success will be measured.

Element III.6A: Uses formative assessment to monitor and adjust pacingEvidence: Use of valid rubrics are a way for teachers to monitor and assess student progress: check for student understanding, seek and provide feedback, and adapt instruction pace, focus, or delivery to meet student needs revealed by rubric scores.

Element V.1A: Designs and/or selects assessments to establish learning goals and inform instructionEvidence: Valid rubrics are tools that can be used to diagnose entry level student competencies and establish learning goals, used during formative assessment for student self-assessment of learning progress in order to evaluate instructional effectiveness and modify instruction, and used again summatively to measure and record student achievement of standards-based knowledge, skills, and understandings.

Element V.5A: Communicates purposes and criteriaEvidence: Valid rubric descriptions of levels of success articulate assessment criteria to students and provide parameters for success.

5/5/2023 Pat Loncto Rubric final 93

Page 94: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 7: Review of Literature Topic: The use of scoring rubrics as “Best Practice” to facilitate valid judgment of learning

competencies and modification of teaching.

Essential Question for the Review of Literature/Theories: How do educational theories/researchers justify the time it takes to create valid and reliable rubrics?

Basic premise for developing this Learning Experience: A scoring rubric is a tool used as a formative and summative assessment strategy to judge and evaluate the continuous intellectual, social, and physical development of the learner.

__________________________________________________________________________________Theories and Research behind the “Why” of rubrics is supported by

o Cognitive Learning Theoryo Constructivist Approacho Social Development Theory

Cognitive Learning Theory: is concerned with understanding thought processes and makes an assumption that humans are logical beings who make choices to solve problems with intelligence and conscious thought. The learning that results is a behavioral change based on the acquisition of new information added to existing knowledge enabling the person to make the appropriate modifications to accommodate that information.

Key theorists: Educational psychologists such as Jean Piaget and William Perry developed a cognitive approach that focused on mental processes rather than observable behavior. Albert Bandura's emphasis on the capacity of agents to self-organize and self-regulate eventually gave rise to his later work on self-efficacy. Researchers A. L. Brown and J. D. Ferrara and Xiaodong Lin highlight the metacognitive process having an impact on independent learning and motivation.

Common principles: Knowledge is actively constructed by learners based on their existing cognitive

structures. Learners use such factors as past learning, cultural background to organize their experience and to select and transform new information.

Knowledge is therefore actively constructed by the learner rather than passively absorbed; it is essentially dependent on the standpoint from which the learner approaches it.

Expectations of personal efficacy determine whether coping behavior will be initiated, how much effort will be expended, and how long it will be sustained in the face of obstacles and aversive experiences.

Persistence in activities that are subjectively threatening but in fact relatively safe produces, through experiences of mastery, further enhancement of self-efficacy and corresponding reductions in defensive behavior.

Implications for teaching: Because knowledge is actively constructed, learning is presented as a process of active discovery. The role of the teacher is to facilitate discovery by providing the necessary resources in a safe environment and by guiding learners as they attempt to assimilate new knowledge to old and to modify the old to accommodate the new. Teachers must thus take into account the knowledge that the learner currently possesses and cultural backgrounds when deciding

5/5/2023 Pat Loncto Rubric final 94

Page 95: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

how to construct the opportunities for integrating new material and illuminate the path to success. This knowledge is also valuable when conferencing with students about learning progress.

Learners are not motivated by extrinsic factors such as rewards and punishment, but rather intrinsic through the personal investment on the part of the learner (Perry 1999, 54). Learners must face up to the limitations of their existing knowledge and accept the need to modify or abandoning beliefs. According to Bandura (1997) existing expectations of personal efficacy are derived such factors as performance accomplishments, and persuasion.

Because learning is largely self-motivated in the cognitivist framework, cognitivists such as A. L. Brown and J. D. Ferrara (1983) suggest methods which require students to monitor their own learning have the potential to facilitate learners in the self-regulation of their learning through a metacognitive process. Metacognitive research implies that the learner’s internal metacognitive functioning, that is the ability to accurately monitor, reflect, evaluate, and adjust, are promising and encouraging in support of the learners’ improved independent learning abilities as well as their motivation to learn (Lin, Xiaodong, 1994).

Benefit of using a rubric: Asking students to explain their success and gaps in learning as described in the rubric can assist them in assimilating new ideas into their existing vision of success and provide expectations for successful accomplishment – a metacognitive exercise. Likewise, providing students with sets criteria in a rubric from which to construct their learning makes it easier for them to highlight certain competencies and persist toward mastery. For instance, the use of rubrics enables students to monitor progress privately with the teacher (who understands their cultural background, past experiences, and learning style) or with a trustworthy peer who can draw attention to any recurring difficulties needing modification for success within a safe environment.

References:

Bandura, Albert, B. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191-215. doi:10.1037/0033-295X.84.2.191

Brown, A.L., Bransford, J.D., R.A. Ferrara & Campione, J.C. (1983). Learning, remembering and understanding. In Flavell, J. H., & Markman, E.M. (Eds.). Handbook of Child Psychology, Cognitive Development (pp. 77-166). New York: Wiley.

Lin, X.D. (1994). Metacognition: Implications for research in hypermedia-based learning environment. In Proceedings of Selected Research and Development Presentations at the 1994 National Convention of the Association for Educational Communications and Technology Sponsored by the Research and Theory Division (pp. 16-20). Nashville, TN.

Perry, William G. (1999). Forms of ethical and intellectual development in the college years. San Francisco: Jossey-Bass Publishers.

Piaget, Jean. (1968). Six psychological studies (Anita Tenzer, Trans.). New York: Vintage Books.

5/5/2023 Pat Loncto Rubric final 95

Page 96: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Constructivist Approach suggests looking for what students can generate and demonstrate, not for what students can repeat or imitate.

Key theorists: Jerome Brunner states: “To instruct someone… is not a matter of getting him to commit results to mind. Rather, it is to teach him to participate in the process that makes possible the establishment of knowledge. We teach a subject not to produce little living libraries on that subject, but rather to get a student to think … for himself, to consider matters… to take part in the process of knowledge-getting. Knowing is a process not a product (1966: 72)”. In his more recent work, Bruner (1986, 1990, 1996) has expanded his theoretical framework to encompass the social and cultural aspects of learning which relates to the Social Development Theory. (Smith, 2002)

Thelma Cey (2001) in a paper for the University of Saskatchewan catalogues several educational experts such as: Maor (1999) who speaks of the changing role of the teacher in a constructivist classroom where the teacher becomes the facilitator or coach who does not possess all the knowledge; Murphy (1997) who discusses how important it is for the teacher to utilize errors as a way of providing feedback for the learner’s understanding; and Petraglia (1998) who claims that a constructivist environment gives the best hope for the educator to intervene in the learning that is occurring in contrast to the educator being in charge of the act of learning.

Common principles: Learning is an active process in which learners select and transform information,

construct hypotheses, and make decisions to construct new ideas or concepts. Instruction must be structured so that it can be easily grasped by the student (spiral

organization). Instruction should be designed to facilitate extrapolation and or fill in the gaps (going

beyond the information given). The teacher is a facilitator of learning who does not possess all the knowledge. Feedback on learning occurs during the process of learning.

Implications for teaching: The instructor and student engage in an active dialogue that encourages students to discover principles by themselves. The task of the instructor is to translate information to be learned into a format appropriate to the learner's current state of understanding so that the student continually builds upon what they have already learned. Bruner (1966) states that a theory of instruction should address four major aspects: (1) predisposition towards learning, (2) the ways in which a body of knowledge can be structured so that it can be most readily grasped by the learner, (3) the most effective sequences in which to present material, and (4) the nature and pacing of rewards and punishments. Good methods for structuring knowledge should result in simplifying, generating new propositions, and increasing the manipulation of information.

Benefit of using a rubric: A rubric provides a framework to help learners reflect (Metacognition) on their perceptions of success in contrast to the teacher’s expectations. Once the teacher and student compare the level of competency during the formative stages of learning the teacher can build upon the student’s strengths while guiding the student to discover new ways for him/her to manipulate the information or the learning process. The teacher may also discover a personal need for modification in pacing issues, or the sequence in presenting material and thereby modify and adjust classroom instruction.

5/5/2023 Pat Loncto Rubric final 96

Page 97: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

References:

Bruner, J.S. (1966). Toward a theory of instruction. Cambridge, Massachusetts:Belkapp Press.

Cey, Thelma. (2001). Moving towards constructivist classrooms. Saskatoon,Saskatchewan.Retrieved from http://www.usask.ca/education/coursework/802papers/ceyt/ceyt.pdf

Maor, D. (1999). Teachers-as-learners: the role of a multi-media professional development program in changing classroom practice. Australian Science Teachers Journal, 45(3), 45-51.

Murphy, E. (1997). Constructivism from philosophy to practice. Retrieved from www.stemnet.nf.ca/~elmurphy/cle.htmlMurphy, E. (Summer 1997) Constructivism From Philosophy to Practice. Available WWW: [http://www.stemnet.nf.ca/~elmurphy/emurphy/cle.html].

Petraglia, Joseph. (1998). The real world on a short leash: the (mis)application of constructivism to the design of educational technology. ETR&D, 46(3), 53-65.

Smith, M.K. (2002). Jerome S. Bruner and the process of education. The Encyclopedia of Formal Education. Retrieved from http://infed.org/mobi/jerome-bruner-and-the-process-of-education

Social Development Theory: explains human behavior in terms of continuous reciprocal social interaction between cognitive, behavioral, an environmental influences thus intersecting with Cognitive Learning Theory and the Constructivist Approach.

Key theorists: Lev Vygotsky (1978) states: "Every function in the child's cultural development appears twice: first, on the social level, and later, on the individual level; first, between people (interpsychological) and then inside the child (intrapsychological). This applies equally to voluntary attention, to logical memory, and to the formation of concepts. All the higher functions originate as actual relationships between individuals." (p57).

According to Vygotsky (1978) “much important learning by the child occurs through social interaction with a skillful tutor. The tutor may model behaviors and/or provide verbal instructions for the child. Vygotsky refers to this as cooperative or collaborative dialogue. The child seeks to understand the actions or instructions provided by the tutor (often the parent or teacher) then internalizes the information, using it to guide or regulate their own performance.”

“A second aspect of Vygotsky's theory is the idea that the potential for cognitive development depends upon the "zone of proximal development" (ZPD): a level of development attained when children engage in social behavior. Full development of the ZPD depends upon full social interaction. The range of skill that can be developed with adult guidance or peer collaboration exceeds what can be attained alone (instructionaldesign.org).”

Vygotsky's theory is corresponds to Bandura's work on social learning. Bandura (1977) states: "Learning would be exceedingly laborious, not to mention hazardous, if people had to rely solely on the effects of their own actions to inform them what to do. Fortunately, most human behavior is learned observationally through modeling: from observing others one forms an idea of how new

5/5/2023 Pat Loncto Rubric final 97

Page 98: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

behaviors are performed, and on later occasions this coded information serves as a guide for action." (p22).

Common principles: Full cognitive development requires social interaction. The highest level of observational learning is achieved by first organizing and

rehearsing the modeled behavior symbolically then enacting it overtly. Coding modeled behavior into words, labels or images results in better retention than simply observing.

Implications for teaching: Because social learning theory encompasses attention, memory and motivation it spans both cognitive and social theories. The component processes underlying Bandura’s observational learning include retention through symbolic coding and organization, self-observation and reflection, accuracy of feedback, and self-reinforcement (instructionaldesign.org). Support can take the form of teacher demonstration, model assignments/projects, peer collaboration, and reflection opportunities all of which can provide motivation for perseverance and an increase in self-efficacy.

Benefit of using a rubric: The emphasis on social learning having a central role in cognitive development presents several rationales for using rubrics. The organization of the dimensions and corresponding detailed levels of performance in a rubric guide students through self-observation and reflection, give immediate feedback which can be discussed in collaboration with peers and the teacher, and are accompanied by benchmarks allowing for observation of the criteria in model work. The repeated use of the rubric reinforces the retention of criteria for success. The collaborative assessment of learning with peers and teachers impact student motivation for perseverance and self-efficacy. As the students progresses to higher achievement levels with social support they begin to believe in their possibility of success and thus are motivated to persevere.

References:

Bandura, A. (1997). Self-efficacy: the exercise of control. New York: W.H. Freeman. Bandura, A. (1977). Social learning theory. New York: General Learning Press. Culatta, Richard. (2013). Social learning theory (Albert Bandura). Retrieved from

http://www.instructionaldesign.org/theories/social-learning.html Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.

Retrieved from http://www.simplypsychology.org/vygotsky.html

_________________________________________________________________________

5/5/2023 Pat Loncto Rubric final 98

Page 99: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

RESEARCHER INFORMATION – Annotated Bibliography

Jonsson, Anders and Gunilla Svingby. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Malmo, Sweden. Retrieved from: www.sciencedirect.com Educational Research Review, Volume 2, Issue 2, 2007. Pp. 130-144.The paper focuses on three questions:1. Does the use of rubrics enhance the reliability of scoring?2. Can rubrics facilitate valid judgment of performance assessments?3. Does the use of rubrics promote learning and/or improve instruction?

“AbstractSeveral benefits of using scoring rubrics in performance assessments have been proposed, such as increased consistency of scoring, the possibility to facilitate valid judgment of complex competencies, and promotion of learning. This paper investigates whether evidence for these claims can be found in the research literature. Several databases were searched for empirical research on rubrics, resulting in a total of 75 studies relevant for this review. Conclusions are that: (1) the reliable scoring of performance assessments can be enhanced by the use of rubrics, especially if they are analytic, topic-specific, and complemented with exemplars and/or rater training; (2) rubrics do not facilitate valid judgment of performance assessments per se. However, valid assessment could be facilitated by using a more comprehensive framework of validity when validating the rubric; (3) rubrics seem to have the potential of promoting learning and/or improve instruction. The main reason for this potential lies in the fact that rubrics make expectations and criteria explicit, which also facilitates feedback and self-assessment.”

Brookhart, S. M. (2013). How to create and use RUBRICS. Alexandria, Virginia: ASCD.The Preface of this book is worth reading particularly for those educators who have heard of, used, and developed rubrics for years. Throughout the chapters in her book, Susan Brookhart corrects the myths and misconceptions that surround the development and use of rubrics which have become a “norm” for those untrained in the development of valid and reliable rubrics. This book in combination with the work of McTigh and Wiggens in Understanding by Design (2005), the research by Anders Jonsson, and Gunilla Svingby, and the process for developing rubrics outlined in this Learning Experience can challenge and redirect educators when creating and using rubrics for the benefit of student learning, instruction and grading.

Goodwin, B., & Hubbell, E. (2013). The 12 touchstones of good teaching: A Checklist for Staying Focused Every Day. Alexandria, Virginia: ASCD.A chapter in this book discusses how “rubrics are an essential tool for delineating the criteria that distinguishes between novice and mastery-level work.” Goodwin and Hubbell recommend guidelines that support the work of Brookhart, Wiggens, McTighe, the research by Anders Jonsson, and Gunilla Svingby, and the process for developing rubrics outlined in this Learning Experience.

Miller, Andrew K. (2013). 4 tips to get more out rubrics. Retrieved from http://www.andrewkmiller.com/2013/04/4-tips-to-get-more-out-of-rubricsThis post originally appeared on the ASCD community blog. Andrew K. Miller is an educator and consultant. He is a national faculty member for ASCD and the Buck Institute for Education. A unique aspect of his comments is the recommendation to “collaborate with others to create rubrics”. His recommendations support the work of Brookhart, Wiggens, McTighe, the research by Anders Jonsson, and Gunilla Svingby, and the process for developing rubrics outlined in this Learning Experience. The full blog precedes the Teacher Exemplar in this Learning Experience.5/5/2023 Pat Loncto Rubric final 99

Page 100: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

All references used in this Learning Experience: Bandura, A. (1997). Self-efficacy: the exercise of control. New York: W.H. Freeman. Bandura, Albert, B. (1977). Self-efficacy: toward a unifying theory of behavioral

change. Psychological Review, 84(2), 191-215. doi:10.1037/0033-295X.84.2.191 Bandura, A. (1977). Social learning theory. New York: General Learning Press. Brookhart, S. M. (2013). How to create and use RUBRICS. Alexandria, Virginia: ASCD. Brown, A.L., Bransford, J.D., R.A. Ferrara & Campione, J.C. (1983), Learning, remembering and

understanding. In Flavell, J. H., & Markman, E.M. (Eds.), Handbook of Child Psychology, Cognitive Development (pp. 77-166). New York: Wiley.

Bruner, J.S. (1966). Toward a theory of instruction. Cambridge, Massachusetts: Belkapp Press. Cey, Thelma. (2001). Moving towards constructivist classrooms. Saskatoon, Saskatchewan.

Retrieved from http://www.usask.ca/education/coursework/802papers/ceyt/ceyt.pdf Culatta, Richard. (2013). Social learning theory (Albert Bandura). Retrieved from

http://www.instructionaldesign.org/theories/social-learning.html Goodwin, B., & Hubbell, E. (2013). The 12 touchstones of good teaching: A Checklist for Staying

Focused Every Day. Alexandria, Virginia: ASCD. Jonsson, Anders and Gunilla Svingby. (2007). The use of scoring rubrics: Reliability, validity and

educational consequences. Malmo, Sweden. Retrieved from: www.sciencedirect.com Educational Research Review, Volume 2, Issue 2, 2007, Pages 130-144.

Lin, X.D. (1994). Metacognition: Implications for research in hypermedia-based learning environment. In Proceedings of Selected Research and Development Presentations at the 1994 National Convention of the Association for Educational Communications and Technology Sponsored by the Research and Theory Division (pp. 16-20). Nashville, TN.

Maor, D. (1999). Teachers-as-learners: the role of a multi-media professional development programin changing classroom practice. Australian Science Teachers Journal, 45(3), 45-51.

Miller, Andrew K. (2013). 4 tips to get more out rubrics. Retrieved from http://www.andrewkmiller.com/2013/04/4-tips-to-get-more-out-of-rubrics

Murphy, E. (1997). Constructivism from philosophy to practice. Retrieved fromwww.stemnet.nf.ca/~elmurphy/cle.html

Perry, William G. (1999). Forms of ethical and intellectual development in the college years. San Francisco: Jossey-Bass Publishers.

Petraglia, Joseph. (1998). The real world on a short leash: the (mis)application of constructivism to the design of educational technology. ETR&D, 46(3), 53-65.

Piaget, Jean. (1968). Six psychological studies (Anita Tenzer, Trans.). New York: Vintage Books. Smith, M.K. (2002). Jerome S. Bruner and the process of education. The Encyclopedia of Formal

Education. Retrieved from http://infed.org/mobi/jerome-bruner-and-the-process-of-education Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press. Retrieved from

http://www.simplypsychology.org/vygotsky.html Wiggins, G., & McTighe, J. (2005). Understanding by design (Expanded 2nd Edition ed.).

Alexandria, Virginia: ASCD.

5/5/2023 Pat Loncto Rubric final 100

Page 101: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

Appendix 8: Peer Review Protocol for Professional Development Rubric LE

Peer Review Protocol for Professional Development Rubric Learning Experience (LE)

2-5 mins.StartEnd:

Introduction Facilitator, then each person speaks in turn.

15 mins.StartEnd:

Presentation(Clarifying Questions)

Presenter only speaks.

30 mins.StartEnd:

Focus Group Reading Time with Discussion

Focus Groupquietly reads (15min) then discusses (15min); presenter rests.

15 mins.StartEnd:

Review Reviewers speak; presenter silently takes notes.

5/5/2023 Pat Loncto Rubric final 101

Focus Groups:1) Implementation of LE: How

effective is this LE in guiding teachers to create valid, reliable, and student friendly rubrics that will improve student achievement?

Sections: Procedure, Appendix 2 Handouts-Exemplar, PowerPoint, Materials

2) Relevant Research: How do educational theories justify the time it takes to create valid and reliable rubrics?

Sections: Appendix 7 Review of Literature

3) Turnkey Capabilities to Various Audiences (I.E. instructional levels and demographics): How well could a TLQP member use this LE as a professional development workshop or independent study? Sections: Congruency Table, Procedure, Appendix 2-Handouts, PowerPoint, Lesson Sketch

Page 102: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

5 mins.StartEnd:

Presenter Response

Presenter speaks; reviewers silently take notes.

10 mins.StartEnd:

Conversation Full Group interactive conversation.

2 mins.StartEnd:

Reflection Full Group silently completes criteria reporting form and passes to the Recorder.

Directions: For each criterion, please note specific evidence from the student work and the learning experience. Evidence noted here provides basis for warm and cool feedback to the Presenter. You may want to divide each section into two parts to track your warm and cool ideas.

(1) Implementation of LEFocus Question: How effective is this LE in guiding teachers to create valid, reliable, and student friendly rubrics that will improve student achievement?Sections: Procedure, PowerPoint, Materials

Warm Cool

5/5/2023 Pat Loncto Rubric final 102

Roles: Facilitator Presenter Timer Recorder Reviewers

Warm feedback is a comment which identifies a strength in the Learning Experience related to its goals and/or the Criteria for Review; for example, “I see that the research support the need for reliable and valid rubrics.”

Cool feedback may be a comment or question that points to a gap in the Learning Experience in relation to its stated goals and/or the Criteria for Review; for example, “Could you tell me more about how you would modify the turnkey training for an audience of mixed grade levels?”

NOTE: Reviewers should maintain a distinct separation between warm and cool comments. For example the type of mixed feedback that should be avoided is, “I like your procedure but I am having difficulty seeing the connection to the INTASC Standards within the congruency table.”

Page 103: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

5/5/2023 Pat Loncto Rubric final 103

Page 104: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

(2)Relevant ResearchFocus Question: How do educational theories justify the time it takes to create valid and reliable rubrics?Sections: Appendix 7 with Review of Literature

Warm Cool

5/5/2023 Pat Loncto Rubric final 104

Page 105: New York State Academy for Teaching and Learningfileserver.daemen.edu/~tlqp/PatriciaLoncto_LE/my_docu…  · Web viewis a grade pre-K-12 ... thus requiring the observer to infer

(3)Turnkey Capabilities to Various Audiences (i.e. instructional levels and demographics)

Focus Question: How well could a TLQP member use this LE as a professional development workshop or independent study?Sections: Congruency Table, PowerPoint, Lesson Sketch

Warm Cool

5/5/2023 Pat Loncto Rubric final 105


Recommended