+ All Categories
Home > Documents > Statewide Practitioners’ Group Meeting January 7, 2015 Copyright © 2015.

Statewide Practitioners’ Group Meeting January 7, 2015 Copyright © 2015.

Date post: 16-Jan-2016
Category:
Upload: shanon-butler
View: 217 times
Download: 4 times
Share this document with a friend
Popular Tags:
82
Statewide Practitioners’ Group Meeting January 7, 2015 Copyright © 2015
Transcript

Statewide Practitioners’ Group MeetingJanuary 7, 2015

Copyright © 2015

To enhance educator effectiveness and student learning.

For the benefit of all stakeholders, including students, educators, parents, and the community.

By developing an integrated and coherent human capital system that aligns with the district mission and includes the following key features for all educators: regular, specific measurement and feedback; on-going professional development; and fair and equitable recognition and reward.

So That schools can better attract and retain high-performing educators and benefit from a workforce of teachers and administrators who are aligned in purpose, teamed in their efforts, and motivated to succeed in delivering high-quality instruction to all students.

MSFE Vision

2

SPG Meeting: 8:30am-10:30am LEPG Training: 10:30am-2pm

SPG Agenda:

• General updates – Scott/Patricia

• SLO support – Paul Stautinger

• Performance gap reduction - Mary/Ken

• Online student growth assessments for the specialist positions - Mark Nixon

Agenda

3

AIR: Judy Ennis and Mark Nixon

Welcome and Introductions

4

District questions / requests / breaking news

Teaching and Learning 2015 Annual Performance Report (APR) Measurement Inc. – Project Evaluation TEPG Calibration System MSFE Leadership 360 MSFE Student Perception Survey TEPG/LEPG training – Ch180 requirement TEPG/LEPG submittal process

Before Getting Started

5

6

General Updates

Merger of the 2 subgroups– “Educator Preparation and Employment”

Educator Preparation and school climate

Finalizing membership

Action Planning

HCMS

7

Progress To-Date:

1. Math Coaching Project

2. Surveyed teachers, principals and district administrators

• Teachers were asked:– To reflect on their preparation program (their level of preparation on a series of topics, one

additional area of focus for their prep program)

– To identify one area they feel their principal could come into the classroom more prepared

• Principals were asked:– To reflect on how prepared teachers were when they start teaching (their level of preparation

on a series of topics, one area new teachers could come into the classroom more prepared)

– One area they wish their principal preparation program focused more

HCMS-Educator Preparation and Employment

8

• District administrators were asked:– To reflect on how prepared principals were when they start (their level of preparation on a

series of topics)

– To identify one area they feel their principal could come into the administrative team more prepared

HCMS-Educator Preparation and Employment

9

10

Progress To-Date:

Parameters of the work defined and goals developed

3 goals:1. To provide a comprehensive library for school districts to use in assessing

and improving school climate and working conditions

2. To build upon the student and teaching survey instruments within MSFE

3. To collect in-person qualitative feedback from MSFE districts on key aspects of school climate

HCMS – School Climate

Continue work on the action plans

Finalize membership

Full HCMS group meeting – March 3, 2015

HCMS – Next steps

11

Performance Evaluation and Professional GrowthTEPG / LEPG

12

On-Site Coaching Support and Virtual

Instructional Leadership Development

MSFE Teacher Evaluation Calibration System (TECS)

TEPG

13

LEPG rubric revisions completed in November Training session 1: November

• Refined rubric, introduction to goal setting

Training session 2: postponed until today• Leader observation, artifact review

Revised 360 Survey• Anticipated in early January – aligned to revised rubric

LEPG Companion Guide• Anticipated in April 2015

Teachscape at-elbow support

LEPG

14

2014/15 Calendar

15

DateTIF 3 Statewide Practitioners’

Group

Executive Advisory Council

TIF 4 Statewide Practitioners’

GroupLocation

Wed, October 1 8:30-11am 11am-noon Noon-3pm Cross Office, Rm 103

Wed, November 5 9am-2pm: LEPG Training for Superintendents / Principals Mod #1 Cross Office, Rm 103

Wed, December 3 9am-2pm: LEPG Training for Superintendents / Principals Mod #2RESCHEDULED DUE TO WEATHER Cross Office, Rm 103

Wed, January 7 8:30-10:30am: TIF3 and TIF4 Statewide Practitioners’ Group10:30am-2pm: LEPG Training for Superintendents / Principals Mod #2 Cross Office, Rm 103

Wed, February 4 8:30-9:30am: TIF3 and TIF4 Statewide Practitioners’ Group9:30am-2pm: LEPG Training for Superintendents / Principals Mod #3 Cross Office, Rm 103

Wed, March 4 8:30-11am 11am-noon Noon-3pm Cross Office, Rm 103

Wed, April 1 9am-2pm: LEPG Training for Superintendents Mod #4 Cross Office, Rm 103

Wed, May 6 8:30-11am 11am-noon Noon-3pm Cross Office, Rm 103

SLO SupportPaul Stautinger

SLO Support

17

Final SLO Approvals Mid-Year Check-in and revisions SLO Review/Audit

• Results used for:– oversight of SLO approval for comparability between approvers, schools and districts

– refinement of training

– refinement of SLO processes and guidelines

• TIF 3– Review in progress (75-100% depending on district)

– Results delivered to superintendents in February

– Measures clarity and rigor in Interval of Instruction, Standards, Assessment, and Growth Target

• TIF 4– 30% district review

– Training in planning

Performance Gap ReductionMary Paine and Ken Coville

HOME

Student Learning and Growth: Approaches to Measuring

Teacher Effectiveness

HOME

Introduction

This slide presentation introduces emergent thinking on a method of rating teachers on the student learning and growth component of a performance evaluation and professional growth (PEPG) system. The "Performance-Gap-Reduction" (PGR) method presents both a unique approach to targeting and measuring student growth and to rating teacher impact on that growth. This resource supports districts in understanding both the PGR method and the more commonly used method. The Maine DOE welcomes input and feedback from districts who decide to use either of the methods described in this presentation.

HOME20

HOME

Method of Scoring Student Learning and Growth Measures

to Determine Teacher Rating

The following slides compare two different methods of measuring student growth and determining a teacher's impact on that growth :

The Percent-Met Method The Performance-Gap-Reduction Method

HOME

Percent-Met Method Rating Scale*

Percentage Ranges of Students Who Met Their Growth Targets  Teacher Impact

85–100% High

71–84% Moderate

41–70% Low

0–40% Negligible

Total of the % of all growth targets met ÷ number of SLOs = Average % of students who met the growth target

Impact on Student Learning and Growth Rating

*This Impact scale is used in the Maine DOE Teacher Performance Evaluation and Professional Growth Model, which also uses an SLO frame. The design of the scale represents the widely used method of measuring student growth and rating teacher impact on that growth. In some instances of the use of this method, the rating categories are numeric (e.g. 85-100% = 3.51-4.00 Points).

Based on number of students who meet a growth target, which is typically set by the teacher.

22

HOME

Steps in the Percent-Met Method

Step 1: Pre-assess; scoreStep 2: Teacher sets a growth target for the cohort, using one of multiple

approaches (see slide 34)Step 3: Post-assess; scoreStep 4: Determine how many students met the growth target set for the cohortStep 5: Determine the teacher's impact rating on the % Met Impact Scale

23

HOME

A Closer Look at thePercent-Met

MethodThe following slides illustrate possible outcomes of the Percent-Met method.

HOME

Comparing Two Cohorts

Teacher 1 Growth Teacher 2 Growth

A 150 /157 y 7 A 150/162 y 12

B 170/176 y 6 B 170/189 y 19

C 175/163 n -12 C 175/180 n 5

D 180/187 y 7 D 180/194 y 14

E 190/186 n -4 E 190/193 n 3

F 195/203 y 8 F 195/213 y 18

% Met Growth Target 4 of 6 66%..................................................................................................4 of 6 66%

Two like teachers Illustration based

on use of individual growth targets (GTs) converted to mean GT of 6

Same number of students meet the growth target

25

HOME

Percent-met Rating Scale

Percentage Ranges of Students Who Met Their Growth Targets  

85–100% High

71–84% Moderate

41–70% Low

0–40% Negligible

Total of the % of all growth targets met÷ number of SLOs = Average % of students who met the growth target

Impact on Student Learning and Growth Rating

Teacher 1 and Teacher 2

Same rating on Percent-Met Scale

26

HOME

Actual Growth

Teacher 1 Growth Teacher 2 Growth

A 150 /157 y 7 A 150/162 y 12

B 170/176 y 6 B 170/189 y 19

C 175/163 n -12 C 175/180 n 5

D 180/187 y 7 D 180/194 y 14

E 190/186 n -4 E 190/193 n 3

F 195/203 y 8 F 195/213 y 184 12 4 71

% Met Growth Target 4 of 6 66%..................................................................................................4 of 6 66%

Mean Growth 12÷6 = 2.00 ………………………………………………………………..71÷6 =11.83

Different amount of actual growth occurs.

27

HOME

And the results can be even more dramaitc

Teacher 1 Growth Teacher 2 Growth

A 150 /157 y 7 A 150/164 y 14

B 170/176 y 6 B 170/189 y 19

C 175/163 n -12 C 175/180 n 5

D 180/187 y 7 D 180/194 y 14

E 190/196 y 6 E 190/195 n 5

F 195/203 y 8 F 195/213 y 184 22 4 75

% Met Growth Target 5 of 6 83%..................................................................................................4 of 6 66%

Mean Growth 22÷6 = 3.66 ………………………………………………………………..75÷6 =12.50

Different amount of actual growth occurs.

28

HOME

Percent-Met Rating Scale

Percentage Ranges of Students Who Met Their Growth Targets  

85–100% High

71–84% Moderate

41–70% Low

0–40% Negligible

Total of the % of all growth targets met÷ number of SLOs = Average % of students who met the growth target

Impact on Student Learning and Growth Rating

Teacher 1Teacher 2

Teacher 1 is rated as having greater growth impact than Teacher 2 even though teacher 2’s instructional cohort has more than three times the mean growth as Teacher 1’s instructional cohort.

29

HOME

Summary of A Closer Look at the Percent-Met Method

The percent-met method of arriving at a teacher's Student Learning and Growth uses a binary, yes or no, target that does not account for all of the growth attained (or not attained) by students in a cohort.

When all factors are made equal, the Percent-Met method cannot distinguish between two teachers with significantly different actual growth.

When all factors are made equal, the Percent-Met method could result in teachers whose instructional cohorts show lower actual growth being rated higher than teachers whose cohorts show higher actual growth.

30

HOME

Performance-Gap-Reduction (PGR) Method

HOME

Steps in the PGR Method

Step 1: Pre-assess; scoreNOTE: The PGR method does not require teachers to set a growth target for a cohort.

Step 2: Calculate the mean performance gap among studentsStep 3: Post-assess; scoreStep 4: Calculate the mean growth among studentsStep 5: Calculate % Mean Performance Gap ReductionStep 6: Determine the teacher's impact rating on the RPG Impact Scale

32

HOME

Step 1: Pre-Assessment

Student Max Score Possible

Pre-Assessment Score

A 250 95B 250 86C 250 222D 250 37E 250 103F 250 214G 250 230H 250 78I 250 87J 250 200

Assessment: The comprehensive assessment in our sample has a total possible points of 250.

33

HOME

Step 2: Calculate Mean Performance Gap

Student Max Score Possible

Pre-Assessment Score

Performance Gap

A 250 95 155B 250 86 164C 250 222 28D 250 37 213E 250 103 147F 250 214 36G 250 230 20H 250 78 172I 250 87 163J 250 200 50

Mean Performance Gap1,148 ÷ 10 114.8

34

HOME

Step 3: Post-assess

Student Max Score Possible

Pre-Assessment Score

Performance Gap

Post-Assessment Score

A 250 95 155 194B 250 86 164 167C 250 222 28 236D 250 37 213 135E 250 103 14 171F 250 214 36 231G 250 230 20 240H 250 78 172 162I 250 87 163 193J 250 200 50 229

Mean Performance Gap1,148 ÷ 10 114.8

35

HOME

Step 4: Calculate Mean Growth

Student Max Score Possible

Pre-Assessment Score

Performance Gap

Post-Assessment Score

Mean Growth Gain

A 250 95 155 194 99B 250 86 164 167 81C 250 222 28 236 14D 250 37 213 135 98E 250 103 147 171 68F 250 214 36 231 17G 250 230 20 240 10H 250 78 172 162 84I 250 87 163 193 106J 250 200 50 229 29

Mean Performance Gap1,148 ÷ 10 114.8

606 ÷ 10

Mean Growth 60.6

36

HOME

Step 5: Calculate Percent Performance Gap Reduction (PGR)

Student Max Score Possible

Pre-Assessment Score

Performance Gap

Post-Assessment Score

Mean Growth Gain

A 250 95 155 194 99B 250 86 164 167 81C 250 222 28 236 114D 250 37 213 135 98E 250 103 147 171 68F 250 214 36 231 17G 250 230 20 240 10H 250 78 172 162 84I 250 87 163 193 106J 250 200 50 229 29

Mean Performance Gap1,148 ÷ 10 114.8

606 ÷ 10

Mean Growth 60.6

% Performance Gap Reduction—60.6/114.8 53 %

37

HOME

Step 6: Determine Rating on PGR Impact Scale

PGR Impact Scale  

Mean growth index reduces mean performance gap by at least 75% High

Mean growth index reduces mean performance gap by at least 50% Moderate

Mean growth index reduces mean performance gap by at least 25% Low

Mean growth index reduces mean performance gap by less than 25% Negligible

Multiple measures of Student Learning Growth may be combined through equal or weighted values, but collective measures may not be weighted more than 25% of the total.

Impact on Student Learning and Growth Rating

38

HOME

Summary of the PGR Scale Analysis

Using a Performance Gap Reduction scale…

Uses all of the growth demonstrated by students in a cohort Eliminates the variability in quality and rigor of growth targets set by individual

teachers Makes room for a greater focus, in training programs, on the quality of content

standards, instruction, and assessments Preserves data on individual students by using growth gains to arrive at the

performance gap reduction Provides for equity and comparability in establishing teacher impact rating for

instructional cohorts with low, high or widely varying pre-assessment scores

39

HOME

Frequently Asked Questions about the PGR

Scale

HOME

FAQ 1

Question: We are intuitively uncomfortable with eliminating student growth targets. Can we use the PGR Rating scale along with student growth targets?

Answer: The PGR method does not eliminate student growth targets. It rather sets a continuum of growth ranging from 0 growth for 0 students to 100% of students achieving maximum attainable growth. Within that continuum, teachers should base their instruction on identified needs of students and articulated learning goals for improvement. This goal-oriented focus of instruction is clearly called for in the standards of every instructional practice framework approved by the Maine DOE for PEPG systems, and it is integral to the SLO process (for districts who choose to use SLOs).

41

HOME

FAQ 5

Question: Can the PGR scale be used with the NWEA?

Answer: The NWEA Conditional Growth Index Calculator provides for a mean growth target for a cohort. The mean growth result is expressed as a mean 'Z' Score (CGI). The NWEA CGI scores can easily be converted to a rating on the PGR Scale.

A video explaining the calculation of the CGI score can be viewed here:

https://nwea.adobeconnect.com/_a203290506/cgicalculator/

A modified PGR impact scale with CGI Scores is shown on the next slide.

42

HOME

PGR Impact Scale With NWEA CGI ResultsPGR Impact Scale  

Mean growth index reduces mean performance gap by at least 75%High

NWEA mean Conditional Growth index of at least 0.5 (69th growth percentile)

Mean growth index reduces mean performance gap by at least 50% Moderate

NWEA mean Conditional Growth index of at least 0.0 (50th growth percentile)Mean growth index reduces mean performance gap by at least 25%

LowNWEA mean Conditional Growth index of at least -0.5 (31st Growth percentile)

Mean growth index reduces mean performance gap by less than 25%Negligible

NWEA mean Conditional Growth index of at least -1.0 (16th Growth Percentile)

Multiple measures of Student Learning Growth may be combined through equal or weighted values, but collective measures may not be weighted more than 25% of the total.

Impact on Student Learning and Growth Rating

43

HOME

FAQ 8

Question: What are the implications of the PGR method for the (SLO) process?

Answer: The PGR method provides a uniquely stable standardization of growth targets across teachers and contents. This allows for greater attention to the selection and approval of the content standards, the assessments, and the instructional plan articulated in the SLO.

44

HOME

Contributors

Maine Department of Education—Mary Paine, Educator Effectiveness Coordinator; Anita Bernhardt, Director, Standards and Instructional SupportsRSU 74— Ken Coville, SuperintendentMaine Schools for Excellence—Scott Harrison, TIF 3 and TIF 4 Project Director; Sue Williams, TIF 3 Professional Development Coordinator; Jane Blais, TIF 4, Professional Development Coordinator; Deb Lajoie, TIF 3 and TIF 4 Project Coordinator

A special thanks to the following for contributing technical expertise.

BST Educational Consulting—Paul Stautinger, ConsultantCommunity Training and Assistance Center—Scott Reynolds, Senior Associate, National School ReformAmerican Institutes for Research—Mariann Lemke, Managing Researcher

45

On-line Student Growth Assessments for Specialists PositionsMark Nixon

Leader Evaluation and Professional Growth Session 2Evidence, Feedback, and Growth

49

Welcome and Introductions

50

By the end of the day, you should:

Increase your familiarity and comfort with the LEPG Rubric.

Be able to identify a high-quality, aligned artifact of practice.

Understand how to conduct and score a leader instructional feedback observation.

Use item sets to support the 360-degree survey

Outcomes

Session 1. Expectations and Goal Setting (November 2014)

Session 2. Evidence, Feedback, and Growth (today)

Session 3. Reflection, Rating, and Planning (February 2015)

Session 4. Summative Scoring and Feedback (April 2015, superintendents only)

LEPG Sessions Overview

51

360-Degree Survey

Artifact Review

Lunch

Instructional Feedback Observation

Wrap-Up and Next Steps

Agenda

52

53

360-Degree Survey

Changes to the LEPG rubric prompted the need for changes to the 360-degree survey to ensure better alignment. The following activities were completed between the end of November and today.

1. Review alignment between current survey and revised LEPG

2. Drafted approximately 200 items which align to revised LEPG core propositions, indicators and levels of performance.

3. Created a small item bank, which includes additional items.

The processes are intended to provide MSFE superintendents options for creating tailored 360-degree surveys.

360-Degree Survey

54

Due to the cumulative design of the LEPG rubric, we faced several item design challenges. After consideration, we opted for the following:

1. Item alignment. We determined to write items that represent performance levels and indicators. Doing so will allow superintendents to see the distribution of responses per performance level.

2. Scale. We opted to use a single, 4-level Likert scale with a range between strong disagree to strongly agree in order to simplify teacher response.

3. Item design. We wanted to write low inference, behaviorally-focused items that are free of jargon that may not be accessible to all.

360-Degree Survey

55

360-Degree Survey

56In

effe

ctive

Develo

ping

Effect

ive

Disting

uishe

d0

0.51

1.52

2.53

3.54

4.55

Core Prop 1, indicator 2With this approach, superintendents and principals receive information on the average rating for each performance level, which describes the distribution of teacher/staff ratings across the performance continuum.

360-Degree Survey

57

Next steps.

1. Review and comment. We welcome your review and comment on the item set, and discussion about our design points.

2. Support. We can offer support to create a manageable survey that gathers enough information to make decisions about performance.

3. Content and construct validation. Should the districts move toward use of a common survey, we can assist MSFE to validate the survey.

58

Artifact Review

59

An artifact review: Is a systematic and objective review of artifacts against a

scoring rubric for the purpose of determining a leader’s performance level

Provides evidence for standard indicators that may not be addressed by other measures

Should include common and unique examples, with explanations

Is not: • A portfolio of accomplishments or interviews

• A random collection of accomplishments

Artifact Review

60

What are artifacts?

Artifacts provide evidence of nonobservable aspects of educator practice, including planning and professional responsibilities.

Artifacts are naturally produced through a leader’s daily routines and show evidence of leader performance as it relates to the standard indicators within the LEPG Rubric.

High-density artifacts, which address multiple Core Propositions and Standard Indicators, are preferred.

Artifact Review

61

The LEPG Artifact Submission Form

Artifact Review

62

Artifact brainstorm: Step 1: Each table is assigned one or two

Core Propositions. Step 2: Brainstorm possible artifacts aligned to

assigned Core Propositions:• How common are the artifacts?

• Will all leaders produce the artifacts?

• How standardized are the artifacts?

• Will you see differences in leadership performance?

Step 3: Chart your artifact list (one chart per Core Proposition).

Step 4: Participate in a gallery walk.

Artifact Review

63

Lunch

Instructional Feedback Observation

64

Reflect on leader observations last year.• What went well?

• What tips or tricks might you suggest to your peers to make this process go smoothly?

• What were your biggest challenges?

Be prepared to share out.

Warm-Up Discussion

65

Supports quality implementation of the Teacher Evaluation and Professional Growth (TEPG) system

Done well, feedback motivates teachers to improve

Provides actionable feedback for teachers and leaders

Measuring Practice: Leader Observation

66

Observation is:• Transparent

• Accurate

• Combined with other measures

Multiple observations result in a score

Measuring Practice: Leader Observation

67

Open the Instructional Feedback Observation Protocol and Toolkit

Measuring Practice: Leader Observation

68

See Toolkit, page 3

Domain What it measures Evidence

1. Evidence use Centers conversation on observation evidence Accurately aligns data to TEPG Observation

2. Professional interactions

Focused attention Appropriate communication Depersonalized comments

Observation

3. Differentiated questions Asks reflective questions Observation

4. Leading conversations Sets goals for the meeting Paces conversation Defines next steps with the teacher

Observation

5. Written feedback Completes forms Connects to teacher framework

Document review

Measuring Practice: Leader Observation

69

Prepare (and schedule)

Collect

AnalyzeGather

Evidence

Plan

Leader observations must be scheduled to correspond to teacher postobservation conferences.

Observation evidence can be collected by scripting or video.

To minimize the number of required meetings, observation results may be shared at the midcourse conference.

Measuring Practice: Leader Observation

70

What is inside the Protocol?

Take 15 minutes to read pages 2 through 10 of the Protocol (stop at 5. Rate and Plan).

Meet in small groups:

• What was interesting or surprising?

• What do you need more information about?

• What points are important to highlight?

Report out.

Measuring Practice: Leader Observation

71

What is inside the Toolkit?

Take 10 minutes to read pages 3 through 6 of the Toolkit. Note the performance progressions or look-fors.

Meet in small groups:• What was interesting or surprising?

• What do you need more information about?

• What points are important to highlight?

Report out

Measuring Practice: Leader Observation

72

If you are taking written notes:

Verbatim notes are best.

Focus on leader and teacher speech.

Make postobservation notes to help you recall important incidents or interactions.

Analyze soon after completing the notes.

Measuring Practice: Leader Observation

73

If you are video recording:

Set the camera up to view the teacher and leader.

Check the sound quality.

Review the video, and make notes.

Analyze soon after completing the notes.

Quality evidence means writing detailed notes or video recording.

Note: Professional development should include establishing interrater agreement by reviewing multiple videos.

Activity: You Are the Observer

Make a T-chart on your computer; use the form in the Toolkit, or use your preferred note-taking method.

Watch the video of the principal providing feedback to the teacher.

Rate performance according to the Observation Rubric (pp. 4–6 of Toolkit).

Discuss your overall impressions of the feedback session at your table:• What does the principal do well?

• How could the principal improve?

Share out key findings with the larger group.

Measuring Practice: Leader Observation

74

http://tpep-wa.org/trainingpd/pre-and-post-observation-examples/

http://tpep-wa.org/trainingpd/pre-and-post-observation-examples/

Measuring Practice: Leader Observation

75

Let’s try it again, now that we’ve practiced once together.

When combined with other evidence, observation results will assist supervisors to evaluate leader effectiveness on Core Proposition 4: Teaching and Learning, specifically:

Standard Indicator 4.3 Supporting Instructional Practice: The leader supports improvement of teacher practice through evidence-based, actionable feedback and access to quality professional development.

Scoring the Observation:LEPG Rubric Alignment

76

Step 1. Average the two final observation scores to create a single, final observation score, or have the fall observation be formative, and the spring observation be summative.

Step 2. Use Table 4. LEPG Scoring Alignment (see Protocol, p. 11), to select an initial, preliminary score for Standard Indicator 4.3.

Step 3. Analyze other applicable evidence (e.g., 360 survey or artifacts), and adjust preliminary score to reflect this evidence.

Scoring the Observation:Recommended Scoring Process

77

What went well about this activity?

What could be improved?

Debrief

78

79

Wrap-Up and Next Steps

LEPG and TEPG are intimately connected and interdependent.

LEPG builds upon best practices in which you are already engaged.

Evaluation is a learning process: it will evolve over time.

We are here to support you; tell us how we can help.

Things to Remember

80

LEPG Implementation

What is the your biggest challenge or concern about conducting the observations or compiling or reviewing artifacts?

What additional supports do you need?

LEPG Training

What worked well today?

What needs to be different for next time?

Reflection

81

Session 3, February 2015

Rating, Reflection, and Planning

Supports the midcourse and end-of-year conferences

Next Steps

82


Recommended