Date post: | 16-Jan-2016 |
Category: |
Documents |
Upload: | jocelin-harmon |
View: | 213 times |
Download: | 0 times |
SPRING DATA REVIEW
MAY 20, 2014
TODAYS PRESENTATION HAS BEEN ADAPTED FROM THE FOLLOWING RESOURCES. THANK YOU!
• MIBLSI
• HURON ISD
• INGHAM ISD
• FLORIDA RTI PROJECT
• IL SPIRE
• IES PRACTICE GUIDES (MATH/READING)
• RTI INNOVATION CONFERENCES
LEADERSHIP TEAMS… PURPOSE: SPRING DATA REVIEW WILL HAVE AN OPPORTUNITY TO REVIEW THE STATUS OF
LITERACY, MATHEMATICS AND/OR BEHAVIOR SYSTEMS
WILL MAKE ACTION PLANS TO CONTINUE TO IMPROVE STUDENT ACHIEVEMENT
MATERIALS YOU WILL NEED TODAY
• DATA (FALL/WINTER) REVIEW PROBLEM SOLVING GUIDE (ELECTRONIC OR HANDOUT)
• WORKED EXAMPLE OF DATA FALL REVIEW PROBLEM SOLVING GUIDE (ELECTRONIC OR HANDOUT)
• SIP INFORMATION, SWIS, DIBELS, AIMSWEB REPORTS & TIER 2-3 TRACKING FORMS
• LAST/THIS YEARS ASSESSMENT BINDER
• BOQ/SAS
• PET-R/SWEPT
• ACTION PLANS & COMMUNICATION PLAN
LEARNING TARGETS
WE CAN ARTICULATE…
THE PURPOSE AND VALUE IN A BUILDING/DISTRICT DATA REVIEW.
HOW STUDENT OUTCOMES DATA CAN INFORM THE BUILDING/DISTRICT ABOUT STUDENT PERFORMANCE.
HOW PROCESS DATA CAN INFORM THE BUILDING/DISTRICT ABOUT STUDENT PERFORMANCE.
HOW STUDENT OUTCOME DATA AND PROCESS DATA CAN BE USED TOGETHER TO INFORM BUILDING/DISTRICT GOALS.
THE PURPOSE AND STEPS IN A PROBLEM SOLVING PROCESS (GATHER DATA, IDENTIFY AND ANALYZE PROBLEM(S), DEVELOP AN ACTION PLAN, EVALUATE THE PLAN).
AGENDA
WELCOME
DATA OVERVIEW
SCHOOL IMPROVEMENT CONNECTIONS
TEAM TIME
LUNCH
TEAM TIME
REJOIN AT 2:45 P.M.
EXIT SLIP & EVALUATION
TEAM ROLES
FACILITATOR
GUIDE THE DISCUSSION
KEEP THE TEAM FOCUSED
ENSURE THE TASKS ON THE “COMPLETE THE EXIT SLIP” ARE COMPLETED
TIME KEEPER
MAKE SURE THE TEAM IS MOVING THROUGH THE PROCESS EFFICIENTLY
HAVE TEAM GATHER AT 2:45 P.M.
ASSIST MANAGER
MAKE ANY MODIFICATIONS TO SIP/DIP
ACTION PLAN RECORDER
FILL OUT PROBLEM SOLVING GUIDE, AT A GLANCE OVERVIEWS
RECORD ANY “TO DO’S” GENERATED BY THE TEAM
SEND COMPLETED ACTION PLAN TO TEAM MEMBERS
Check when
complete
Problem Solving Guide: Literacy
Problem Solving Guide: Mathematics
Data Review Problem Solving Guide (Large Packet)
Special Education Tracking Form (Coaches)
Action Plan (Appendix C)
Communication Plan (Appendix D)
Building Summary Report for District Data Review
(Appendix E)
This item is critical because it will inform future
District-level Data Reviews.
Completed Data Review Guide e-mailed to
Rebecca Buxton:
Exit ChecklistSpring 2014 Building Data ReviewPlease have your time-keeper check-off each task when completed using the table below. Turn this form in to MTSS Facilitator at the end of the day, verifying that your completed Data Review Guide was sent to: [email protected] . Date:________________ Building:_______________________ District:____________________________
DATA SETS & PURPOSE: SRESD
Trend Outcome Process
Grade Level (DN, AW) (IE) Or District
Sub-group (DN, AW) (IE) Explore (IE) or (ACT)
MEAP (IE or MISchoolData.org)
Aimsweb/DIBELSNext Benchmark
Aimsweb ROI by Measure for Reading & Math (P, IE) at Building Level (P, IE)
Summary of Effectiveness (DN, P) SWIS/Behavior, SRSS,
Suspensions/Expulsions/Attendance(SWIS or PowerSchool: P) Local Assessments (P, IE)
PET-R (P)/ SWEPT-R (P)
SAS
BAT
BoQ
PET-M * (P)
PET-M * Middle School (P) MTSS-SA * (P)
PBIS/Reading/Math TICs
SET
ASSIST AW = AIMsweb; DN = DIBELSNext, IE = Illuminate Ed; P = Team Provided
SCHOOLWIDE OVERVIEW- ACADEMICS
Where to find the academic data!
Process Data Student Outcome Data
Reading Math Behavior Systems Reading (≥80%) Math (≥80%)
PET-R
PET-M
BAT
MTSS-SA
K: Composite PA (ISF) (PSF) AP (LNF) AP (NWF)
K: OCM NIM QDM
(≥80%)
(≥80%) (≥70%) SOE W(1, 2, 3) S(1, 2, 3)
MNM
1: Composite PA (PSF) AP (NWF) FL (R-CBM WC) FL (R-CBM Accuracy)
SOE W(1, 2, 3) S(1, 2, 3) 1: OCM NIM QDM
MNM 2: Composite FL (R-CBM WC) FL (R-CBM Accuracy)
SOE W(1, 2, 3) S(1, 2, 3)
2: M-CAP M-COMPCOMP 3: Composite FL (R-CBM WC) FL (R-CBM Accuracy)
SAS (DAZE/MAZE) MEAP-Trending up last 3 years in a row
SOE W(1, 2, 3) S(1, 2, 3)
3: M-CAP M-COMP
(≥70%) 4: Composite FL (R-CBM WC) FL (R-CBM Accuracy) 4: M-CAP M-COMP (DAZE/MAZE) MEAP- Trending up
SOE W(1, 2, 3) S(1, 2, 3)
5: M-CAP M-COMP 5: Composite FL (R-CBM WC) FL (R-CBM Accuracy)
Bo Q (DAZE/MAZE) MEAP-Trending up
(>70%) SOE W(1, 2, 3) S(1, 2, 3)
6: Composite FL (R-CBM WC) FL (R-CBM Accuracy) (DAZE/MAZE) MEAP-Trending up
SOE W(1, 2, 3) S(1, 2, 3) Subgroup Performance
Subgroup Performance
SCHOOLWIDE OVERVIEW-BEHAVIOR BEHAVIOR
SCHOOLWIDE OVERVIEW- BEHAVIOR
PROCESS DATA - BEHAVIOR
Benchmarks of Quality (BoQ)
• Completed annually by school leadership teams
• Tier 1 SWPBIS implementation fidelity check
• 53 benchmarks across 10 critical elements of implementation.
• Identifies areas of strength and need; informs problem analysis and action planning.
• 70% Implementation Goal
Self-Assessment Survey (SAS)
• Completed annually by building staff • Fidelity check of PBIS implementation
across (a) school wide, (b) non-classroom, (c) classroom, and (d) individual students
• Seven key elements of the Implementation Subsystems
• Informs of areas of strength and need, including communication between leadership team and staff
• 70% Implementation Goal
pbisapps.org
SCHOOLWIDE OVERVIEW- BEHAVIOR
EARLY WARNING SIGNS
EARLY WARNING SIGNS (EWS)
• ROUTINELY AVAILABLE DATA; AVAILABLE EARLY IN THE SCHOOL YEAR
• BETTER PREDICTOR THAN BACKGROUND CHARACTERISTICS
• CUT POINTS SELECTED TO BALANCE YIELD AND ACCURACY.
• HELPS TARGET INTERVENTIONS
• INFORMS OF PATTERNS AND TRENDS
EARLY WARNING SIGNS (EWS)
ATTENDANCE: MISSING MORE THAN 10% OF INSTRUCTIONAL TIME
BEHAVIOR: SUSPENSIONS (ISS OR OSS); MINOR OR MAJOR ODRS
• ISS OR OSS: 6 HOURS OF ACADEMIC INSTRUCTION LOST PER DAY
• ODR: 20 MINUTES OF ACADEMIC INSTRUCTION LOST FOR STUDENT PER REFERRAL
COURSE PERFORMANCE: COURSE FAILURES, GRADE POINT AVERAGE; CREDIT ACCRUAL
• COMBINATIONS OF ACADEMIC INDICATORS CAN REDUCE GRADUATION LIKELIHOOD TO 55%
EWS OUTCOME DATA - BUILDING LEVELATTENDANCE: > 90% MISSING MORE THAN 10% OF
INSTRUCTIONAL TIME• STATE OF OHIO RETROSPECTIVE ANALYSIS OF TOP/BOTTOM 10% ACADEMIC
OUTCOMES
• BALANCES YIELD VS. ACCURACY
BEHAVIOR: > 80% WITH 0 SUSPENSIONS (ISS OR OSS)• “HIGH QUALITY INSTRUCTION” RESEARCH
• MTSS TARGETED INTERVENTION
COURSE PERFORMANCE: ACT-EXPLORE DATA• COURSE FAILURES (MTSS MODEL OF 80% CORRECTED FOR ACCURACY TO 85-90%)
• CREDIT ACCRUAL IS BUILDING-SPECIFIC
• COMBINATIONS OF ACADEMIC INDICATORS CAN REDUCE GRADUATION LIKELIHOOD TO 55%
Schoolwide Overview – BehaviorWorked Example
EXAMPLE PROCESS DATA SNAPSHOTSACADEMICS & BEHAVIOR
WHAT ARE THEY GOOD FOR?
WE CAN USE PET/SWEPT TO ANSWER THE FOLLOWING QUESTIONS—
WE HAVE LESS THAN 80% OF OUR STUDENTS AT BENCHMARK. WHAT MIGHT BE HAPPENING IN OUR CORE INSTRUCTION THAT MIGHT BE CONTRIBUTING TO THAT? WHAT CAN WE DO TO IMPROVE?
RESEARCH INDICATES SPECIFIC AREAS THAT WILL IMPROVE OUTCOMES. THESE INCLUDE GOALS/OBJECTIVES/PRIORITIES, ASSESSMENT, INSTRUCTIONAL PRACTICES AND MATERIALS, INSTRUCTIONAL TIME, DIFFERENTIATED INSTRUCTION/GROUPING, ADMINISTRATION/ORGANIZATION/COMMUNICATION, PROFESSIONAL DEVELOPMENT.
DOES OUR MTSS SYSTEM HAVE THE MOST EFFECTIVE STEPS IN PLACE?
BARRIERS?
THE PET/SWEPT/BOQ/SAS ARE GOOD GUIDES FOR DESIGNING AND IMPROVING A BUILDING MTSS SYSTEM. HOWEVER, THEY ALSO CAN ALSO PRESENT MANY PERCEIVED BARRIERS TO WORK AROUND.
REMEMBER, THINK OUTSIDE OF THE BOX! WHAT CAN WE DO DIFFERENTLY?
DATA CONNECTIONS ACTIVITY
PICK A PIECE OF CANDY FROM THE BOWL ON YOUR TABLE.
TAKE ONE CANDY WRAPPER ACTIVITY SHEET FROM YOUR TABLE AND FIND ANOTHER INDIVIDUAL (NOT FROM YOUR DISTRICT)THAT HAS THE SAME PIECE OF CANDY.
IDEAS TO SHARE WITH ONE ANOTHER…AND MAKE NOTES
HOW DATA SETS REVIEWED SO FAR ARE USED IN EACH DISTRICT?
WHAT OTHER TYPES OF TREND OR OUTCOME DATA IS USED IN YOUR DISTRICT?
WHAT SUB-GROUPS DOES YOUR DISTRICT LOOK AT?
HOW OFTEN DOES YOUR LEADERSHIP TEAM REVIEW DATA?
PET-M SNAPSHOTS
MTSS: BUILDING SELF-ASSESSMENT
SCALE: NOT STARTED (N) — IN PROGRESS (I) —
ACHIEVED (A) — MAINTAINING (M) —
What Does BSA Data Tell you?
PROCESS DATA SNAPSHOTSBEHAVIOR
BENCHMARKS OF QUALITY (BOQ)
• TIER 1 SWPBIS IMPLEMENTATION FIDELITY CHECK
• 53 BENCHMARKS ACROSS 10 CRITICAL ELEMENTS:
• IDENTIFIES AREAS OF STRENGTH AND NEED TO INFORM ACTION PLANS
• COMPLETED ANNUALLY BY SCHOOL LEADERSHIP TEAMS
Self-Assessment Survey (SAS)
• Completed annually by building staff • Fidelity check of PBIS
implementation across (a) schoolwide, (b) non-classroom, (c) classroom, and (d) individual students
• Seven key elements of the Implementation Subsystems
• Informs of areas of strength and need, including communication
PROCESS DATA SNAPSHOTS: PBIS BENCHMARKS OF QUALITY (BOQ)
PROCESS DATA SNAPSHOTS: PBIS SELF-ASSESSMENT SURVEY (SAS)
While summary data from the SAS provides a general sense of a building’s PBIS systems, more focused analysis can inform a team of the most vital and influential next steps.
Low Implementation
Status
HighStaff
Priority
PBIS Subsystem
TargetedImplementation
Supports
PROCESS DATA SNAPSHOTS: PBIS SELF-ASSESSMENT SURVEY (SAS)
PROBLEM SOLVING GUIDE
PROBLEM SOLVING GUIDE: STEP 1
Determine your (first) problem to be addressed today based one what you’ve derived from:• Previous SIP• Overviews of
Academics and Behavior (At a Glances)
• Outcome Data• Process Data and
Process Data Snapshots
PROBLEM SOLVING GUIDE: STEP 2
Complete a Problem Analysis: Hypothesize what may be contributing to the problem
Again, your data and the Snapshots can inform this discussion.
PROBLEM SOLVING GUIDE: WORKED EXAMPLE
PROBLEM SOLVING GUIDE: WORKED EXAMPLE
PROBLEM SOLVING GUIDE: WORKED EXAMPLE
PROBLEM SOLVING GUIDE: WORKED EXAMPLE
PROBLEM SOLVING GUIDE
PROBLEM SOLVING GUIDE-STEP 3
PROBLEM SOLVING GUIDE STEPS 3 & 4
Summary information from Spring/Winter/Fall Data Reviews (Math, Reading, Behavior, EWS) should be used to:
Inform (create/revise) plan’s measureable objectives, strategies & activities
Monitor and evaluate SI strategies & activities
District/School Improvement Plans
Goals Objectives Strategies Activities
Student Goal Statement:
“All students will increase proficiency in mathematics/reading/writing, etc.”
Resources
Measurable Objective Statement(s):
What will happen, with which students, by when
• At least 1 must address state assessment
• At least 1 should address critical objectives across one or more tiers of support
Strategies:What staff will do to attain goal & objectives
• What teachers will implement
• Research-based Practices
• Clear connection to Consolidated Application/ Title I budget
Activities:
• Activity Description
• Activity Type• Planned/Actual
Staff• Planned/Actual
Timeline• Include actions
to Monitor & Evaluate strategy implementation
• Clear connection to Consolidated Application/Title I budget
Resources
• Funding Source
• Planned/ Actual Amount
Improvement Plans + Problem Solving Guide = Student Achievement
Problem Solving - Guide Action Plan
Problem Solving Guide -
Study/ Problem Analysis
Problem Statement &
Cause of Problem
Statement
THE BUILDING LEADERSHIP TEAM DOES NOT
HAVE TO SOLVE EVERY PROBLEM BUT DOES
NEED TO STUDY BUILDING DATA TO DETERMINE
SCHOOL-WIDE NEEDS THEY WILL ADDRESS
ALONG WITH IDENTIFYING GRADE-LEVEL NEEDS
AND ENSURING THE APPROPRIATE
INDIVIDUAL(S) WHO WILL ADDRESS THESE
NEEDS ARE IDENTIFIED (E.G., WHICH GRADE-
LEVEL TEAMS NEED TO ADDRESS THE
IDENTIFIED NEEDS)
REMEMBER…
TEAM TIME: COMPLETE SCHOOL-WIDE OVERVIEW SHEETS
(BEHAVIOR/ACADEMIC) AND/OR PROBLEM SOLVING GUIDE
• COMPLETE OVERVIEW SHEETS
• REVIEW/UPDATE PREVIOUS ACTION PLAN
• IDENTIFY BUILDING CELEBRATIONS AND OPPORTUNITIES: SHARE OUT
• PRIORITIZE “PROBLEMS” FOR TODAY’S PROCESS
You do!
TEAM TIME: COMPLETE PROBLEM SOLVING PROCESS
• CHOOSE A PROBLEM; COMPLETE THE PROBLEM SOLVING PROCESS AND CREATE AN ACTION PLAN.
• MOVE ON TO SECOND (AND THIRD) PROBLEM, IF ABLE.
• COMMUNICATION PLAN
• LUNCH AT 12:00
• RECONVENE AT 2:45
• EXIT SLIP
You do!
SESSION EVALUATION
RATE YOUR KNOWLEDGE/SKILLS/
COMPETENCE FOR THE FOLLOWING ITEMS UPON THE COMPLETION OF TODAYS’
SPRING DATA REVIEW.