Date post: | 27-May-2015 |
Category: |
Education |
Upload: | mana-education |
View: | 267 times |
Download: | 2 times |
Achieving consistency of teachers’ judgements and measuring progress
against national standards:
Curriculum knowledge deployed within a psychometric framework
Michael Johnston29 August 2012
National Standards design
• Validity of teacher judgements is emphasised:– Wide range of evidence– Over-reliance on formal testing discouraged– No national testing
• Reliability will be a challenge:– Overall judgement must be formed from
diverse evidence on multiple criteria
Achieving reliability
• Important to teacher acceptance of the standards:– Common understanding of standards – Confidence that they can make judgements with
reasonable national consistency
• Exemplars and moderation help but are not sufficient– A given student does not usually resemble the
exemplar– Social moderation is not designed to produce
consistency
The Progress and Consistency Tool (PaCT)
• Designed to support teachers to make reliable judgements without sacrificing validity.– A diverse evidence base and professional
judgement remains at the forefront. The framework gives structure to the professional judgement.
• Will provide a mechanism to measure progress.
Design of PaCTEach sequence of standards (reading, writing,
mathematics):– Conceptualised as a developmental continuum,
underpinned by a quantitative scale– Decomposed into a set of aspects, each comprising
a series of (sequential) items stages.
• Each item comprises annotated illustrations.• Collectively, the aspects comprise a rubric.
The form of the rubrics
An item
An illustration
Guidelines and principlesfor developing progressions
1. Aspects should represent salient stages of cognitive development and/or curriculum progression.
2. Aspects should cover the range of relevant cognitive functions important to the standard.
3. The number of aspects for each standard should be manageable for teachers.
4. Observational evidence for each item (stage) of each aspect must be readily available to, and recognisable by, teachers.
Guidelines and principlesfor developing progressions (ctd.)
5. Aspects should be associated with cognitive development and learning rather than with year levels.
6. The aspects defined for a given standard need not all entail the same numbers of stages.
7. The extent of each stage on each progression in relation to the overall continuum will be determined empirically, not defined by the standards writers.
Aspects for the mathematics rubric (Gill Thomas)
Strand Aspects Number and algebra
• Additive thinking • Multiplicative thinking • Patterns and relationships • Using symbols and expressions
Measurement and geometry
• Measurement sense • Geometrical thinking
Statistics • Statistical investigations • Statistical literacy and probability
1. Making sense of text 2. Applying vocabulary knowledge (with Writing)3. Using reading to organise for learning4. Locating and using information and ideas in
informational texts (continuous and non-continuous/in print)
5. Locating and using information and ideas in informational texts (continuous and non-continuous/on-line)
6. Interpreting and responding to ideas, information and experiences in literary texts (in print and on-line)
7. Identifying and reflecting on the way writers use ideas and language to influence their readers.
Aspects for the reading rubric (Sue Douglas)
1.Encoding 2.Sentence Structure (different types of sentences & sentence beginnings; grammar; punctuation)3.Applying vocabulary knowledge (with Reading)4.Using writing to think and organise for learning 5.Using writing (in print) to communicate knowledge and understanding 6.Using writing (online) to communicate knowledge and understanding 7.Creating texts for literary purposes8.Creating texts to influence others
Aspects for the writing rubric (Sue Douglas)
The components of an illustration
Student response
What the child does or says (work sample, interview transcript)
Annotation
The interpretation of the response & why it ‘fits’
Annotation
The interpretation of the response & why it ‘fits’
Prepared for TAG, 14 August 2012
More expertMore expert
Less expertLess expert
About the same
About the same
UnsureUnsure
Implementing the framework
• Defining the rubrics; describing the aspects and items
• Developing the illustrations• Collection and analysis of sample data
– Determine quality of rubrics, items and illustrations
– Calibrate the measurement scale
• Standard setting
Developing the frameworkAnalysis of sample data- Collect judgements on each aspect for a
sample of students- Sample should be representative of
Year levels, geographic regions, school deciles and demographics
- The scale for each sequence of standards will be constructed from these data using item-response analysis.
Utility of item response analysis
• Creates a scale with equal-interval properties.– Ideal for measuring progress
• Locates each stage of each progression on this overall scale.
Developing the framework
Standard setting– When the overall continuum for each sequence of
standards has been constructed, a standard-setting exercise will be required to define overall Year-level boundaries.
– Standard setting will require holistic judgements by expert judges made using the same evidence base as that supporting the scale construction exercise.
– Important to be clear that boundaries are stochastic, not deterministic
Rubric design: Standard setting
✔
✔ ✔
✔ ✔
✔
✪
Test information
• Teachers can also enter results for a selection of tests; e.g.,
–PAT–GLOSS–asTTle
• It is envisaged the role of test information will as a self-moderation process.
Bands showing probable location of a student
Observational Test
Year 4
Year 3
Year 2
Year 1
95% confidence
68% confidence
Con
tinuu
m
Measuring and reporting progress
-2
-1
0
1
2
3
4
5
2011 (Year 2) 2012 (Year 3) 2013 (Year 4) 2014 (Year 4) 2015 (Year 5)
Year and schooling year
Perf
orm
ance
(log
its)
Standard
Year 2
Year 3
Year 4
Year 5
Problematic issues
• Making the tool mandatory
• Housing and usage of data