+ All Categories
Home > Documents > Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools...

Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools...

Date post: 15-Mar-2020
Category:
Upload: others
View: 10 times
Download: 0 times
Share this document with a friend
13
Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help schools understand the changes made to the PIPS BLA tool in 2010 by examining data gathered from students and teachers this year. It will also outline the directions that will be taken in 2011 to ensure this PIPS BLA tool continues to provide a valid and reliable assessment to aid classroom and school planning.
Transcript
Page 1: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

Performance Indicators in Primary Schools Baseline Assessment, 2010

Changes and Directions

This report is designed to help schools understand the changes made to the PIPS BLA tool in 2010 by examining data gathered

from students and teachers this year.

It will also outline the directions that will be taken in 2011 to ensure this PIPS BLA tool continues to provide a valid and reliable

assessment to aid classroom and school planning.

Page 2: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 1

Table of Contents

Introduction ......................................................................................................................................2

The new assessment items ......................................................................................................... 2

Early Reading and Mathematics ........................................................................................................... 2

Phonological Awareness Scale .............................................................................................................. 2

Repeating words ................................................................................................................................... 3

Splitting words ...................................................................................................................................... 3

Making words ........................................................................................................................................ 3

Hearing sounds ..................................................................................................................................... 3

Methods and participants ..................................................................................................................3

Results ...............................................................................................................................................5

Quantitative study ....................................................................................................................... 5

The Phonological Awareness Scale overall ........................................................................................... 5

Figure 1.1: Ability of 2008 PA scale to assess all students .................................................. 5

Figure 1.2: Ability of 2010 PA scale to assess all students .................................................. 6

Discrepancies amongst items ............................................................................................................... 6

Figure 1.3: A well fitting item .............................................................................................. 7

Qualitative study ......................................................................................................................... 8

Location of survey respondents ............................................................................................................ 8

Figure 2.1: Percentage of survey respondents by state ...................................................... 8

Rating the issues faced by PIPS users in 2010 ...................................................................................... 9

Figure 2.2: Volume and clarity of the new voice ................................................................. 9

Figure 2.3: Removal of the rhyming section ....................................................................... 9

Figure 2.4: Unchanging pictures on the Phonics screens .................................................. 10

Figure 2.5: Length of Phonics assessment ........................................................................ 10

Figure 2.6: Program glitch ................................................................................................ 11

Future directions.............................................................................................................................. 12

Page 3: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 2

Introduction Obtaining quality baseline data at the beginning of the first school year is a valuable way to inform planning and action in the classroom. Administering the same assessment at the end of the year to find a value-added score provides essential information to the Year 1 teacher, and also for monitoring school trends and driving school planning and reflection. The Performance Indicators in Primary Schools Baseline Assessment (PIPS-BLA) has been running in Australia since 2001, and is now administered in over 750 schools across the country.

In 2009, the PIPS Australia team (in conjunction with Charles Darwin University) undertook a project to expand on the PIPS scale, adding additional items to the Maths and Reading sections and replacing the Phonological Awareness scale with a more extensive measure. The result was the 2010 PIPS program. This report will examine the changes made in 2010, and look at the directions to be taken in 2011.

The new assessment items Background research, consultation with experts at the Centre for Evaluation and Monitoring (CEM) in the United Kingdom and a two phase trial with 388 public school students led to the items included in the PIPS BLA.

Early Reading and Mathematics These are new Picture Vocabulary and Mathematics sections, designed to present vocabulary and maths problems demonstrated to be easy for younger students, and those with emerging literacy and numeracy skills.

Phonological Awareness Scale The Phonological Awareness (PA) Scale in the PIPS-BLA operates on a continuum of item difficulty, like the rest of the assessment. Very easy items blend through to very hard items. In the case of the PA scale, this means progressing through increasingly smaller units of speech. The first set of items (Repeating Words) is very easy, and the next set (Splitting Words) begins at a difficulty level reached by the previous set. In this way, the final set of items (Hearing Sounds) begins with a very difficult item. Each section is prefaced with instructions and sample items. These sample items are at CVC level, despite the fact that the scale may pick up with a more complex set of sounds, so that the students have the best chance of understanding the instructions. In 2010, there were reduced stopping rules included and every student saw a large volume of items, so we could gather as much data as possible on the new items. In 2011, stopping rules will ensure students see only items that they can complete. The scale consisted of four sections in 2010.

Page 4: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 3

Repeating words Students hear and repeat pseudo-words, or nonsense words. The final 8 words were selected from an initial set of 40. Pseudo-words used in this section are words students have never heard before, so students must listen and retain the sounds before reproducing them. If the items were known words, students would be able to retrieve the sounds from their vocabulary, decreasing the validity of the assessment.

Splitting words From hearing and processing whole words, students now move on to working with word segments. Students hear a two syllable word in this section, and are asked to remove one syllable and say what is left. This exercise uses real words.

Making words Students hear a series of individual sounds and blend them together to produce a word. These items proved quite difficult but effective in face-to-face trials.

Hearing sounds Students hear a pseudo-word and identify the last individual phoneme that they can hear. This is an exceptionally difficult set of items. Pseudo-words are once again used to ensure that the student attends to the word, without retrieving sounds from their existing vocabulary.

Methods and participants The Phonological Awareness items were trialled in two rounds by the PIPS Australia team at The University of Western Australia, using students from three West Australian metropolitan public primary schools. 214 students, aged from 4 years 4 months to 7 years 3 months, took part in the Round 1 trials. The assessments took place at the end of the academic year, and so students in Kindergarten as well as Pre-Primary and Year 1 were included in the trial. This ensured that a wide range of students were assessed on these items, allowing us to identify which items worked best with our target age group. In this first sample, 50.7% were male and 49.3% were female.

The test items were refined for Round 2, selecting the best items from Round 1 for further trialling. The sample group was drawn from 8 Western Australian primary schools. The second round of assessment took place at the beginning of the academic year, at a similar time to that of the administration of the actual PIPS-BLA. This time, 174 students aged between 4 years 5 months and 5 years 11 months, drawn from Pre-primary and Year 1 were involved in the testing. 49.4% were male and 50.6% were female. The final data analysis was completed using all 389 students from both rounds of data collection.

Page 5: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 4

The Literacy and Numeracy items were trialled by the PIPS Australia team at The University of Western Australia in conjunction with Charles Darwin University. 92% of the 329 student sample group were drawn from public schools in the Northern Territory, and 8% were from Western Australian public schools. The Northern Territory sample consisted of 303 students, 51.9% male and 48.1% female. Of this group, 31.6% identified as Indigenous Australians. The Northern Territory sample ranged in age from 3 years 9 months to 8 years 1 month. Students were drawn from Preschool, Transition and Year 1 classes. The Western Australian sample consisted of 26 students, 53.8% male and 46.2% female. This sample ranged in age from 4 years 5 months to 6 years 1 month. 100% of the Western Australian sample came from metropolitan schools.

This series of trials culminated in the inclusion of the new items in PIPS 2010. Data were then gathered from the 26 484 students who completed PIPS this year. This group of students ranged in age from 4 years 0 months to 7 years 6 months. 51% were males, and 49% females. The data gathered by this large scale trial, and the conclusions and alterations made as a result, are reported here.

A separate study was conducted in May 2010 to gather feedback from teachers about the new items included in 2010. The qualitative data will be reported here also, and have informed decisions about content and structure for 2011. The survey gathered anonymous data from 221 PIPS users.

Page 6: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 5

Results

Quantitative study

The Phonological Awareness Scale overall The new Phonics items aimed to provide a wide scale ranging from very easy to very hard, so all students could be assessed regardless of their level of development. The decision was made to expand the Phonological Awareness (PA) scale after an investigation by the PIPS team in 2008.

The figure presented below shows the distribution of items in terms of difficulty, and students in terms of ability prior to the inclusion of the new items. Students are represented by the pink bars, and items by the blue. Looking at this graph, you can see that the vast spread of student ability far surpasses the spread of test items. On the original PA scale, a significant number of students were positioned well above the hardest item on this graph, and a number also fell below the easiest item. In this scale, items were located between -2.0 and approximately 1.5 logits. Students spread from -3.0 to approximately 3.75 logits. It was concluded that the scale required extension by the addition of both easier and harder items. Gaps between the blue bars can also be seen. This indicates that students who fall in these gaps could be better served by a more comprehensive PA scale.

Figure 1.1: Ability of 2008 PA scale to assess all students

The research described in the Methods section resulted in the inclusion of 32 new Phonological Awareness items in the 2010 PA scale. All 32 of these items were trialled with all PIPS students in 2010, and the resulting analysis produced the graph that you can see overleaf. The blue bars (items) extend for a much larger range, indicating that the assessment has been extended as the PIPS team intended. The items now range from -3.0 to approximately 3.0 logits, and there are fewer gaps between these items. Most students are able to be assessed by this new range

Page 7: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 6

of items. Only 15 sit above the hardest item in the new PA scale. However, it can be seen that there are 120 students located at the left hand side of the graph who have not been provided with easy enough items. Students in this group may have Special Education Needs such as hearing difficulties, or speak English as an Additional Language. The sound quality of the new Phonics items may also play a role in the students sitting below the scale. In 2011, several easier Repeating Words items will be included to cater for these students.

Figure 1.2: Ability of 2010 PA scale to assess all students

Discrepancies amongst items In analysis of the new PA scale items, it was noted that one set of items did not operate as well as the others. The Making Words items did not fit as we expected when examining item characteristics. The Making Words items tested well in face-to-face trials, however when presented in the computer based format, seem not to have translated as well as was intended. This may be because the visual element of watching a mouth form the sounds of the words is removed in the computer based assessment. It may also have to do with the difficulty students experience hearing these items in a classroom setting. The graphs that follow show a visual explanation of the unexpected fit of these items.

Figure 1.3 demonstrates an Item Characteristic Curve for a well fitting item. The Item Characteristic Curve shows four points plotted on a line graph. The line shows the expected value of students performing at each location along the horizontal axis. In order to say that the item has ‘good fit’, it is expected that the points fall on or close to the line.

In Figure 1.3, the points follow the curve of the line. This item has a ‘good fit’ and will continue to be included in the PIPS assessment. In contrast, when these Item Characteristic Curves were

Page 8: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 7

generated for the Making Words items, the points fell well beyond the curve of the line, proving that the item did not function as we intended it to. As a result, the Making Words items will be omitted from the PA scale in 2011.

Figure 1.3: A well fitting item

Page 9: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 8

Qualitative study

Location of survey respondents The survey was opened to respondents all over Australia, and the results reported here come after the survey has been active for three months. The survey will remain active until Friday August 27th, and can be accessed by visiting www.education.uwa.edu.au/pips . At the time the responses were gathered for this report, a total of 221 responses had been received. The bulk of these came from Western Australia (n=94) and the Australian Capital Territory (n=48). Queensland had 25 respondents, New South Wales had 19. Tasmania and Victoria both received 17 responses, and South Australia had only 1 respondent.

Figure 2.1: Percentage of survey respondents by state

Western Australia

42%

Australian Capital Territory

22%

Queenland11%

New South Wales

9%

Tasmania8%

Victoria8%

South Australia0%

Page 10: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 9

Rating the issues faced by PIPS users in 2010 The respondents to the survey answered a series of questions about previously reported issues in PIPS 2010. Survey respondents identified whether a certain issue had posed a problem for them, and then rated the level of difficulty they had faced for each issue.

Figure 2.2: Volume and clarity of the new voice

Volume and clarity problems were experienced by almost all PIPS users, with 93% of users noting that this issue was a problem for them. This issue was identified as being Problematic or Very Problematic to 70% of PIPS users and is, of course, an issue of utmost importance for PIPS 2011. The program will be revoiced with a single speaker.

Figure 2.3: Removal of the rhyming section

The removal of the rhyme section was identified as a problem by 80% of PIPS users. 51% of users reported that this was Problematic or Very Problematic. In 2011, the PIPS assessment will include an optional rhyme section that teachers may access as they choose.

7%

21%

72%

Was volume and clarity a problem?

No

Somewhat

Yes, definitely

10%

20%

20%

50%

How much of a problem did volume and clarity pose?

Not problematic

Somewhat problematic

Problematic

Very problematic

20%

24%56%

Was the removal of rhyme a problem?

No

Somewhat

Yes, definitely

23%

26%26%

25%

How much of a problem did removal of rhyme pose?

Not problematic

Somewhat problematic

Problematic

Very problematic

Page 11: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 10

Figure 2.4: Unchanging pictures on the Phonics screens

This issue was raised in particular by teachers of children with Special Educational Needs, such as Autism Spectrum Conditions, as changing pictures provide a visual cue regarding the change of task type. This issue was therefore experienced as a problem by only 38% of PIPS users. 16% reported that this issue was Problematic or Very Problematic. PIPS is an assessment tool that aims to cater for all students, and so the 2011 program will include a change of visual cue to coincide with changing task types in order to ensure that no student group is disadvantaged.

Figure 2.5: Length of Phonics assessment

The length of the Phonics assessment was experienced as problematic by 62% of PIPS users. 32% said it was Problematic or Very Problematic. The Phonics assessment was much longer in 2010 in order to gather good data on the new items, and this was always intended to be run for one year only. In 2011, the Phonics assessment will have fewer items as a whole, and will have stopping rules in place consistent with the remainder of the PIPS program.

62%22%

16%

Were the unchanging pictures a problem?

No

Somewhat

Yes, definitely

63%21%

8% 8%

How much of a problem did the unchanging pictures pose?

Not problematic

Somewhat problematic

Problematic

Very problematic

38%

30%

32%

Was the length of Phonics a problem?

No

Somewhat

Yes, definitely

39%

29%

16%

16%

How much of a problem did the length of Phonics pose?

Not problematic

Somewhat problematic

Problematic

Very problematic

Page 12: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 11

Figure 2.6: Program glitch

The PIPS team acknowledges the error that resulted in the Program glitch in 2010, and this is a problem that cannot afford to be repeated in the future. This glitch obviously caused an issue for all PIPS users, but it was the hope of the PIPS team that we could minimise the negative impact by providing a prompt solution and support to schools as they implemented this solution. The glitch was experienced as a problem by 55% of PIPS users. 36% reported that is was Problematic or Very Problematic. We are pleased to see that the support offered by the PIPS team and schools’ IT staff meant that 45% of PIPS users did not experience additional problems in recovering from the Program glitch. The PIPS program is always quality tested before CD duplication begins, but more strenuous tests will be completed for all future versions of PIPS to ensure this issue is not repeated.

45%

21%

34%

Was the Program glitch a problem?

No

Somewhat

Yes, definitely

45%

19%

10%

26%

How much of a problem did the Program glitch pose?

Not problematic

Somewhat problematic

Problematic

Very problematic

Page 13: Performance Indicators in Primary Schools Baseline ......Performance Indicators in Primary Schools Baseline Assessment, 2010 Changes and Directions This report is designed to help

PIPS Baseline Assessment 2010: Changes and directions

Page | 12

Future directions In 2011, there will be a number of changes to the PIPS program, some as a general matter of continuing to improve the assessment tool, and others in response to feedback from PIPS users this year.

More strenuous quality assurance The checklist used to review the PIPS program prior to CD duplication has been reviewed, and more strenuous quality assurance measures put in place. This will mean that incidents such as the program glitch will be captured prior to the program being released to PIPS schools in the future.

Revoicing the program The program will be entirely revoiced for 2011. A single voice will be present throughout the entire program, and the PIPS team will work closely with programmers and a sound engineer to ensure appropriate levels of volume and clarity.

Optional rhyming section The rhyming section will be reintroduced in 2011 as an optional assessment. Analysis and tests of fit in the 2008 investigation, and the trial of new rhyming items in the expansion of the PA scale, found that rhyming did not fit in a meaningful way with other PA items. Rhyming is still an important skill for teachers to gather information on, and PIPS will provide an optional assessment for teachers to do this.

Shorter Phonological Awareness scale The trial period for the new PA items has meant that most of the 32 items were seen by every student. This will change in 2010. Fewer items will be included in the scale overall, with the removal of the Making Words section. Stopping rules will also be put in place in 2011 consistent with the remainder of the PIPS program. This means that if a student gives an incorrect response three times in a row, or four times in a section of the Phonics assessment, they will not proceed to any harder items.

Compatibility with Windows7 The new Windows7 operating system is being used by some schools. The operating system was released after the PIPS 2010 program was finalised, and so the program has not been compatible with this operating system in 2010. In 2011, PIPS will be compatible with the Windows7 operating system, as well as being backwards compatible.

IDEAS+ Introduction Package As a special offer to schools in 2011 only, the PIPS team will provide an introductory package for IDEAS+. Schools will receive individual student reports for every student they assess, a copy of the IDEAS+ program preloaded with their data and a quick start guide so they can begin working with IDEAS+ to create customised groups and reports.


Recommended