+ All Categories
Home > Documents > Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the...

Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the...

Date post: 03-Feb-2018
Category:
Upload: nguyenhuong
View: 223 times
Download: 2 times
Share this document with a friend
93
Meta-analysis of programmes and projects supported by the Zenex Foundation between 2006 and 2011 Contents Zenex Foundation Meta-evaluation methodology.......................3 Purpose of the meta-evaluation...................................3 Data sources..................................................... 3 Analytic framework............................................... 4 Organisation of the report.......................................7 Supporting educational improvement through teacher development.....8 Introduction..................................................... 8 Models of teacher support and development implemented by the Zenex Foundation....................................................... 8 Overview of the alignment between projects and the Foundation’s 2005-2015 strategy.............................................. 10 Examination of the outcomes realised by teacher development.....11 Discussion of possible factors that could explain extent of project impact.................................................. 14 Lessons learnt: project design and implementation..............17 Supporting educational improvement through providing direct assistance to learners............................................19 Introduction.................................................... 19 Models of student assistance implemented by the Zenex Foundation 20 Overview of the alignment between projects and the Foundation’s 2005-2015 strategy.............................................. 23 Examination of the impact of the various programmes.............24 Discussion of possible factors that could explain extent of project impact.................................................. 25 Lessons learnt: project design and implementation..............26 Supporting educational improvement through the development of innovative materials..............................................27 1
Transcript
Page 1: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Meta-analysis of programmes and projects supported by the Zenex Foundation between 2006 and 2011

ContentsZenex Foundation Meta-evaluation methodology................................................................................3

Purpose of the meta-evaluation........................................................................................................3

Data sources......................................................................................................................................3

Analytic framework...........................................................................................................................4

Organisation of the report.................................................................................................................7

Supporting educational improvement through teacher development..................................................8

Introduction.......................................................................................................................................8

Models of teacher support and development implemented by the Zenex Foundation....................8

Overview of the alignment between projects and the Foundation’s 2005-2015 strategy...............10

Examination of the outcomes realised by teacher development....................................................11

Discussion of possible factors that could explain extent of project impact....................................14

Lessons learnt: project design and implementation.......................................................................17

Supporting educational improvement through providing direct assistance to learners......................19

Introduction.....................................................................................................................................19

Models of student assistance implemented by the Zenex Foundation...........................................20

Overview of the alignment between projects and the Foundation’s 2005-2015 strategy...............23

Examination of the impact of the various programmes..................................................................24

Discussion of possible factors that could explain extent of project impact....................................25

Lessons learnt: project design and implementation.......................................................................26

Supporting educational improvement through the development of innovative materials.................27

Introduction.....................................................................................................................................27

Description of the projects..............................................................................................................27

Lessons learnt with respect to the introduction of innovative teaching materials..........................32

Supporting educational improvement through the development of schools of excellence................33

Introduction.....................................................................................................................................33

Assessment of alignment with the Foundation’s strategic objectives.............................................33

Focussed whole school development: Zenex Foundation’s School Development Project..............33

Project management and delivery...................................................................................................34

Assessment of outcomes.................................................................................................................35

1

Page 2: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Lessons learnt..................................................................................................................................35

Introduction.....................................................................................................................................37

Identification of the development challenge: the relationship between language skills and achievement in Mathematics and Science..........................................................................................38

Needs analysis as a component of project planning and design.........................................................40

Identification and selection of beneficiaries........................................................................................40

Partnership as a model for effective project delivery..........................................................................43

Programme evaluation that supports organisational and sectoral learning..........Error! Bookmark not defined.

2

Page 3: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Zenex Foundation Meta-evaluation methodology

Purpose of the meta-evaluation

The meta-evaluation of the first five years of the Zenex Foundation’s 10 year strategy provides an opportunity to reflect on the Foundation’s body of work and its achievements between 2005 and 2010. It is also a time to reflect on the extent to which the various projects that it has supported have contributed to the achievement of the organisation’s strategic goals and consider what lessons which can be learnt about effort to improve educational quality. However, this meta-evaluation (or meta-analysis) of the Foundation’s work is not only about looking back, it is also a chance to look forward and to think about whether strategies and approaches need to change or evolve in order to promote greater organisational effectiveness.

In order to fulfil these objectives, the meta-evaluation seeks to provide a concise overview of the Foundation’s work during the first five years of its 2005-2015 organisational strategy and to examines the extent to which projects have been aligned with and supported the attainment of organisation’s strategic objectives. In addition to describing the work that the Foundation has undertaken, the meta-evaluation focuses on the outcomes achieved by the projects initiated during this period and also attempts to identify and document lessons learnt about project design, implementation and evaluation. Where appropriate, each section concludes with a number of suggestions to improve programme delivery.

Data sources

Project evaluation reports were the primary data source used in the conducting the meta-analysis which follows. At the outset, it should be noted that the meta-evaluation does not cover the Foundation’s full project portfolio and the review of activities and lessons learnt is limited to those projects that have been the subjects of an external evaluation. As the meta-evaluation seeks to analyse and document the extent to which project outcomes have been achieved, only those projects that had participated in at least two evaluation cycles were selected for analysis. A total of 21 evaluation reports from 17 projects were reviewed.

Over the last five years the Zenex Foundation’s has moved to adopt a programmatic approach to organising its work, clustering projects according to the different strategic objectives to which they contribute. This same logic was applied when grouping the project evaluations and in writing up the meta-evaluation. However, in some cases it has been necessary to include discussions of some projects under more than one heading (for example: projects that supported the development of innovative materials and then trained teachers in the use of these materials are discussed in sections dealing with teacher development and innovation).

3

Page 4: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Table 1: Projects included in the meta-evaluationProgramme Focus Projects reviewedDeveloping Schools for Mathematics, Science and Language excellence

LEAPSchool Development Programme

Learner Development Programme ISASA Mathematics & English ProgrammeInkhanyeziReunert College (*)

Teacher Development Programme MIET Educator Support - North West Province Dinaledi projectMaths Centre – LimpopoELET – LimpopoRUMEP – Eastern CapeMaths Centre – Eastern Cape and KZN

Research and Innovation Systemic Evaluation – Grade 3Grade R research projectResearch on School Leadership and qualifications for school principals Primary Mathematics Research ProjectREAD mother-tongue literacy programmeConcept Literacy Project

These reports were supplemented by a review of reports made to the Foundation’s Executive Committee (Exco) and the Board of Trustees. These reports provided additional background detail on the activities of the Foundation – particularly the range of partnerships that the Foundation entered into in order to implement its various activities – and also drew attention to shifts in operational focus and implementation strategies that took place during the period under review.

Discussions were also held with a number of staff members in order to both clarify information on programme design, where this was not clear from the evaluation reports, and to gather information on specific themes which were not directly addressed by the various evaluations.

Analytic framework

This report purposefully adopts the same analytic framework that was used when conducting a meta-analysis of the first 10 years of the Zenex Foundation’s work. Both studies have used the project logic model as the primary tool for describing and analysing the development assumptions that underpin the different projects initiated and funded by the Foundation. The same conceptual framework was used in order to facilitate comparisons between the findings of the first meta-evaluation and the current review. In addition, this model is also used by the Zenex Foundation when designing project evaluations and when it articulates the theory of change underpinning each of its programmes.

Figure 1 (below) outlines the different components of the logic model1.

1 Adapted from the logic model presented in Roberts, J. and Schollar, E (2006) The evaluation of the development projects of the Zenex Foundation between 1995 and 2005 – A meta-study.

4

Page 5: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Definition of the Development Problem:

Strategy : Project approach to address the development problem

Project inputs Services (e.g. training offered)

Material assistance (Learning Support Materials etc)

Beneficiary: Target population (schools/ teachers etc)

Delivery strategyDelivery methods;

Strategic partnerships.

ImpactChange in “negative” conditions identified in the definition of the

development problem(e.g. Change in learner performance)

Process Outcomes(e.g. changes in practice, behaviours, access to materials etc that are

expected to lead to impact on performance.)

Figure 1: Project logic model

The logic model and project cycle management theory also provides a useful tool for identifying possible factors which could affect the extent to which a project successfully achieves its stated outcomes and the different times in a project’s lifecycle at which these may occur. Evaluators often refer to these factors as the intervening variables that influence outcomes. Table 2 provides a summary of these factors, which are considered when examining possible reasons that projects do not deliver the intended results.

Table 2: Identification of potential threats to project impactPhase Project Component Threats to impact

Conceptualisation

Framing of the development problem.

Inappropriate analysis of the development problem.

Identification of the beneficiary population

Selection of the beneficiary population flawed. Factors internal to the population prevent or inhibit change.

Lack of readiness for the project. Mismatch between strategies and population.

Intervention design – determining the strategies that will be used to address

Poorly designed intervention – the proposed inputs are not likely to solve the development problem.

Underlying educational theory flawed.

5

Page 6: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Phase Project Component Threats to impact

the development problem. Focus of strategies is inappropriate.

Implem

entation

Implementation of the programme

Project duration. Intensity of exposure to interventions. Capacity of the implementing organisation Intervention does not proceed as planned. Delivery

falls short of targets. Poor quality materials and services Negative attitudes towards service providers Project participation (allied to dosage) is insufficient

to effect change.

Nature

of

change

Process outcomes (changes that should lead to or contribute to the ultimate goal of improving performance)

Changes in attitudes, skills, knowledge, behaviours does not take place as expected.

Changes take place, but do not have an effect on learner performance (desired/ expected changes are not sufficient, appropriate or logically linked to changes in learner performance)

Impact on learner performance

No logical linkages between the process changes which are sought and changes in learner performance.

Evaluation

Evaluation of impact Poor quality evaluation design contributes to over- or under-estimation of impact.

Data quality and research design does not allow for clear determination of impact.

Measures of impact unsuitable.

Taking into account the fact that Foundation adopted a new strategic framework in 2006, the meta-evaluation also considers the degree of alignment between Foundation’s strategic objectives and the design and implementation of the Foundation’s project portfolio. The following criteria, based on the Foundation’s strategy, as outlined in the document “Educating for impact in mathematics, science and language: A 10 year review”, will be applied to assess the degree of strategic alignment.

Table 3: Criteria for assessing strategic alignment of projects and the Zenex Foundation strategy

Strategic dimension Rating scaleExtent of partnership with the Department of Education

1. No visible partnership / Not discussed in evaluation2. Partnership limited to information sharing on the

project 3. Joint project design, planning and/or project

management4. High level of interdependent partnership: Needs

identified by the Department of Education shaped the project design; there are linkages between the project and other initiatives in the district/ province.

Extent to which project has a systemic focus and supports the creation of an

1. No systemic linkages evident in project design2. Project aims to create an effective organisational

6

Page 7: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Strategic dimension Rating scaleenabling environment for teaching and learning

climate for teaching and learning at the level of the school.

3. The project contributes to creating effective systems and capacity beyond the level of the school (i.e. at district or provincial level). 4. Project supports policy development /

implementation.

Extent to which the project supports capacity development amongst stakeholders

1. No evidence of capacity building amongst stakeholders.2. Evidence of capacity building amongst stakeholders

Programme specific indicators Rating scaleExtent to which project aims to build teacher capacity through a range of strategies

0. Not relevant to project focus1. Only one approach to capacity development is

adopted.2. Project allows for a variety of strategies to be used to

enhance teacher capacity (workshop-based training/ mentoring/ school-based support).

Extent to which project seeks to raise the number of disadvantaged learners obtaining high quality Grade 12 passes in mathematics, science and English

0. Not directly relevant to project focus1. Part of underlying project rationale, but not measured.2. Intended outcome of project – not achieved3. Intended outcome of project – partially achieved4. Intended outcome of project – clear contribution to goal.

Organisation of the report

The report is divided into two parts: Section One begins with a review of each of the Zenex Foundation’s programmes that includes a discussion of the implementation models used and a synthesis of evaluators’ findings about the impact of the various projects. Second Two of the report considers a number of over-arching themes related to programme management including:

challenges associates with determination and selection of the beneficiary group; the creation of strategic partnerships to support project implementation; and the role of evaluations to strengthen programme delivery and organisational learning.

7

Page 8: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

1 Increase in knowledgeAttitudinal change

2 Introduction of changes in practices

3 Changes in operational and organisational efficiency

Section One: Review by programme area

Supporting educational improvement through teacher development

Strategic Objective: To increase the number of professionally qualified teachers in mathematics, science and language

IntroductionFigure 2: Summary of the Kirkpatrick Model

The following model for the transfer of training has implicitly informed the design, and evaluation, of many of the teacher development projects that the Foundation has supported: Teachers begin by acquiring new knowledge that deepens their understanding of the subject which they are teaching, so that they are better able to explain concepts to learners. At the same time they also acquire knowledge about different strategies that can be used to support student learning. It is then assumed that teachers will begin to apply the knowledge that they have acquired and adapt their teaching practices to incorporate new methods and strategies based on the training that they have received. In turn, this supports more effective learning by students. This approach to understanding the relationship between training and personal or organisational effectiveness was first outlined by Kirkpatrick in 1959 (and has remained remarkably stable in spite of numerous subsequent elaborations by the originator) and has been one of the most commonly used models for evaluating the impact of training.

A total of 8 teacher development projects are reviewed in this section. At the outset it should be acknowledged that a number of these projects were designed and approved prior to the adoption of the Foundation’s 2005-2015 strategy. However, it will become clear that many were guided by the principles which were subsequently incorporated into the strategy – particularly the need to combine training sessions with on-site support, mentoring and coaching.

Models of teacher support and development implemented by the Zenex Foundation

The majority of teacher development programmes implemented over the last five years have sought to provide teachers with the opportunity to acquire a three-year professional qualification, in line with government requirements that teachers should hold either a three-year Bachelor of Education

8

Page 9: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

degree or an Advanced Diploma in Education (equivalent to and replacing the Post-Graduate Certificate in Education and the Higher Diploma in Education).

i. Partially accredited training with classroom based supportOnly one of the projects reviewed adopted this model. In this project several Foundation Phase teachers from each participating school were offered the opportunity to obtain partial credit towards an Advanced Certificate in Education (ACE) accredited by UNISA. In addition to this workshop-based training, NGO service providers also provided teachers with in-school support and coaching.

ii. Accredited training leading to a full qualification with additional training and classroom-based support

These projects offered under- or unqualified teachers the opportunity to obtain a professional qualification in education, accredited by a Higher Education Institution. Formal academic training was often coupled with workshop-based training offered by the same NGO service providers responsible for the delivery of the accredited programme. These sessions were intended to provide teachers with additional opportunities to discuss course work; examine ways in which the theory of the academic programmes could be applied in their schools; and have an opportunity to discuss issues which were not addressed in the formal programmes. Most NGOs also offered teachers some level of classroom-based support, where they could provide teachers with feedback on the application of new techniques and strategies based on their coursework.

In most projects that adopted this model, only one teacher per school was selected to participate in the programme - with the intention that teachers from nearby schools would group together to form “communities of practice” through which they could support each other and also share information about the introduction of new teaching methods and strategies obtained through the project with other teachers in their schools. The schools (and teachers) selected to participate in the project were usually concentrated in a particular area or district, making it easier for teachers to interact outside of formal programme activities.

iii. Subject-specific training offered to groups of mathematics and science teachers from the same school, with an additional focus on curriculum management

The Zenex Foundation’s School Development Project2 offers subject specific training to groups teachers of literacy/ English, mathematics and science from selected primary and secondary schools. In this project teacher training is not accredited and is offered by a range of service providers with specialist expertise in a particular subject area. This approach differs from the two other models described in this section in that the project aims to support all mathematics, science and language teachers working in a particular phase in a school – rather than focussing on a single teacher. The project also takes into account the context in which the teachers are working and provides

2 The School Development Project is discussed both in relation to teacher training and development and the Foundation’s objective of developing schools of excellence.

9

Page 10: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

additional support to school managers in order to enable them to enhance curriculum management practices in their schools.

Overview of the alignment between projects and the Foundation’s 2005-2015 strategy

Table 4 below summarises the degree of alignment between the design and implementation of the projects reviewed and the articulation of the Foundation’s strategy3.

Table 4: Strategic alignment of teacher development projectsProject name Indicators of strategic alignment

Evidence of partnership with Dept. Of Education

Systemic focus of project

Builds stakeholder capacity

Builds capacity with multiple strategies

Raises G 12 passes

MIET NW 3.5 1 1 2 0Mindset 2 1 2 2 1ELET Limpopo 1 1 1 2 1Maths Centre Limpopo

1 1 1 2 1

Maths Centre & READ KZN, EC

1 1 1 2 0

RUMEP 1 1 1 2 2.5Note: Projects appear roughly in chronological order

The Foundation set itself a target of training 360 FET teachers and 700 Foundation Phase teachers between 2005 and 2010. Over the last five years, over 500 teachers have participated in training provided through Zenex Foundation projects. Table 5 summarises the number of teachers who have received training through the projects which formed part of the meta-analysis4. The data presented in the table above shows that a far higher number of teachers completed non-accredited training than those who completed accredited training.

Table 5: Number of teachers that have completed training programmes supported by the Zenex Foundation

Project name Accredited training Non-accredited trainingFoundation Phase FET Foundation Phase FET

MIET (NW) 72Maths Centre Limpopo

16

ELET Limpopo 14RUMEP 28

3Detailed discussions of the ratings given for each project are provided in an appendix to this report.4 The totals reflect only the numbers of teachers who completed the projects which were evaluated. This therefore does not reflect the full spectrum of the Foundation’s work and these figures are not used to measure progress towards the achievement of strategic targets.

10

Page 11: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Maths Centre KZN

40

Dinaledi 243READ mother-tongue literacy

70

PRMP 40TOTAL 112 58 110 243

Examination of the outcomes realised by teacher development

Using the model outlined at the start of the chapter for measuring the different ways in which training can affect personal and organisational performance, the evaluations show that the various interventions were successful in assisting teachers to acquire new knowledge, but only enjoyed a moderate level of success in influencing teachers to adopt new practices in their classrooms. Unfortunately, most evaluations failed to show that the interventions had any significant impact of learner performance levels.

Acquisition of new knowledge

One of the most successful features of the accredited training programmes that were offered was the high completion and pass rates obtained by participating teachers. Table 6 summarises the number of teachers who initially registered for a qualification and the number who obtained the qualification.

Table 6: Completion rates for accredited teacher training initiatives Started

programmeSat for

examinationGraduated % pass Notes

MIET (NW) 75 75 72 96%Maths Centre KZN/ EC

60 58 40 66% 10 teachers in the Eastern Cape and 2 KZN teachers had 2 or more modules outstanding and could

not graduate. Maths Centre Limpopo

19 19 16 84%

ELET Limpopo 20 19 14 70%RUMEP 38 Not available 28 73%TOTAL 212 170 80%

Three evaluations compared the completion rates and academic results of the Zenex Foundation supported teachers and others who had registered independently for the programme. The evaluations found that the additional support was successful in motivating teachers to submit assignments on time and sit for the examinations, however it did not appear to have had an effect on the scores obtained in academic examinations.

Changes in teaching practice

11

Page 12: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Almost all evaluations noted some degree of change in teacher practice, which was attributed to participation in the project. The most commonly reported changes in teaching practice included:

Improvements in curriculum planning, with more teachers drafting year-long teaching programmes that covered all the material prescribed in the National Curriculum Statements for the grade. There was also an increase in the number of teachers preparing individual lesson plans;

Teachers increased the pace of lesson delivery, increasing the likelihood of greater curriculum coverage during the scholastic year;

More teachers were reported to be teaching material that was set at an appropriate grade level.

Higher levels of interaction and engagement with learners during lessons, as marked by changes in frequency with which questions were asked to students and the use of tasks that encouraged discussion and interaction between students;

More written work was completed by students in both mathematics and language classes. This is relevant as the opportunity to complete written work allows students the opportunity to practice skills and thereby gain greater confidence in the application of mathematical algorithms; and

Teachers made use of materials supplied through the various projects.

However, a number of teaching practices appeared to be more entrenched and less amenable to change, including:

Assessment practices. A number of evaluations found that there was very little relationship between the marks obtained by students on external assessments (such as the tests used by the evaluators) and the marks obtained in internal assessments. This, coupled with the fact that learners are routinely promoted after passing internal assessments while it is clear that they have not mastered the content prescribed for that grade, suggest that many teachers are continuing to set assessment tasks that are not sufficiently demanding in that they do not assess the curriculum standards for the grade, nor do they include a range of question types – particularly questions that aim to assess learners’ abilities to extract information from a given context or that require the application of content knowledge to real world situations.

Accommodating different ability levels within the same classroom. The results of national assessment surveys and student performance data from the various evaluations show that within each class there are some students who are performing at the desired grade level while many others may be one, two or three grade levels below this. The evaluations have shown that teachers do not vary lesson delivery or the tasks set according to students’ performance levels or needs. This means that the most able students are often bored as the lesson pace is too slow for them and those who need the greatest support do not receive this. Some evaluators suggest that the slow pace of many lessons is a result of teachers seeking to teach at the pace most suitable for low performing students.

Effective and appropriate use of materials. Teachers were often described as being able to correctly manipulate materials supplied by projects, however the way in which the materials were used in classes suggested that not all teachers had a good understanding of how to use the materials to explain particular concepts. This point is probably best illustrated with the

12

Page 13: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

following example: In one lesson a teacher provided each group of students with 15 counters and asked them to use them to calculate 12 + 7.

The findings of the different evaluations suggest that although there is some evidence of changes in teacher practice occurring, the amount of change was generally not as great as would be expected, given the duration and intensity of the projects. In some cases this was attributed to the participating teachers being “fairly competent” at the start of the project, which in turn raises questions about the criteria used to select participants. In another project, the greatest change in teachers’ behaviour was recorded in the middle of the intervention, suggesting that teachers had modified their practices in order to collect evidence that would be used in their academic assessments. Changes in teaching practice were therefore aimed at meeting course requirements and did not appear to be motivated by a real desire to implement new practices that would support student learning.

Impact on learner performance

Most evaluations of the teacher development interventions supported by the Zenex Foundation were not able to demonstrate a significant impact on learner achievement levels, particularly when learners’ scores in project schools compared with the scores of learners in non-participating schools5. Table 7 provides a summary of the extent to which student achievement scores changed over the duration of the various projects.

Table 7: Changes in achievement scoresProject name Test

descriptionChange in student

achievement scores in project

schools(% points)

Change in student

achievement scores in

comparator schools

(% points)

Difference in amount of

change between project and comparator

schools

MIET North West Numeracy G3

+8.2 +7.0 1.2

Numeracy G3

+10 +4.4 5.6

Maths Centre/ ReadEastern Cape and Kwa-Zulu Natal

Literacy G3 (EC)

9.7 10.9 -1.2

Numeracy G3 (EC)

10.5 10.3 0.2

Literacy G3 (KZN)

-3.5 -7.4 3.9

Numeracy G3 (KZN)

-3 -12.7 9.7

Maths Centre - Limpopo Math. G10 7.55 10.82 -3.27ELET – Limpopo English G10 2.27 -0.63 2.9RUMEP – Eastern Cape Math Basic 4.2 0.5 3.7

5 The term comparator group is used in place of “control” group. The evaluations did not take the form of randomized control tests or true experiments and so the use of “control” has been avoided. The schools in the comparator groups were not selected through random allocation of schools to project or control groups, neither is there any definitive proof that schools that did not participate in the school were equivalent to the project schools.

13

Page 14: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

SkillsMath G10 content

-1.1 -1.1 0

Primary Maths Research Project

Grade 4 Math

12.34 2.95 9.39

Grade 6 Math

13.59 3.0 10.59

AverageNote: Shaded cells indicate cases where the rate of change of the project schools was noticeably greater than amongst project schools and exceeds the margin where this difference could be attributed to sampling error6.

In a number of the projects there is a noticeable improvement in students’ scores between the baseline and final assessments. In many of these schools, students were performing extremely poorly at the start of the project. This means that in spite of the improvements measured, many students were still unable to meet the levels required for grade level mastery. Unfortunately, when these gains are compared with the performance of students in schools (and classes) which did not benefit from being taught by a teacher who had taken part in the project these students also showed similar, and sometimes greater, gains in achievement scores. In only 6 of the 12 cases listed above was there a positive, statistically significant difference between the two groups.

These results raise the obvious question: What possible explanations would explain the failure of the projects to demonstrate an impact on student achievement? While it is not possible to provide a definitive answer to this question, the data presented in the evaluation reports points to a number of possible explanations.

Discussion of possible factors that could explain extent of project impact

Using the framework outlined in Section 1 of the report, a number of possible reasons are explored that could explain the degree to which the projects achieved their stated objectives. As noted earlier, the projects showed a high level of success in ensuring the delivery of training to teachers and in ensuring that teachers acquired professional teaching qualifications, however they appeared to be less successful in creating lasting changes in teaching practice and in raising levels of learner achievement.

Conceptualisation of the project and analysis of the development problem

The Zenex Foundation strategy document indicate that the Foundation has taken note of research on the most common factors that are thought to undermine educational quality in South Africa, particularly the availability of suitably qualified teachers, inadequate levels of subject content knowledge amongst teachers and insufficient time devoted to teaching and the slow pace of lessons which in turn leads to poor curriculum coverage. Other factors including forms of assessment and forms of teacher-student interaction that do not foster conceptual development are also identified as possible causes of poor performance.

The project descriptions contained in the evaluation suggest that the training components of projects all sought to enhance teachers’ knowledge of the subject matter that they teach and expose 6 In the absence of raw data, a 2 percentage point increase in scores has been used as a measure of significance. The same measure was used in the 2006 meta-evaluation.

14

Page 15: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

teachers to a range of teaching methods or strategies for introducing subject content that were intended to promote levels of student attainment.

The Foundation has favoured the provision of accredited training as it was assumed that the acquisition of a professional qualification would provide teachers with an incentive to attend training sessions and demonstrate a degree of mastery of the training content. The evaluation findings provide somewhat mixed results on the extent to which accreditation serves as an incentive: completion rates for accredited programmes are high suggesting that at a personal level there is an incentive to complete the training. The fact that several teachers applied for (and received) promotions on completion of the programme support the idea that many participants associated personal benefits such as career progression and increased earning potential on completion of the programme.

However, there appears to be an absence of institutional incentives to apply the training in a way that fundamentally changes or enhances teaching practice.

The evaluation findings do not present a strong case for the delivery of accredited training over non-accredited training – particularly taking into consideration that several of the programmes which demonstrated student impact were not accredited. Accredited programmes were in general far longer (usually about 3 years) than those that were not accredited and are associated with a high unit cost (using 2009 figures, it cost approximately R 75 000 per participant to complete a B.Ed degree)7. It is therefore recommended that the Foundation explore training options that are likely to provide a greater return on investment.

Identification and selection of the beneficiary population

The selection of beneficiaries is discussed in more detail in the final section of the report, however it is worth noting that in a number of teacher development programmes participating schools and teachers were selected by the Department of Education. This meant that participation was not voluntary and it was reported that in some cases the worst performing schools in the district were selected as intervention sites – often contradicting the selection criteria laid down by the Foundation.

Quality of the training programmes offered

In general, evaluations found that training programmes were of a high quality. Those programmes that were accredited by Higher Education Institutions were based on the qualifications standards set by professional and institutional authorities and therefore met a level of internal quality control. Service providers also had to maintain a particular standard of delivery in order to retain the accreditation offered by higher education institutions. As only one of the evaluations (the School Development Programme) included a review of the quality of the training offered it is difficult to provide a definitive assessment of the quality of workshop-based training, particularly the training that was carried out as an adjunct to the accredited programme. It is also not possible to comment on the content covered by this training as this issue is not addressed in evaluation reports.

Degree of programme exposure by teachers7 Data obtained from Zenex Foundation report to the Executive Committee. Unfortunately no comparable data is available for non-accredited programmes.

15

Page 16: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

High completion and graduation rates from the accredited programmes indicate that most teachers completed the courses as planned. However, very few programmes report on teacher attendance at supplementary or adjunct workshops and so it is not possible to determine the extent to which they successfully deepened subject content knowledge or supported site based application of theoretical concepts. Similarly, very few of the evaluation reports examine the nature of classroom-based support provided to teachers.

Data on project dosage in the School Development Programme shows that service providers often do not manage to meet targets set for workshop delivery or school-based support. If it is assumed that this is a common problem, then it can be assumed that other projects could also have failed to meet delivery targets – even when programme implementation was not affected by strike action.

Internal reports by the Foundation to the Board of Trustees noted that the B.Ed degree only covers about 40% of the content contained in the FET curriculum. There is a growing body of evidence that indicates that many teachers have an inadequate understanding of the content that they are teaching. As the B.Ed does not (and probably cannot) devote sufficient time to developing FET-level subject content knowledge, what is the most effective way of deepening teachers’ content knowledge?

Alignment between programme design and implementation

Internal monitoring by the Foundation revealed that school-based support did not take the form of mentoring and was often limited to classroom observation or monitoring of teachers.

Opportunity for the programme effects to manifest

The 2006 meta-evaluation noted that it was difficult to determine how much time was needed for the effects of teacher training, particularly on learner performance, to manifest. That said, it should be noted that the PRMP demonstrated that statistically significant gains in student achievement in Grades 4 and 6 could be obtained through a 14 week programme.

One of the factors which could have influenced the degree to which projects resulted in measurable impact was the amount of time over which learners were exposed to teaching by project teachers. A number of evaluations noted that over the course of the project, a fairly large proportion of participating teachers were no longer working as classroom-based teachers, or were no longer teaching FET students, or were teaching other subjects or had moved to teach at another school. It was also reported that a number of project teachers had left the teaching profession or retired. This meant that it was very uncommon for students to remain with the same teacher for more than one year, particularly in the FET phase. Internal management practices governing teacher allocation, coupled with a moderately mobile teacher population, significantly limit the potential impact of these projects.

Lessons learnt: project design and implementation

Cumulative learning deficits exhibited by learners should be taken into account when designing programmes for FET teachers.

16

Page 17: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Based on the gains in learner performance, it would appear that Foundation phase projects were more successful in changing achievement levels than projects directed at FET-level teachers and students. This could be a function of the cumulative deficits in student understanding and attainment of foundational competencies that have been measured by the time that they reach the FET phase. Where gains were recorded amongst FET students these tended to be in basic skills and not in content or skills contained in the Grade 10 curriculum – lending further support to the need to shore up basic skills before expecting to see learner impact in the FET phase.

Seek to retain teachers in the phase and subject areas in which they have been trained

There is a greater likelihood of projects having a positive impact on achievement levels if teachers are allocated to teach the subjects and grade levels in which they have received specialist training. Under ideal circumstances, it should be possible to negotiate this with school managers and district officials, convincing them of the potential benefits that this could bring. However, staff turn-over and fact that the Foundation does not have any way of enforcing compliance makes this difficult. The pressure to ensure that teachers continue to apply the knowledge acquired through projects in classrooms also presents the Foundation with the challenge of managing tensions that arise between teachers’ desires for promotion and continuing as classroom-based teachers.

Projects that combined teacher training with a structured instructional programme demonstrated greater impact.

In spite of the relatively short period of intervention of both the PRMP and READ mother-tongue literacy programme, both programmes demonstrated significant impact on both teaching practice and student achievement. Both programmes are based on a very structured instructional programme with well-developed materials.

Providing training to one teacher per school appears less effective than training several teachers per school

Training individual teachers in schools appears to be less effective than when several teachers from the same school receive training. By training several teachers, the likelihood of students in a particular phase being taught by one of the project teachers increases. Collegial support may also encourage the introduction of more lasting changes in teaching practice.

Changes in teaching practice occur when linked to assessment requirements in accredited courses, but these are not always sustained.

Assessments of changes in teaching practice should take into account the effects of compliance with course requirements (which often require that teachers produce a portfolio of evidence of planning, assessment or classroom activities) and be timed so as to measure whether these changes are sustained after the period of assessment.

17

Page 18: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Supporting educational improvement through providing direct assistance to learners

Strategic Objective: Increase the number of learners with quality passes in mathematics and science at Bachelor’s level

Introduction

One of the primary objectives of the Zenex Foundation is to raise levels of academic achievement in mathematics and science of South African students, particularly of students from disadvantaged backgrounds, so that these students are then able to embark on further studies in mathematics-, science- and engineering-related fields and ultimately pursue careers in these fields. This objective is in turn informed by an analysis of the skills needed to stimulate economic growth and competitiveness and an acknowledgement of the current mismatch between the skills required for growth and the output of the schooling system where less than 10% of South African schools produce more than 50% of the total number of mathematics and science passes8 and fewer than 400 schools are responsible for producing more than half the number of mathematics passes above 50%.

Prior to 2005 the Zenex Foundation supported a number of programmes aimed at providing direct support to a number of students who were thought to have the potential to succeed in FET level mathematics and science; at the same time the Foundation also supported several independent schools which had been established with the objective of providing disadvantaged students with intensive support in order to raise levels of achievement in selected subjects. The current student support programme has its roots in these programmes, but has also sought to develop these initiatives further so that more comprehensive support and development opportunities are offered to students and the capacity of effective public schools is strengthened so as to better support disadvantaged students in obtaining high quality passes in mathematics and science.

A total of four projects were reviewed in relation to the learner support and development programme, however several of these are relatively new programmes and have not yielded a great deal of impact data to date, with grade 12 graduation rates only being available for two of the four projects. It is for this reason that this chapter focuses more on describing the various models of implementation and examining emerging lessons related to programme implementation.

8 Simkins calculates that 6.6% of public schools produce 50% of the total number of passes in mathematics and science (p3). Simkins, C. (2010) Maths and science performance of South Africa’s public schools: some lessons from the past decade. Centre for Development and Enterprise.

18

Page 19: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Models of student assistance implemented by the Zenex Foundation

The following four programmes were reviewed:

LEAP school (Langa Education Assistance Programme) ISASA Mathematics and English Project (Independent Schools of Southern Africa) Inkhanyezi Project Reunert College

Various model have been used to raise the Grade 12 achievement scores of learners, including:

Institutional support. Institutions are created that have a dedicated focus on providing intensive instruction in mathematics, science (and language) in order to assist students overcome learning backlogs and subsequently succeed in FET-level mathematics and science so as to be able to enter into higher education programmes in these fields. In these schools the student population is usually fairly homogenous in that all students are selected on the basis of their potential to perform well in mathematics and science and share a similar socio-economic profile.

Bursary programmes. Students are selected and placed in an environment that is likely to provide a better opportunities to learn, with additional/ compensatory academic and psycho-social support provided.

Projects have adopted one of two sub-models:

o Bursary programme coupled with institutional support: Support to an institution to enable the acceptance of selected students through the provision of bursaries and other forms of academic/ cultural/ social support facilitated by a grant to the school.

o Bursary programme: Students are placed at high performing or specialist schools.

“Second chance” programmes. Students who have sat for the Grade 12 examinations, but who have not succeeded in obtaining grades which will allow them to access to higher education programmes in mathematics, science and engineering fields are provided with an opportunity to take the examinations a second time after receiving intensive tuition coupled with workplace preparation.

Each of these models is discussed in more depth below.

19

Page 20: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Direct institutional support (LEAP)

LEAP is an independent school located in Cape Town9 that is modelled on the Charter schools of the United States. The school was established to offer students from impoverished backgrounds the opportunity to complete Grades 10 to 12 in an environment that was both nurturing and also provided intensive tuition in mathematics, science and English. Students with the potential to succeed in Mathematics and Science in the FET phase, and subsequently enrol in Higher Education programmes, are selected for admission to the school based on their performance in selection tests. The school initially accepted students into Grade 10, but later introduced a Grade 9 class that served as a bridging programme into the intensive FET programme.

Although the school provides the standard FET curriculum for Grades 10 to 12, the school day is extended and students receive tuition or have the opportunity to complete homework in a supervised environment until 17h00 – almost doubling the amount of contact time.

In addition to all students having to take Mathematics and Physical Science as Grade 12 subjects, all students are expected to take English as a First Language. This is done in order to raise levels of English proficiency and expose students to English registers that may not form part of the Additional Language syllabus. The LEAP model also places a high premium on Life Orientation, with all students participating in a 40 minute Life Orientation session each morning and students have on-site access to qualified health professionals and counsellors. During holidays students also have opportunities to access extra tuition through participation in winter schools or maths/science camps.

The Foundation began its association with LEAP before 2005 when the Foundation’s learner support and development programme provided direct support to number pilot schools which were established to provide high quality education to students from disadvantaged backgrounds. Over the period under review, the support to these schools has changed with the Foundation providing support to a number of students who have been selected for participation in the ISASA project and who are placed at the school. The evaluation of the LEAP project examines the overall effectiveness of the school and includes a tracer study of former graduates; it does not specifically comment on the performance of students who were placed at the school through the ISASA programme (described below).

Bursary-based10 programmes (ISASA Mathematics and English programme and the Inkhanyezi project)

Both programmes seek to provide disadvantaged students, who display a level of aptitude for mathematics and science, with the opportunity to attend effective schools with an established tradition of high quality mathematics, science and language instruction. The core principle

9 The original LEAP school was established to assist students drawn from the Langa area in Cape Town. LEAP has subsequently established a number of satellite schools in both the Western Cape and Gauteng. However, the evaluation report was only concerned with the operation of the original school. 10 The term bursary is used as it is commonly understood and the term is used within the Foundation’s literature.

20

Page 21: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

underlying both programmes is that there is a greater likelihood that students will obtain passes in mathematics and science that will provide access to higher education programmes if they attend high-functioning schools.

The main features of each programme are summarised below.

Table 8: Features of bursary programmes supported by the Zenex FoundationISASA project Inkhanyezi Project

Student selection criteria

Students write a selection examination which assesses mathematical and science knowledge.

Students must demonstrate academic ability and must be meet socio-economic criteria.

Grade levels Grades 10 to 12 Two intake points:Grade 8Grade 10

Period of support Three years.Support may be withdrawn for poor performance.

Five years for Grade 8 intake and three years for Grade 10 intake. Support may be withdrawn for poor performance.

Nature of schools Independent schools (9 schools) Public schools (11 schools)Nature of student support (financial)

Tuition feesTextbooksBoarding costs (in selected cases)

Tuition fees (for Grade 8 students)

Student support (academic and psycho-social)

Additional tuitionMentoring to assist with adjustment to new school environment.

Extra lessons/ additional tuitionMentoring – both academic and psychosocial

Capacity development of participating schools

None Schools are provided with an annual grant which is used to provide student support and cover the fees of the selected students. Schools can use the balance of the grant to employ additional staff to teach mathematics or science, support professional development or improve the school’s educational infrastructure. Each school must submit a development plan and budget that outlines how the grant will be utilized.

In both projects once students are admitted to one of the participating schools, they will receive additional academic support, usually in the form of additional lessons in the afternoons, as well as personalised mentoring to facilitate settling into their new school and coping with the academic demands placed on them.

The number of students participating in each programme is summarised below.

21

Page 22: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Table 9: Number of participating students in each programme11

ISASA InkhanyeziGrade 10 Grade 11 Grade 12 Grade 8 Grade 9 Grade 10 Grade 11

2007 97

2008 63 63

2009 69 68 68 110 110

2010 70 69 70 110 110

Note: Highlighted cells track the progress of a cohort of students through the programme.

“Second chance” institutions

Reunert College is a private institution which provides young persons with the opportunity to improve their Grade 12 scores in mathematics and physical science, while at the same time providing them with skills which will prepare them for workplace entry. Students who attend the institution must meet attendance requirements in order to maintain their enrolment and once students have passed their first assignments and met attendance requirements, they receive a small stipend as an incentive to continue with the programme.

In the discussion that follow, this model is not discussed in any further detail as the Zenex Foundation has only recently started to support students attending the institution and initial evaluation report does not provide any outcome data.

Overview of the alignment between projects and the Foundation’s 2005-2015 strategy

Table 10:Project name Indicators of strategic alignment

Evidence of partnership with Dept. Of Education

Systemic focus of project

Builds stakeholder capacity

Builds capacity with multiple strategies

Raises G 12 passes

LEAP 1 1 1 0 3ISASA Maths & English

1 1 1 0 1

Inkhanyezi 2.5 2 2 2 -Reunert College 1 1 1 0 -

11 Numbers of students participating in the ISASA project were provided by the Zenex Foundation. Numbers of students participating in the Inkhanyezi project obtained from the evaluation report.

22

Page 23: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Examination of the impact of the various programmes

Grade 12 achievement rates

As the primary objective of the programme is to increase the number of students who obtain a Grade 12 pass in mathematics and science that will provide them with access to higher education programmes, Grade 12 results have been used as the main impact indicator. Data was derived from the evaluation report on LEAP and information held by the Foundation on Grade 12 pass rates (and the quality of passes obtained) for students participating in the ISASA project. No Grade 12 data is available from the Inkhanyezi programme as it is a relatively recent initiative and it has not produced its first cohort of Grade 12 candidates.

The final evaluation of the LEAP programme presents senior certificate data from 2007. In spite of the increased tuition time and the school’s focus on mathematics and science excellence, in the 2007 Senior Certificate examinations, only 12 out of 33 candidates (36%) obtained a Higher Grade pass in Mathematics. While this is generally higher than in many of the schools in the communities from which students are drawn, the quality of passes was not particularly good with students obtaining a mean score of 37.8%. Only 46% of students obtained a Higher Grade pass in Science in the same year, with only 16% of candidates obtaining a score of over 50%.

The results from the ISASA project are a little more encouraging with 80% of candidates scoring 60% or over in Grade 12 mathematics and science examinations.

Table 10: Pass rate and quality of passes obtained by students in the ISASA programme12.Number of Gr 12

passesNumber of students obtaining

more than 60% for mathematics and science

Percentage of students who obtained a “quality” pass in

mathematics and science2009 67 34 50.7%2010 61 49 80.3%

Supporting entry into higher education programmes

A key objective of the LEAP project is to increase the number of historically disadvantaged students gaining access to Higher Education programmes in the mathematics, science and engineering fields. The evaluators administered a set of placement tests developed by the Alternative Admissions Research Programme of the University of Cape Town which seek to determine which students could perform adequately if admitted to a mainstream academic programme and which students would benefit from admission to a bridging programme, extending their course time by one year. Using these tests the evaluators found that only 15% of Grade 12 students (5 students) met the criteria for admission into a mainstream programme, while an additional 18% were eligible for admission to a bridging programme. The assessment found that the remaining 67% of students were not performing at a level that would offer them access to a Higher Education programme.

12 Achievement scores are only provided for the cohort of students who had attended the ISASA schools for Grades 10 to 12. The first Grade 10 intake of students to the programme occurred in 2007 and these students sat for the Grade 12 examination in 2009.

23

Page 24: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Discussion of possible factors that could explain extent of project impact

Beneficiary identification

In spite of students needing to meet selection criteria for entry into the various projects, it is clear that each year there were a number of students who found themselves unable to cope with the academic demands placed on them and either withdrew from the project or had financial support withdrawn after continued under-performance. This draws attention to the challenges of accurately predicting future performance and testing for potential (discussed below). There would appear to be an implicit assumption that students who have managed to succeed, despite the odds in under-resourced and less functional schools, will raise their performance levels, given adequate teaching and learning opportunities. This unfortunately does not always hold true as there are a range of other factors which may affect future performance, which may include students’ ability to cope with increased academic demands, the social and emotional pressures of adapting to a new school, and responses to the additional stress that arises when students feel that they must maintain a certain level of performance.

Beneficiary receptiveness to the programme

All of the students interviewed as part of the evaluations indicated that they felt a sense of honour in having been chosen to receive additional support and expressed appreciation for the opportunities which this afforded them. However, in this case beneficiary receptiveness refers to the effects of entering high performing schools when the conceptual foundations in mathematics and science may not be have been sufficiently understood in order to enable students to fully benefit from the high quality teaching to which they are exposed. Tests of critical reasoning skills showed that the Zenex Foundation supported students did not perform as well as their peers. Annual results obtained by students in the LEAP schools showed a general decline in the mean scores of students as they progressed from Grades 913 to 11. This could be attributed to students struggling to master increasingly complex concepts without having sufficiently mastered foundational concepts.

Both the ISASA and Inkhanyezi reports do not provide a great deal of detail on the number of hours of additional instruction that students receive or the focus of these lessons. Where information is available, it would appear that these lessons often focus on providing additional opportunities to cover grade-specific work without necessarily addressing gaps in foundational knowledge. Although it would appear that some of the winter school activities and maths camps which are offered may include foundational skills, these often take place mid-year after half of the academic year has passed.

Intervening variables affect achievement of outcomes

13 Although the LEAP model was designed to provide intensive instruction for students from Grades 10 to 12, the school later introduced a Grade 9 class which served as a foundation year for the FET programme.

24

Page 25: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

A number of factors, other than the quality of teaching and learning, can influence students’ academic performance, including their ability to adapt to their new schools. Students make mention of increased workloads and greater expectations of class participation in the schools in which they have been placed. Most of the students selected to participate in the programme were amongst the highest performers in their former schools and several mention having to come to terms with the fact that they are now “average” or “below average” performers in their new schools.

Lessons learnt: project design and implementation

Challenges associated with “testing for potential”Several evaluation reports discuss the challenges associated with assessing students’ potential on entry to the programme and the fact that these tests do not always provide a sufficiently strong indication of which students will succeed in FET-level mathematics and science and perform well enough to obtain a Bachelor’s level pass.

The evaluations distinguish between tests which assess the extent to which students have mastered both Grade 9 (or Grade 7) content and the foundational skills required to successfully understand the material presented in Grade 10 (or Grade 8) and tests which assess scientific and mathematical aptitude. The evaluation of the ISASA project has recently included an assessment of scientific and mathematical reasoning, which it argues should form part of the student selection process. Other writers note that tests designed to measure potential should assess more than academic proficiency and reasoning and also include psychometric elements which can be used to assess commitment to learning, levels of application and interest in pursuing a career in a scientific (or computational) field.

Students’ reading age is a strong predictor of academic successThe evaluation of the LEAP programme found that students who entered the project with a reading age below 13 years struggled to meet the academic demands placed on them.

Set clear expectations for service delivery from schoolsWhere much of the success of the project relies on inputs provided by the individual schools, it is important that all participating schools understand what is expected from them in terms of providing mentorship to students supported by the Foundation. Similarly, the Foundation’s requirements for academic support to students should also be clearly outlined in the initial Memoranda of Understanding entered into with participating schools.

Development logic is not followed through to its logical conclusionObtaining a good quality pass in mathematics and/or science is not an end in itself – the development rationale for supporting the programme is that this will lead to students following career paths that will provide them with a path out of poverty and contribute to the pool of skilled professionals in South Africa. Only the LEAP evaluation includes a small tracer study of recent graduates.

25

Page 26: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Supporting educational improvement through the development of innovative materials

Strategic objective: Influence education policy and continuous improvement of classroom practice

Introduction

One of the ways in which the Zenex Foundation has sought to support improvements in the quality of education is by investing in the development of innovative teaching and learning support materials. More than half of the projects reviewed in this section were pilot programmes and the findings from the various studies have been used to refine and further develop the materials.

Due to the diversity and the innovative nature of the materials which were developed, a short description is provided of each project before considering the some emerging lessons that can be learnt from the various projects. Where possible, data is provided on the impact that the materials had on student learning. There is some overlap between the data provided here and in the section which deals with teacher development, as many of these projects had a dual focus on materials development and training teachers in the utilisation of the materials.

Description of the projects

In total five projects were reviewed that centred on the development of teaching and learning support materials that sought to encourage innovative classroom practices and raise levels of student achievement.

Table 11: Materials development projectsProject name Subject area(s) Phase or grade

READ mother-tongue literacy

Literacy Foundation Print based graded student readers and teacher’s books

Primary Mathematics Research Project (PMRP)

Mathematics Intermediate Print based student worksheets

Mindset Science and Mathematics FET Broadcast and ICT based materials

Concept Literacy Science, Mathematics, Biology, Geography

FET Print-based book providing African language explanations of common concepts

Dinaledi (??)

26

Page 27: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

READ mother tongue literacyThe READ mother tongue literacy project was designed in order to support the acquisition of initial literacy skills in African languages (specifically Xhosa and Zulu).

One of the key factors in developing reading proficiency amongst young learners is the extent to which they have the opportunity to read a wide variety of age appropriate texts as it is through frequent practice of reading and decoding skills that young children build their ability to recognise words and make sense of unfamiliar words through sounding them out and guessing at their meaning by interpreting the context in which these words appear. However, in many South African schools there are very few books in African languages which have been written for young children in the process of acquiring initial literacy skills. It is important to support the acquisition of mother-tongue literacy as reading levels in the mother tongue (or home language) have an influence on students’ abilities to learn to read fluently in a second language. If the basic skills of learning to read are not mastered, then students are likely to face greater difficulties in becoming fluent readers in a second language.

The READ mother-tongue literacy programme sought to develop a range of books which could be used to teach initial literacy, building on READ’s established methodology for promoting reading skills and which also took into account the linguistic (phonological, orthographical and syntactic) features of African languages. A total of 20 graded readers were prepared, along with a set of “big books” which the teacher could use when reading to the students and teaching reading skills, these were supplemented with a learner workbooks and a teacher’s guide. Once materials had been developed, a group of Grade 1 teachers were trained to use them.

The evaluation included an evaluation of the materials, which concluded that they were of high quality, based on sound principles for teaching reading and took into account features of African languages. The evaluators found that teachers had made use of the materials in their classrooms in accordance with the training provided and the guidance contained in the teacher guides. The use of the materials was also felt to have increased the amount of written work completed by learners and encouraged teachers to set homework.

Assessments of learner performance showed that students in project schools out-performed their peers in schools which had not participated in the project by 5% points14. The following table illustrates the differences in achievement scores on a range of reading skills:

Table 12: Achievement scores by skillSkill Mean score – Project

schoolsMean score – Comparator group 1 (IEP project only)

Mean score - Comparator group 2 (no external support)

Matching of similar shapes

88.6 83.8 79.25

Matching letter shapes 77.0 68.7 69.3Matching pictures and 82.8 69.2 73.314 The participating schools were also part of a large-scale district improvement programme called the IEP. As this could have an effect on the evaluation of outcomes the results of schools using the READ materials were compared with other schools that were part of the IEP and schools that did not receive any external support.

27

Page 28: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Skill Mean score – Project schools

Mean score – Comparator group 1 (IEP project only)

Mean score - Comparator group 2 (no external support)

wordsMatching word to body part

62.15 50.67 65.5

Reading sentences for meaning

83.21 78.83 77.7

Writing in response to visual and verbal clues

28.9 12.42 20.4

All skills 67.15 56.9% 62.9%

Primary Mathematics Research ProjectThe first phase of the Primary Mathematics Research Project investigated the nature of outcomes of mathematics education in primary schools and consisted of an analysis of 7028 learner test scripts which had been collected as part of a number of project evaluations that had taken place between 1998 and 2004. This study concluded that the fundamental cause of poor learner performance in mathematics could be attributed to a failure of learners to learn to calculate using established mathematical algorithms that require an understanding of the base 10 number system. Schollar’s research showed that students – up to and including those in the Intermediate phase – relied counting in units to solve addition and subtraction problems and relied on repeated addition or subtraction to solve multiplication and division problems. Without a sound understanding of the number system and of basic arithmetic algorithms, students will fail to build a solid foundation for the acquisition of more advanced concepts, and in doing so entrench the knowledge deficits which have been shown to have a profound influence on mathematics performance in senior grades.

Based on the findings of the research study, a set of materials were developed for classroom use which aimed to provide a good grounding in understanding numbers, operations and number relationships. Taking into account the fact that within each class there are students operating one, two and even three grade levels below what is expected, the materials were developed in a way that they could be used for students at different performance levels and in doing so assist teachers to deliver a more differentiated teaching programme. The materials provided a structured framework for teachers to deliver an 14 week programme that combined teacher instruction with daily exercises for students that provided opportunities for practice and, where appropriate, memorisation. Attention was paid to the sequencing of the lessons so that they would provide a logical framework for students to understand concepts and move towards handling more complex concepts and processes. Teachers were also provided with a framework for assessing student performance and progress.

Because the materials are not grade specific, the materials were field tested with students in Grades 4 and 6. The evaluation found that students who had been exposed to the materials performed significantly better than students who had not used the materials as shown below:

28

Page 29: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Table 13: Changes in student performance between initial and final assessmentsGrade 4 Project group Grade 4 Comparator group

Change in score (% points) +12.34 +2.95Grade 6 Project group Grade 6 Comparator group

Change in score (% points) +13.59 +3.0

The gains shown by the project group are substantial and statistically significant. The evaluation also considered the effects of project coverage on students scores – the full programme was designed to run for 14 weeks, however a number of teachers only utilised part of the programme, with some students only completing 3-4 weeks of the programme. Gains in learner performance were far higher where students had completed the entire programme.

Table 14: Effects of project exposure on achievement scoresExtent of project exposure Change in score (% points)

Grade 4 Grade 650% of programme completed +15.54 +14.6880% or of programme completed +18.39 18.60

Concept Literacy ProjectThe majority of African learners in South African schools are learning through the medium of a second language, particularly in Grades 10-12. Many of these learners are studying English (or to a lesser degree Afrikaans) as a second or additional language and do not demonstrate very high levels of competence in reading, writing or speaking. This in turn inhibits students’ abilities to read and interpret material presented in textbooks designed to support learning in various content subjects. Difficulties in learning in a second language are compounded when these subjects make use of specialised, technical terminology.

The Concept Literacy project was initiated in order to provide African language students with explanations of concepts in the FET curriculum in mathematics, Physical Science, Biology and Geography in indigenous languages. In addition to presenting a translation of terminology, the materials also provided a short explanation of each concept in an African language in order to support learning. The materials were designed in a way that teachers would be able to use them to provide technically correct explanations of concepts using students’ home languages – as evidence exists that although code switching occurs in classes it does not always promote a deeper understanding of subject content with teachers using African languages to encourage students, give instructions and make reference to common or everyday examples.

The materials underwent a rigorous process of translation and editing for linguistics and content accuracy before they were field tested. One of the key questions that the evaluation examined was the extent to which it would be necessary to offer teachers additional training in order to promote effective use of the materials. The evaluation found that where teachers the use of the materials was more extensive and more effective when supplemented with teaching.

Unfortunately, in spite of the availability of the materials and the training’s focus on effective use of code switching, this did not appear to have influenced classroom practice in any way.

29

Page 30: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

In general, responses to the materials were positive and teachers indicated that they had used the materials to clarify their own understanding of key concepts when planning their lessons.

MindsetAlthough the Mindset project does not fall within the Zenex Foundation’s materials development programme, the evaluation provides some useful insights into teachers’ responses to the introduction of new materials, particularly materials that are not print based.

The Zenex Foundation supported Mindset project was launched soon after the initial Mindset pilot project had been conducted in 2003 – and it is worth noting that the evaluation therefore reflects the early design of the project and not its current form. The Mindset project set out to raise levels the achievement levels of Grade 12 students in mathematics and Physical Science through providing teachers with ICT-based resources and televised lessons. These lessons provided students with the opportunity to observe experiments performed in well resourced laboratories and listen to explanations provided by experienced subject teachers.

Each school was provided with a satellite dish that was able to receive broadcasts, a television and access to internet-based materials that could be downloaded to provide students with additional information on concepts discussed in broadcast lessons and which also provided exercises based on the broadcast material. In addition, a series of teacher training workshops were conducted to demonstrate how the equipment and materials could be used and to raise levels of teacher subject content knowledge in FET mathematics and science. Unfortunately attendance at these workshops was low with most teachers attending fewer than half of the workshops offered.

The implementation of the project suffered a number of setbacks including the theft of equipment from schools and a protracted teacher strike which disrupted teaching programmes and which meant that teachers had to implement a number of “catch-up” initiatives in order to complete the syllabus, particularly the Grade 12 syllabus.

The evaluation showed that the extent to which the ICT-based materials were accessed by schools varied significantly between schools. It is unfortunate that the evaluation does not consider how each school made use of the materials which were downloaded or how much student work was completed as a result of accessing computer-based materials. The use of the materials (both the internet based and broadcast materials) is described as being ad-hoc. The exposure to “expert teaching” through the broadcast lessons and training provided did not appear to have any effect on teachers’ practice.

The extent to which the materials were used appears to have been affected by the inability of some schools to schedule access to computer laboratories or other equipment at appropriate times and in a way that allowed for integration of the materials into the normal teaching programme.

Assessments of student achievement levels before and after the project only showed a significant increase in Grade 10 Mathematics scores. Achievement levels in science remained low and declined

30

Page 31: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

between the baseline and final assessments and may have been affected by the strike action and the resulting disruption to the teaching programme.

Strategic alignment of projects

Project name Indicators of strategic alignmentEvidence of partnership with Dept. Of Education

Systemic focus of project

Builds stakeholder capacity

Builds capacity with multiple strategies

Raises G 12 passes

READ mother-tongue literacy

2 1.5 1.5 2 0

Concept literacy 1 1 1 1.5 0PRMP 3 1 1 0 0Mindset 2 1 2 2 1

Lessons learnt with respect to the introduction of innovative teaching materials

Materials which were informed by a specific theory of learning appeared to have a greater impact on student achievement.

The PMRP and READ materials were both shaped by a particular theory of learning that formed the basis for the materials.

Materials which provided a structured programme for teachers had a greater impact on student achievement than less structured programmes.

Training on materials utilisation raises usage levels, particularly where these materials are unfamiliar and use new technologies.

Utilisation of the materials is increased when there are clear linkages between the curriculum and the materials.

The quality of the materials is not necessarily a predictor of the extent to which they will be utilised by teachers.

31

Page 32: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Supporting educational improvement through the development of schools of excellence

Strategic outcome: Improved learner performance in mathematics, science and language using a pipeline approach across primary and high schools

Introduction

Over the last five years the Foundation has supported a number of projects which broadly fall under the umbrella of school development. These include support to the LEAP school in the Western Cape, providing a complementary language programme in support of the Department of Education’s Dinaledi programme and the Foundation’s School Development Project.

This section of the meta-evaluation concentrates almost exclusively on the Foundation’s flagship school development programme which integrates aspects of teacher and learner development, the provision of materials and enhancing the organisational effectiveness of participating schools. This project deserves special attention as it also represents the Foundation’s most coherent effort to develop a project which adopts an holistic approach to educational development that is implemented in close partnership with the Department of Education.

Assessment of alignment with the Foundation’s strategic objectives

Table 15:Project name Indicators of strategic alignment

Evidence of partnership with Dept. Of Education

Systemic focus of project

Builds stakeholder capacity

Builds capacity with multiple strategies

Raises G 12 passes

School Development Project

4 3 2 2 -

Focussed whole school development: Zenex Foundation’s School Development Project

Project design

The School Development Project (SDP) is the largest projects in the Zenex Foundation’s current project portfolio and it arguably best represents the coalescence of the organisation’s different

32

Page 33: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

strands of work in order to effect coherent change in a group of schools. The project also stands apart from other projects in that it was designed in a way that it would be implemented in close partnership with the district offices in which the schools were located and would seek to develop a model through which district officials could effectively support schools. In this way, the project was intended to contribute directly to strengthening the capacity and performance of the education bureaucracy in selected provinces.

The project is being implemented in 49 schools across four provinces (Western Cape, Limpopo, KwaZulu Natal and Gauteng) in one district per province. The selection of each district took place in partnership with the Department of Education in province and was based on a number of factors including district capacity to participate in the project and the availability of a cluster of schools in the district which met the Foundation’s intervention criteria. As a systemic intervention, the project set out to select primary schools which served as feeder schools to local high schools – as this would, if the project was successful, create a pipeline or pathway that would provide learners with access to quality education from the start to the end of their school careers. In the same way that the Dinaledi project was hoped to create schools of excellence (or star schools) that would serve as beacons in their communities and demonstrate what could be achieved with dedication and initiative, it was hoped that these clusters of Zenex Foundation supported schools would serve as examples of what could be achieved in South African schools.

The project includes a number of components, including: Teacher training for Foundation Phase teachers (including Grade R teachers) in numeracy

and literacy teaching; Teacher training for Intermediate Phase teachers in mathematics and language teaching

(usually English as the dominant language of learning and teaching in most schools); Teacher training for FET teachers in secondary schools in Mathematics, Physical Science and

English; Management support for school managers that focuses on improving the management and

delivery of the curriculum; Provision of reading materials to schools that are graded and appropriate to students of

different ages. Materials were also provided to support self-directed learning in schools that did not have sufficient teachers in FET mathematics and science.

Learner support through the form of additional or supplementary instruction opportunities on weekends and during holidays.

The model used here could be characterised as a form of focussed whole school development, as it shares a number of features with earlier whole school improvement projects implemented in the mid-1990s and early 2000. However, this model of school development is more focussed in its support of particular grades and subjects and it seeks to support specific management activities.

Project management and delivery

Within each province, the project is managed by a provincial co-ordinator who is located at the district office and is responsible for co-ordinating activities amongst participating schools and

33

Page 34: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

supporting service providers in resolving any problems that they may be experiencing, particularly in relation to teachers’ participation in the programme. The district provincial co-ordinators attend periodic project management meetings that are held between the Foundation, service providers and the Department of Education. The Foundation has allocated one of its project managers to be responsible for oversight and management of activities in a particular province. These arrangements represent a high level of investment in the project from both the funder and the Department of Education.

The project is implemented through a consortium of 24 different service providers who are each responsible for delivery of services in a particular area of specialisation.

Assessment of outcomes

The project is currently underway, with a mid-term evaluation having been conducted in 2010. The mid-term evaluation examines the extent to which the programme is being delivered as planned and the nature of initial changes in teacher practice as a result of project participation. It is therefore premature to consider the outcomes achieved by the project.

One of the key features of the mid-term is evaluation is the close monitoring of the extent to which delivery targets (particularly the provision of training and classroom-based support) are met by service providers. The mid-term evaluation shows that the extent to which service providers are able to meet delivery targets differs by agency and the location (province) in which they are operating. In general, the number of classroom based support visits lags behind delivery targets. The end of programme evaluation should be able to link the degree of change measured in a school or by learners taught by a group of teacher with the degree to which they participated in project activities.

Lessons learnt

The majority of lessons learnt presented in this section are drawn from the Foundation’s reports to its board of trustees and interviews with the Foundation’s staff.

The design of the School Development project appears to have built on lessons that emerged from evaluations of earlier teacher development and student support initiatives.

The design of the School Development Project appears to have taken into consideration a number of lessons learnt from other earlier projects including:

o Working with a critical mass of teachers in each subject and phase;o Project design should take into account student advancement throughout their school

career in order to ensure that solid learning foundations are laid down and then maintained and built upon through exposure to good quality teaching in the Intermediate and Senior phases so as to ensure that they enter the FET phase in a good chance of being able to engage effectively with FET level content and skills.

34

Page 35: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

o Systemic approaches to educational development need to take into consideration the functionality and operation of the organisations to which development programmes are directed. Effective curriculum management supports the adoption of changes in teaching behaviour and ensures that managers understand the rationale for changes being introduced at classroom level.

o There is a need to explicitly address the learning deficits that accumulate as a result of incomplete curriculum coverage and grade inappropriate assessments.

Partnership between the Department of Education and NGOs should be defined by realistic expectations

In principle partnership between the Department of Education and NGOs or funders should result in greater synergy between efforts to support and improve schools, successful partnerships are not always easy to initiate and sustain. This difficulty is not due to any level of ill will on the side of either party – it is simply the day to day realities and challenges constraining the effective operation of many district offices that create these challenges. Many district offices are understaffed and even when they have a full complement of staff, there is a high ratio between each curriculum expert and the number of schools or teachers that they are expected to guide, monitor and support. Involvement in external projects is often simply an adjunct to their existing duties, limiting the amount of time that can be devoted to the project.

These, and other, challenges can seriously undermine the potential for hands-on partnership. This requires that external agencies set realistic expectations for the nature of support and participation that can be expected.

Teacher training activities were narrowed to focus on areas in the curriculum that present learners with the greatest difficulty

In 2010 it was decided that training provision, particularly for the Foundation and Intermediate phases should focus on Learning Outcomes 1 and 4 in mathematics as these were identified as areas in which students experienced the greatest difficulties. Learning Outcome 1 (Numbers, operations and relationships) establishes the foundation for mathematical understanding and reasoning – it is therefore essential that learners fully master the content and skills associated with this outcome.

The current project model does not give sufficient recognition to the role of parents in supporting learning.

The current implementation model does not include any components that explicitly recognise the role of parents and home-based learning in supporting academic advancement.

35

Page 36: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Section 2: Cross cutting themes

Introduction

In the same way that the project logic model was used as the analytic framework for examining project design, implementation and outcomes in the first section of the report, the project cycle has been adopted as the over-arching framework that guides the discussion of cross-cutting themes that have emerged over the course of conducting the meta-analysis.

The project cycle describes the relationship between different phases of designing, implementing and evaluating a project. The process is represented as a cycle as it is assumed that phases of a project are interdependent and that evaluation findings will inform future project design and programming.

Figure 3: The project cycle

In analysing the work of the Zenex Foundation over the last five years, a number of thematic issues have emerged, which have been associated with a particular phase of the project cycle.

Project phase Thematic focusIdentification of development challenge → Understanding the relationship between

numeracy and literacy Needs analysis → Inclusion of needs analysis in project planningFormulation of response → Selection of project beneficiariesImplementation → Partnership arrangementsEvaluation → Evaluation design and evaluation utilisation

36

Identification of development

challenge

Needs analysis

Formulation of response

Implementation

Evaluation

Page 37: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Identification of the development challenge: the relationship between language skills and achievement in Mathematics and Science.

In the development of the 2005-2015 strategy the Foundation maintained its focus on supporting the development of learners’ literacy and language skills as a key support to improving performance in Mathematics and Science. The central assumption underpinning this approach is that poor language skills will undermine the likelihood of achieving academic success in mathematics and physical science as students are required to understand and make use of specialised terminology. Students are also expected to be able to read and understand learning support materials, which are often presented in English. Language skills are also a key factor in determining whether students meet the entry requirements for higher education programmes in mathematics, science and engineering related fields.

Due to the historical legacy of South Africa’s language policies, most African learners are first taught; through the medium of their first or home language in Grades 1 to 3, while in Grade 4 many schools begin teaching content and computational subjects through the medium of English15. There is often a large gap between school-based language policies as they exist on paper, which states that English is used as the medium of instruction from Grade 4 onwards, and actual classroom practice where teachers rely heavily on code switching (moving between two or more languages) or teaching through the home language while teaching and learning materials are available only in English.

The relationship between language and learning is summed up by Howie who states, “the most significant factor in learning science and mathematics isn’t whether learners are rich or poor. It’s whether they are fluent in English” (quoted in Fleisch, 2009, p99). Various national and cross-national achievement surveys consistently show that students who speak English “all the time at home” obtain higher mean scores than those who learn through the medium of English but rarely speak English at home16.

Learners are often more successful in acquiring literacy skills in a second language if initial literacy skills are well established in the home language17. Cummins and Swain (1979) distinguish between two forms of second language competence: basic interpersonal communicative skills (BICS) which allows a person to hold conversations and interact with others - with the content of these discussions largely being limited to everyday events, concrete experiences, feelings and events – and what they term cognitive academic language proficiency (CALP) which involves the use of more

15 Schools have the power to determine their own language policy, however many have chosen either the “Straight for English” where students begin learning through English in Grade 1 or the “sudden transfer” model where there is a change from mother-tongue instruction in Grade 4. 16 There is obviously a degree of intersection between social class (or socio-economic status) and the use of English as the dominant home language, taking into consideration the nature of South African society and the degree to which white English speakers had access to a degree of privilege and power that others did not. However, this is not a direct relationship as a number of minority groups in South Africa spoke English as home languages (even if heritage languages were retained) while they did not enjoy the levels of social and political privilege accorded to white South Africans.17 In schools where learners immediately learn to read and write in a second language, particularly where there is strong home-based support for speaking, listening and reading the second language, this may hold less true.

37

Page 38: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

technical terminology and reference to abstract and theoretical concepts. They argue that one first has to acquire BICS before being able to use language for academic purposes.

Formal teaching to support the acquisition of initial literacy skills is concentrated in the Foundation Phase (Grades R to 3), however the results of the 1999 Monitoring Learning Achievement Study, the Systemic Evaluation of Grade 3 students in 2003 and 2007 and the 2010 Annual National Assessment results for Grade 3 all show that the majority of South African students are not able to read fluently for meaning and many have not mastered basic reading skills after three years of schooling. This means that learners have not acquired either good abilities to decode symbols and match them to sounds (phonic decoding or the “sounding out” of words) in their home languages before they move to Grade 4 where printed materials – particularly materials that support the learning of mathematics and science – are often only available in English.

The assessment surveys conducted by the Southern African Consortium for Monitoring Educational Quality (SACMEQ) show that most Grade 6 learners did not meet the requirements to be classified as proficient readers who are able to read a passage of connected text and extract meaning from it. These results do not bode well for students’ ability to engage with written text (be it in the form of textbooks or internet based materials) as they progress through their schooling careers.

Based on these argument arguments about the relationship, particularly the predictive relationship between language proficiency and performance in computational subjects, the review of the various project evaluations sought to note trends on the relationship between the two subject areas. Many of the studies undertaken only examined performance in one learning area and therefore limited themselves to commenting on performance in this area only – or only made fleeting, and often anecdotal, reference to performance in other subject areas. However, a study undertaken in schools participating in the Dinaledi language support project and the School Development project did not find any relationship between performance in English and performance in Mathematics or Science.

Although these findings contradict much of the evidence that has been presented by other researchers (Fleisch 2009, Howie 2003, Reddy 2006), it is worth considering possible explanations for the trends that were noted in these evaluations:

Measurement error – where the scores rely on internal assessments designed and administered by schools, there may be a lack of equivalence between the mathematics and language scores. In mathematics there is a far clearer hierarchical organisation of the curriculum than in language where the sequence in which concepts should be learnt is less clearly defined and as the same concept spans many years in the curriculum, the level of difficulty at which questions on the same content are set can differ greatly.

Small sample sizes distort the power of the findings. In some cases the sample size in the studies was fairly small and this would undermine the generalisability of the findings.

In-school differences – Simkins (2010) shows that within the same school Grade 12 results often differ greatly from year to year; similarly the School Development Project evaluation shows that schools may be high performers in mathematics but amongst the worst performers in English.

38

Page 39: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

The data showing a lack of relationship between performance in mathematics (and science) and language is not sufficiently compelling to suggest that the Foundation should review its policy of supporting language development alongside efforts to improve performance in mathematics and physical science.

Needs analysis as a component of project planning and design

Adopting the logic of project cycle management, after the development problem has been identified and defined, the next phase in designing an interventions is to conduct a needs analysis in the communities in which the intervention is to take place.

In most cases, the conditions in schools, and the problems that they experience, are fairly similar and so the development of many educational projects omits this phase. A number of the Zenex Foundation project managers indicated they felt that the design and implementation of the School Development Project would have been strengthened if teachers and principals of the participating school had been involved in the initial stages of designing the project. The project managers also indicated that this process could have been used to afford school principals and participating teachers the opportunity of indicating the level of commitment that they felt was reasonable and realistic, taking into consideration the demands of other district-led training which they are expected to attend and the influence of any other projects which may be operating in the school during the same period.

In addition to assessing specific needs of participating schools and there are a number of benefits which could be derived from pre-intervention interaction with schools that have met initial selection criteria, including:

On-site assessment of the functionality of the school; Gathering information on performance in national and provincial assessment surveys; Direct observation of key indicators of the quality of teaching and learning (for example the

amount of written work completed by learners); and Gathering information on other projects which may also be functioning in the school.

Identification and selection of beneficiaries

A period of project design (or refining an existing project design) usually follows the needs analysis has been conducted. One of the most important decisions which has to be taken at this point is the final identification of who will benefit from the project. The issue of how to select schools, teachers and learners arose at several points during the course of the meta-analysis of the evaluation and in the course of discussions with Zenex Foundation staff.

Selection of the beneficiary group is a difficult process, often involving multiple trade offs between deciding to provide support where there is the greatest need or highest levels of deprivation and selecting sites where a programme is most likely to achieve its outcomes and result in tangible

39

Page 40: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

changes to the lives of learners – this issue was often hotly debated between the Foundation and Department of Education officials when selecting sites for intervention.

Identification of schools for school and teacher development programmes

There is a growing body of evidence to suggest that educational improvement programmes (particularly those operated by NGOs) have the greatest likelihood of success when implemented in reasonably functional schools that have committed themselves to improving performance and that have already started exhibiting initial gains based on the school community’s initial efforts. School which are extremely low performers often need external support from the Department of Education, which as the employer, should be able to exert a degree of authority over schools and teachers to hold them accountable for performance and also for failure to comply with project participation requirements.

The Foundation adopted a policy of working with schools which exhibited a degree of functionality, and at the start of the implementation of the 2005-2015 strategy they commissioned an NGO to undertake a study to identify schools which met a number of performance criteria, largely based on the quality of their Grade 12 results, particularly in mathematics, Physical science and English. This analysis was used to identify areas where a number of moderately well-performing schools serving disadvantaged communities were located. This information was then used to select provinces and districts in which the Foundation would concentrate its activities. The absence of performance data from other points in the education system made it difficult to use these criteria to identify primary schools in which projects would be implemented. This difficulty is compounded by the fact that many districts do not have reliable independent assessments of the functionality of primary schools nor do many have accurate information on the normal feeder patterns between local primary and secondary schools.

In spite of these criteria being in place, several evaluation reports indicated that the schools in which teacher development programmes were conducted were not sufficiently functional and the managerial and organisational culture did not create enabling conditions for teachers to introduce changes to how they were teaching. This begs the question, what additional criteria can be used to select schools in which there is a reasonable chance of interventions yielding results?

Based on evaluation findings, the Foundation may wish to take the following additional criteria into account:

The presence of a permanently appointed principal; The degree of staff turnover (schools that have a high staff turnover are less likely to be able

to sustain the benefits of having a group of teachers trained); The principal and staff members’ willingness to agree that teachers who receive training will

continue to teach the grade or phase in which they receive specialist training for the duration of the programme – and for a specified period thereafter;

The presence of positive school-community relations where parents show an interest in the school and an interest in the academic performance of their children;

Evidence of innovation and improvement initiated by the school; Minimum levels of infrastructure present in the school to support the intervention; and

40

Page 41: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Current teaching practices – particularly levels of curriculum coverage and the amount of written work completed by learners.

A number of these criteria unfortunately cannot be assessed without visiting the schools, which has cost implications for the Foundation. However these costs should be offset against the possible gains and increased likelihood of the project achieving its objectives.

Teacher development – selection of teacher candidates

In spite of the success of various teacher projects in increasing the number of teachers with professional qualifications, several evaluations noted that the impact of the programme had been diluted by teachers retiring, leaving the profession, moving schools and obtaining promotions which meant that they were no longer engaged in classroom-based teaching.

Very few programme evaluations reported on the age profile of the participants in the projects, however one evaluation which did so showed that most participants had more than 10 years teaching experience. While it cannot be assumed that all of the teachers who participated in the various projects shared this profile, a number of issues arise when considering the age and experience profile of teachers selected as candidates to participate in development programmes:

Younger and less experienced teachers are less likely to apply for promotion during or immediately on completion of a training programme.

Younger teachers may find it easier to adapt teaching methods as research has shown that teaching methods become fossilised as teaching becomes more and more routinised.

Younger teachers may find it easier to adopt new instructional technologies. Less experienced teachers are more likely to spend longer engaged in classroom teaching

after the completion of the project before being promoted to a managerial position or retiring.

In spite of there being some internal performance appraisal measures in place in schools, their implementation is widely regarded as being flawed and as not providing a reliable measure of professional competence. In the absence of performance-based selection criteria, the Foundation may wish to conduct pre-selection interviews with participants in order to gauge their commitment to a training and development programme and also examine the extent to which general organisational environment in the school is likely to support the introduction of new teaching and learning strategies.

Learner support – identifying learners with potential to succeed

All learner support initiatives include an element of pre-selection of students, usually through the administration of various tests which assess content knowledge in particular subjects and also aim to assess students’ potential to succeed in higher grades. Recently one of the evaluators has argued in favour of the inclusion of a critical reasoning test to be included in initial selection tests (when a similar test was administered to project learners and their peers it was found that project learners experienced greater difficulties with these tasks than their classmates).

41

Page 42: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

As there are a number of factors which can influence student potential to succeed, it may be worth including more qualitative dimensions into student assessments that include psychometric measurements of student resilience and ability to adapt to a change in school environment and the additional pressures that selection as a project learner could bring. In addition, interest in mathematics and science – and in scientific investigation and reasoning – expressed by students may be an additional factor which could be considered. Students with a real aptitude for science and related subjects often show a keen interest in “finding things out” and asking why natural phenomena occur18. This interest is often expressed through holding discussions with others, reading, conducting small scale investigations at home.

Partnership as a model for effective project delivery

One of the features of the Zenex Foundation’s 2005-2015 strategy is the commitment to entering into partnerships with government in order to support the development of the education system as a whole as it was felt that it would be more likely that the Foundation would be able to make a more significant contribution to educational improvement if its programmes and approaches were aligned with government policies and priorities.

The idea of partnership in order to harness different strengths, leverage additional funding and to achieve a greater impact than one organisation could working alone is laudable – and an idea that very few would oppose or disagree with. However, when it comes down to defining exactly what contributes to creating an effective partnership and what “partnership” means in practice, it becomes a little more difficult and contested. The challenge of knowing what a good partnership would mean in theory and actually creating it can be likened to desiring a good marriage and the difficulties associated with actually creating one and living it on a day-to-day basis!

Partnership, when understood in relation to the operation of the Foundation, takes different forms depending on the type of entity with which the Foundation is entering into a partnership. These different forms are illustrated below and are discussed in more detail in the sections that follow.

Figure 4: Range of partnership arrangements

18 Richard P Feynman, Nobel winning physicist, speaks passionately of his interest in “finding things out” when he was a child, reading widely, fiddling with household appliances to see how they work and generally expressing an interest in the world around him.

42

Page 43: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Partnership with Government

A central tenet of the Zenex Foundation’s strategy is that the initiatives that it supports should be aligned with government policies and priorities and that they should contribute to creating an effective environment for delivery of educational services to all.

Partnership to support policy formulation and implementationBetween 2006 and 2011 the Foundation has initiated and supported a number of projects that have directly or indirectly supported policy formulation and planning. In these cases, the nature partnership was very different to occasions on which the Foundation sought to implement projects with the Department of Education.

In 2006 the Foundation worked closely with the Department of Education to support the implementation of the Grade 3 systemic evaluation and the development of reliable and valid learner performance tests that could be measure change in achievement between testing cycles. The Foundation decided to support this initiative because of its recognition of the importance of having accurate and useful information on student performance levels that can inform policy and practice. The funding provided by the Zenex Foundation was used to fund the work of expert service providers who worked closely with the Department of Education to develop performance tests and build internal capacity to analyse the data that was collected from students. During this project the Foundation played the role of a “silent partner” and was largely responsible for providing financial assistance to support State processes. Unfortunately, the Department of Education did not release the final report that was prepared at the end of the project for public discussion.

The Foundation also commissioned research on the qualifications structure for Grade R teachers, the results of which then fed into policy discussions on the conditions of employment and career advancement of Grade R teachers. Similarly the Foundation supported research into the creation of a qualification for school principals in order to support and inform policy discussions.

Partnership for implementation

43

Government

Supporting policy development and implementationProgramme implementation

Service Providers

Project deliveryBuilding sectoral capacity

Other donors Co-funding initiatives

Zenex Foundation

Page 44: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

One of the operating principles of the Zenex Foundation is that there should be a clear agreement between the Foundation and provincial Department of Education in each of the provinces in which it operates. After the adoption of the 2005-2015 strategy, the Foundation entered into memoranda of understanding (MoU) with the Heads of the Department of Education in each of the provinces in which the Foundation operates. These MoUs set out the principles which would guide co-operation and collaboration. The speed with these MoUs were completed varied significantly from one province to another, which in turn affected the speed with which project initiation took place.

The extent to which the Foundation has been able to establish successful partnerships for programme implementation has varied from project to project and from province to province. This is probably be best presented in the form of a table that also considers the different forms that partnerships can take:

Partnership features Discussion Partnership facilitates entry into schools and offers approval for activities

The MoUs entered into with provinces permit the Foundation to operate in these provinces. In most projects, the level of partnership has extended beyond this superficial level.

Joint selection of schools In most projects, district officials were involved in the selection of schools in which projects would be implemented. This sometimes meant that schools which did not meet the Foundation’s criteria for minimum levels of functionality and demonstrated levels of student performance were included.

Joint design of project interventions In most cases the Foundation was responsible for initiating projects and designing them. They were presented to provincial and district officials and endorsed, with very little re-design taking place at the request of the Department of Education.

Joint management of projects In only one project, the School Development Project, has the Department of Education played an active role in the managing and steering project implementation. A district official was appointed to serve as project manager and liaison between the participating schools, service providers and the district office – usually in addition to their normal duties.

One of the greatest challenges to the initiation and maintenance of successful partnerships – as acknowledged by both the Foundation and the Department of Education – is Departmental capacity, particularly at the level of the district. Many district offices are understaffed and serve several hundred schools that are often spread over vast distances in rural districts, stretching district capacity to its limits, particularly when the district lacks basic resources. At the same time, there is no common set of norms and standards governing staffing or equipment norms for district offices and there has been wave after wave of restructuring and changes to the authority delegated to district directors. All of these factors make it difficult for NGOs and funders to form effective and sustainable partnerships with district offices.

In practice, this leads to partnership taking the form of information sharing on projects and NGO involvement in schools becoming a means of alleviating some of the workload of districts, freeing officials to concentrate their efforts on other schools. This makes it more difficult for project-based

44

Page 45: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

interventions to contribute to building the capacity of districts or to pilot realistic alternatives for the delivery of district based support as NGO led projects usually have the luxury of being able to intervene more intensively in a smaller group of schools.

Donor partnerships19

None of the evaluations directly consider funding arrangements, however information provided in the Foundation’s reports to its Board of Trustees indicates that a number of relationships have been formed with other donors. Co-funding has been secured from a range of corporate donors for a number of learner support initiatives.

Partnerships with service providersTo be completed after further discussion with the Zenex Foundation.

Programme evaluation that supports organisational and sectoral learning

The final section of this evaluation synthesis and meta-evaluation examines the current state of the practice of evaluation and the role which it has (and can) play in supporting the design and delivery of more effective interventions that seek to raise educational quality. The preceding chapters of this report have provided a review and synthesis of findings that were presented in a range of project evaluations; this chapter shifts slightly in focus and provides a critical review the evaluations which were commissioned by the Zenex Foundation. Over the last two decades, the term meta-evaluation has been used to refer to a systematic assessment of an evaluation systems and processes in order to support continuous improvement of evaluation practice20.

The purpose of evaluation in relation to the work of the Zenex Foundation

Programme evaluation can serve a number of functions: one of the most important functions is to support decision making about the future of a particular intervention and the design of future interventions, based on lessons learnt about programme design, implementation and the degree to which an approach has resulted in the achievement of expected or desired outcomes. Evaluation is frequently coupled with notions of accountability, which can be understood both in terms of accountability for the funds used to implement a programme, and also the degree to which funders and service providers should be accountable to the communities that they seek to serve as beneficiaries often invest heavily in the participation in a programme and providers are increasingly being expected to demonstrate that that investment has been worthwhile in resulting in tangible improvements in their lives.

19 Unfortunately very little information is available on this issue. More information may need to be obtained through discussions with Foundation staff and management.20 This definition is based on a synthesis of Lipsey’s work (2000 – quoted in ALNAP’s review of Humanitarian Action in 2003).

45

Page 46: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

With specific reference to the operation of the Zenex Foundation, programme evaluations serve the following purposes:

Measurement of impact / outcomes: evaluations assess the extent to which project/ programme implementation has resulted in measurable benefits to the beneficiary group and more generally to the education sector;

Identification of lessons learnt: this informs the design and implementation of other projects that the Foundation initiates or supports;

Project redesign and redevelopment: evaluation data is also used to introduce changes to the implementation model being used in projects while they are underway; and

Contribution to the body of scholarly knowledge about educational practices and efforts to improve educational quality: evaluation reports are treated as a specialised form of research and are used to influence the Foundation’s participation in policy debates and are also used as source of authoritative information on educational practice in South African schools.

Framework for assessing evaluations

In the interests of ensuring a systematic approach to assessing the design and execution of the various evaluations commissioned by the Zenex Foundation between 2006 and 2011, a number of performance standards have been proposed. These standards are based on evaluation standards that have been adopted by UN Agencies (including UNICEF) and professional evaluation associations, discussions on evaluation practice that were presented in the 2006 Zenex and ISASA meta-evaluations, and observations on the focus, structure and methodologies used in the evaluations on which the current report is based.

Standards for assessing the quality of programme evaluations

Evaluation design and data collection

The evaluation design is guided by a clearly articulated theory of change on which the intervention is premised;

The evaluation is structured around a set of indicators that reflect the underlying theory of change (project logic) and that are related to the inputs provided.

The evaluation design allows for the collection of data at several points during the implementation of the project;

o The baseline evaluation should take place prior to the start of the intervention or as close to the intervention start as possible.

o Mid-term and final assessment should be planned so as to ensure that sufficient time has elapsed for project effects to be measurable.

Where possible, the evaluation should attempt to establish the situation that would have existed, had the intervention not been implemented and compare changes in the intervention group and a comparator group.

46

Page 47: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

The evaluation design should ensure that the evaluation will be able to address core questions, including the following:

o Was the project implemented as planned?o What results were achieved by the project?o To what extent is it likely that results achieved by the project will be sustained?

The evaluation design and data collection strategy should conform to established standards for research practice, including:

o Sample sizes are large enough to facilitate the drawing of sound conclusions;o Sample sizes and populations remain consistent throughout the duration of the

evaluation;o Data collection methods promote the collection of reliable and valid data.

47

Page 48: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Reporting of evaluation results The evaluation report should contain a clear (and comprehensive) overview of the intervention

that is being evaluated. This should be written in a way that the relationships between the underlying change theory, the inputs provided by the project and the results measured are clearly articulated.

The methodology used to conduct the evaluation is explained – including the construction of samples, instrumentation used and coverage and quality of learner performance tests.

Utility - supporting the utilisation of evaluation findings The evaluation plan should provide a clear understanding of the information needs of the client.

The evaluation should clearly indicate lessons learnt and best practices.

The evaluation should contain a clear set of recommendations that can be used to inform future decision making.

The standards listed above form the core of the analytic framework, but are not exhaustive. The analytic framework has been developed with reference to the nature of information available through the meta-evaluation; a framework that can be used by the Foundation for reviewing and assessing the quality of evaluations that it commissions has been appended to this report.

Application of evaluation standards to evaluations commissioned by the Zenex Foundation between 2006 and 2011.

Note: As it is expected that most of the information contained within this chapter will be for internal usage by the Foundation, in the interests of clarity and learning, reference is made to project names. Should the Foundation wish to release this section of the report to an external audience, project names should be replaced with project codes.

Standard 1: Articulation of theory of change

Many of the evaluations reviewed did not explicitly articulate the theory of change on which either the intervention or the evaluation was founded. In many cases it was necessary to infer the programme logic on which the intervention was based and to then establish links between the intervention logic and the evaluation questions or the core themes that were addressed in the evaluation. In the case of teacher development programmes, the underlying logic was fairly clear and could be inferred from the use of the Kirkpatrick model in a number of evaluations. However, in other projects - more specifically the learner interventions - it was harder to establish the logical connections between the intervention model and the issues addressed by the evaluation.

Mouton et al’s review of the baseline evaluation of the School Development project presents a cogent argument in favour of designing the evaluation in such a way that it examines the

48

Page 49: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

development logic of the intervention and the chain of expected results, and then using the evaluation to examine whether these linkages hold true, assuming that the project is delivered as planned. The mid-term evaluation of the School Development Project pays greater attention to the relationship between inputs and the measurement of expected results than many earlier evaluations.

Standard 2: Presence of an indicator framework based on and reflecting the intervention’s theory of change

The presence of an indicator framework helps to make explicit the relationship between particular practices (e.g. specific teaching practices) that are assumed to contributed to the quality of education provided and the inputs provided. The indicator framework also sets out the performance expectations that are being measured and identified the behaviours or practices that participants/ beneficiaries are expected to display as a result of project exposure. The validity of the indicator framework is determined by two things: its relationship to the theory of change/ project logic and its relationship to the nature of inputs provided. An illustration of the relationship between indicators, inputs and change assumptions is provided below21.

Indicator Input Change assumptionPresence of curriculum planning documents that cover a full scholastic year and detailed plans outlining the content to be covered each academic term.

Training on the development of annual and quarterly (term-based) curriculum planning documents

Student achievement will be improved if teachers complete the prescribed curriculum in a structured and timely manner.

Only one of the evaluations reviewed (MIET North West Teacher Development programme) made reference to the existence of an indicator framework that had been developed jointly between the various service providers. Unfortunately the evaluation report does not contain a copy of the framework that was developed. Several other evaluations provide an indication of the core performance areas that are measured by the evaluation in the form of evaluation questions, but these do not appear to be translated into indicator frameworks that set out performance expectations against which teacher practice or service delivery is measured.

Standard 3: Data is collected at various points during the project life cycle.

All of the evaluations reviewed adhered to this standard, with the majority of projects collecting data at two or more times during the course of an intervention.

21 Evaluation literature often refers to SMART indicators – in other words, indicators that are Specific, Measurable, Achievable/ Realistic and Time-bound/ Target-focused. In the indicator formulation given in the example, there are no time-limits in which the indicator must be achieved, nor are any targets given. This is deliberate as the first step in creating indicators is to ensure that they are relevant, logical and measurable. Targets can be attached after the completion of a baseline study. Using the baseline data it will be possible to determine the amount of change that can realistically be expected as a result of the intervention.

49

Page 50: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Evaluation design (Teacher development projects) No. of evaluationsSingle measurement of performance - compared with performance of comparator group

1

Two measurements of project and comparator group 2Two measurements of project and comparator group that also measured change during the course of a single year

4

The table above focuses only on teacher development projects as other initiatives do not necessarily lend themselves to the use of these designs and other approaches may be more relevant in the assessment of materials development projects or learner support programmes. In the case of the evaluation of the learner support programmes, it is noted that the evaluation plan allows for the collection of data at different points in the schooling cycle of a cohort of learners.

In the meta-evaluation of mathematics, science and language support programmes conducted by Schollar and Roberts (2006), it was recommended that donors ensure that baseline evaluations be conducted prior to the start of interventions. This was recommended so that the evaluation can determine the conditions and practices that exist prior to the start of the intervention in order to (i.) inform project design and determine the relative weighting given to particular inputs and (ii.) provide a benchmark against which change is measured. Where it is not possible to conduct the baseline assessment prior to the start of the project, efforts should be made to use techniques that will support the reconstruction of baseline data (e.g. reviews of documents and participant reflections coupled with direct observation, particularly where participants do not report having introduced any significant changes as a result of the intervention). It was noted in several of the reports that the relationship between when the various evaluation phases were conducted and the project life-cycle was not clear.

Standard 4: Evaluations should attempt to establish the counterfactual

Evaluation literature uses the phrase “establishing the counterfactual” to refer to the need to compare changes in the project group with the situation that would have existed if the project had not been implemented. One of the most widely accepted methods for doing this is through the introduction of control groups in quasi-experimental designs or through the use of randomised control tests which are gaining popularity in the testing of social interventions, in part due to the support of the World Bank for these designs.

Most of the evaluations that were reviewed adopted a quasi experimental design in which data was collected from project beneficiaries as well as a group those who did not receive the same inputs and support and in one study a more classic experimental design was employed with schools being randomly assigned to project or control groups. The majority of designs could be classified as having used non-equivalent comparator groups as the “control” schools were not necessarily similar in terms of size, class size, teacher experience and general school functionality. While the difficulties of ensuring that schools are equivalent or that student populations are sufficiently similar are acknowledged, in some cases the comparator group performed noticeably differently on achievement tests at the start of the project which may have had an influence on the nature of conclusions that could be drawn about project effect.

50

Page 51: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

The fact that so many evaluators made use of comparator groups marks a significant shift in evaluation approach from the studies reviewed in 2006 where only four out of 12 (33%) included comparator groups. The inclusion of a comparator group helps to determine what might have taken place in the participating schools, had the intervention not occurred and the schools had continued to benefit from the support provided by the State.

Although the inclusion of comparator groups in the evaluation designs is a very positive move, unfortunately in many evaluations the comparator group is far smaller than the sample of project beneficiaries. This raises the likelihood of sampling errors occurring and the results that are obtained being as a result of chance or errors in sampling. In the studies reviewed as part of the ISASA meta-evaluation, in the evaluations that made use of control/ comparator groups the ratio of project to control 1:3.4, while in the most recent set of evaluations the ratio had dropped to 1:1.87.

Table X provides a summary of the relationship between the population size and the size of the evaluation and comparator group samples. Issues of sampling size and the effects of sample size on data generalisability, reliability are discussed a little later in this chapter.

Table: Evaluation sample sizesProject name Project

population (number of schools or teachers )

Evaluation sample size

Comparator sample size

Learner testing (project sample)

Learner testing

(comparator group)

Ratio of project

to control

(learners)Foundation Phase projectsREAD 50 24 schools 8 547 148 3.70PMRP 40 teachers 40 schools/

teachers40 1845 1579 1.17

MIET North West

17 16 schools 10 532 276 1.93

Maths Centre/ READ (E.Cape & KZN)

60 6 schools 2 350 281 1.25

FET Phase projectsMaths Centre (Limpopo)

19 19 schools 10 370400

212180 **

1.752.22

ELET (Limpopo)

19 (20 teachers)

15 schools 10 428610 **

293418 **

1.461.46

RUMEP 36 schools 12 schools12

6 No detail No detail -

Standard 5: Evaluation design should address core evaluation questions

i. Was the project implemented as planned?Two factors can have a profound impact on the extent to which project outcomes are achieved: whether a project is implemented as planned (in particular the quantity of inputs delivered, the quality of delivery and the relevance of the inputs to addressing the development problem) and the level of exposure to the intervention by the participants. However, very few evaluations overtly

51

Page 52: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

considered whether projects had been implemented as planned, with the mid-term evaluation of the School Development Programme which examined whether service delivery levels were in line with the project plans being a notable exception. The impact evaluation of the Mindset pilot project reported on levels of project exposure, but failed to link this data with information on levels of utilisation, teaching practice or changes in student achievement. Only the evaluation of the materials developed as part of the Primary Mathematics Research Project explicitly considered the extent to which project exposure influenced the rate of change observed in student achievement levels – in this case it was noted that in those classrooms where teachers had completed more than 75% of the programme, results were statistically significantly (quite noticeably) better.

Some studies, particularly mid-term evaluations, sought to examine and comment on project implementation processes. These studies, for the most part, provided useful insights about the extent to which project planning and implementation could potentially influence the outcomes achieved.

However, none of the evaluations considered whether the inputs provided were relevant to the needs of the participants.

ii. What results were achieved by the intervention?

Most of the evaluations that were reviewed sought to examine whether the desired outcomes had come about as a result of the project being implemented. One of the most striking changes in evaluation approach was the extent to which learner achievement (or learner performance) was used as the core indicator of project impact. Only 8 out of 15 evaluations of teacher development programmes that were funded by the Foundation between 1995 and 2006 assessed changes in leaner performance. Evaluators have clearly been guided by the Foundation that they wish to use learner performance as a key indicator of impact – issues related to the comparability of testing data and the construction of tests to measure change in performance are discussed in relation to Evaluation Standard 6.

iii. What is the likelihood that intervention effects will be sustained?

The issue of project sustainability or the sustainability of effects was rarely overtly addressed by the evaluators. An exception to this pattern was the evaluation of the RUMEP project a year after the completion of the programme in order to ascertain the nature of the benefits/ outcomes which were still evident in the schools. Unfortunately, the evaluation design did not appear to be driven by a clear set out of outcome indicators which were measured in a consistent fashion between the final (summative) evaluation and the sustainability assessment. In spite of this, the evaluation drew attention to a number of unanticipated outcomes that arose as a result of the project (including career advancement of participants and the creation of professional communities amongst participants).

52

Page 53: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Standard 6: Conformity with accepted standards for sampling and data quality

Sampling

In order for evaluators to be able to make judgements on project impact and to realistically assess the success with which a project has been implemented, they need good quality data. This can be achieved through ensuring that appropriate sampling protocols are followed and that the instruments and data collection procedures yield reliable and valid data.

Sampling Another positive trend in the evaluation designs that were reviewed has been the increase in the evaluation sample sizes in order to collect data that will allow for more robust statistical analysis that leads to more conclusive pronouncements on project impact. Table X above shows the sample sizes in relation to project population. One of the challenges in determining the adequacy of sample sizes for many of the evaluations was the shifting “unit of analysis” in the designs – some project populations were described only at the level of the number of participating schools or teachers, yet most impact was measured at the level of the learner. When the sample sizes used in the studies were analysed in terms of their adequacy for statistical analysis, it was found that all project were sufficiently large, as were all but one of the control groups (even if they were not equivalent in size to the project groups)22.

In addition to the sample size needing to be sufficiently large, it is important that the sample composition remains consistent. One of the weaknesses noted in the 2006 ISASA meta-evaluation was that the sample size did not remain consistent throughout the course of several evaluations. Unfortunately, this problem also affected a number of the evaluations reviewed as part of this study, this appeared in one case to be an issue of sample design and in other cases it was a function of the mobility of the project population (as discussed below):

Several studies that sought to administer tests to the same group of students within the course of a single academic year found that it was difficult to do this as the numbers and identifies of students within each school seemed to change quite noticeably between April and October of the same year.

Other project evaluations noted that the sample of participating teachers changed and shifted over the course of a three year project (even when the programme was linked to an accredited training programme). Where the programme was accredited it was more common for teachers to move between schools or to be moved to teach different subjects or grades, than to drop out of the programme itself. However, the fact that the teachers did not remain within the same schools or did not continue to teach within the same phase or subject had serious implications for the long-term impact of the programme.

In the evaluation of the READ/ Maths Centre Foundation Phase project it was noted that the evaluation sample changed between the baseline and final studies and a different control school was used in the final evaluation. The report does not provide justifications for this change.

22 In several cases it was necessary to estimate the size of the project population. This was done by multiplying the number of participating teachers by 40 in the Foundation Phase and by 35 in FET grades (based on national class size norms).

53

Page 54: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Data quality

Data quality is usually determined in terms of its reliability and validity. In essence, reliability refers to the stability of the findings over time and between different researchers who may collect data, while validity refers to the appropriacy of the measure used to collect data and the internal logic between what is observed and the construct that is being measured 23. It is worth briefly considering some factors that can have an impact on data quality in learner performance testing and in the collection of qualitative data on attitudes and behaviours.

Data quality and learner performance tests

Theoretical and practical issues related to the construction of student performance tests are complex and it would not be appropriate to attempt to address them in this review. Instead, it is useful to examine some of the issues as they relate specifically to the Zenex Foundation meta-evaluation and their role in project evaluations. Ideally, learner performance testing should be able to both inform the way in which projects are designed, drawing attention to issues that should be addressed by service providers, and they should provide a measure of the extent to which performance changes during the course of a particular intervention. Taking this into consideration, the following criteria are proposed for the assessment of the quality of learner performance testing:

Evaluators should be able to demonstrate that tests are reliable and valid - and that the results obtained from these tests provide a sound basis for making decisions.

The data generated by the tests should be able to be used for diagnostic purposes and to track changes in performance – at different levels – over time.

As noted earlier, one of the most striking shifts in evaluation practice over the last 10 years has been the of performance tests to measure student achievement as a core outcome of development projects.

The table below provides a summary of the range of tests that were administered as part of the evaluations conducted between 2006 and 2011, the content that the tests covered and the duration of each test.

23 Reliability and validity in research have been reduced to their core elements. More detail on specific aspects of reliability and validity can be found in most standard texts on research methods.

54

Page 55: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Table : Overview of the structure of tests used in project evaluationsProject Subject &

Grade levelFocus No. of items Test

durationREAD mother-tongue literacy

Literacy - Grade 1

LO 3: Reading and viewingLO 4: WritingLO 5: Using language, thinking, reasoning

31 45 min

MIET Numeracy - Grade 3

LO 1 and 2 (number system and basic operations, some measurement)

84 90 min

MIET Literacy – Grade 3

Word recognition (40 items)Sentence completion (18 items)Reading comprehension (17 items)

75 90 min

Maths Centre/ READ KZN and EC

Literacy - Grade 3

Tests compiled from Grade 3 systemic evaluation. No information on number of items or distribution by Learning Outcome.

- -

Maths Centre/ READ KZN and EC

Numeracy – Grade 3

Tests compiled from Grade 3 systemic evaluation. No information on number of items. Test data is presented according to core literacy skills (reading, writing).

- -

PMRP Numeracy – Gr 4

LO 1: counting, ordering Basic operations

51 -

Numeracy – Gr 6

LO 1: counting, ordering Basic operations

86

Maths Centre – Limpopo

Mathematics - Gr 10

LO 1: Number & number relations (5)LO 2: Functions & algebra (8)LO 3: Shape & measurement (6)LO 4: Data handling & probability

21(subdivided to 52 items)

90 min

RUMEP Mathematics – Basic skills & Grade 10

No information. Final test included 50 marks on Grade 10 contents

- -

ELET Limpopo English - Grade 10

Tests covered “a range of learning outcomes and assessment standards”

- 90 min

ISASA English - Grade 10

Vocabulary, comprehension, free writing - 30 min

ISASA Mathematics – Grade 10

Four learning outcomes - 30 min

ISASA Grade 10 – critical reasoning

Critical reasoning

The follow observations can be made, based on the presented in the table above and the evaluation reports:

A variety of tests were used to assess student performance – even though the bulk of these sought to focus on performance in the Foundation (Grade 3 level) and FET Phases (Grade 10). The fact that so many different tests were used to assess performance makes it difficult to compare the results across the different studies – particularly where the tests differed in terms of curriculum coverage, the number of items and the duration of the tests (Grade 10 tests ranged in duration from 30 to 90 minutes).

Many of the evaluation reports do not provide enough information on test construction methodology in order to draw conclusions on test reliability and validity. Many of the

55

Page 56: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

reports that were reviewed do not indicate whether the tests were piloted with sufficiently large groups of students prior to their implementation, nor do the reports provide an indication of how the evaluators determined the reliability of the various tests (through the application of statistical tests of internal reliability). Similarly where reports do not provide information on the scope (content/ skill coverage) of the test it is difficult to assess its validity. All of these factors play a part in determining how useful the test data is for decision-making purposes. In principle, the processes governing the development and administration of all of the tests used in these studies should have conformed to established practices for large-scale, standardised testing. Unfortunately, based on the information presented in the reports it is difficult for the reader to assess whether this was indeed the case.

The level of detail about student performance varied between reports, with some reports providing global scores (mean score on the entire test) and others disaggregating data by learning outcome or skill. In some cases, the data was also disaggregated by question type (reasoning/ word sum/ context-based problem). This makes it difficult to use the data for diagnostic purposes or to track subtle changes in performance by content or cognitive domain.

Finally, the lack of detail on test structure and project inputs, makes it very difficult to determine whether there was a relationship between the focus of the training-based interventions and domains in which change in performance was recorded. If a project seeks to improve teachers’ content knowledge in a particular area (e.g. calculus) and their ability to convey information about this topic, to what extent does the test used assess changes in performance in this domain?

Collection of qualitative data

The following table provides a short summary of the key issues addressed by the evaluations:

Table: Project dimensions evaluatedElement evaluated No. of evaluationsEnjoyment of / response to training 2Changes in teacher subject content knowledge (teacher testing) 0Changes in teacher practice 6Changes in learner performance 9Attitudes to mathematics and/or classroom environment 3

On the whole, the evaluations reviewed showed a shift towards the collection of quantitative data, particularly when compared with the evaluation methodologies used prior to 2006. In all but one of the examinations of teaching practices, the reports provided narrative descriptions of the lessons observed. While this provides rich descriptions of what took places in the various lessons, the following limitations are noted:

56

Page 57: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

All but one24 of the evaluations appeared to consistently investigate the central behaviours or practices which the Foundation considers to be contributing to low levels of learner performance (e.g. curriculum coverage, questioning styles etc).

Similarly, many evaluations did not make formal linkages between the focus of lesson observations and the training that was provided by service providers. This makes it more difficult to make clear claims of attribution, linking changes in practice with the intervention.

In some studies the descriptions of various lessons did not maintain a consistent focus on particular practices – this makes it difficult to use a standard lens or approach to assess how teachers were teaching and the consistency with which new behaviours were introduced.

In some evaluations, qualitative data was collected by service providers25. While this practice may reduce costs, it also introduces a number of methodological limitations, including possible biases that may be introduced in order to demonstrate impact or that arise from the theoretical bias of the service provider.

Reporting of evaluation results

Standard 7: Reports should provide a clear overview of the project being evaluated

One the themes running through the discussion of project evaluations is the need to draw explicit linkages between the intervention and the evaluation, both in terms of the implicit assumptions about change which underlie both and in relation to the measurement of the impact of particular inputs. In order to ensure that the reader has a sound understanding of what a project entailed, it is very useful for an evaluation report to provide a concise overview of the project that includes the following information:

Project duration Assumptions about the development problem that the project wishes to address and how

the project implementers link their services (inputs) with the desired results (outcomes). Overview of inputs provided Project beneficiary profile

Standard 8: Reports should clearly outline the methodology used

One of the ways in which an evaluator can demonstrate that the evaluation methodology is reliable and valid (see standard 6) is to include a sufficiently detailed chapter on the methodology used. If necessary, this can be appended to the report.

Not all of the evaluations reviewed during the course of this study provided sufficiently detailed methodologies (although all reports did contain a short section on the methodology used). Areas in which sufficient detail was not provided include:

24 The evaluation of the MIET-led project in the North West province investigated most of the practices which are identified as contributing to low levels of learner achievement, as articulated in the Foundation’s strategy document. However, this study did not link practices with changes in learner performance – possibly because the research design was not set up in a way that facilitated this analysis.25 This practice was reported in two evaluations: the ELET and Maths Centre projects in Limpopo.

57

Page 58: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Sample size and composition – and the rationale guiding the construction of the sample; Indicator frameworks; Instrumentation used in the collection of qualitative data; Efforts to ensure inter-rater reliability during the collection of qualitative data; Test construction (scope, coverage, duration); Test administration (standardisation of test procedures); and Item reliability (tests of item reliability, reliability coefficients for the test as a whole,

information on piloting and standardisation of test, equating of test items where different tests were used to measure change in performance).

Facilitating the utilisation of evaluation findings

Utility is one of the most common evaluation standards that is used by international agencies and professional evaluation associations. Good quality evaluations should assist the project implementers and funders to take decisions about the project that has been evaluated and to inform the design and implementation of future projects.

Standard 9: The evaluation should draw attention to lessons learnt and highlight promising practices

Ideally, the evaluation report should assist the funder and relevant service providers by summarising lessons learnt about project design, implementation and impact. At the same time it should highlight promising practices that are associated with higher levels of project impact. The extent to which the evaluation reports achieved this is considered

Standard 10: The evaluation should provide clear recommendations to inform future action

Utilisation-focussed evaluations should conclude with a set of clear recommendations that can be used to guide and support future decision making. The review of the various reports noted that the quantity and specificity of the recommendations made at the end of the report varied significantly (with the number of recommendations ranging from zero to 19).

The extent to which the Foundation is able to use evaluations to shape practice is very strongly determined by the extent to which evaluators are willing to make specific recommendations on changes which could be made to project design and delivery strategies. From the review of the reports it is not clear to what extent the Foundation’s expectations about the focus or specificity of recommendations were communicated to the evaluators and whether the paucity of recommendations in some reports is a function of the Terms of Reference which were issued to them26.

26 This comment is made in light of the fact that the extent to which individual evaluators provide specific recommendations was not consistent and therefore cannot simply be attributed to the professional preferences of the individual.

58

Page 59: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Suggested strategies to enhance the quality and utility of future evaluations

In addition to being able to apply the 10 standards listed above to the planning and implementation of various evaluations, there are a number of strategies that the Foundation may wish to consider when commissioning project evaluations:

Standardisation of evaluation approaches and tools

Given the Foundation’s interest in examining the impact/ outcomes achieved by projects that contribute to similar strategic objectives, it may be worth exploring the degree to which certain elements of project evaluations could be standardised through the use of common methodologies. One way in which this could be achieved is through the development of an evaluation framework for projects of a similar nature that specified core elements that should be included. Some examples of common elements are:

o Project description (duration, number of beneficiaries, detailed description of inputs provided);

o Evaluation methodology (providing the rationale for sampling strategies, instruments used, detail on test construction, measures taken to ensure test reliability and validity);

o (Where relevant) Descriptions of teaching practice that pay special attention to issues of curriculum coverage, effective use of questions to assess student understanding, quality of in-school assessment practices.

o (Where relevant) Learner performance data that is disaggregated by learning out-come (or skills and/or sub-skills), cognitive domain (recall, application, reasoning, single step and multiple step procedures), grade level (differentiating foundational skills and grade-level content), question type (multiple choice and free response; context embedded questions).

Invest time and resources in evaluation scopingThe scope of an evaluation is determined through jointly assessing the information needs of the Foundation and the service provider in order to assess what information is needed by each party in order to strengthen future project design and implementation. By assessing and clarifying information needs alongside the evaluator can have a number of benefits including the development of a shared understanding of the purpose of the evaluation and establishing a clear framework in which the evaluation should take place and then being able to hold the evaluator accountable to this framework.

It is therefore recommended that the Foundation build into the cost of each evaluation time for the evaluator to meet with the Foundation and the service provider(s) to:

o Identify the information needs of each party;o Obtain detailed, specific information on the inputs that will be provided by the

service provider; ando Develop a shared indicator framework that reflects the changes to teaching practice

and learner behaviours (and performance) that the project seeks to bring about.

59

Page 60: Zenex Foundation Meta-evaluation methodology - ERA Foundation...  · Web viewPurpose of the meta-evaluation. The meta-evaluation of the first five years of the Zenex Foundation’s

Strengthen the systematic collection of monitoring data that can be used in the attribution of project impact

The systematic collection and recording of data on service delivery and participation statistics that has been incorporated into the management and evaluation of the School Development Programme marks a very positive innovation. One of the weaknesses noted in various evaluations was the lack of information on project delivery and the failure of many evaluations to link participation in the project (dosage) with effect sizes (i.e. changes in teaching practice, materials utilisation or learner achievement). One of the ways in which future evaluation designs could be strengthened would be to extend the SDP model of recording service delivery to other projects. It would also be very useful if the Foundation’s project monitoring system could include information on the content covered in training sessions, in order to facilitate the creation of more explicit links between changes in teacher behaviour and learner performance, and the intervention.

Promote the systematic interrogation of evaluation findings and recommendations through the institution of structured evaluations review tools

One of the ways in which the utilisation of evaluations can be promoted is through the creation of a system that records recommendations made by evaluators along with the Foundation’s assessment of the relevance of the recommendation to the Foundation’s work, the degree to which the Foundation endorses or accepts the recommendation and a short discussion on the way in which the recommendation will be acted upon. This would help formalise organisational learning from evaluations and would provide a structured record of the ways in which evaluation findings and recommendations have informed the work of the Foundation.

60


Recommended