+ All Categories
Home > Documents > Data Quality Assessment Framework (DQAF)...

Data Quality Assessment Framework (DQAF)...

Date post: 04-Apr-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
49
i Data Quality Assessment Framework (DQAF) MALAWI 21 June – 1 July 2011 Marc BERNAL, UIS Regional Advisor for Sub-Saharan Africa Chris VAN WYK, Stellenbosch University, South Africa Ndèye Yacine FALL, Assistant Programme Specialist, UIS Dakar
Transcript
Page 1: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

i  

 

DataQualityAssessmentFramework

(DQAF)

MALAWI

21June–1July2011

Marc BERNAL, UIS Regional Advisor for Sub-Saharan Africa Chris VAN WYK, Stellenbosch University, South Africa Ndèye Yacine FALL, Assistant Programme Specialist, UIS Dakar

Page 2: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

ii  

ListofAcronyms

CDSS The Community Day Secondary School CSR Country Status Report CSS Conventional Secondary Schools DEM District Education Managers DEMIS District Education Management Information SystemsDQAF Data Quality Assessment Framework DIAS Directorate Inspection and Advisory ServicesECD Early Childhood Development ED*ASSIST The Education Automated Statistical Information Systems Toolkit EMIS Education Management Information Systems ESIP Education Sector Implementation Plan GDDS General Data Dissemination System GER Gross Enrolment Rate HRMIS Human Resource Management Information System IMF International Monetary Fund JCE Junior Certificate Examination LAN Local Area Network M&E Monitoring and Evaluation MANEB Malawi National Examinations Board MGDS Malawi Growth and Development Strategy MoEST Ministry of Education, Science and Technology MSCE Malawi School Certificate Examination NER Net Enrolment Rate NESP National Education Sector Plan NSO National Statistics Office NSS National Statistical System PAS Primary Assessment Service PEA Primary Education Advisors PIEQM Project to Improve Education Quality in Malawi PIF Policy and Investment Framework PSLE Primary School Leaving Examination RDMS Relational Database Management Systems SACMEQ Southern African Consortium for Monitoring Educational Quality SADC South African Development Community TEVETA Technical Entrepreneurial Vocational and Training Authority TVET Technical and Vocational Education and Training UIS UNESCO Institute for Statistics

Page 3: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

iii  

TABLE OF CONTENTS 1.  BACKGROUND ............................................................................................................................................... 1 

2.  METHODOLOGY USED FOR THE DQAF ........................................................................................................... 2 

3.  GOALS AND OBJECTIVES OF DQAF ASSESSMENT ........................................................................................... 3 

4.  OVERVIEW AND FUNCTIONAL STRUCTURE OF THE EDUCATION SYSTEM IN MALAWI ................................... 4 

4.1.  Education System in Malawi ......................................................................................................................... 4 

4.2.  Early Childhood Development (ECD) ............................................................................................................ 4 

4.3.  Primary Education ........................................................................................................................................ 5 

4.4.  Secondary School .......................................................................................................................................... 6 

4.5.  Technical and Vocational Education and Training (TVET) ............................................................................. 7 

4.6.  Teacher Education (Primary and Secondary) ................................................................................................ 8 

4.7.  University Education ..................................................................................................................................... 8 

4.8.  Key Indicators to Monitor the Education System ......................................................................................... 8 

5.  DATA COLLECTION PROCESSES IN THE MoEST ............................................................................................... 9 

5.1.  Data Collection Process for Annual Survey of Primary and Secondary Schools ......................................... 10 

5.2.  Data Collection Process for Annual Survey of Teacher Training Colleges .................................................. 12 

5.3.  Data Collection Process for Annual Survey of TVET .................................................................................... 12 

5.4.  Data Collection Process for Annual Survey of Higher Education ................................................................ 13 

5.5.  Data Collection Process for ECD ................................................................................................................. 13 

6.  RECOMMENDATIONS .................................................................................................................................. 13 

6.1.  Legal Environment ...................................................................................................................................... 13 

6.2.  Strengthen the Link and Collaboration with the NSO ................................................................................. 13 

6.3.  Standardised Procedures to Complete the Questionnaires at Institutional Level ..................................... 13 

6.4.  Standardised Procedures to Complete the School Registers ..................................................................... 13 

6.5.  Data Audits ................................................................................................................................................. 14 

6.6.  Information systems ................................................................................................................................... 14 

6.7.  Data processing schedule optimisation ...................................................................................................... 14 

6.8.  Improving documentation and transparency ............................................................................................. 14 

6.9.  Data quality control .................................................................................................................................... 14 

6.10.  Data Collection at the Beginning of the School Year .................................................................................. 14 

6.11.  School Registration Unit ............................................................................................................................. 15 

6.12.  Next Step: Preparation of Action Plan ........................................................................................................ 15 

7.  DATA QUALITY ASSESSMENT FRAMEWORK ................................................................................................ 15 

7.0.  Prerequisites of Quality .............................................................................................................................. 16 

7.0.1.  Legal and institutional environment ........................................................................................................... 16 

7.0.2.  Resources .................................................................................................................................................... 17 

7.0.3.  Relevance: education statistics cover relevant information ...................................................................... 18 

7.0.4.  Quality awareness: Quality is a cornerstone of statistical work ................................................................. 18 

7.1.  Integrity: The Principle of Objectivity is Firmly Adhered to in the Collection, Processing and 

Dissemination of Statistics .......................................................................................................................... 18 

7.1.1.  Professionalism: Statistical policies and practices are guided by professional principles .......................... 18 

Page 4: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

iv  

7.1.2.  Transparency: Statistical policies and practices are transparent ............................................................... 19 

7.1.3.  Ethical standards: Policies and practices are guided by ethical standards ................................................. 20 

7.2.  Methodological Soundness: The Methodological Basis for the Statistics Follows Internationally Accepted 

Standards, Guidelines or Good Practice ..................................................................................................... 20 

7.2.1.  Concepts and definitions: Concepts and definitions used are in accord with standard statistical 

frameworks ................................................................................................................................................. 20 

7.2.2.  Scope: The scope is in accord with internationally accepted standards, guidelines or good practice ....... 21 

7.2.3.  Classification/sectorisation: Classification and sectorisation systems are in accord with national and 

internationally accepted standards, guidelines or good practice ............................................................... 23 

7.2.4.  Basis for recording: Data is recorded according to internationally accepted standards, guidelines or good 

practice ....................................................................................................................................................... 23 

7.3.  Accuracy and Reliability: Source Data and Statistical Techniques are Sound and Statistical Outputs 

Sufficiently Portray Reality ......................................................................................................................... 23 

7.3.1.  Source data available provide an adequate basis to compile statistics ...................................................... 24 

7.3.2.  Assessment of source data: Source data are regularly assessed and validated ......................................... 30 

7.3.3.  Statistical techniques: Statistical techniques employed conform to sound statistical procedures, and are 

documented................................................................................................................................................ 31 

7.3.4.  Assessment and validation of intermediate data as well as statistical outputs are regularly assessed and 

validated ..................................................................................................................................................... 32 

7.3.5.  Revision studies: Revisions, as a gauge of reliability, are tracked and mined for the information they may 

provide ........................................................................................................................................................ 32 

7.3.6.  Archiving of source data and statistical results .......................................................................................... 32 

7.4.  Serviceability: Statistics with adequate periodicity and timeliness are consistent .................................... 35 

7.4.1.  Periodicity and timeliness: Periodicity and timeliness follow internationally accepted dissemination 

standards .................................................................................................................................................... 35 

7.4.2.  Consistency: Released statistics are consistent within a dataset and over time, and with other major 

datasets ...................................................................................................................................................... 35 

7.5.  Accessibility: Data and Metadata are Easily Available and there is Adequate Client (User) Support ........ 36 

7.5.1.  Metadata accessibility: Up‐to‐date and pertinent metadata are made available ...................................... 38 

7.5.2.  Assistance to users: Prompt and knowledgeable support service is available ........................................... 38 

8.  CONCLUSION ............................................................................................................................................... 39 

9.  Appendix A: List of relevant references and documents .............................................................................. 40 

10.  APPENDIX B: List of Persons Met ............................................................................................................. 42   

Page 5: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

1  

1. BACKGROUNDEducation Management Information Systems (EMIS) were established as one of the priorities of the African Union’s action plan for the Second Decade of Education for Africa, as well as the education programme of the South African Development Community (SADC). The Government of Malawi recognises the importance of establishing mechanisms to collect, compile and disseminate statistics so that they are able to meet the ever increasing demand for quality statistics. The National Statistical System (NSS) was launched in November 2006 in order to oversee the development and implementation of a comprehensive and well-coordinated system of producing, processing and disseminating official statistics. The NSS is there to support development initiatives that require good statistics to monitor and evaluate the progress of implementation. The role of EMIS for evidence-based policy formulation and results-based decision making is highlighted in the NSS. The monitoring and evaluation of the National Education Sector Plan (NESP), the Education Sector Implementation Plan (ESIP) and the Policy and Investment Framework (PIF) through the M&E are based on the availability and accessibility of quality education statistics. EMIS plays a key role in this regard. To this end, the Ministry of Education, Science and Technology (MoEST) has established the M&E Unit for the development of a national strategy, the definition of monitoring outcomes and the advocacy of the importance of monitoring and evaluation interventions in order to improve the delivery of quality education. The strategic placement of the M&E unit within the same directorate as EMIS not only will help to foster the relationship between these two core functional areas, but also indicates the importance placed on the value of data in the monitoring of its outcomes. In the Project to Improve Education Quality in Malawi (PIEQM), as part of the Malawi National Education Sector Implementation Plan (ESIP), EMIS is regarded as the main instrument for monitoring outcomes and outputs. In order to support the development of an effective EMIS across sub-Saharan Africa, the UNESCO Institute for Statistics (UIS) is in the process of conducting diagnostic assessments, in line with its Mid-Term Strategy, of national education statistics systems within the global context of UNESCO’s support to the African Union Second Decade of Education. The ensuing report summarises the findings of a situation analysis of the quality of education statistics in Malawi, and proposes recommendations for improvement. At this stage it is appropriate to define and gain an understanding of what is meant by EMIS within the context of a national ministry of education. A good definition would be that of UNESCO (2011), which asserted: “A solid information system should not only aim to collect, store data and process information but help in the formulation of education policies, their management and their evaluation.” In our assessment of the quality of education statistics in Malawi, we concurred with this definition and soon realised that the production of quality education statistics should not be thought of as a technical or operational process only. It includes important elements such as compliance and accountability issues through the legislation and policy framework of the organisation; adequate human resource allocation and the provision of training; sufficient budget allocation; integrated processes

Page 6: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

2  

through the collaboration and cooperation of relevant units and sections; an integrated management information system; and proper usage of and demand for quality data through monitoring and evaluation. This integrated approach plays an important role in the production of quality education statistics and will be outlined with specific reference to the situation in Malawi. 2. METHODOLOGYUSEDFORTHEDQAFIn the information age, quality data is essential for evidence-based decision making, and in this exercise the quality of education statistics is addressed in various ways. The main methodology used was semi-structured interviews with key stakeholders, including government officials and donor agencies, and a literature review of the relevant documents. The results show that there is a need for quality data, and this is addressed through legislation, education policy and key education documents, which is encouraging, although at the same time there is room for strengthening the data processes at various levels. The assessment process was guided by the Data Quality Assessment Framework (DQAF). This methodology was originally developed by the International Monetary Fund (IMF), and developed further by UIS and the World Bank for an education context. The UIS is currently developing the framework further into a complete methodology for assessing national education statistics systems. The DQAF for Malawi was planned through a process of collaboration between UIS and the MoEST in Malawi. The fieldwork and on-site situation analysis for Malawi were carried out from 21 June to 1 July 2011. Data was collected through:

Interviews: Semi-structured interviews were held with key stakeholders within the MoEST, such as officials at district and institutional level, as well as with other government agencies in charge of data, in particular the National Statistics Office (NSO) and other development partners involved in the collection, processing and use of data. Appendix B indicates all the groups and persons who were interviewed during the field visits. The interview sessions were the largest and single most valuable source of qualitative information collected. A roundtable meeting with the Secretary of the MoEST, on 1 July 2011, during which the preliminary findings of the country visit were presented and discussed and additional information was gathered. Archivalanalysis: This observational method was used to examine the accumulated documents as part of the research method to enhance the report. The documents included, but were not limited to: promulgated Acts; Policies; documents; official publications; strategic plans of the agencies; and questionnaires used to collect data. A list of the documents that form the basis of the analysis is attached as AppendixA.

Page 7: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

3  

Analysisofdatabase: A basic and elementary analysis of the database was done for consistency and accuracy, to discover trends and relationships in the data that was collected, and to inform the investigation. The mission was conducted by Marc Bernal, UIS Regional Advisor, Chris Van Wyk, private consultant from the University of Stellenbosch, and Ndèye Yacine Fall, Assistant Programme Specialist, UIS Dakar. For most of the interviews the team was accompanied by an official of the MoEST. The analysis and this report were contingent on the cooperation, consultation and input of national and sub-national staff of the MoEST and agencies, as well as school staff and other partners. Representatives from these organisations have been more than helpful, and sincere thanks are extended to them for their assistance. We would like to thank all of those who participated in the interview sessions associated with this initiative. 3. GOALSANDOBJECTIVESOFDQAFASSESSMENTThe aim of the mission was to develop a realistic picture of the practical state of education statistics in Malawi at the institutional, regional and national level from an implementation and application point of view, and to discover how consistent it is with what the DQAF proposes. The aim is to try to inform the recommendations made for improving education statistics in Malawi. As such, the objectives of the situational analysis are the following:

Todevelopanaccuratepictureof theavailability, levelandextentoftheuseofeducationstatisticsinMalawi

Toidentifyandunderstandthechallengesthatheadquartersandregionsfaceintheirdrivetooptimally implementEMISintheproductionofeducationstatistics

ToidentifygapsinthecurrentsituationandkeyprioritiesforfuturedevelopmentthroughtheDQAF

Toidentifybestpracticesthatcanberecommendedandusedinothercountries

Toput forwardrecommendations to theMoESTonways toimproveeducationstatisticsinMalawi

Page 8: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

4  

4. OVERVIEWANDFUNCTIONALSTRUCTUREOFTHEEDUCATIONSYSTEMINMALAWI

4.1. EducationSysteminMalawiThe revised Policy and Investment Framework (PIF) of January 2001 outlines the basic structure of the education system in Malawi and addresses the main challenges that are faced, together with providing an implementation strategy for how to improve education quality in Malawi. The education system of Malawi has an 8-4-4 structure, with primary, secondary, and tertiary education as the main education components. The age of entry to school education is six years, although there is no universal registration of births in Malawi. This structure of the education system forms the basis for the NESP and the implementation and accompanying indicators, and will be discussed as such. The introduction of free primary education in Malawi has increased enrolment and placed a high demand on infrastructure and resources. Early childhood development (ECD) is offered and forms a vital part of the education system, but it falls under the jurisdiction of the Ministry of Gender, Children and Community Development. Figure 4‐1 below diagrammatically shows the education structure in Malawi.  Figure4‐1:EducationstructureinMalawi(Source:MalawiEducationCountryStatusReport(CSR2008/09)

4.2. EarlyChildhoodDevelopment(ECD)

Page 9: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

5  

ECD in Malawi from age 0 to age 5 falls under the Ministry of Gender, Children and Community Development. Although ECD programmes are not standardised yet, according to the Annual Report of ECD, the Government of Malawi is making great strides in establishing ECD services in the country. This is manifested by the development and implementation of the National ECD Policy, a syllabus, training manual, programme documents, a Parents’ and Caregivers’ Guide, and ECD Centre Guides. 4.3. PrimaryEducationThe duration of primary school is eight years, from Standard 1 to Standard 8, and ends with the Primary School Leaving Certificate (PSLC), which determines the eligibility for entry into secondary school. The master list of schools obtained from the MoEST shows the number of primary schools in Malawi in 2010. It is worth noting that the number of primary schools has increased from 5 590 in 2008 to 5 718 in 2010, as indicated in

Table4.1 below DistrictName 2010 2009 2008Balaka 160 160 160Blantyre City 172 171 148Blantyre Rural 178 174 170Chikwawa 177 175 175Chiradzulu 88 88 88Chitipa 171 171 171Dedza 224 220 218Dowa 241 241 241Karonga 170 169 169Kasungu 333 333 331Likoma 10 10 10Lilongwe City 136 132 114Lilongwe Rural 202 200 196Lilongwe Rural 238 238 238Machinga 166 166 165Mangochi 262 261 256Mchinji 202 201 200Mulanje 171 168 166Mwanza 49 49 49Mzimba North 252 251 251Mzimba South 290 288 287Mzuzu City 65 65 62Neno 69 69 68Nkhata Bay 189 189 185Nkhotakota 165 164 164Nsanje 111 111 109Ntcheu 240 240 239Ntchisi 139 139 139Phalombe 91 91 86Rumphi 173 173 165Salima 140 137 136Thyolo 205 204 203Zomba Rural 202 198 197Zomba Urban 37 37 34TOTAL 5718 5683 5590 Table4.1:Numberofprimaryschoolsperdistrict

Page 10: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

6  

Source: MoEST database (2008‐2010) 

4.4. SecondarySchoolThe duration of secondary school is four years, from form 1 to form 4. The community day secondary schools (CDSS) and the conventional secondary schools (CSS) accommodate public secondary students. The community day secondary schools require the payment of a school fee for tuition, a general purpose fund, a development fund and a textbook revolving fund. After two years of secondary education, the students write the national Junior Certificate of Secondary Education (JCSE), followed two years later by the Malawi School Certificate Examination (MSCE). The number of secondary schools increased from 1 394 in 2008 to 1 519 in 20101, as indicated in Table4.2 below.

                                                            1 Note: The total number of secondary schools recorded in the 2008-2010 Bulletins seems to be only those schools that submitted their survey questionnaires according to the table “gdb:SchoolData” and not according to the “gdb:school” table, which appears to be the master list of secondary schools as recorded in Table4.2.  

Page 11: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

7  

DistrictName 2010 2009 2008Balaka 49 49 47Blantyre City 106 104 90Blantyre Rural 63 61 60Chikwawa 43 40 40Chiradzulu 36 36 37Chitipa 31 30 27Dedza 53 51 48Dowa 53 53 51Karonga 36 32 32Kasungu 62 61 60Likoma Island 4 4 3Lilongwe City 85 81 77Lilongwe Rural 41 41 40Lilongwe Rural 46 45 45Machinga 46 44 40Mangochi 56 56 52Mchinji 41 42 40Mulanje 50 49 45Mwanza 20 19 24Mzimba North 63 62 61Mzimba South 61 60 59Mzuzu City 31 30 30Neno 16 15 10Nkhata Bay 61 48 45Nkhotakota 37 37 34Nsanje 29 29 25Ntcheu 57 57 51Ntchisi 19 19 19Phalombe 21 20 19Rumphi 41 42 42Salima 28 27 26Thyolo 48 47 40Zomba Rural 27 23 21Zomba Urban 59 58 54TOTAL 1519 1472 1394 Table4.2:NumberofsecondaryschoolsperdistrictSource: MoESTdatabase(2008‐2010)

4.5. TechnicalandVocationalEducationandTraining(TVET)Technical and vocational education requires relevant and effective governance and management systems, and the Technical Entrepreneurial Vocational and Training Authority (TEVETA) is the implementing agency of TVET. In relation to this endeavour there is a need to review the TEVET Act so that issues affecting vocational training and public-private partnership policies are taken on board. According to the Malawi Education Country Status Report (CSR 2008/09) data concerning Technical and Vocational Education and Training (TVET) are scarce in Malawi. The country status report also states that the “TEVET landscape in Malawi is highly diverse, fragmented and uncoordinated with multiple private and public provider systems. Access to the regular TEVET programmes, regulated and administered by the TEVET Authority (TEVETA) and provided mainly as 4-year apprenticeship training, is very low compared to the demand”. The following is a synopsis of different TEVET provider types in Malawi (Source‐CountryStatusReport): TEVETinpublicTechnicalColleges:

Page 12: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

8  

In seven public Technical Colleges (TCs) in Malawi pre-employment training courses form the core of the formal public training supply. These courses can be divided into “regular” (regulated by TEVETA) programmes in the form of apprenticeships and “parallel” programmes offered by public Technical Colleges. Privateprovision:TEVETprovidedbyNGOsandprivatecommercialschools:These are the largest provider of TEVET, although the total number of institutions and enrolment is unknown. Training institutions cater for a range of training fields and many institutions offer formalised courses of one to four years following recognised curricula and geared towards trade test, Malawi Craft/Advanced Craft, City and Guilds and other qualifications that are formally or informally recognised in the labour market. Providersofsector‐specifictraining:These are mainly public training institutions providing specialised training. Examples include the Malawi Institute of Hospitality (MIT), Marine Training College, Police Training Schools, and others. TEVETforspecialtargetgroups:Public, parastatal or NGO training provision to cater for special target groups such as small and micro business and start-ups, handicapped, etc. Traditionalapprenticeship:Traditional apprenticeship, also called master craftsman training, is a well-established and wide-spread system of on-the-job training provided in the informal sector. According to the Country Status Report the following are enrolments in regular apprenticeship programmes as in 2007: Carpentry and joinery 302 Welding and fabrication 92 General fitting 181 Auto electrical 39 Electrical installation 84 Painting and decoration 60 Plumbing 92 Printing 33 Refrigeration and air conditioning 14 Vehicle body repair 41 Woodwork machining 16Motor vehicle mechanics 79

4.6. TeacherEducation(PrimaryandSecondary)

Page 13: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

9  

According to the NESP, it should be borne in mind that teacher education in Malawi addresses two key areas, namely primary and secondary school teacher needs. The training of primary school teachers currently follows a mode of training that entails one year in college and one year in practice. The requirement for training as a teacher is a minimum of a JCE certificate, but preferably an MSCE certificate with a pass in English and Mathematics. There are six primary school teacher training colleges, all affiliated to the MoEST. The training of secondary school teachers comprises two levels, a diploma and a degree programme. Malawi University trains secondary school teachers to degree level and the Domasi Institute offers a four-year educational programme for secondary school teachers. 4.7. UniversityEducationMalawi has four major universities. Admission to university is for students who have completed and passed secondary school examinations. The university education has two stages. An undergraduate degree is done in four years, except for degrees in medicine, engineering and journalism, which take five years. A postgraduate degree in management is offered at the University of Malawi. 4.8. KeyIndicatorstoMonitortheEducationSystemUsing the structure of the education system as a guide, the NESP identified the following indicators in order to improve the quality of education. These indicators are contingent on quality data – data that is complete, relevant, accurate, timely and accessible.

Primary Education: 

Reduce dropout rate from 14.3% to 5%, Reduce repetition from 18% to 5%, Improve distribution of teachers in rural areas from one qualified teacher per 90 pupils to at least 1:70, Improve the survival rate of pupils to Standard 5 from 53% to 75%, and Increase the survival rate from 29.6% to 60% by Standard 8.

 

Secondary Education  

Increase secondary enrolment and participation by girls in particular to at least 50%, Improve throughput for the MSCE from the 38.6% in 2006 to at least 65%, Increase the teaching staff to student ratio in community day secondary schools from 1:104 to 1:60, and Reduce the overhead costs of secondary education as a result of increased enrolment and reduce the boarding subsidy.

Teacher Education  

Increase the supply of teachers, with a bias towards increase the female throughput for both primary and secondary schools, by at least 35%, Mainstream special needs education in at least half of the teacher training colleges,

Page 14: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

10  

Institutionalise in-service training (INSET)/continuous professional development for teachers in the education system, and Increase and rationalise the use of teaching staff.

Technical, Vocational Education and Training 

Increase enrolment, with a bias towards increasing the intake of females in non-traditional areas, Reduce overhead costs of running colleges, and Rationalise teaching staff in line with relevant training requirements.

Higher Education 

Double enrolment, Reduce overhead costs from 185 US dollars to 65 US dollars or below, and Increase and rationalise staffing levels with appropriate qualifications from 20% to 75%..

5. DATACOLLECTIONPROCESSESINTHEMoESTThe structure of the education system as discussed in point 4 above will be used to describe the data collection processes within the MoEST and to determine the data requirements for the Education Sector Implementation Plan (ESIP). Such an approach serves as a monitoring and evaluation mechanism to determine if the data is available according to the NESP and ESIP requirements for achieving the Malawi Growth and Development Strategy (MGDS). The EMIS Unit is the only section that collects data in the MoEST and it is also acknowledged as such by all other departments in the ministry. This approach ensures that only a single data source is available, which increases the integrity of the data. As custodian of the data in the MoEST, the EMIS Unit is responsible for the proper management of the data that is produced and should ensure: - The secure storage of the data - The implementation of business rules - The technical environment, processes and database structure - That access to the data is authorised and controlled - The maintenance and updating of documentation, data definitions and metadata and the compilation of manuals, forms, etc. - The identification or definition of data collection procedures and standards The report will assess how these points are implemented. EMIS collects data pertaining to the school system once every year. EMIS takes responsibility for the dissemination, collection and capturing of data from the survey instruments of primary and secondary schools. However, EMIS uses the decentralisation

Page 15: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

11  

approach to get support from other departments, such as Higher Education and TVET, to distribute and collect the questionnaires from their respective institutions. To better understand the process in the value chain and how it is executed within the MoEST, we follow the EMIS life cycle2. 5.1. DataCollectionProcessforAnnualSurveyofPrimaryandSecondarySchools

Compilationofthequestionnaire:The EMIS Unit is mainly responsible for the compilation and design of the survey questionnaire, but circulates it to other sections in the MoEST for their input. However, it must be noted that any changes to the form entail a cumbersome process and take long, because the changes to the software are done through an external service provider. Review: The survey questionnaire is reviewed every three years. The M&E Unit is conveniently placed in the same directorate as EMIS and has regular meetings to determine the need for new elements on the form to facilitate the monitoring and evaluation of specific indicators, as required. Through the system wide approach, development partners and the NSO are consulted through regular meetings. Calendar:A data collection calendar that covers all the steps is drawn up and sent to all the relevant role players. The EMIS head writes to all districts in December to inform them about the dates (calendar was not available during our investigation). These steps are indicated on the questionnaire and allow space for each responsible person to sign. In this way there is a specific tracking process in place. Dissemination of the questionnaire: In preparation for the dissemination of the questionnaires, labels are printed and applied to the forms, which are sorted according to district and zones. This process serves as a control measure to improve the distribution of questionnaires ensuring that all institutions receive a form; a master list of schools from the EMIS database is used for this purpose. Due to a lack of funds, this practice was not pursued in the dissemination of the 2011 questionnaires. Briefing session: In April the EMIS Unit meets with all the District Education Managers (DEM), Primary Education Advisors (PEA) and cluster leaders to brief them on the completion of the questionnaire, how to verify and validate the form and when to complete and submit the form. The PEAs, in turn, go to their respective zones (a zone comprises a number of primary schools in a district) and the cluster advisors go to their respective clusters (a cluster is a group of secondary schools) to brief the head teachers on how to complete the form and to disseminate the questionnaire to each school. The PEA coordinator and the cluster leader are responsible for getting the completed questionnaires back.                                                             2 The EMIS Life cycle concept is commonly used to formalize and document the data collection and results production process. 

It usually encompasses the following phases: information needs identification, data source inventory and data collection 

instruments, data collection, data processing, data analysis and reporting, publication and dissemination. It can be derived 

according to the context. 

Page 16: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

12  

Completionofquestionnairebyschools: It was noted during our field visits that there is no standardised procedure at school level on how to complete and verify the questionnaire. Some schools complete the questionnaire by using school records such as the school attendance register, while other schools simply ask the teachers to provide the figures. The site visits raised a need for additional training and improving instructions in the questionnaire (see recommendation 6.3). Verification: Limited instructions are provided on the questionnaire for verifying and comparing tables. Forms returned: The questionnaires are returned to headquarters via the districts. The monitoring of the returned forms only takes place during the data capturing process. The captured forms are compared with the master list of schools using the quick count table on page 1 to determine the outstanding schools. According to them the response rate is based on this table. Point 7.3.1 ”Source data available provide an adequate basis to compile statistics” outlines an explanation of the possible response rate in primary and secondary schools. Data capturing: The data entry process only starts when all forms have been received (Refer to recommendation 6.7). This process takes place on a local area network (LAN) into a central database. The verification takes place as the forms are captured and any discrepancies are followed up through telephone with the head teachers, for example new entrants cannot be more than the enrolment for Standard 1, the entire school cannot be orphans, etc. Datadisseminationandpublication: The aim of the EMIS Unit is to publish the data at least three weeks after the release. It takes about nine months to produce the bulletin, with all the EMIS processes taken into consideration. For the data dissemination process, a workshop is conducted to whom the relevant directors and officials at national and district level are invited. The database on the survey and the bulletin are put on a CD for dissemination to the districts. According to the EMIS head, the DEMs at districts offices have been trained in the past on how to use the database and how to write simple queries to extract specific information. Dataanalysis: The analysis of EMIS data is one of the most important steps, yet it is often the one that is neglected. The EMIS Unit,together with M&E Unit,should maximise such a dataset through the promotion of data analysis and research practices. It is to be noted that the M&E Unit, in collaboration with EMIS Unit, has started a data analysis process through the development of a document for the “2010 key EMIS indicators analysis”. Datausage:Thedepartments concerned with primary schools and secondary schools use the EMIS data in their day-to-day planning, among others for the procurement of textbook allocations, stationery purchases, the distribution of resources, teacher allocations, policy decisions, research and related projects and to procure teaching and learning material. MonthlyReturnsforPrimaryandSecondarySchools:Data is also needed and collected from schools at the regional level on a monthly basis, through what is called the Staff, Enrolment,

Page 17: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

13  

Structure and Furniture Return. This is only used at district level for management purposes. No cross-checking is done to verify the consistency with the annual survey questionnaires. Apart from the data collection processes for primary and secondary schools, EMIS Unit collects data for other sub-sectors. 5.2. DataCollectionProcessforAnnualSurveyofTeacherTrainingCollegesThe EMIS Unit sends the questionnaire directly to both private and public institutions, which complete it and send it back to EMIS at headquarters. There is no database and data entry process, although the data is prepared and compiled for publication in the annual bulletin. 5.3. DataCollectionProcessforAnnualSurveyofTVET The Technical Entrepreneurial Vocational and Training Authority (TEVETA) is the implementing authority of TVET in the country. The Director of TVET is the link to EMIS Unit in order to manage the data collection processes for this sub-sector (for all programmes not under TEVETA). The TVET questionnaire is compiled by EMIS Unit in collaboration with the TVET department. The questionnaire is sent to the institutions by the TVET directorate, which collects it from the institutions and sends it back to EMIS. The data is not captured in a database, but only compiled for publication purposes. TEVETA also collects their own data on “regular” programmes without any collaboration with the MoEST.

Page 18: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

14  

5.4. DataCollectionProcessforAnnualSurveyofHigherEducationThe data collection process for higher education is decentralised to the directorate of higher education in the MoEST. The data collection takes place once a year when the directorate sends the questionnaire directly to the institutions, which complete it and return it to EMIS Unit. The data is not captured but only compiled for publication purposes. 5.5. DataCollectionProcessforECDThe Ministry of Gender, Children and Community Development is the lead agency coordinating and overseeing the implementation of ECD activities. However, it seemed during our investigation that there was no sustained data collection process or any database available for ECD. An attempt is made to collect summary data for each district with the help of a social welfare officer on a quarterly basis.

6. RECOMMENDATIONSBased on the above situational analysis and the accompanying findings, recommendations are put forward in the following categories: 6.1. LegalEnvironmentIt is recommended that the Education Act be revised to include statistical responsibilities and that the MoEST’s mandate is clearly identified in terms of the scope and periodicity of the production of educational statistics. 6.2. StrengthentheLinkandCollaborationwiththeNSOIt is necessary to work with the NSO to analyse the different data sources and use the current census data to look for differences and discrepancies. Concepts and definitions should be defined and documented and reasons should be given for the differences in the datasets. 6.3. StandardisedProcedurestoCompletetheQuestionnairesat

InstitutionalLevelThere is a need to train head teachers in basic statistics and EMIS-related standardised processes. Standardised instructions should be compiled on how best to complete the survey questionnaire at school level. A booklet such as “Questionnaire Made Easy” could accompany the questionnaire. 6.4. StandardisedProcedurestoCompletetheSchoolRegistersAs a priority, the school register should be enforced as an important official document that is used to improve the quality of the data, for example by recording all the students in the

Page 19: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

15  

school. Standardised instructions should be compiled on how best to use the school register as a mechanism to be used to complete the survey questionnaire at school level. 6.5. DataAuditsTo improve the quality of the EMIS data and to build trust and increase the integrity of the data, data audits can be done on a regular basis. A data audit is a process in which random samples of schools are selected and a head count of learners is conducted and compared with the data submitted by the school. These results should be advertised publicly and punitive measures should be implemented. Appropriate disciplinary measures should be included in the Statistics Act. Other datasets, such as the Monthly Returns at district level and the Human Resource Management Information System, could be used to verify the EMIS data – a practice that will increase the reliability of the data. 6.6. Informationsystems

Terms of reference (ToR) should be developed for an integrated information system that adheres to the functionality that is required within the MoEST and at sub-national levels, with a medium-term perspective of decentralisation down to the institution level if required. A consultative structure should be organised in the MoEST with the objective of finalising the strategy for the development/acquisition of a new information system. Norms and standards should be determined as requirements for the development of any system in the future in order to enhance and promote capacity building and to create harmony in the different departments of the MoEST.

6.7. DataprocessingscheduleoptimisationData capturing should start as soon as questionnaires are returned to the central level without waiting for the all instruments to be returned. This could even improve the timeliness of the process more. 6.8. ImprovingdocumentationandtransparencyMeta data documentation as well as an EMIS cycle procedure manual should be developed and made available to public. Relations with users should be improved. In general we recommend that an EMIS policy document is developed including these different points. 6.9. DataqualitycontrolDevelop procedures to cross-check data with other datasets such as Resource Management Information System (HRMIS) and Monthly Returns. Even dataset from previous years should be used to verify and check the data of the current year. 6.10. DataCollectionattheBeginningoftheSchoolYearIt is recommended that a snapshot of data is compiled at the beginning of the year to provide data (teachers and pupil enrolment data) for education planning purposes by using the Monthly Returns produced at district level.

Page 20: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

16  

6.11. SchoolRegistrationUnitAlthough a master list of institutions could be obtained, it seems that there is no unit in the MoEST that manages the opening and closing of schools. At this juncture it is appropriate to mention that a School Registration Unit should be establish in the Planning Department, which would be the most logical place to maintain and update the master list of institutions. Procedures for opening and closing institutions should be developed and included in the Education Act. The purpose of such a School Registration Unit is to assign a unique identifier to every institution in the country and to maintain and update the official master list of institutions. 6.12. NextStep:PreparationofActionPlan

There is a need for the development of an Action Plan that will identify and prioritise a set of actions needed to address the weaknesses identified by the diagnostic study. It should also identify a coherent framework for the capacity building of education statistics, and should identify the costs associated with the implementation of the plan. It therefore will serve as the basis for discussion, and for mobilising additional resources as required to cover the in-country costs associated with the implementation of the plan. The plan should include:

o The development and implementation of an EMIS Policy and the identification of an EMIS expert to lead such a process. o The development of a “hands-on learning by doing” capacity-building strategy for education planning and EMIS. o A sound assessment of potential solutions that will guide the choice of a proper information system.

 

 

7. DATAQUALITYASSESSMENTFRAMEWORKThe principal objective of the DQAF evaluation is to produce a qualitative assessment of the education statistics used and produced by the education sector in Malawi. The tools and methodologies for this exercise have been adapted from earlier evaluations undertaken by the International Monetary Fund and UIS. Further adaptations by the UIS and World Bank sought to ensure a comprehensive evaluation, focused specifically on the quality of education statistics. To underpin the DQAF evaluation, a participatory and needs-based approach was adopted. The evaluation framework covers the different steps included in the statistical business process model at the national and sub-national levels, and assesses the strengths and weaknesses of the available structures, based on the six DQAF dimensions: 0) prerequisites of quality 1) integrity 2) methodological soundness

Page 21: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

17  

3) accuracy and reliability 4) serviceability 5) accessibility Narrative descriptions are given of the state of the system in Malawi as per these dimensions and sub-dimensions, in addition scores on each of the sub-dimensions. Scores are attributed according to international norms pertaining to the functions of the different elements of the statistical system. The scores on the sub-dimensions are then aggregated to arrive at scores on each of the dimensions. These scores should be indicative of where efforts to improve the statistical function could focus. In order to make this more explicit, specific recommendations are made. For the purposes of our assessment we have assessed education statistics as generally produced by the main data-producing agencies, specifically EMIS in the MoEST and the NSO as the overarching data producer. This diagnostic is intended to provide input for an action plan to improve the system of educational statistics in Malawi. 7.0. PrerequisitesofQualityData quality is regulated by a framework of statistical laws, policies, standards and practices, and by technical and human resources. This framework cannot exist in a vacuum. Prerequisites of quality, as one of the dimensions of data quality, do not comprise a qualitative dimension, but refer to the evaluation and understanding of the institutional context in which the statistical processes exist and which is essential to the other dimensions. This dimension presents the integrated nature in which available statistical laws, as well as essential human and technical resources, impact on other quality dimensions. 7.0.1. LegalandinstitutionalenvironmentAccording to the NSS, the legal framework in Malawi is old and outdated. The National Statistical Office (NSO) produces and disseminates data in terms of the provisions of the Statistics Act, 1967 in the laws of Malawi. However, it should be noted that a Draft Bill is available that seeks to repeal the Statistics Act (Cap. 27:01). This Draft Bill, under the duties of the Commissioner for Statistics in Section 5(e), specifies the tasks and duties of the NSO in relation to collecting, compiling, analysing, abstracting and publishing or otherwise disseminating statistical information. Although the Education Act was not available, it seemed as if there was a strong policy agenda and strategic direction that clearly recognise the important role that statistics play in informing strategy and policy choices and in monitoring and evaluating progress towards realising development goals. The following documents can be mentioned in this regard:

The NationalStatisticalSystem(NSS), which was launched in November 2006, is an important milestone towards the development of a comprehensive and well-coordinated NSS for producing, processing and disseminating official statistics.

Page 22: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

18  

TheMalawiGrowthandDevelopmentStrategy(MGDS), which was launched in July 2007. The MGDS is the overarching development strategy for the country. The Malawi Country Status Report (CSR) (2008/2009), which is a detailed analytical document of the education sector that serves to highlight the weaknesses and strengths of the sector and ultimately recommends remedial policies on a sub-sectorial basis to help improve quality, access and governance in the system. The draft National Education Sector Policy Statement (2008), which is a consolidation of all the policies governing the education sector in the country. The National Education Sector Plan (NESP) and the Operational Supplements, which outline the priorities of the education sector and the strategies and activities to address these priorities. The institutional arrangement between the NSO and the MoEST is not clearly defined, although the EMIS Unit is benefitting through support from the National Statistics Office (NSO) (see 7.0.2). ECD, an important sub-sector in the education system, falls under the jurisdiction of the Ministry of Gender, Children and Community Development, but no formal arrangement exists with the MoEST relating to data collection processes. The NSS also forms the basis for line ministries to meet regularly. The sectoral plans for all the line ministries are included in the NSS Strategic Plan. The cross-cutting issues, such as data collection and technology, are specifically addressed in the Strategic Plan. This Strategic Plan aims to coordinate the sourcing of funds and the implementation of activities. It contains budgets and activity plans prepared by the NSO and the six ministries. The arrangement is that the ministries implement the strategies and the NSO provides support. 7.0.2. Resources

Staff: The number of staff and the capacity to perform data management functions in EMIS seem to be sufficient. There are 10 staff members in the EMIS Unit - two officials from the MoEST and eight seconded staff members from the NSO, which seemed to be adequate to perform the required data production functions. However, it appears that more professional training and capacity building could be implemented. At the district level, staff within the education system are used to support and assist with the data collection process. The primary education advisors (PEAs) and the cluster advisors disseminate, collect and verify the data collection instrument. Two DEMIS officers are placed in each district to facilitate the data-related matters at district level. IT equipment: The IT equipment seemed to be sufficient for the staff to perform their respective functions (data capturing, printing, etc.). There is also a LAN with enough computers to facilitate the capturing of the EMIS data on a central database. The software system (The Education Automated Statistical Information Systems Toolkit (ED*ASSIST)) is an information system designed by an external service provider on an Access database and used to capture the survey forms. There is a need to review the software and to determine the way forward.

Page 23: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

19  

Financial resources: EMIS is not a separate programme in the National Education Sector Plan (NESP) and is only part of the SWAp basket fund. A lack of funding has a direct influence on the data collection process and affects the quality of the data. This year (2011), no funds were available to print the labels to control and disseminate the data collection instrument. 7.0.3. Relevance:educationstatisticscoverrelevantinformationRelevance of collected data is essentially ensured through information needs assessment process. The communication with users is generally not effective and users are not informed about specific aspects of the data, such as timeliness and data details. There is also no structured process to consult with users. There are no procedures for requesting data and no system to record the requests from users in order to build up a history to analyse trends and so improve the quality of data for the users. It seems data requests from users and the provision of data to users take place on an adhoc basis (See recommendation

6.8). 7.0.4. Qualityawareness:QualityisacornerstoneofstatisticalworkEMIS is recognized as the only producer of data in the MoEST. This is a good practice that shows the importance given to data quality by managers. However processes focussing on quality are not in place. No reviews are undertaken to identify problems and maintain quality requirements. Managers are not informed about quality issues (response rate, changes in data collection process, etc.). Strategic decisions for improving data quality seem not to be a priority (See recommendation 6.9 in this regard).

7.1. Integrity:ThePrincipleofObjectivityisFirmlyAdheredtointheCollection,ProcessingandDisseminationofStatisticsThis dimension captures the notion that statistical systems should be based on adherence to the principle of objectivity in the collection, compilation and dissemination of statistics. The dimension encompasses institutional arrangements that ensure professionalism in statistical policies and practices, transparency, and ethical standards. The three elements for this dimension of quality are:

Professionalism Transparency Ethical standards 7.1.1. Professionalism:Statisticalpoliciesandpracticesareguidedby

professionalprinciplesThe Draft National Statistics Bill, 2011 makes provision for the appointment of a Commissioner for Statistics, who shall be responsible for the administration and control of the National Statistical Office. Amongst the duties of the Commissioner are: to collect, compile, analyse, abstract and publish, or otherwise disseminate, statistical information relating to matters specified in the First Schedule. Although the NSO is under the jurisdiction of the Ministry of Planning, the minister has no influence on the data that is

Page 24: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

20  

released. This function is underpinned by the Draft Statistics Bill of 2011, and clearly states that one of the duties of the Commissioner is “to ensure the independence, accuracy, relevance, integrity, timeliness and professional standard of statistical information produced by the National Statistical Office”. The functions of the EMIS Unit (dissemination of questionnaires, capturing, dissemination and analysing of data) widely recognise that it is the only provider of data in the MoEST. Such an approach facilitates the following: The duplication of data and the data response burden on institutions are minimised (there is only one main data collection process during the year). A positive result from this recognition is that EMIS data is produced on an annual basis. However, it must be noted that this important function is not stipulated in any document or policy. According to the Deputy Commissioner, when statistics are erroneously interpreted or misused, especially in the media, the National Statistical Office reacts by giving the true interpretation. The aim, however, is to engage the media and not to be confrontational in their approach and dealings with the media. In the MoEST, the public relations officer deals with any media-related enquiries. 7.1.2. Transparency:StatisticalpoliciesandpracticesaretransparentIt must be emphasised that the documentation of processes and procedures is poorly defined and there is no information available on how the data are approved before being released to the public. It seemed as if the data collection process was well implemented and established and it appeared that the staff at the regional and institutional level are dedicated to their task. The PEA coordinators at district level control the data collection process in an organised and systematic way to ensure that the required data quality is maintained. But we could not obtain any documentation throughout the EMIS life cycle, including manuals or a data dictionary, that facilitates the effective production of data and describes processes and procedures. The lack of documentation has a direct effect on the sustainability of the system (when the current person leaves there is a loss of knowledge and institutional memory). It also brings about a lack of transparency and trust. Furthermore, it takes longer to resolve or troubleshoot issues and data-related problems. The Statistics Act is also not readily available to the public. We could not access it on the website of the NSO. There is also no EMIS policy that can guide the terms and conditions of the collection and dissemination of education statistics. However, most publications that are available have been clearly identified, and the data-producing agency is usually acknowledged as the source. The bulletins of the MoEST, the

Page 25: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

21  

Population and Housing Census Publications of the NSO and the Annual Reports of the ECD in the Ministry of Gender, Children and Community Development are examples of where the data-producing agency is clearly identified.

7.1.3. Ethicalstandards:PoliciesandpracticesareguidedbyethicalstandardsWe could not obtain any guidelines during the investigation outlining correct staff behaviour and we are not sure if such guidelines exist. Interviewees were aware about the possible existence of a code of ethics. In discussions with staff of the MoEST, it appeared that each staff member had to sign a form when starting as a new employee. These ethical standards could also be addressed in an EMIS policy (See recommendation 6.8). 7.2. MethodologicalSoundness:TheMethodologicalBasisfortheStatistics

FollowsInternationallyAcceptedStandards,GuidelinesorGoodPracticeThis dimension covers the idea that the methodological basis for the production of statistics should be sound and that this can be attained by following internationally accepted standards, guidelines or good practices. The dimension has four elements, namely: concepts and definitions scope classification/sectorisation basis for recording

7.2.1. Conceptsanddefinitions:Conceptsanddefinitionsusedareinaccord

withstandardstatisticalframeworksNo official documents were found on the concepts and definitions for key datasets in the MoEST. Only a limited list of education indicators is provided in the annual Bulletin. Net enrolment rate (NER), gross enrolment rate (GER), survival rate and dropout rate are defined. In an important data analysis document, “2010 Key EMIS Indicators Analysis”, only a limited list of key indicators is provided. Key concepts and definitions are used in a census of the instruments that are sent to institutions, but nowhere are these concepts or definitions explained, and no manual and instructions accompany these census questionnaires. Although concepts relating to education indicators are used in key documents, such as the NESP and ESIP, the meaning of these indicators is not defined. In the Population and Housing Census 2008 of the NSO on education and literacy, definitions for specific concepts are provided: school attendance, primary school gross enrolment rates, primary school gross enrolment rates and literacy are concepts that are defined in this publication. It is, however, interesting to note that, although dropout rates are used, they are not defined. The enrolment is published by public and private institutions per year in the Statistics Bulletin (2010), but the meaning of the concept public or private is not explained in the census questionnaire or in the Statistical Bulletin (2010) (see Table7.1 below).

Page 26: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

22  

The survival rate is usually calculated at the end of an education cycle, such as at the end of the primary phase or at the end of the secondary phase. However, in the Draft Indicators Analysis document and the Statistics Bulletin (2010), survival rates are provided for Std. 5 and Std. 8. During our investigation, this deviation was explained with sufficient reasons by officials of the M&E Unit. public private TOTAL public private TOTAL2005 3169553 31093 3200646 143967 39887 1838542006 3242483 38231 3280714 166307 52003 2183102007 3264594 42332 3306926 148845 61480 2103252008 3542019 58752 3600771 160709 72864 2335732009 3614324 57157 3671481 190096 53742 2438382010 3818829 49814 3868643 204455 36463 240918

Primary SecondaryYear

Table7.1:Enrolmentbyyear Source: EMISintheMoEST

7.2.2. Scope:Thescopeisinaccordwithinternationallyacceptedstandards,guidelinesorgoodpracticeThe EMIS Unit is the only section in the MoEST that collects data, and it is also acknowledged as such by all other departments in the ministry. This approach ensures that only a single source of the truth is available, which increases the integrity of the data, avoids redundancies and overlaps. The EMIS Unit collects data pertaining to the school system once every year. EMIS takes responsibility for the dissemination, collection and capturing of the census instruments of primary and secondary schools. However, EMIS uses the decentralisation approach to get support from other departments, such as Higher Education, Teacher Training and TVET, to distribute and collect the questionnaires from their respective institutions. The Ministry of Gender, Children and Community Development is the lead agency coordinating and overseeing the implementation of ECD activities. However, during our investigation it seemed that there was no official data collection process or any database available for ECD. An attempt is made to collect summary data for each district with the help of a social welfare officer on a quarterly basis. Data collection instruments exist within EMIS in an attempt to cover all levels of the education system. At a regional level, data is also needed and collected from schools on a monthly basis, in what is called the Staff, Enrolment, Structure and Furniture Return. Table

7.2 below presents a detailed analysis of all the data collection instruments that the team could obtain during its investigation. It should be noted that, although some of the instruments exist, the datasets are not necessarily complete or available. The ECD has no official data collection instrument. Statistical Returns for Primary Schools (Form A) Statistical Returns for Secondary Schools (Form B) Statistical Returns for Tertiary Education (Technical and Vocational Education)

Page 27: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

23  

Statistical Returns for Tertiary Education (Teacher Training Colleges) Statistical Returns for Tertiary Education (University Education) Staff, Enrolment, Structure and Furniture Return

PrimarySchools

(FormA)

Secondary

Schools(Form

B)

Technicaland

Vocational

Education

TeacherTraining

Colleges

University

Education

TeacherTraining

CollegesAnnual

MonthlyReturns:

Staff,Enrolment,

Furniture

Enrolment Gender X X X X X X X Public X X X X X X X Private X X X X X X X Standard/level X X X X X X X Age X X X X X Repeaters Gender X X X X Public X X X X Private X X X X Standard/level X X X X Dropouts Gender X X X Public X X X X X Private X X X X X Standard/level X X X X X Reason X X X X X

Transfers in/out Gender X X Public X X Private X X Standard/level X X Streams Gender X X Public X X X X Private X X X X Standard/level X X X X Curriculum subjects/courses Periods per week X Public X X X Private X X X Standard/level X X X Orphans Gender X X X Public X X X Private X X X Standard/level X X X

Classrooms Type X X X Public X X X Private X X X Standard X X Teachers Gender X X X X X X X Public X X X X X X X Private X X X X X X X Level of education X X X X X X X Length of service X X X X X X

Non-teaching staff Gender X X Public X X Private X X Standard/level Education Public X X X X X X

Page 28: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

24  

institutions Private X X X X X X Standard/level Table7.2:OverviewofdatacollectionscopeRelevant geographical boundaries are used in Malawi. The education system operates within 34 districts. The data collection process is also organised and coordinated within these 34 districts and is also published as such in the Statistical Bulletin.

7.2.3. Classification/sectorisation:Classificationandsectorisationsystemsareinaccordwithnationalandinternationallyacceptedstandards,guidelinesorgoodpracticeA national education structure of primary, secondary, TVET, teachers training and higher education exists and the data collection processes are organised around this structure. The ECD falls under the Ministry of Gender, Children and Community Development and although data is collected. Definition of ECD centres and programmes is not standardised. Although not providing all the elements to identify the full set of ISCED level, the questionnaires for data collection, in the whole, provide adequate data to submit the international questionnaires. The listed data collection instruments attempt to cover all levels of education. Although not strictly according to the ISCED classification, the data is categorised according to public and private: ECD, primary, secondary and tertiary education, and an agreed mapping of the education system exists and is applied.

7.2.4. Basisforrecording:Dataisrecordedaccordingtointernationallyacceptedstandards,guidelinesorgoodpracticeIn the primary school tables where age is required, the note in the questionnaire clearly states that the age should be as at the beginning of the school year. For the secondary schools, the note reads, “age at the beginning of June this year”. Concerning finance data, information from budget are published, but not complete and not from actual expenditure.

7.3. AccuracyandReliability:SourceDataandStatisticalTechniquesareSoundandStatisticalOutputsSufficientlyPortrayRealityThis dimension of quality is based on the principle that the data produced give an adequate picture of the reality of the education sector in Malawi. Therefore, this dimension is specific for each data set and reflects the specificity of its sources and treatments. The elements of this dimension cover:

source data statistical techniques assessment and validation of source data assessment and validation of intermediate data and statistical outputs, and revision studies

Page 29: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

25  

7.3.1. SourcedataavailableprovideanadequatebasistocompilestatisticsAdministrative school census program: The education statistics on enrolment, teachers and education resources are collected by the EMIS Unit through an annual school census questionnaire for primary education, secondary education, TVET, teacher training and higher education. For more details on the procedures and periodicity of each of these data producing agencies, refer to Section 5 above - “Data Collection Processes in the MoEST”. ECD is situated in the Ministry of Gender, Children and Community Development, and although data is collected, it seemed during our investigation that there was no official data collection process or data collection instrument. The coverage is comprehensive in terms of geographic areas. The education system operates in 34 districts. The data collection process is also organised and coordinated within these 34 districts in Malawi and the data is also published as such in the Statistical Bulletin (2010), as indicated in Table7.3 below on secondary school enrolment for 2010.

Page 30: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

26  

District g1m g1f g2m g2f g3m g3f g4m g4f totalm totalfChitipa 742 766 760 894 609 612 515 477 2626 2749Karonga 890 788 1014 854 808 607 770 523 3482 2772Rumphi 1210 988 1129 989 1016 788 976 664 4331 3429Mzimba North 1345 1220 1387 1281 905 765 693 607 4330 3873Mzimba South 1504 1075 1579 1131 1295 802 1005 566 5383 3574Mzuzu City 774 904 836 1007 959 985 887 923 3456 3819Nkhata Bay 1124 967 1114 1087 845 664 649 518 3732 3236Likoma Island 98 79 98 69 93 54 96 31 385 233Kasungu 1787 1660 2009 1865 1558 1172 1208 828 6562 5525Nkhotakota 866 685 1035 817 736 445 534 320 3171 2267Dowa 1413 1242 1548 1304 1335 898 1049 560 5345 4004Ntchisi 616 501 551 578 575 335 403 255 2145 1669Salima 738 590 719 635 659 453 524 315 2640 1993Mchinji 1177 998 1318 1053 916 692 901 763 4312 3506Lilongwe City 1375 1481 1591 1596 1526 1552 1483 1266 5975 5895Lilongwe Rural 1273 999 1346 1174 1130 783 890 551 4639 3507Lilongwe Rural 1885 1545 1872 1845 1485 1139 1197 992 6439 5521Dedza 1421 1103 1284 994 1170 816 1012 675 4887 3588Ntcheu 1480 1315 1721 1549 1302 1147 1181 891 5684 4902Mangochi 1063 1305 1077 1255 1021 1058 823 683 3984 4301Balaka 814 608 679 506 679 446 554 351 2726 1911Machinga 1026 788 973 757 774 489 638 365 3411 2399Zomba Rural 635 504 596 485 450 261 436 166 2117 1416Zomba Urban 1179 1023 1152 1066 1240 902 1079 929 4650 3920Blantyre City 2160 1977 2230 2057 2180 1897 1967 1623 8537 7554Blantyre Rural 1231 988 1225 981 970 694 831 594 4257 3257Mwanza 291 272 350 267 285 225 332 223 1258 987Chikwawa 902 732 997 795 732 494 622 325 3253 2346Nsanje 643 436 760 501 610 268 517 136 2530 1341Chiradzulu 987 808 1019 789 962 561 827 373 3795 2531Thyolo 1289 1114 1441 1188 1258 927 1100 664 5088 3893Mulanje 1295 1182 1393 1310 1348 943 1037 739 5073 4174Phalombe 522 408 459 429 449 360 356 199 1786 1396Neno 251 214 243 177 189 151 126 90 809 632

Form 1 Form 2 Form 3 Form 4 TOTAL

 

Table7.3:Secondaryschoolenrolmentin2010bygradeanddistrictSource: EMISdatabase(Note:Theenrolmentinthistableistakenfromthetable“gdb:schooldata”,which includes only 1045 schools. The table “gdb:school”, however, indicates that there are 1519schools,whichwouldmeanthatthesefiguresonlyshowaresponserateof68.8%%)

Page 31: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

27  

The master list (tablegdb:school) in the 2010 database of primary schools obtained from the MoEST also makes provision for these geographical boundaries, as indicated in Figure7‐1 below. (Note: The number of schools is published in the Statistical Bulletin (2010) according to the number of questionnaires received (response rate), according to the table gdb:schooldata and not according to the number of schools that are in the master list table, gdb:school). However, closed schools are not identified in the database.

 

Figure7‐1:MasterlistofprimaryschoolsSource:Databasefor2010ofprimaryschoolsobtainedfromMoESTThe coverage is comprehensive in terms of relevant sub-groups, such as enrolment by gender (male and female pupils), public and private schools, trained and untrained teachers). The primary school database makes provision for the number of primary school pupils by gender and proprietor (government, private and religious agency), and is published as such in the Statistical Bulletin (2010). Table7.4 below was compiled from the primary school database, and a similar table was published in the Statistical Bulletin (2010).

Page 32: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

28  

Division Girls Boys Girls Boys Girls Boys TotalGirls TotalBoys GrandTotalCentral Eastern 119750 119042 3071 3073 195905 189485 318726 311600 630326Central Western 155420 151446 5433 5037 333361 324931 494214 481414 975628Northern 93988 94939 3559 3443 193481 196718 291028 295100 586128Shire Highlands 116403 114780 3789 4041 148205 146406 268397 265227 533624Southern Eastern 87421 87018 3326 2839 223040 220650 313787 310507 624294Southern Western 136118 139942 5837 6027 114817 115902 256772 261871 518643GrandTotal 709100 707167 25015 24460 1208809 1194092 1942924 1925719 3868643

Government Private ReligiousAgency

Table7.4:Primaryschoolenrolmentin2010bygenderandproprietorSource: EMISdatabaseforprimaryschools(ThereisnoexplanationintheBulletinoronthecensusquestionnaireaboutthemeaningofthesedifferenttypesofschoolproprietor) The database also makes provision for trained teachers by proprietor (government, private and religious agency) and is published as such in the Statistical Bulletin (2010), as indicated in Table7.5below.

Division Government Private ReligiousAgency GrandTotalCentral Eastern 2658 41 4224 6923Central Western 4015 112 6932 11059Northern 2497 84 4717 7298Shire Highlands 2276 52 2873 5201Southern Eastern 2046 51 4195 6292Southern Western 3148 42 2633 5823GrandTotal 16640 382 25574 42596 Table7.5: Trainedteachersinprimaryschoolsin2010,perdivisionandproprietor Source: EMISdatabaseforprimaryschoolsWith regard to accuracy and reliability it is appropriate to mention that, although good practices exist within the MoEST, the maintenance procedures of the school list could be improved. The master list of schools is used for the dissemination of questionnaires. In preparation for this dissemination, labels are printed according to this list and affixed to the forms. The forms are then sorted according to district and zones. This process serves as a control measure to ensure that all institutions receive a form. Although a master list of institutions could be obtained, it seems that there is no unit in the MoEST that manages the opening and closing of the schools and maintains the master list (see recommendation 6.11). The contribution of age to the reliability and accuracy of data: Regarding age in Malawi it can be reported that the Statistical Bulletin (2010) has a footnote that specifies that,

Page 33: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

29  

because there is no universal registration of births, age statistics are obtained from verbal responses, which makes the age data unreliable. StatisticsonExpenditurearenotcollectedforallsourcesoffundsandtypesofexpenditure.In particular, directly funded project and private sources of funds (households) are not kept in track and only few projects supported by international organisations are kept in the record. Statistics on assessments of student achievement: Statistics on the quality of learning outcomes are collected through assessments of student achievement. The national assessment body responsible for national examinations is the Malawi National Examinations Board (MANEB). It is a semi-autonomous body that is sub-vented by the government. It administers various national examinations at different levels of education. Pupils take the National Primary School Leaving Examination (PSLE) at the end of the primary cycle in Standard 8. The results obtained are used to select students for public secondary schools by ranking them and deciding what type of secondary schools they will be able to attend. The Malawi JuniorCertificateExamination (JCE) is administered at the end of two years of secondary education. At the end of the secondary education cycle, students write the MalawiSchoolCertificateofExamination(MSCE). The MoEST has also instituted the Primary Assessment Service (PAS), a standardised examination for Standards 3, 5 and 7. This is a national achievement process carried out by government once a year. This test is coordinated by MoEST and the marking takes place centrally. Malawi, along with other 13 countries in the region, has been participating in the SouthernAfricanConsortiumforMonitoringEducationalQuality (SACMEQ), which measures student performance in Grade 6 in English reading and mathematics. Malawi has participated from the start to the three SACMEQ projects. Figure7‐2 shows reading and mathematics results of SACMEQ II.

Figure7‐2:MeanforthereadingtestscoresandmathematicstestscoresoflearnersinallSACMEQcountries(SACMEQII)Source:WorldBankdata(2009).Note:basedonaprimarycycleofsixyearstoallowforcomparison

Page 34: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

30  

Thepopulationdata used to calculate enrolment rates have not been updated since 1998, although Household Surveys were conducted in 2004. According to the current publications, the MoEST calculations are still based on the 1998 population figures, although the population census data of 2008 is available. For instance, the comparison between enrolment and attendance shows a major difference and this is not being investigated. Primary school enrolment reported by MoEST is 33.7% higher than attendance figure from the Population and Housing Census of 2008 Report on Education and Literacy, Volume 5. This is an indication of the lack of coordination between the NSO and EMIS. Recently the Bureau of Statistics released the projections of population data. Analysis of these data showed that globally the smoothed and adjusted data for the 2008 census are close enough to UNPD estimations (see figure 7.3).

Figure7‐3:2008PopulationdatafromNSOandUNPDHowever, a follow up of a cohort of children aged 6 years in 2008 up to 13 in 2015 shows some inconsistencies in data between years and a clear discrepancy with UNPD’s figures(see Figure7‐4). The graph shows that between 2008 and 2012 the trend is logical but in 2014 the number of children aged 12 years increased.

Page 35: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

31  

Figure7‐4:Cohortofchildrenfromage6in2008In terms of the timeliness of the data source, the respondents are made aware of the deadlines of the reporting time. Head teachers are informed about the deadlines during briefing sessions. However, we could not find a calendar or any letter that specifies these dates. On the questionnaire there is a table that indicates the sequence and different steps that the questionnaire should follow, with a place where the responsible official should enter the completion date. In this way the questionnaire can be tracked and schools that did not respond are tracked and contacted. However, there are no rigorous or systematic follow-up procedures to ensure the timely receipt of the respondents’ questionnaires.

7.3.2. Assessmentofsourcedata:SourcedataareregularlyassessedandvalidatedThere is no official auditing of the data that is submitted to headquarters. The briefing sessions that take place in the beginning of the data collection process are the only official measure to brief the head teachers on how to complete and verify the form. In April the EMIS Unit meets with all the district education managers (DEM), primary education advisors (PEA) and cluster leaders to brief them on the completion of the questionnaire, how to verify and validate the form and when to complete and submit the form. The PEAs, in turn, go to their respective zones (a zone comprises a number of primary schools in a district) and the cluster advisors go to their respective clusters (a cluster is a group of secondary schools) to brief the head teachers on how to complete the form and to disseminate the questionnaire to each school. The PEA coordinator and the cluster leader are responsible for getting the completed questionnaires back. At the district level there are no specific guidelines available on how to verify and validate the completed questionnaires that are submitted. The PEAs in the districts check only if the necessary fields are entered and that the right codes have been entered. The PEAs ensured us that they can identify inconsistencies, specifically in the classroom and enrolment totals, because they have the school enrolment submitted via the Monthly Returns. Information on coverage, non-response errors, and the percentage of missing and/or imputed data by methods of imputation is not compiled. The MoEST publishes the data that is submitted.

Page 36: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

32  

There is no audit or inspections of questionnaires filled by head teachers. Inspections are conducted by the Directorate Inspection and Advisory Services (DIAS) without coordination with EMIS (See recommendation 6.5). No comparison with data from earlier years is done to examine reasonableness of year-to-year changes and trends (Refer to recommendation 6.9 for further detail). During our visits to the schools it was observed that a standardised school register exists. We are not sure, however, if the completion of the registers takes place according to a uniform practice. It was also indicated in the register when a student was transferred to another school. There is an official form used to facilitate this process. The school register should be enforced as an important official document that is used to improve the quality of the data by recording all the students in the school. With regard to the accuracy and reliability of the data it is important that a register of all schools exists that is updated and maintained on a regular basis. Although a master list of institutions exists in the database, it seems that there is no unit in the MoEST that manages the opening and closing of schools. A School Registration Unit should be establish in the Planning Department, and that this would be the most logical place to maintain and update the master list of institutions (Refer to recommendation 6.11 for further detail). 7.3.3. Statisticaltechniques:Statisticaltechniquesemployedconformtosound

statisticalprocedures,andaredocumentedThere is no indication of tabulation errors or error report generation. Some cross-table checks are integrated into the data entry software. A summary table for enrolment, teachers and classrooms is provided in the annual census questionnaire. This table can be used as a “control table” to cross-check all the enrolment, teacher and classroom-related data in the questionnaire. An instruction in the questionnaire, “Add total male and total female pupils, and place the resulting total on the first page of the questionnaire on bottom marked PRELIMINARY COUNTS: TOTAL PUPILS” is included to ensure that the totals in these related tables are the same. A similar instruction is provided in the questionnaire for the total classrooms and teachers. Another note, which reads “the totals for each room type must be equal to the ones listed under construction columns (iii) & (iv) in D.1” is provided for rooms under construction to ensure data quality. The database is designed according to the format and layout of the questionnaire. The MoEST only publishes the data that is submitted. There is also no indication of the response rate and there is no treatment of missing data in the MoEST. Response rate is not estimated neither missing data treatment or imputation methods are applied. It must be emphasised that the documentation of processes and procedures is poorly defined. We could not obtain any documentation throughout the EMIS life cycle (questionnaire design, questionnaire dissemination, questionnaire completion, data entry and verification and data use), including manuals and the data dictionary, which facilitates the effective production of data and describes processes and procedures.

Page 37: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

33  

7.3.4. Assessment and validation of intermediate data aswell as statisticaloutputsareregularlyassessedandvalidatedThe data is compared with data from earlier years to examine the reasonableness of year-to-year changes and trends: Although this may be done, no measures are taken to identify the sources of discrepancies. The discrepancies between enrolment rates and attendance rates (see 7.3.1) are acknowledged but no attempt to understand such phenomena has been documented. Systematic processes have not been found in place to monitor errors and omissions, and address data problems.

7.3.5. Revisionstudies:Revisions,asagaugeofreliability,aretrackedandminedfortheinformationtheymayprovideLittle to no evidence was found of revision studies, or general reviews of the methodologies used.

7.3.6. ArchivingofsourcedataandstatisticalresultsIn terms of the database structure we observed that it partly adhered to the standards of the relational database management systems3 (RDMS). The system used in the MoEST is the Education Automated Statistical Information Systems Toolkit (ED*ASSIST) as discussed under 7.0.2 above. Referential integrity is applied in this database in the sense that the primary key of one table refers to the foreign key in another table. Reference tables are also consistently used in the database. The database structure in Figure7‐5 shows the tables in the database. The tables with “a:” as prefix are all reference tables and link through their primary key to other tables in the database. In this way, table “a:District” is a reference table with a primary key of “district_num” that links this table with the “gdb:school” table, as indicated in Figure7‐6.                                                             3 A relational database management system (RDMS) is a system in which data is stored in the form of tables and the relationships among the tables are also stored in the form of tables. The relational structure makes it easy to query the database and to integrate large datasets from multiple sources. Data integration generally means linking different data sources through a common field across a collection of data sources. To be able to do this, unique identifier codes must be assigned to the datasets that are used for the integration. Another key concept in RDMS is referential integrity, a concept that ensures that relationships between tables remain consistent. In other words, when one table has a foreign key to another table, referential integrity states that one may not add a record to the table that contains the foreign key unless there is a corresponding record in the linked table (it includes concepts such as ‘cascading delete’ and ‘cascading update’). Normalisation is another important concept that ensures that the data in the database is efficiently organised, by eliminating redundant data and ensuring that dependencies make sense. These functionalities are critical for data use and data analysis and make the integration of different datasets possible. 

Page 38: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

34  

Figure7‐5:Databasestructure Source:EMISinMoEST

Page 39: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

35  

Figure7‐6:ReferentialintegritySource:EMISinMoEST(Referentialintegrityisappliedinthedatabaseof2010.Noticehowfielddistrict_numinthea:districttablereferstothedistrict_numinthegdb:schooltable)The structure of the database also does not always adhere to the principles of RDMS. Figure7‐7below is an example of some of the fields in the students’ table that is in the form of a flat file, which makes data querying difficult.(Thetableindicatesthefieldsinthedatabasewithaflatfilestructure.Thefields,suchasg1mandg1f,indicatetheenrolmentbystandardandgender). Figure7‐7:Thestructureofthegdb:studentstableSource:EMISinMoEST 

Page 40: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

36  

7.4. Serviceability:StatisticswithadequateperiodicityandtimelinessareconsistentThe quality dimension of serviceability looks at the extent to which statistics are useful for planning or policy purposes. It refers, mainly, to periodicity and timeliness, and to consistency. Data is timely when it is current or up to date, as defined by the owner of the data. Data must be on time and available when it is required, otherwise the credibility of the information system diminishes. Given that data is actually accurate, it looks at the extent to which it reflects a reality either of the moment or of the past.

7.4.1. Periodicityandtimeliness:PeriodicityandtimelinessfollowinternationallyaccepteddisseminationstandardsData must be on time and available when it is required, otherwise the credibility of the information system diminishes. In this process to maintain the timeliness of data, accuracy could sometimes be impacted. In Malawi, education data is published each school year, a practice that is commendable and quite good in comparison with many other countries. However, it must be noted that this dataset comes quite late in the year, which is not practical for budgeting and planning purposes that need data much earlier in the year (See recommendation 6.10).

AdministrationData: Data is collected at least once per year for all sub-sectors under the MoEST. The aim is to disseminate the data in the same year. It takes about nine months to produce the Statistical Bulletin, taken into consideration all the EMIS processes. These are good practices in line with international standards. A workshop with all the directors at national level and district level is conducted and the database and the bulletin are put on a CD for dissemination to the districts. The ECD data is not published neither disseminated. AchievementData: Statistics on the quality of learning outcomes are collected through assessments of student achievement. The national assessment body responsible for the national yearly examinations is the Malawi National Examinations Board (MANEB). See above presentation of the different examinations the Board is responsible for (7.3.1 – Statisticsonassessmentof studentachievement). In addition to participation to SACMEQ, periodicity and timeliness seems to correspond to country monitoring needs. FinanceData: Data on education finance is available in the Statistical Bulletin from 2000 to 2003 and 2008 to 2010 and the format of the data published differs from one year to another. The total allocation to the MoEST is published, with a breakdown of the total budget allocation for secondary schools, technical colleges and training colleges.

7.4.2. Consistency:Releasedstatisticsareconsistentwithinadatasetandovertime,andwithothermajordatasetsAccounting identities between aggregates seem to be done during quality control at MoEST. Consistent time data is available for at least five years and the databases for these datasets are also available in the EMIS Unit. In the Statistical Bulletin (2010), enrolment data is published from 1964 to 2010 for all the sub-sectors (primary, secondary, teacher

Page 41: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

37  

training, TVET and university) except the ECD. It is important to note that data is not available for some sub-sectors and for particular years, such as TVET and university data for the years 2004 to 2007, etc. There is not a systematic process for the cross-checking of data with other sources (See recommendation 6.9). Collaboration on education data collected by NSO is not effective. 7.5. Accessibility:DataandMetadataareEasilyAvailableandthereis

AdequateClient(User)SupportThis dimension is based on the principle that data and metadata should be presented in a clear and understandable way and should be easily available to users. Metadata should also be relevant and be updated regularly. In addition, assistance to users should be available, efficient and performed in a reasonable time frame. Data accessibility - Statistics are presented in a clear and understandable manner, forms of dissemination are adequate, and statistics are made available on an impartial basis. In the Statistical Bulletins and the Population and Housing Census 2008 publications, such as the Analytical Report on Education and Numeracy, education data is published in a clear manner, and charts and tables are disseminated with the data to facilitate the analysis. In these publications, education indicators such as enrolment ratios; survival, dropout and repeater rates; progression rates; teacher details; and attendance are supported with tables and charts. In an attempt to provide stakeholders (policy makers, educators, civil society group as well as development partners) with a clearer basis for policy making, planning and other actions the M&E Unit in the MoEST embarked on an analysis of “2010 key EMIS Indicators”. A draft document of this analysis was available. This report provides these stakeholders with information on the performance of education sector key indicators (e.g. enrolment rates, number of schools, teacher pupil ratio, survival rates, PSLCE pass rates, secondary school enrolment, etc) in relation to targets set by the NESP and ESIP. Although the EMIS data is published in an annual Statistical Bulletin, there seems to be a need for more detailed data. It must be noted that data that is not published can be obtained from the EMIS Unit on request. There is, however, no query system in place and queries takes place in an adhoc manner. There is also no system to record al the queries or data requests. A history of request helps determine trends and the kind of data needs that exist. The statistical data of the MoEST is not released according to a pre-announced schedule with dates indicating the release of the statistical data to users. The MoEST also has no website. The Statistical Bulletin (2010), however, is available on the website of NSO (http://www.nso.malawi.net). The NSO has a release calendar for its major publications and the calendar is presented in Figure7‐8 below. The NSO conducted a workshop with key role players such as development partners, civil society, various line ministries and the media to brief them on the statistics of the Population and Housing Census (2008).

Page 42: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

38  

Figure7‐8:ReleasecalendarofmajorpublicationsofNSOSource: NSOwebsite,availablefromhttp://www.nso.malawi.net The public is not informed of the statistics being released and of the procedures to access them (e.g. internet publications). There is no official process in MoEST for the release of the data, only an informal process in collaboration with the PS. The aim of the EMIS Unit is to publish the data at least three weeks after the release. It takes about nine months to

Page 43: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

39  

produce the bulletin, taking all the EMIS processes into consideration (See 5.1 - Datadisseminationandpublication) During our investigation we could not obtain a policy or any procedures in relation to the confidentiality of use of micro-data to prevent the identification of individual respondents when the data is used by researchers or anyone else.

7.5.1. Metadataaccessibility: Up‐to‐date and pertinent metadata are madeavailableMetadata is useful to data production agencies and users of the data. Metadata describes the content, quality and other characteristics of the data and helps users to locate and understand the data. Metadata provides a standard and common language about a set of terminology and definitions for the end user. We could not find an official document describing the data elements used in the datasets of MoEST. Metadata regarding elements such as response rates, deviations from international accepted standards, etc. is also not provided by the MoEST, although only data that is submitted is published and made available to users. NSO is aware of the importance of the quality of the data for the country and is committed to improving the quality of statistics by subscribing to international standards. Although the general data dissemination system (GDDS) has been developed they admitted that there is nothing about the dissemination of education data in the GDDS. This is a concern that they will address in the future.

7.5.2. Assistance to users: Prompt and knowledgeable support service isavailableThere seems to be no structured assistance to users by the MoEST and no organised and formal system is in place to manage the interactions with the end-users of the data. There is also no technical support that is provided to the users of the data. The data that is available is through publications such as the Statistical Bulletins. Users’ requests are managed on an adhoc basis and no specific times or days are indicated to request data from EMIS. Also, no record is kept of the data queries by users. Keeping a record of the requests of users could help in building up a history of the kind of data that is required by users. This information can be taken into consideration and will facilitate processes when the data collection questionnaires are revised.

Page 44: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

40  

8. CONCLUSION

1

2

3

4

Pre-requisites of quality

Integrity

Methodological soundness

Accuracy and reliability

Serviceability

Accessibility

EMIS

EMIS

Figure8‐1:OverallresultsThe results of the DQAF, as presented in Figure8‐1 above, show that Malawi should focus on all the dimensions to improve the quality of its education statistics. The current situation of EMIS in MoEST in Malawi demands that the decision makers take the initiative and introduce changes for collaboration among the various data-producing agencies, and that they integrate the data collection processes and systems where possible. The above-mentioned findings and accompanying recommendations are proposed for improving the Malawi system for educational statistics. Statistics is not only about computers and processes. We want to emphasise that statistics is more than just a technical solution, but that the important aspects of legislation, governance, regulation, resources, people, systems and processes should be part of the process.

Page 45: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

41  

9. AppendixA:Listofrelevantreferencesanddocuments Nr Details Medium Year Issuer

1 Project to improve education quality in Malawi (PIEQM) Document (Report No: 53051-MW) May 2010 World Bank 2 Key EMIS Analysis, 2004-2010 Document 2010 MoEST 3 Education Sector Implementation Plan Document September 2009 MoEST 4 NATIONAL EDUCATION SECTOR PLAN 2007 –2016 Document August 2007 MoEST

5 National Statistical System Strategic Plan 2008-2012 Document June 2008 NSO 6 Malawi_ISCED_mapping_for final approval_8Mar2011 Document March 2011 7 Policy & Investment Framework (PIF) Policy Revised: January 2001 MoEST8 Malawi Education Country Status Report(CSR 2008/09) Report April 2009 World Bank and UNESCO-BREDA (Pôle de Dakar) 9 2010 Annual Report for Early Childhood Development Report January 2010 Ministry of Gender, Children and Community Development

Page 46: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

42  

10 Population and Housing Census 2008: Education and Literacy Publication October 2010 NSO 11 Population and Housing Census 2008: Population Projections Publication November 2010 NSO 12 Education Statistics Bulletins 2000-2010 Publication 2000-2010 MoEST 13 NATIONAL STATISTICS BILL, 2011 Act 2010 NSO 14 Databases for primary and secondary schools Database 2008-2010 MoEST 15 http://www.nso.malawi.net website 2011 NSO16 Statistical Returns for Primary Schools (Form A) Questionnaire 2011 MoEST 17 Statistical Returns for Secondary Schools (Form B) Questionnaire 2011 MoEST 18 Statistical Returns for Tertiary Education (Technical and Vocational Education) Questionnaire 2011 MoEST 19 Statistical Returns for Tertiary Education (Teacher Training Colleges) Questionnaire 2011 MoEST 20 Statistical Returns for Tertiary Education (University Education) Questionnaire 2011 MoEST 21 Staff, Enrolment, Structure and Furniture Return Questionnaire 2011 MoEST

Page 47: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

43  

10. APPENDIXB:ListofPersonsMet

No. NAME POSTION/DESIGNATION UNIT ORGANISATION EMAIL

1 Martin Masanche EMIS Head Dept of Planning Dept Policy & Planning [email protected] Grace Milner Policy & Planning (M&E) Dept of Planning [email protected] Chrissie Kafundu Monitoring & Evaluation Dept of Planning Chandiwira Nyirenda Planning - EMIS Dept of Planning [email protected] 2 Chikondano Mussa Deputy Director Basic Education Basic Education [email protected] [email protected] 3 Dudley Chiwala Deputy Director Secondary Education Secondary Education [email protected] 4 Dr Simeon Hau Principle Secretary for Basic Education MoEST Office of the PS [email protected]

5 Aubrey D. Matemba Senior Technical Education Officer Directorate of Technical and Vocational Training Directorate of Technical and Vocational Training [email protected] Dr. Godfrey B. C. Kafere Director of Technical & Vocational Training Directorate of Technical and Vocational Training [email protected]

6 Rose Kalizangoma Assistant Chief Education Officer Directorate of Higher Education Higer Education [email protected] Chandiwira Nyirenda Planning - EMIS MOEST [email protected] Francis R. W. Chalamanda National Coordinator ECD Ministry of Gender, Children and Community Development ECD [email protected] Enock Matale Statistician EMIS [email protected]

Page 48: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

44  

Immaunlate Salaon ECD Technical Assistant UNDP (Supporting Ministry of Gender, Children & Community Dev.) [email protected]

8 Maclean Kaluwa Assistant Stastician EMIS - MOEST

Budget [email protected] Anna Downs Budget Officer ODI / MOEST [email protected] Lowland Sakale Senior Budget Officer MOEST [email protected] James Changadeya Budget Officer MOEST [email protected] Job Mwamlima Principal Budget Officer MOEST [email protected]

9 George Mnamizana Deputy HT CHINSAPO LEA Primary School CHINSAPO LEA Primary School Liviness Chimbayo Deputy HT CHINSAPO LEA Primary School 10 Flora Mtocha H I Teacher Lilongwe LEA Primary School Lilongwe LEA Primary School Susan Kalipholi D H I Teacher Lilongwe LEA Primary School [email protected] 11 Gcadsop Msambati Head Teacher Tsabango Coommunity Day Secondary School Tsabango Coommunity Day Secondary School N/A

McDonald Mthaphwi Deputy Head Teacher Tsabango Coommunity Day Secondary School N/A Stanley Banda EMIS HQS EMIS N/A 12 Martin Masanche EMIS Head Dept of Planning EMIS [email protected] 13 Virginia Chavula Principal Lilongwe Teacher Training College Lilongwe Teacher Training College 14 Frazer Martin Nkhata Chief Human Resource Management Officer Education Human Resource Management [email protected] 15 Catherine L Saiwa Assistant Director Education Directorate Inspectorate Advisory Services [email protected] Mayamiko Chiwaya Principal Inspector Education [email protected]

Page 49: Data Quality Assessment Framework (DQAF) MALAWIdqaf.uis.unesco.org/images/5/58/EdDQAF-Malawi-2011... · 2015-04-15 · The results show that there is a need for quality data, and

45  

 

16 Angela Msosa Chief Statistian (Demography) National Stats Officer National Statsitics Office [email protected] John Khozi PRMIE Education [email protected] Mercy Kanyuka Deputy Commisioner NSO [email protected] 17

Sezerine Misomali District Education Manager Lilongwe Urban Lilongwe Urban District Office

[email protected] Linda Vho EMIS Officer Ministry Head quarter [email protected] Doreen Maere Coordinating Primary Education Advisor Lilongwe Urban Joyce Masache Desk Officer Lilongwe Urban [email protected] 18 Jorgen Friis Adviser GIZ Development Partners [email protected] Enock Matale Statistician EMIS EMIS [email protected] 20 Grace Milner Policy & Planning MOEST M&E [email protected] 21 John J Bisika Secretary for Education, Science & Techn. MOEST Secretary MoEST. [email protected] Martin Mansanche EMIS Head MOEST [email protected] Grace Milner Policy & Planning MOEST [email protected]


Recommended