eddata ii data for education research and …...eddata ii data for education research and...

74
EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big Data Report EdData II Technical and Managerial Assistance, Task Number 19 Contract Number BPA No. EHC-E-00-04-0004 Task Order Number AID-OAA-12-BC-00004 Date: May 2016 This publication was produced for review by the United States Agency for International Development. It was prepared by RTI International.

Upload: others

Post on 17-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

EdData II

Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big Data Report

EdData II Technical and Managerial Assistance, Task Number 19 Contract Number BPA No. EHC-E-00-04-0004 Task Order Number AID-OAA-12-BC-00004 Date: May 2016 This publication was produced for review by the United States Agency for International Development. It was prepared by RTI International.

Page 2: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big Data Report Prepared for Bureau for Africa United States Agency for International Development 1300 Pennsylvania Avenue, N.W. Washington, DC 20523 Prepared by RTI International 3040 Cornwallis Road P.O. Box 12194 Research Triangle Park, NC 27709-2194 Photo caption: Head Teacher and Deputy Head Teacher table training conducted by the Kenya Big Data Activity in Isiolo County. Photo credit: RTI International/DERP staff RTI International is a registered trademark and a trade name of Research Triangle Institute.

The views expressed by the authors at RTI International do not necessarily reflect the views of the United States Agency for International Development or the United States Government.

Page 3: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report iii

Table of Contents Page

List of Figures ..................................................................................................................vi

List of Tables ...................................................................................................................vi

Abbreviations ................................................................................................................ viii

Executive Summary ........................................................................................................ 1

1. Overview of the Kenya DERP Big Data Activity.................................................... 8

1.1 Introduction and Background ..................................................................... 8

1.2 Purpose, Partners and Scope .................................................................... 9

2. Documenting the Implementation Process ......................................................... 10

2.1 Implementation Timeline .......................................................................... 10

2.2 Training Program ..................................................................................... 11

2.2.1 County and Sub-County Officers ................................................... 11

2.2.2 Training of Head Teachers and Deputy Head Teachers ............... 12

2.3 Technical Support and Help Desk ........................................................... 12

2.4 Communication and SMS Gateway ......................................................... 13

2.5 Device, Data, and Software License Requirements ................................. 13

3. Research Plan .................................................................................................... 14

3.1 Theory of Change .................................................................................... 14

3.2 Variables .................................................................................................. 14

3.3 Other Research Questions ...................................................................... 15

3.4 Methodologies ......................................................................................... 15

3.4.1 Data Validation through Lot Quality Assurance Sampling ............. 15

3.4.2 Qualitative Analysis and Feedback Sessions ................................ 16

4. Findings and Observations ................................................................................. 16

4.1 Completion and Timeliness of Returns .................................................... 16

4.2 Validation Results .................................................................................... 18

4.2.1 EMIS Returns and Validation Results for Enrollment, Textbooks and School Revenues .................................................. 18

4.2.2 Estimated Efficiency Gains from Reducing Error of Enrollment and Textbook Data Reported ........................................................ 21

Page 4: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report iv

4.2.3 Simplified Cost-Effectiveness Analysis ......................................... 22

4.2.4 Does the Order of Completing the Paper or Electronic Forms Have an Impact on the Accuracy of the Data Report? .................. 23

4.2.5 Summary Conclusions from Validation Findings ........................... 24

4.3 Findings from the Qualitative Assessment ............................................... 24

4.3.1 Qualitative Assessment Areas of Inquiry ....................................... 25

4.3.2 Observations from the Qualitative Assessment ............................ 25

4.3.3 Recommendations ........................................................................ 25

4.4 Findings from the Report Card Feedback Sessions ................................ 27

4.4.1 Successes Related to the Use of the Report Cards ...................... 30

4.4.2 Challenges Related to the Use of Report Cards ........................... 31

4.4.3 Recommendations and Suggestions for Report Card Improvement ................................................................................. 32

4.4.4 Overall Conclusions ...................................................................... 33

5. System Requirements ........................................................................................ 34

5.1 Data Hosting Technical Requirements .................................................... 34

5.2 Centralized Human Resources Capacity and Skills Requirements for System Administration ............................................................................. 34

5.3 Decentralized Human Resources Capacity and Skills Requirements for System Support and Maintenance ...................................................... 35

5.4 License Requirements ............................................................................. 35

5.4.1 Mobenzi (Telephone) Application .................................................. 35

5.4.2 IBM School Census Hub Application ............................................. 36

5.5 Supervision and Validation System Support Needs ................................. 36

6. Strengths, Weaknesses, Opportunities, and Threats Analysis ........................... 36

6.1 Strengths and Opportunities .................................................................... 37

6.2 Weaknesses and Threats ........................................................................ 38

7. Ways Forward .................................................................................................... 39

7.1 Articulate the Vision and Outcomes of an Integrated Digital EMIS in the Broader Strategic Planning Framework ............................................. 39

7.2 Scaffold Ministry Capacity to Scale and Sustain a School Information System at Scale ....................................................................................... 40

7.2.1 Establish a Centralized Information Communications Technology Office as an Education Sector-Wide Service ............. 40

Page 5: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report v

7.2.2 Rollout a School Information System with Multi-modal Platforms ....................................................................................... 40

7.2.3 Develop and Implement a Change Management and Behavior Change Plan ................................................................................. 41

7.3 Establish Expectations and Monitor Against Those Expectations in Relation to Data Reporting and Quality Assurance .................................. 41

7.3.1 Standardize School Record Keeping and Management ................ 41

7.3.2 Clarify and Articulate Roles and Responsibilities for Data Reporting and Validation ............................................................... 42

7.3.3 Enforce Consequences for Misreporting and/or Not Performing Validation Tasks ......................................................... 42

7.3.4 Tether the Free Primary Capitation Grants to the Timeliness and Accuracy of the Data Reported .............................................. 42

7.4 Commission Rigorous Cost-Effectiveness and Cost–Benefit Analyses .................................................................................................. 42

Annex A. Agenda for the Policy Dialogue Workshop..................................................... 44

Annex B. Detailed Validation Data Findings .................................................................. 48

Annex C. Design Features and Functionality of the Digital Education Management Information System Applications ........................................................................ 54

Page 6: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report vi

List of Figures Figure ES-1. Degree of error reported as a percentage of total validated data for

revenue, textbooks, and enrollment. .......................................................... 3

Figure 1. The theory of change model, which lists the desired outcomes (green boxes) and the generalized intervention outputs (other boxes). .............. 14

Figure 2. Degree of error reported as a percentage of total validated data for revenue, textbooks, and enrollment. ........................................................ 20

Figure 3. Income reported by Mombasa County schools: Digital (d) versus paper (p) forms. ....................................................................................... 21

Figure 4. Income reported by Isiolo County schools: Digital (d) versus paper (p) forms. ................................................................................................. 21

Figure 5. School performance dashboard view for tablets only............................... 28

Figure 6. School report card view for mobile telephones only. ................................ 28

Figure 7. Example of the decision-support tool visualizer. ...................................... 29

Figure 8. The County Dashboard card view. ........................................................... 30

List of Tables Table 1. Implementation Timeline of Training ........................................................ 11

Table 2. Types of Devices and Data and Software Requirement Used in Mombasa and Isiolo Counties .................................................................. 13

Table 3. Study Variables and Methods .................................................................. 15

Table 4. Status of EMIS Submission: The Number (and Proportion) of Schools by Each County That Submitted, Was in Progress, or Had Not Yet Started by the End of November or by the End of December 2015 ......................................................................................................... 17

Table 5. Number of Schools in Each County with Greater than ±10% Reported Error Compared to the Validated Data by Electronic and Paper Submissions .................................................................................. 19

Table 6. Efficiency and Equity Gains from Reduced Errors in Enrollment and Textbook Reporting (Note: Currency Is Presented in Kenyan Shilling [KES]) ...................................................................................................... 22

Table 7. Cost-Effectiveness Estimates for Mombasa and Isiolo Counties ............. 23

Table 8. Enrollment Validation Results for Schools Completing Paper Forms First (P1) versus Paper Forms Second (P2)—Mombasa County Only .... 24

Table 9. Textbook Validation Results for Schools Completing Paper Forms First (P1) versus Paper Forms Second (P2)—Mombasa County Only .... 24

Page 7: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report vii

Table 10. Challenges for Head Teachers’ and Recommendations from Qualitative Assessment Observations ..................................................... 26

Table 11. Challenges for County and Sub-County Officers and Recommendations from Qualitative Assessment Observations .............. 26

Table 12. Summary of the SWOT Analysis ............................................................. 37

Page 8: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report viii

Abbreviations CPU Central Processing Unit CSV comma-separated values DERP Data for Education Research and Planning (in Africa) ECDE early childhood development education EdData II Education Data for Decision Making EMIS Education Management Information System FP Free Primary GB gigabyte GPE ICT

Global Partnership for Education Information and Communication Technology

KES Kenyan shilling KNEC Kenya National Examinations Council LQAS lot quality assurance sampling MB megabyte MoEST Ministry of Education, Science, and Technology M-PESA Mobile Money Transfer Service in Kenya P1 schools that completed the paper forms first P2 schools that completed the paper forms second PRIMR Kenya Primary Math and Reading (Initiative) RAM random access memory SAGA semi-autonomous government agency SIM Subscriber Identity Module SMS short messaging service SSL Secure Sockets Layer TB terabyte TSC Teachers Service Commission UNICEF United Nations Children’s Fund USAID U.S. Agency for International Development WASH water, sanitation, and hygiene

Page 9: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 1

Executive Summary The Kenya Big Data Activity is an operations research study to help the Kenya Ministry of Education, Science, and Technology (MoEST) identify the challenges, opportunities, and capacity requirements for implementing a school-based digital Education Management Information System (EMIS) at scale and sustainably. This activity was funded by the U.S. Agency for International Development’s Task Order 19, Data for Educational Research and Planning (DERP) in Africa, under the Education Data for Decision Making (EdData II) program. RTI International, in partnership with IBM Corp. Research Laboratory, Kenya and Digital Divide Data, implemented the study, the MoEST coordinated the effort, and the United Nations Children’s Fund (UNICEF) and other development partners provided collaboration and support.

Summary Overview of the Study Three sub-components composed the study: (1) shadow the national paper-based EMIS forms by using telephone and tablet-based applications, (2) support data validation efforts by county and sub-county officials, and (3) produce and distribute school report cards to schools. The study pilot tested telephone-based EMIS applications in 109 schools in Isiolo County, and tablet-based EMIS applications in 97 schools in Mombasa County. The study trained Head Teachers and Deputy Head Teachers regarding how to properly use the devices to complete the EMIS forms. The Head Teachers and Deputy Head Teachers were also trained regarding how to use the applications to generate school report cards from the data reported. District Education Officers and County and Sub-county Quality Assurance Standards Officers were trained and received support regarding how to use tablets to validate the EMIS data reported by schools through school-visit double checks.

Summary of Implementation Process and Research Plan RTI, Digital Divide Data, and IBM Corp. Research Laboratory, Kenya (henceforth referred to as the DERP Team) provided training, technical support, and field management in coordination with and with support from the MoEST. During October 2015, the Head Teachers and Deputy Head Teachers received the initial training regarding the digital EMIS applications; EMIS data were recorded and submitted by schools from November to December 2015. During the same time frame, the Head Teachers submitted forms separately through two different means: electronically (by telephone or tablet) and by paper which is the regular reporting EMIS format. During January and February 2016, the County and Sub-county Officers conducted data validation site visits to a total of 40 schools in Mombasa and Isiolo Counties (20 schools in each county). During February 2016, the Head Teachers and Deputy Head Teachers were trained on how to use the school report cards. After the interventions, during March and April 2016, a series of qualitative assessments and stakeholder feedback sessions were held.

This study was designed to identify the extent to which using the digital EMIS applications resulted in gains regarding the improved timeliness and accuracy of the EMIS data submitted by schools and to identify likely issues that would be encountered in a national roll-out of electronic reporting, so as to prevent possible mishaps. That is, the study also aimed to identify the key capacity and system-related challenges confronting the MoEST to implement a school-based digital EMIS at scale. The findings and recommendations in this report draw from the data

Page 10: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 2

validation exercise, stakeholder feedback sessions, and other observations of the DERP Team from ongoing implementation of the study.

The DERP Team compared the enrollment, textbook, school revenue data reported both electronically and by paper with the actual validated data to identify the error or discrepancy rates. The reason why the DERP Team selected enrollment, textbook, and school revenue data for analysis was because of the financial implications that these data have on the financing and resources available to schools. Increased accuracy and timeliness of these data will have a direct impact on the efficiency of the funding and resources available to the schools because the Free Primary (FP) capitation school grant is based largely on enrollment, and the funds earmarked for textbooks are derived from the FP grant.

Summary Findings The results of the data validation exercise show that discrepancies in textbook data far exceeded discrepancies in reported enrollment and revenue data. The most important indicator was the number of schools that had greater than or less than ±10% error in comparison to the validated data. The degree of error reported by paper for textbooks stand out in comparison to the accuracy of the textbook data reported electronically; specifically, 18 out of 20 schools in Mombasa County and 19 out of 20 schools in Isiolo County had greater than ±10% error for their paper-reported data regarding textbooks.

The degree or magnitude of the textbook error is another important indicator. The validation results found that the 20 Isiolo schools that were double checked only had 21,563 textbooks recorded in comparison to the 42,463 textbooks they reported by paper. This discrepancy resulted in an over-reporting of 20,900 (or 97%) of the validated total. In Mombasa County, schools that were double checked over-reported by a combined 30,136 textbooks (i.e., 69,497 reported by paper versus 39,361 textbooks validated). This discrepancy resulted in an over-reporting of 76% of the validated total. In contrast, there were very little discrepancies in the number of textbooks reported electronically in Isiolo (23,564) or in Mombasa (39,312) Counties compared to the validated totals (21,563 in Isiolo and 30,136 in Mombasa).1

The validation results also found that the school revenue discrepancies varied significantly between counties, but not between the electronic or paper forms. The greatest discrepancy was discovered regarding how schools under-reported their revenues. The form fields asked schools to indicate how much money they received from the FP capitation grant, the amount received from fees, and the amount received from “other sources.” Schools in Isiolo County under-reported their capitation grant revenue significantly, by an average of 9 million Kenyan shillings (KES). However, these same schools reported perfectly accurate (0% error) for their “fees” and “other” revenues.2 In Mombasa County, the schools under-reported their revenue electronically by KES 20 million and in the paper submissions by KES 25 million. The differences are likely significant, and the variance is most pronounced in the amounts that “other revenue” were reported.

Figure ES-1 shows that the error rates from over- and under-reported textbook and revenue data are considerably higher than the error rates from the enrollment data. Although Mombasa County 1 It is not exactly clear why schools reported their textbook figures much more accurately since the electronic and paper forms were identical. Conceivably, this difference could be the result of a data entry errors (manual data entry into the national EMIS) or a data query error in downloading the data from the national EMIS database. 2 Again, it is not clear why the schools were perfectly accurate in reporting non-FP revenues.

Page 11: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 3

had higher discrepancies in over-reported enrollment data than Isiolo County, the differences in the number of schools reporting greater than ±10% error between electronic and paper forms, as well as the overall magnitude of the enrollment error were negligible.

Figure ES-1. Degree of error reported as a percentage of total validated data for revenue, textbooks, and enrollment.

Summary Conclusions from the Validation Findings The validation results revealed general patterns that are instructive and actionable. First, textbook and resource data submitted by schools tend to be vastly over-reported—more so when reported by paper rather than by electronic forms—requiring disciplined validation and double checks of the data reported. The secondary conclusion is that Head Teachers tend to improve the accuracy of electronic submissions when working from a completed paper form. The validation results found that Head Teachers who completed and submitted their electronic forms before completing their paper forms tended to have higher inaccuracies in the electronic data submitted. A third conclusion is that these findings need to be understood with some caution because the results compare only the validated electronic and paper form submissions of the same schools and counties. Another point of comparison needs to be taken with the validated returns from the national database in order to determine how these counties compared in the accuracy of data reported. Lastly, the validation process itself raised concerns about the quality of the validated data, in terms of both the timeliness of its capture and the accuracy of the reports. This issue is discussed further in Section 3.4.

Summary of Implementation Challenges The challenges that the DERP Team encountered are instructive as the MoEST begins to consider the technical, capacity, and operational requirements to implement a digital school information system at scale. During the course of the pilot test, challenges were encountered at each level of the system: with Head Teachers and Deputy Head Teachers in schools, with counties and sub-county offices, and even at the national Ministry level. It is important to note that these challenges are understandable, given that the EMIS data collection process was only

0% 20% 40% 60% 80% 100%

Mombasa Tablet

Mombasa Paper

Isiolo Phone

Isiolo Paper

Revenue Under Textbook Over Enrolment Over

Page 12: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 4

recently implemented after a seven-year hiatus.3 In this respect, the MoEST and EMIS unit, with assistance from UNICEF, has done a remarkable job in resurrecting a once-dormant system, and their accomplishments to mobilize schools and counties in support of a renewed EMIS effort should not be diminished in view of these findings. In any case the findings reported here apply mostly to the potential requirements of electronic roll-out per se, though some of the findings do pertain to EMIS generally.

School-Level Challenges Although a vast majority of the schools could submit their EMIS data electronically in a timely manner,4 the Head Teachers and Deputy Head Teachers experienced several challenges in relation to the digital EMIS. Their infrequent use of the devices and software led to unfamiliarity with the applications and limited utility and sharing of the report cards and dashboards. A few Head Teachers were reluctant to openly use the device at school for fear of someone mishandling or breaking it. In addition, schools varied in terms of the assigned roles and responsibilities for filling out and submitting the form. In addition, some schools were unable to complete the forms by using the electronic means because of several reasons: the device was broken, a lack of data units (because individuals used up the mobile data provided through the activity), a lack of network coverage area, or the inability to correctly submit the form. Given these challenges in the pilot experience, careful consideration and precautions are recommended if a mass-scale roll-out is undertaken in the future.

Other challenges experienced were general to EMIS and not related to the use of electronic forms. For instance, at some schools, all EMIS–related tasks were delegated to the Deputy Head Teacher; at other schools, the Head Teacher and the Deputy Head Teacher had shared responsibilities; and at some schools, only the Head Teacher handled the EMIS tasks. This varied approach led to inconsistent reporting and data accuracy because school records are typically maintained by Head Teachers; Deputies may have only limited access to the records. The quality and consistency of school records proved to be problematic because some officials at schools struggled to identify the information to complete the forms, whereas others could quickly and accurately locate the data from their records. Also, many Head Teachers reported being overwhelmed with administrative tasks and redundant data requests by multiple offices from the MoEST, counties, and semi-autonomous government agencies (SAGAs) such as the Teachers Service Commission (TSC). There was a prevailing sentiment among Head Teachers that questioned the utility of the data and even how the data were supposed to be used. Moreover, many of the Head Teachers were confused about how to correctly fill out some fields in the form, most notably the fields seeking information about textbooks and teachers (due to the teacher strike and the influx of temporary teachers during the time). These challenges, if not addressed, will undermine the efficacy of any national EMIS (whether electronic or paper).

County and Sub-county Level Challenges Counties and sub-counties also experienced challenges regarding their capacity to implement the validation activity and support the completion of the EMIS forms by officials at the schools. Although the DERP Team trained County and Sub-county Officers, provided them with tablets, and facilitated their transport to schools, the officers were unable to visit all 20 schools in each 3 2014 was the first time that EMIS data were collected for all schools since 2008. 4 Two weeks after initial data collection, in both Mombasa and Isiolo Counties, 163 schools (79%) had completed their EMIS submission, 26 schools (13%) were in progress, and 17 schools (8%) had not yet begun.

Page 13: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 5

County to double check the EMIS data because of several reasons. First, the amount of time for the validation activity was limited, and this competed with other duties and engagements. Second, among the officials, there was a lack of clarity or understanding of the purpose of the EMIS data collection process and their role in supporting it. Lastly, a few officials were reluctant to take the time needed to travel great distances to visit schools because they had other pressing needs and there was a lack of other available personnel to provide support. In short, greatest challenge in the counties and sub-counties is the limited number of available individuals to provide support. This challenge is complicated by the competing demands on their time and a clear lack of expectations in terms of their roles and responsibilities in supporting the EMIS. This challenge, noted during a pilot activity, might be either worsened or ameliorated during a mass-scale roll-out, or enhancement of the EMIS function more generally, depending on how well the roll-out is handled, how integrated the data platform systems, and how well it is resourced. Note that the County and Sub-county Officers saw the DERP project as outside their core activities, but in the context of a national EMIS rollout they may not have that same perception.

National-Level Challenges We recommend that the MoEST should focus on several key issues as it proceeds with plans to modernize its EMIS. The first and foremost issue is the multitude of data requests that schools receive from different agencies and divisions across the MoEST and county offices. The challenge in this case is less technical than it is institutional, because many of these agencies and offices operate independently within their bureaucratic silos. As a result, multiple and disparate records and databases proliferate across the MoEST, SAGAs, and counties and sub-counties, and this leads to inaccurate and inconsistent records. For instance, school officials may report different number of students and teachers to the TSC than what they report on their EMIS forms or what is collected by the Kenya National Examinations Council for examinations registration.

The second issue focuses on the capacity of the central Ministry to manage and administer a large-scale, complex national information system. The MoEST should consider increasing the staffing and resources allocated to the EMIS Unit in order to handle a system that requires functional teams to manage the database administration, the network administration, and the data processing and analytics. If hiring additional staff is not an option, the MoEST could also consider hiring private contractors to source the required human resources to manage such a system.

Summary Recommendations and Ways Forward As previously mentioned, the challenges experienced by the MoEST, the schools, and the counties are understandable, given that the EMIS process was only revived in 2014 after seven years of dormancy. The challenges are not intended to belittle the remarkable progress that the MoEST has made; instead, they can be used to help identify the systemic threats that could derail the successful implementation of a digital school information system or even an enhancement of paper-based systems.

The DERP Team’s recommendation is to proceed with the continued planning of a school-based digital information system as long as the MoEST allows for flexibility in its design and scaffolds the capacity of schools, counties, and Ministry offices to support its implementation and uptake. This report outlines the following actionable recommendations in advancing this initiative:

Page 14: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 6

1. Articulate the vision, goals, and strategy in a revised EMIS Roadmap and other strategic documents. Articulating the vision and goals within the framework of a formal policy and planning strategy (e.g., Medium-Term Plans) is an important first step in making this system a reality. The strategy should reflect consensus amongst a wide array of stakeholders on the vision, goals, and benefits of a modern school-based information system. Such benefits include a reduced administrative burden of Head Teachers and an increased capability to analyze data and conduct policy research on factors affecting education outcomes. Additional benefits include reduced silos of decision-making and administrative management, increased transparency and accountability over schools and counties, and increased equity and efficiency of resource allocations to schools and students. A key initial step is to engage a broad spectrum of stakeholders in policy dialogue to generate buy-in across the stakeholder groups and produce a vision and roadmap for a digital school information system to inform the formal strategic policy and planning documents.

2. Scaffold MoEST’s capacity to scale and sustain a digital school information system. To establish the foundation that will allow this system to take root and flourish, the MoEST should consider the following steps:

a. Establish a centralized Information and Communication (ICT) Technology office that can serve the basic education sector first and foremost, and support the information needs of the Ministry as a whole. The centralized office could also support the information needs of the various SAGAs and oversee the ICT operations, database administration and analytics functions.

b. Differentiate rollout to schools based on readiness and capacity. Because of environmental factors such as access to power and the Internet, some schools already have the means to enter data electronically, whereas other schools may not be as well prepared immediately to submit data electronically. A school information system does not require a “one-size-fits-all” approach. The DERP Team recommends the adoption of a multi-modal system in which tablets, desktop computers, or even smart phones could be used. When those items are not available in the schools, the DERP Team recommends that Head Teachers are provided with the means to travel to a networked terminal (in a town or in the district) to enter the data offsite once or twice per year.

c. Leverage existing initiatives. Kenya is the ICT hub for East Africa, and the MoEST and international development partners have begun to capitalize on the availability of broadband and 4G and 3G networks across the country. Multiple initiatives are pushing ICT into schools, including the Digital Literacy Program and the Global Partnership for Education (GPE) under which tablets, laptops and other connected devices are provided to schools nationwide. Leveraging these applications and technologies will minimize the additional investment in devices needed for schools to enter data electronically. These initiatives will benefit, even ex post, from a more comprehensive and planned approach as per the recommendation above.

d. Design and implement a change management and targeted behavior change and communications strategy. The change management strategies are designed to align the behaviors of the individual end users of the system that can include Head

Page 15: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 7

Teachers in schools, the County and Sub-county Officers, and central Ministry and SAGA officials. National policies that support change management may focus on both incentives and disincentives for individuals reporting, validating and utilizing school information data.

3. Establish and monitor performance standards in regard to data reporting and data quality assurance. The first element of this recommendation is that Head Teachers and their Deputies should have clearly articulated expectations and assigned roles for school record keeping and reporting. The second element of this recommendation is that those expectations should be monitored through consistent, periodic, and routine double checks to validate the accuracy of the reports and the quality of the records. The consequences for misreporting or dereliction of validation duties must be clearly understood by individuals at all levels, and must be enforced by the MoEST. Lastly, the MoEST through policy should tether the FP capitation grants to the timeliness and accuracy of the data reported as a means to motivate reporting compliance, but only if the expectations related to data accuracy, validation protocols, and enforcement mechanisms are effectively implemented.

Page 16: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 8

1. Overview of the Kenya DERP Big Data Activity This report details the findings and observations of the Kenya Big Data Activity funded by U.S. Agency for International Development’s (USAID’s) Data for Education Research and Planning (DERP) in Africa Education Data for Decision Making (EdData II) program. Implemented from January 2015 through July 2016, this activity was coordinated by the Ministry of Education, Science, and Technology (MoEST) and was carried out by RTI International in partnership with the IBM Corp. Research Laboratory, Kenya and with Digital Divide Data, with collaboration and support from the United Nations Children’s Fund (UNICEF), CARE, Georgetown University, and USAID/Kenya’s Tusome Program.

This research activity piloted innovative mobile platforms for school-based information systems, data quality assurance, and feedback reports in Isiolo and Mombasa Counties. This research activity was designed to assess and document the systemic capacity requirements and challenges should such a system be taken to scale. The purpose of this report is to advise the MoEST regarding the efficacy of these models and discuss the ways forward for implementing a modern Education Management Information System (EMIS) at scale.

After the submission of this report, the MoEST will lead a Dissemination and Policy Dialogue Workshop. During this workshop, the DERP Team will share with key stakeholders the results of these efforts, lessons learned, and the implications for scale and sustainability. Annex A of this report provides the draft agenda for this Policy Dialogue Workshop.

1.1 Introduction and Background From 2008 to 2013, the MoEST operated without access to reliable information about basic school statistics through a formalized EMIS unit. The lack of a formal EMIS unit severely constrained the MoEST’s ability to report appropriately, undertake effective medium- and long-term planning, and monitor system inputs for equity and efficiency. The lack of a formal EMIS unit also resulted in pressure from “transversal” ministries (such as the Ministry of Finance and the Ministry of Planning) in government to improve the situation. Because of the lack of a formal EMIS unit, MoEST departments, semi-autonomous government agencies (SAGAs), and county education offices established their own management information systems and protocols for data collection.

The consequences of this fragmented information ecosystem has had several deleterious effects on the education sector. These effects include reinforced siloes among the agencies, leading to distrust and competition for resources rather than cooperation and collaboration; and diminished capability for planners, policy makers, and instructional leaders to draw insights from how various aspects of the education system interact, such as the relationships between teacher management, learning outcomes, and school governance indicators. The fragmented information system has also led to much less accountability than is desirable. For a country that has adopted decentralized systems for school funding through the Free Primary (FP) capitation grants and liberalized textbook procurement policies, the lack of reliable school statistics has resulted in weakened accountability and governance systems, particularly at the school level.

However, just in the past two years and with support from UNICEF, much progress has been made to establish a basic functioning EMIS unit. The MoEST’s and UNICEF’s efforts have advanced a consolidated master list of schools, along with unique school identification code; a

Page 17: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 9

simplified annual school census protocol and centralized database with the basic planning information needs (a one pager); and the development of school report cards, which are planned for release in 2016.

During this same timeframe, other interventions that were using mobile-based reporting applications were beginning to demonstrate real results. For example, under the National Tablet Program, RTI leveraged an anonymous donation of tablets to strengthen the role and capacity of Teacher Advisory Center tutors to perform their teacher coaching and support functions. The use of these tablets was piloted under the USAID/Kenya Primary Math and Reading (PRIMR) Initiative and have been taken to scale through USAID/Kenya’s Tusome Program and now the GPE Primary Education Development Project. Under USAID’s Water, Sanitation, and Hygiene (WASH) program, CARE and Georgetown University partnered with Digital Divide Data to pilot a mobile telephone–based EMIS data collection exercise in a small sample of schools as a proof of concept. The Global Partnership for Education likewise has plans to enhance the supervisory capacity of counties and districts through the provision of tablets.

Through all of this, there seems to have been pressure from “transversal” ministries and various observers to have the MoEST quickly move to digital reporting. Despite the interest in this transition to digital reporting, at the start of this DERP Big Data Activity, only incipient experimentation existed, and the paper-based, but much-improved EMIS was just being revived. Thus, it seemed prudent to acknowledge the pressure, but rather than move wholesale into digital reporting, take some deliberative steps through further experimentation with digital reporting.

1.2 Purpose, Partners and Scope The MoEST invited USAID, and through this activity RTI, to advise on EMIS–enhancing strategies that could build on and leverage these mobile-based innovations. In December 2014, RTI staff met with senior MoEST officials to identify the main areas for data systems support. In February 2015, a one-day planning workshop was held in Nairobi with key stakeholders from MoEST, participating SAGAs, and international partners. Through these consultations, the Kenya DERP Big Data Activity was designed through USAID’s EdData II program.

A challenge for the MoEST was how the system could test innovative applications systematically while strengthening and reinforcing its fledgling EMIS. Moreover, the MoEST articulated a need to address some of the key constraints within the information ecosystem, particularly regarding data integration and connectedness, data quality, and data use. In response, RTI and MoEST staff designed a research activity with three core components: shadow the paper EMIS questionnaires with electronic entry and validation applications, data validation and systems support, and test feedback systems and reporting tools for schools, districts, and counties. Each previously mentioned component is described further in the remainder of this subsection.

Shadow paper EMIS questionnaires with electronic entry and validation applications. The pilot activity tested mobile telephones in Mombasa County and tablets in Isiolo County. These interventions were intended to shadow—not to replace—the current paper system in order to test the systemic requirements and efficacy of the digital EMIS platforms. The purpose of the shadow EMIS was to test whether school-based information systems would improve EMIS data quality, timeliness, accuracy, and data use at the school and county levels. The technologies were deployed to all schools in each county to assess county-wide system support requirements. Teachers were trained and received support on how to use the applications. Other partners for the

Page 18: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 10

activity included the IBM Corp. Research Laboratory, Kenya to pilot a tablet-based application in 97 schools in Mombasa County. In Isiolo County, RTI partnered with Digital Divide Data to develop and pilot a licensed Mobenzi5-developed mobile telephone–based application for 109 schools.

Data validation and systems support. The DERP Team mobilized a Help Desk and employed short messaging service (SMS) to assist Head Teachers with how to properly use the tablet and telephone applications for data collection. After completion of the EMIS submission, DERP mobilized Quality Assurance Standards Officers from each county to conduct site visits to a sample of schools to double check and validate the reported EMIS data. The results of the validation activity are discussed in Section 3. The validation activity was designed to assess counties’ and sub-counties’ capacities to perform this data quality assurance function, which is crucial for the system for accountability for data accuracy.

Test feedback systems and reporting tools for schools, districts, and counties. The DERP Team developed and tested the field use of integrated report cards and dashboards for use by school and county stakeholders. The intent of this activity was to produce visually informative reports that display comparative school, sub-county, and county data on a cross-section of education quality, governance, access, and to describe the efficiency indicators generated from the EMIS report. The activity also intended to assess the means whereby and the likelihood that schools might engage their boards of management and parental stakeholders to share performance information regarding students, teachers, schools, etc. and inform priority School Improvement Plans.

Selection of Counties. Isiolo and Mombasa Counties were selected as the intervention counties based on the following reasons: i) they lie on two different ends of the development spectrum where Mombasa is urban with relatively high development indicators, and Isiolo lies in a semi-arid, economically depressed area that is considered a development priority zone for the Kenya Government. Second, the counties are relatively smaller in population compared to most other counties in Kenya, thereby providing a more manageable number of schools to pilot. Third, their proximity to Nairobi by air and road allowed the field team to manage the interventions without the need to establish temporary field offices.

2. Documenting the Implementation Process This section of the report documents the key implementation activities with regard to the Shadow EMIS electronic data collection, data validation and quality assurance, and report card production for both Isiolo and Mombasa Counties. This section also clarifies for the MoEST the steps taken by the DERP Team to ensure the successful implementation of these pilot activities.

2.1 Implementation Timeline The pilot implementation of the digital EMIS applications, validation, and report card dissemination consisted of a series of outreach, sensitization, and capacity-building activities for primarily Head Teachers, Deputy Head Teachers, County and Sub-county Quality Assurance Standards Officers, and District Education Officers. The timing of the digital EMIS was planned

5 Mobenzi is a survey software application that is licensed and distributed through Mobenzi Technologies (Pty) Ltd.

Page 19: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 11

to coincide with the data collection schedule of the national EMIS. Table 1 provides a summary of the training and actual data collection dates.

Table 1. Implementation Timeline of Training

2.2 Training Program 2.2.1 County and Sub-County Officers The main objective of training the County and Sub-county Officers was to build their capacity so that they could act as first responders to technical issues once the Head Teachers complete training and are issued the equipment. The training sessions included an introduction to the project’s objectives and timelines, as well as the functionality of and how to properly use the digital EMIS data collection tools (i.e., Samsung tablets for Mombasa County and Samsung Galaxy telephones and tablets for Isiolo County).

DERP Report Card Feedback SessionMarch 21–24, 2016 March 17, 2016

Qualitative AssessmentMarch 7–9, 2016 March 14–15, 2016

Report Card TrainingJanuary 25–February 2, 2016 February 25–29, 2016

Actual Data SubmissionOctober 9–13, 2015 November 2–20, 2015

Practice SessionsOctober 19–November 2, 2015 October 19–November 13, 2015

Head Teacher and Deputy Head Teacher Training SessionsOctober 12–16, 2015 October 12–16, 2015

County and Sub-county Officers' Training SessionsOctober 8, 2015 October 6, 2015

Training and EMIS Paper Survey TimelinesIsiolo County Mombasa County

Page 20: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 12

In Isiolo County, the DERP Team through Digital Divide Data Team conducted training at the Grande Hotel in Isiolo Town on October 8, 2015. Training attendees were the County Director of Education (CDE), the County Quality Assurance Standards Officer, a TSC representative, District Education Officers, and District Quality Assurance Standards Officers from the three sub-counties except for the Garbatulla Quality Assurance Officer. At the end of the training, all of the attendees gained experience with using the Samsung mobile telephones and the Mobenzi application.

In Mombasa County, training was conducted by the DERP Team through Digital Divide Data and IBM Corp. Research Laboratory, Kenya on October 6, 2015 at the County Director of Education’s office. Training attendees were the County Director of Education, the Assistant County Director of Education, the County Quality Assurance Standards Officer, a TSC representative, District Education Officers, and District Quality Assurance Standards Officers from the four sub-counties and Chairmen of the Head Teachers Association from all of the four sub-counties.

2.2.2 Training of Head Teachers and Deputy Head Teachers Infrastructural challenges make it difficult and expensive to conduct centralized training in Isiolo County; therefore, training sessions occurred at hotels in Garbatulla, Merti, and Isiolo Sub-counties. In both Isiolo and Mombasa Counties, Head Teachers and their Deputies attended the training on two consecutive days. Training occurred in Garbatulla and Merti Sub-counties on October 12 and 13, 2015, and in Isiolo Sub-county on October 15 and 16, 2015.

In Mombasa County, training was conducted at the Royal Court Hotel within Mombasa Town, with each sub-county attending on separate dates. However, participants from both Likoni and Changamwe Sub-counties attended the training on the same day. The training dates were as follows: Likoni and Changamwe Sub-counties on October 12 and 13, 2015; and Mombasa on October 14 and 15, 2015. In addition, training was held in Kisauni Sub-county on October 16, 2015, with Head Teachers attending the morning session and Deputies the afternoon session.

2.3 Technical Support and Help Desk In Isiolo County, technical support was provided by the Digital Divide Data Field Supervisor, who was based in Isiolo Town, but had the flexibility to move to the schools whenever required. Logistical challenges and heavy rains, however, hampered movement to the rural areas; therefore, the Field Supervisor was only able to visit five schools within Isiolo County. All of the teachers would call or text the Digital Divide Data Field Supervisor or the sub-county offices for assistance. Many challenges were experienced, including how to switch on mobile data, how to update the survey, how to input decimals, and how to check the data plan.

In Mombasa County, the original plan was to establish a Help Desk at the County Director of Education’s office and assign each sub-county days to visit to provide assistance and collect the tablet cases. This was not a very successful plan because it quickly became apparent that a majority of the Head Teachers were not visiting the offices, for whatever reason. The Digital Divide Data and IBM Corp. Teams changed the strategy to address the problem and provide more support. Specifically, one IBM staff member remained in the County Directors of Education’s office to provide support to teachers who visited the office for assistance. The other two IBM staff members and the Digital Divide Data Field Supervisor visited the schools to support the Head Teachers who were unable to travel to the County Director of Education’s

Page 21: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 13

office. In addition, a Help Desk telephone number was set up and communicated to all participants of the pilot effort so they could receive technical support when needed. A staff member who was based at the IBM Corp. Research Laboratory, Kenya responded to queries submitted from the participants via phone calls or texts. The Help Desk staff would then call Head Teachers back from the office line and assist them with keeping the customer care telephone number open to receive calls.

2.4 Communication and SMS Gateway Updates about the project were communicated to the teachers through a bulk SMS platform that was managed by Digital Divide Data. The County and Sub-county Officers were also included in the messages that were distributed to ensure that they were aware of the progress of the project.

2.5 Device, Data, and Software License Requirements Table 2 lists the type of tablet and the data and software requirements for the tablets and telephone.

Table 2. Types of Devices and Data and Software Requirement Used in Mombasa and Isiolo Counties

Mombasa County Cost Isiolo County Cost

Tablet specifications are as follows:

• 7-inch tablet

• 1-gigabyte (GB) RAM (random access memory)

• 8 GB storage

• Rear facing camera

• 3G compatibility

• The tablet must have a data connection that is compatible with Kenya’s cellular network, either Safaricom or Airtel.

KES 20,999 per tablet

(approx. $209.99)

Telephone specifications are as follows:

• Feature telephone or Android smart phone

KES 9,000 per telephone

(approx. $90)

Data requirements are as follows:

• Data bundle for 12 months

• Safaricom SIM (Subscriber Identity Module) cards for tablets

KES 999 per month

(approx.$9.99)

Data requirements are as follows:

• Data bundle for 12 months

• SIM cards for telephones

KES 250 per month

(approx.$25)

Software license requirements are as follows:

• None

Not applicable Software license requirements are as follows:

• Mobenzi license at scale (more than 1,000 users)

KES 6,200 per device

(approx. $62)

Page 22: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 14

3. Research Plan 3.1 Theory of Change The hypothesis, or theory of change, stems from the notion that introducing technology, training, and feedback tools at the school, district, and county levels will lead to gains in the efficiency, quality, and use of EMIS data reported by schools. A simplified theory of change model is presented in Figure 1 (note: The desired outcomes are shaded boxes, and the generalized intervention inputs are the other boxes).

Figure 1. The theory of change model, which lists the desired outcomes (green boxes) and the generalized intervention outputs (other boxes).

Timely EMIS data captured

Train head teachers

Introduce technologyto head teachers

Train district andcounty officials to

validate data

Introduce technology to districts and counties Improved data use by schools

Reduced error in EMIS reporting

Dashboard and feedbacktools provided to Schools

Train school, district and county officials to

Interpret the data

As the model indicates, the hypothesis is that the introduction of mobile technologies for school-level reporting, combined with training and support to Head Teachers, will result in improved timeliness and accuracy of data reported. In addition, concerted data validation efforts and the introduction of feedback tools will result in fewer errors or discrepancies reported, as well as an uptick in how schools and counties use the data for their decision-making purposes.

3.2 Variables The activity was designed to assess and compare the relative efficiency gains in two intervention groups: in Mombasa County schools (which received tablets) and in Isiolo County schools (which received mobile telephones). The remaining counties and schools in Kenya were to serve as control groups (which only received paper forms). Table 3 relates the outcomes to the key variables and the methods by which the data on the variables were collected for baseline, treatment, and control populations. It is important to note that because the intervention counties were only one per technology and were not selected at random in any case, this is not a truly randomized study of technological impact. Instead, this study is more of an engineering, operations research, or implementation effort to provide insights and lessons for management of digitalization processes going forward. The language of randomized evaluation is used for convenience.

Page 23: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 15

Table 3. Study Variables and Methods

Outcomes Variables Data Collection Methods

Timely EMIS data captured

Number of days to generate the clean EMIS database from school-reported EMIS returns

Treatment schools: Project EMIS database records

Control schools: MoEST 2015 EMIS records

Reduced error in data reporting

Proportion of schools reporting less than 10% error in enrollment; the numbers of teachers and textbooks based on the sample of schools double checked

Treatment schools: Sample validation survey of schools

Control schools: 2015 EMIS validation analysis

Improved data use by schools

Proportion of schools using and sharing EMIS data for school improvement planning purposes

Treatment schools: Sample survey of schools

Control schools: Not applicable

3.3 Other Research Questions This research activity further explored questions regarding the magnitude and cost–benefit tradeoffs of the gains in relation to a paper-based system; the implications for data use, particularly at school level; the underlying reasons for systematic reporting errors of EMIS data; and the system and capacity requirements of districts and counties to implement similar interventions at scale.

An additional question was raised regarding the efficacy of Head Teachers having paper and electronic forms. The MoEST raised the issue of whether having the paper form before data entry impacts the quality or consistency of the data submitted. In Mombasa County, half of the schools were given paper forms before receiving training on how to use the tablet; the other half of the schools were given paper forms after they submitted their electronic returns.

3.4 Methodologies The DERP Team applied two data collection methods to identify constraints and challenges regarding the use of the EMIS applications and feedback reports. The two data collection methods are (1) lot quality assurance sampling (LQAS)6 of randomly selected schools to assess the accuracy of submitted EMIS data and (2) qualitative assessments by using focus groups of Head Teachers and County and Sub-county Officers.

3.4.1 Data Validation through Lot Quality Assurance Sampling In November 2015, the schools submitted their EMIS data, which were then available for analysis by early January 2016. At the end of January 2016, County and District Quality Assurance Standards Officers were trained on how to use the tablets to conduct data validation activities. The intent of the validation activity was for the Quality Assurance Standards Officers to visit a random selection of 20 schools within each county, modeling a form of LQAS. The 6 Lot quality assurance sampling is a technique that allows for large-scale, rapid monitoring and evaluation of interventions. Schools are assessed based on pre-set thresholds of 10% error rate, and if the number of schools exceeds this threshold, then the entire “lot” or county is considered to be under-performing. This classification approach allows for much smaller sample size with statistically valid results. More information about LQAS is available at https://www.eddataglobal.org/documents/index.cfm?fuseaction=pubDetail&id=602.

Page 24: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 16

primary purpose of the validation activity was to determine the frequency and magnitude of reporting discrepancies, particularly with regard to the number of students, teachers, and textbooks reported by schools. The DERP Team set the threshold for acceptable at ±10% error in reported data versus the validated data. Therefore, using the LQAS approach, if five or more schools out of 20 had greater than ±10% error, then the entire county is deemed to be under-performing. Section 4.2.1 of this report details the results of the LQAS validation.

The secondary purpose of the activity was to assess the capacities of the County’s and Sub-county’s systems to implement a validation protocol. Providing the same form and guidance as was given to the Head Teachers, the County and Sub-county Officers were mobilized to visit 20 schools, which were randomly selected in each county. The results and findings from this validation activity are provided in Section 4.2 of this report.

3.4.2 Qualitative Analysis and Feedback Sessions The feedback sessions were designed to obtain feedback about how well the school report cards were received and used by the schools (a report is forthcoming). The qualitative assessments were designed to obtain more nuanced information about the Head Teachers’ understanding of the EMIS form requirements, the quality and use of school record keeping, and Head Teachers’ adoption and receptivity to the electronic forms and devices. The qualitative assessment also included feedback from County and Sub County Officers. The MoEST officials accompanied the DERP Team while in the field to conduct these feedback sessions and the qualitative assessments.

4. Findings and Observations This section of the report describes the findings from the LQAS validation activity, presents highlights from the qualitative assessment and feedback sessions, and briefly discusses the cost-effectiveness estimates of using these interventions.

4.1 Completion and Timeliness of Returns The digital EMIS applications were piloted with 109 schools in Isiolo County and 97 schools in Mombasa County, for a total of 206 schools. By the end of November 2015 (two weeks after initial data collection), in both Mombasa and Isiolo Counties, 163 schools (79%) had completed their EMIS submission, 26 schools (13%) were in progress, and 17 schools (8%) had not yet begun. Table 4 details the timeliness of the EMIS returns for Mombasa and Isiolo Counties.

Page 25: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 17

Table 4. Status of EMIS Submission: The Number (and Proportion) of Schools by Each County That Submitted, Was in Progress, or Had Not Yet Started by the End of November or by the End of December 2015

Status

Mombasa County Isiolo County

November 2015 December 2015 November 2015 December 2015

Submitted 80 (82%) 88 (91%) 83 (76%) 83 (76%)

In progress 13 (13%) 6 (6%) 13 (12%) 13 (12%)

Not started 4 (4%) 3 (3%) 13 (12%) 13 (12%)

Mombasa County In Mombasa County, some Head Teachers experienced difficulty using the tablets to complete the EMIS forms for a variety of reasons, as explained below.

• Some Head Teachers turned off the data on the tablets when filling in and submitting the data to “save” data bundles. Therefore, even after submitting the form, the data did not transmit over the network and reach the server.

• A few Head Teachers immediately switched off the tablet after submitting the form, which did not allow enough time for the tablet to transmit the data to the server.

• Certain Head Teachers exhausted their data bundles prior to submitting their forms. Without data bundles, the tablets were unable to transmit the data. The DERP Team provided a few Head Teachers with additional data bundles to facilitate their transmission of data.

• Some Head Teachers had locked their tablets and traveled on official duty. Their Deputy Head Teachers who came for the feedback sessions in November 2015 were not aware of the tablets, nor whether the data had been submitted.

Isiolo County In Isiolo County, 12 schools did not use the Mobenzi telephone application to complete the 2015 EMIS survey, and 10 schools started this task, but did not complete data entry. The Deputy Head Teachers and Head Teachers who attended the training and feedback sessions gave the following reasons for not submitting the surveys:

• Four Head Teachers had completed the survey, but the data had not synchronized with the Mobenzi server because the telephone experienced problems with accessing the Internet by using mobile data bundles.

• One school had a new Head Teacher because of transfers, and the current Head Teacher did not know why the previous Head Teacher never completed the survey.

• Two Head Teachers said that their telephones were stuck in training mode; therefore, they were unable to input the data and submit the final survey, which was supposed to be completed once the survey was in live mode. The teachers also confirmed that they mentioned this issue to the DERP Team, and it was addressed, but the problem resurfaced a few days later.

Page 26: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 18

• Five teachers whose surveys were in progress said they thought that they had submitted the surveys.

• Other Head Teachers could not be reached for comment, and their Deputies said they were never given a chance to interact with the telephones and application.

The challenges experienced by the schools underscore the fact that technological solutions also raise technological problems. Most schools were greatly aided by the Help Desk and first-responder teams that DERP deployed to support the Head Teachers and Deputies. Nevertheless, at scale, the MoEST should expect and plan for a similar percentage of non-responding schools because of technical challenges. In which case, back-up paper forms and other means should be programmed into a future system.

4.2 Validation Results The validation survey randomly sampled 20 schools in each county to determine the extent to which errors in reporting were found. District Education Officers, County and Sub-county Quality Assurance Standards Officers, members of the DERP Team, and staff hired to support his project visited the schools in January and February 2016 to double check the school records against select fields in the form that included enrollment, teachers, classrooms, toilets, school revenue, and textbooks. The following subsection of this report discusses the validated enrollment, textbook, and revenue data. The Validation Teams relied on prior year school records and followed similar protocols as was used by the MoEST/EMIS during the 2014 and 2015 validation exercises for their national paper EMIS returns. Annex B of this report contains the full data tables and analysis.

The findings below are derived from three data sources: the electronically submitted data by Head Teachers, the paper-form data submitted by Head Teachers, and the validated data submitted by County and Sub-county Officers and DERP Team field officers. The analysis is based on comparison of the electronic and paper data with the validated data. This analysis presumes that validated data, which was externally observed, reflects the most accurate picture of the schools’ enrollment, textbook and revenue situation. While the validation protocol was not perfect (and the challenges to validate the EMIS data are documented in Section 4.3.2), the results demonstrate enough consistency between the validated and submitted data to warrant confidence in the overall validity and reliability of the validated data.

4.2.1 EMIS Returns and Validation Results for Enrollment, Textbooks and School Revenues

Enrollment and textbook data form the basis of much of the school financing through the FP capitation grant. Table 5 presents the number of schools out of the 20 schools sampled, reporting discrepancies in their enrollment, textbooks, and revenue for each county that submitted electronic and paper forms. The shaded cells in Table 5 highlight where five or more schools out of the 20 validated exceeded a ±10% error rate, which, according to LQAS methodology, is indicative of the overall county performance not meeting expectations for data accuracy.

Page 27: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 19

Table 5. Number of Schools in Each County with Greater than ±10% Reported Error Compared to the Validated Data by Electronic and Paper Submissions

Indicator

Mombasa County

Isiolo County

Tablet Paper Telephone Paper

Number of schools reporting enrollment discrepancies of > ±10% in comparison to validated data

4 5 3 2

Number of schools reporting textbook discrepancies of > ±10% in comparison to validated data

0 18 2 19

Number of schools reporting revenue discrepancies of > ±10% in comparison to validated data

8 7 2 1

The discrepancies from the textbook and revenue figures reported were far more pronounced than enrollment data as shown in Table 5 and Figure 2. Particularly, the textbook data submitted by paper appear to show a much higher degree of discrepancy between what was reported versus what was validated. In Mombasa County, 18 out of 20 schools had greater than a ±10% error than the validated data reported for textbooks. In Isiolo County, 19 out of 20 schools had greater than a ±10% error than the validated figures. Moreover, both counties over-reported their available textbooks by significant amounts: Mombasa County over-reported by more than 30,000 textbooks (or 76% of the validated total), and Isiolo County over-reported 20,900 textbooks (or nearly 97% of the validated total).

Many Head Teachers reported some confusion regarding how to accurately complete the textbook fields. This confusion could have explained the exceedingly high discrepancy rates for textbooks between the paper form submission and the validated data. However, the DERP Team members dispel this hypothesis because those same Head Teachers managed to accurately report their textbook numbers electronically, and the electronic forms mirrored the paper forms. There is a chance that these textbook discrepancies could result not from errors in data reporting, but from errors in either the data entry or data query processes into and from the national paper EMIS database.

Page 28: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 20

Figure 2. Degree of error reported as a percentage of total validated data for revenue, textbooks, and enrollment.

In terms of school revenue reported data, the variability between paper and electronic reported data decreases. For Mombasa County, schools under-reported their income by more than KES 20 million by tablet and KES 25 million by paper. In contrast, in Isiolo County, there was little difference in under-reported income between telephone and paper (approximately KES 9 million for each). Far fewer schools recorded a ±10% error in reporting accuracy, indicating that overall, Isiolo County is faring much better in terms of reporting accurate revenue than Mombasa County.

The percentage of under-reported revenue in Figure 2 speaks to a potentially larger issue regarding the obscured amount of money available to schools by under-reporting. Figures 3 and 4 show how each resource category is reported by county and form. Figure 3 reveals that Mombasa County schools tend to obscure their revenue across each of the categories in near equal proportions, with less accurate reporting of FP revenue. The only apparent difference is that schools reported less “other” revenue by paper submission than they did by tablet. In contrast, Figure 4 shows that Isiolo County schools under-reported their FP revenue and correctly reported (100%) the other revenue categories by each form of submission.

0% 20% 40% 60% 80% 100%

Mombasa Tablet

Mombasa Paper

Isiolo Phone

Isiolo Paper

Revenue Under Textbook Over Enrolment Over

Page 29: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 21

Figure 3. Income reported by Mombasa County schools: Digital (d) versus paper (p) forms.

Figure 4. Income reported by Isiolo County schools: Digital (d) versus paper (p) forms.

4.2.2 Estimated Efficiency Gains from Reducing Error of Enrollment and Textbook Data Reported

The information in Figure 2 is important because the reduced errors in over-reporting enrollment and textbooks has a direct impact on the efficiency of the FP capitation grant to schools. Moreover, the reduced under-reporting of enrollment and textbooks will lead to increased equitable allocation of resources for those schools that are not receiving the necessary funds and textbooks to meet their needs. Table 6 outlines the potential impact in terms of the efficiency and equity gains realized from reduced over- and under-reporting, respectively.

0%

20%

40%

60%

80%

100%

dFP pFP dFees dFees dOther pOther

Correctly reported (%) Obscured (%)

0%

20%

40%

60%

80%

100%

dFP pFP dFees dFees dOther pOther

Correctly reported (%) Obscured (%)

Page 30: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 22

Table 6. Efficiency and Equity Gains from Reduced Errors in Enrollment and Textbook Reporting (Note: Currency Is Presented in Kenyan Shilling [KES])

Total Gains Mombasa County

Isiolo County

Potential efficiency gains from correcting for enrollment (KES) 276,624 −459,684

Potential efficiency gains from correcting for textbooks (KES) 5,990,600 4,015,000

Potential total efficiency (enrollment plus textbooks) gains (KES) 6,267,224 3,555,316

Average efficiency gains per school (KES) 313,361 177,766

Potential gains as a percentage of the total validated enrolment and textbook data

21.7% 19.2%

The KES numbers in Table 6 are based on the MoEST’s policy for allocating FP capitation funds for enrollment (KES 1,356 per student) and textbook allocation (KES 200 per textbook). Based on these numbers, the MoEST might realize enormous gains because of the reduction of over-reporting of textbooks: upward of how 76 KES out 100 KES are spent on textbooks in Mombasa County, and how 93 KES out of 100 KES are spent on textbooks in Isiolo County. The combined gains for the two counties total KES 9,822,540 (or 20.3%) of the total validated enrollment and textbook allocation for the 40 schools. Extrapolating the average efficiency gains per school amounts to a staggering KES 30,396,036 in Mombasa County and KES 19,376,472 in Isiolo County in potential gains in efficiency county-wide.

The DERP Team cannot say conclusively that the digital applications were primarily responsible for the improved accuracy, particularly for enrollment data. However, these findings underscore the need for the MoEST and Counties to conduct routine, random data validation double checks on the EMIS data reported in a timely manner. This critical validation activity not only illuminates the financial impact that data discrepancies have on the equitable and efficient allocation of resources, but it also serves as an important audit function that checks the worst of schools’ tendencies to willfully or casually misreport their data.

4.2.3 Simplified Cost-Effectiveness Analysis The cost-effectiveness analysis compares the relative costs and benefits of the digital applications in comparison with the paper EMIS forms. The indicator is the gains in accuracy related to enrollment, textbooks, and school revenues. As illustrated in Section 3.2.1 (Table 7) of this report, the benefits from these gains can be monetized as indicators for efficiency and equity based on the magnitude of the respective over- and under-reporting of resources. Table 7 details the simplified cost-effectiveness estimates for Isiolo and Mombasa Counties. Based on these calculations, the gains for textbook and enrollment are KES 3.48 million for Mombasa County and KES 2.06 million for Isiolo County. When under-reported revenue is added, the gains jumped to KES 7.5 million in Mombasa County and decreased to KES 1.9 million in Isiolo County.

Page 31: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 23

Table 7. Cost-Effectiveness Estimates for Mombasa and Isiolo Counties

Cost Effectiveness

Textbook + Enrollment Textbook + Enrollment +

Revenue

Mombasa County

Isiolo County

Mombasa County Isiolo County

Digital costs (d) 2,789,255 1,496,037 2,789,255 1,496,037

Efficiency gains (g) 6,267,224 3,555,316 10,370,261 3,453,488

EMIS paper costs (breakeven) −3,477,969 −2,059,279 −7,581,006 −1,957,452

EMIS paper cost per school (breakeven) −173,898 −102,963 −189,525 −97,872

This is a simplified cost-effectiveness analysis (as opposed to a more rigorous cost-benefit analysis) because it is narrowly focused on direct implementation costs and does not account for the associated labor, personnel, or development costs. For the digital EMIS applications, the relevant costs include only those related to equipment procurement and distribution, training and capacity building, data bundles and Internet connectivity, field transportation, and communication. The recurrent per school costs such as data bundles and SMS gateway were set at three months. In comparison, the costs associated with the national EMIS include those related to printing and distribution of the forms, other transport costs, outreach and sensitization, and data entry. Section 7 of this report recommends that the MoEST commission a much more rigorous cost–benefit analysis of an integrated, digital EMIS that could estimate discounted benefits over a 10-year time horizon with capital and recurrent costs, including efficiencies in manpower requirements gained from systems integration, from schools completing forms to centralized systems administration.

4.2.4 Does the Order of Completing the Paper or Electronic Forms Have an Impact on the Accuracy of the Data Report?

This is an important question to determine whether a future digital school information system should retain a companion paper protocol, if it is deemed to be effective in assuring greater data accuracy. Tables 8 and 9 compare the validation results of those schools that completed the paper forms first (P1) before their electronic submissions against the schools that completed the paper forms second (P2), after they submitted their data electronically.

Although there is not a large enough sample size to make statistically conclusive findings, it does appear that completing the paper forms improves the overall accuracy data, particularly for the enrollment data. The standard deviation of the discrepancy rate is far higher for the schools that submitted their papers forms after the tablet (P2), indicating a greater fluctuation of error.

Page 32: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 24

Table 8. Enrollment Validation Results for Schools Completing Paper Forms First (P1) versus Paper Forms Second (P2)—Mombasa County Only

Mombasa County (20 schools sampled) P1 P2

Enrollment discrepancy (number of schools reporting any error out of 20 validated) 7/20 9/20

Enrollment discrepancy > ±10% (number of schools out of 20 validated) 1/20 3/20

Enrollment discrepancy range (minimum percentage of validated total) −5.14% −85.87%

Enrollment discrepancy range (maximum percentage of validated total) 10.37% 10.92%

Standard deviation of discrepancy range 4.25% 28.8%

Table 9. Textbook Validation Results for Schools Completing Paper Forms First (P1) versus Paper Forms Second (P2)—Mombasa County Only

Mombasa County (20 schools sampled) P1 P2

Textbook discrepancy (number of schools reporting any error out of 20 validated) 2/20 2/20

Textbook discrepancy > ±10% (number of schools out of 20 validated) 0/20 0/20

Textbook discrepancy range (minimum percentage of validated total) −1% −8%

Textbook discrepancy range (maximum percentage of validated total) 0% 2%

Standard deviation of discrepancy range −0.5% 3%

4.2.5 Summary Conclusions from Validation Findings The validation results reveal general patterns that are instructive and actionable. Textbook and resource data submitted by schools tend to be vastly over-reported, and more so when reported by paper than by electronic forms, therefore requiring disciplined validation and double checks of the data reported. The secondary conclusion is that Head Teachers tend to improve the accuracy of electronic submissions when working from a completed paper form. The third conclusion is that these findings need to be understood with some caution because the results compare only the validated electronic and paper form submissions of the same schools and counties. Another point of comparison needs to be taken with the validated returns from the national database in order to determine how these counties compared in terms of accuracy of data reported. Lastly, the validation process itself raised concerns about the quality of the validated data, in terms of both the timeliness of its capture and the accuracy of the reports. This issue is discussed further in Section 3.4 of this report.

4.3 Findings from the Qualitative Assessment The qualitative assessment explored how the Head Teachers managed the technology and understood EMIS form fields. The DERP Team engaged the Head Teachers during focus group-type sessions and conducted open-ended interviews following a guiding set of questions. A total of six schools were randomly selected in each county to participate in the qualitative assessment

Page 33: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 25

exercise. During the exercise, the County and Sub-county Officers were also visited to provide their input on the pilot activity. The discussions involved Head Teachers and their Deputies at the school level and District Education Officers and Quality Assurance Officers at the Sub-county level.

4.3.1 Qualitative Assessment Areas of Inquiry The qualitative assessment interviews were guided by the following seven areas of inquiry:

• Head Teacher and Deputy Head Teacher interviews:

− Process for completing the paper and digital EMIS forms

− Cognitive knowledge of the paper or digital form fields and data requirements

− Quality of school records management

− Experiences (successes and/or challenges) relating to their use of the digital EMIS applications and devices

− Practices and behaviors relating to sharing and using report cards and dashboards with key stakeholders and for school planning.

• County and Sub-county Officials interviews:

− Experiences relating to the data validation exercises

− Experiences relating to their data use.

4.3.2 Observations from the Qualitative Assessment The majority of the Head Teachers indicated that they had a general understanding regarding the use of digital EMIS surveys. From the discussion, most of the Head Teachers had a clear understanding of the EMIS and also the data required for each section. When comparing the digital EMIS to the paper surveys, during the discussion, the Head Teachers said that they found the digital EMIS surveys to be more efficient than the paper surveys. In both counties, there was delegation when it came to completing both the digital and the paper EMIS surveys. For instance, out of the 12 schools sampled, the Deputy Head Teachers in three of the schools completed the EMIS surveys, both Head Teachers and their Deputies in three other schools filled out the surveys, and the Head Teachers alone in the remaining six schools completed the surveys.

Data validation by the Sub-county Officers seemed to be the biggest challenge. Most of the Sub-county Officers interviewed were unable to visit all five schools assigned to them. The Sub-county Officers attributed this issue to their busy schedules; therefore, priority was given to their other assigned duties. From the Sub-county Officers’ feedback, it was clear that they did not have a clear understanding of how the applications worked, despite that they were trained and had the Digital Divide Data’s Field Supervisors providing technical support.

4.3.3 Recommendations Summaries of the challenges and recommendations for Head Teachers are provided in Table 10 and for County and Sub-County Officers in Table 11.

Page 34: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 26

Table 10. Challenges for Head Teachers’ and Recommendations from Qualitative Assessment Observations

Challenge Recommendation

Many Head Teachers had many other competing demands and had limited time to interact with the system; therefore, they were not as familiar or comfortable with navigating the system or pulling the reports.

Head Teachers should be encouraged to use this system more frequently to maintain knowledge and familiarity with the applications. For a future rollout, refresher training should be provided to Head Teachers and Deputy Head Teachers.

Many Head Teachers delegated to Deputies and provided only a final quality assurance review. A few of those Deputies did not have privileged access to the devices or records, and this compromised their ability to complete the forms.

Roles and responsibilities for administrative requirements for completing the paper/digital forms, including appropriate delegated tasks and assignments, should be clarified.

In some instances, the Head Teachers were unsure of what data should be entered in which field. This is particularly true for textbooks and repeaters, where the MoEST’s policies on official reporting led to inconsistent or confusing guidance.

Pop-up bubbles could be provided in the digital form in the application on the tablet or telephone to help guide the person entering the information. Also, manuals about how to properly use the electronic system and how to interpret the EMIS form fields for each section should be provided to all Head Teachers and their Deputies for reference.

Many Head Teachers were apprehensive about constantly interacting with the devices because they feared damaging or losing the devices.

Contracts should be established with Head Teachers that outlines their co-liability and responsibility for loss or damage to the device. However, insurance for the devices should be provided to cover partial costs of the devices to help alleviate the full liability of the Head Teacher.

Data retrieval from school records and updating data electronically are not efficient. Retrieving, recalling, and editing data submitted were cumbersome processes.

Improved school record templates can be prepared for easier management and uniformity of records maintained by schools. Data editing functions within the application should be improved.

Table 11. Challenges for County and Sub-County Officers and Recommendations from Qualitative Assessment Observations

Challenge Recommendation

One training session was provided, but the County and Sub-county Officers sought additional guidance after the training.

Continuous field support to County and Sub-county Officers is needed to carry out the data validation process for the digital and paper EMIS data collection.

Most of the County and Sub-county Officers indicated that it was difficult to carry out the validation process because this competed with their other duties.

Clarified roles and expectations for validation and quality assurance are required. The clarified roles and expectations either need to be explicitly described in their job descriptions or within the mandates of the county and sub-county offices.

The County and Sub-county Officers did not have an adequate understanding of the digital and paper EMIS data process and purpose.

Expectations should be clarified for counties and sub-counties regarding the processes and protocols for guidance and adherence, in particular regarding the purpose and use of the EMIS data and the validation requirements.

Page 35: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 27

Challenge Recommendation

The County and Sub-county Officers tend to get overwhelmed because of a lack of time and personnel.

All relevant officers from the county or sub-county could be trained to support the data validation tasks. For example, Sub-county Data Clerks could be a major asset if they are properly trained because their time is more flexible.

4.4 Findings from the Report Card Feedback Sessions School report cards display visually informative graphics that provide comparative data between schools, districts, and counties in relation to the national recommended standards. This comparison spans a number of parameters, including, but not limited to, student to teacher ratio, student to toilet ratio for boys and girls, and teacher qualifications and school performance. The information received from the 2014 EMIS data shared by the MoEST and 2015 EMIS data captured by using mobile telephones and tablets through this activity were used to generate these report cards.

Figure 5 shows the dynamic performance scorecard that Head Teachers were to use to interpret their EMIS performance data, which were generated on the tablets only. The Head Teachers or other users navigate between the following views in interactive manner. Each view is listed and described as follows:

• Section A provides detailed analytics about a primary section, which includes enrollment, new students, and repeaters.

• Section B is used to view detailed analytics about the number of teachers based on specific qualifications.

• Section C provides detailed analytics about examinations.

• Section D is used to view detailed analytics about textbooks.

• Section E provides viewing detailed analytics about classrooms.

• Section F is used to view detailed analytics about metrics that affect their decision-making process.

Figure 6 shows how indicators are displayed on the Mobenzi mobile telephone application report card. The report card allows schools to compare their key performance ratios with the average of the sub-county and county schools.

Page 36: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 28

Figure 5. School performance dashboard view for tablets only.

Figure 6. School report card view for mobile telephones only.

School, county, and sub-county officials could use a decision-support tool in their tablet application to maximize the utility of the data for equitable resource allocation. The DERP Team developed a number of template decision-support interfaces (e.g., Figure 7) and presented them to the EMIS Technical Committee. These decision-support interfaces provide many benefits. For example, the county and sub-county officials can see in real time which schools require intervention, can compare schools, and can devise specific strategies on how to improve performance.

Page 37: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 29

Figure 7. Example of the decision-support tool visualizer.

In addition, county and sub-county officials could use the County Dashboard to view the performance of the schools in their county. Figure 8 shows a report that is generated for the County Dashboard card view, which county and sub-county officials displayed on their tablets. This report allowed the officials to access the information quickly to view data regarding the higher performing and lower performing schools, including those at risk that required immediate support or attention. This feature is critical because most county and sub-county offices are under-staffed and do not have enough resources; therefore, the officials need to prioritize their scarce time and transport allowances to the schools needing the highest amount of support.

Page 38: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 30

Figure 8. The County Dashboard card view.

The purpose of this study was to test whether these feedback tools aided schools and county and sub-county officials in making informed decisions and whether this information enhanced transparency and accountability in the system. Brief discussions of the findings and recommendations are presented in Sections 4.4.1 through 4.4.3 of this report. A more detailed discussion of the features and functionality of these reports is provided in Annex C of this report.

4.4.1 Successes Related to the Use of the Report Cards Most of the Head Teachers reported that the report cards were useful. The general belief among the Head Teachers was that the report cards are easy to understand and make meaningful conclusions compared to the raw EMIS data that they are asked to provide through the EMIS survey. According to the Head Teachers during the training sessions, the report cards offer the following benefits:

• Identify gaps in school needs, which is helpful information to know when soliciting for funds and other forms of assistance from donors and others.

• Help track school performance over the years with the aim of ensuring continuous improvement. Some areas of focus include infrastructure, student performance, sanitation, teacher performance, and school attendance by gender.

• Provide the ability to share graphs with different government agencies such as the TSC and the MoEST so they can ensure that there is equitable distribution of resources, both human and capital.

Page 39: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 31

• Conduct comparisons to determine which classes are disadvantaged and where to focus efforts for improvement.

• Compute school completion rates for different school years and examine possible reasons for growth or regress of students in different classes.

• Complement the School Improvement Plan and provide an accurate picture of areas in need of improvements.

The Head Teachers participated in lengthy a discussion about the current performance of schools based on the report cards. Some Head Teachers were already realizing areas of weakness in their schools, whereas others realized that their schools, sub-counties, and counties were faring better than anticipated, particularly in specific areas. For example, Head Teachers in Isiolo County noted that their county had a fairly good number of primary teachers with an average teacher-to-student ratio of 1:37 compared to the recommended ratio of 1:40. Other areas that generated lengthy discussions focused on the numbers of teachers, toilets, and textbooks in schools, as well as high repetition rates, particularly for Standard 1 students. The Head Teachers also participated in lengthy a discussion about how to improve the equity of resources across schools in the county.

4.4.2 Challenges Related to the Use of Report Cards The Head Teachers also identified and discussed many challenges regarding the telephone report cards in Isiolo County and the tablet report cards in Mombasa County. These challenges areas are discussed in the remainder of this subsection.

Mobile Telephone Report Cards (Isiolo County)

• Four schools could not access report cards on their telephones; additional support to fix the bugs required an add-on to the service agreement with Mobenzi.

• Approximately five Head Teachers had telephones with broken screens and casings while in their possession, but the devices were still functional. One telephone was seriously damaged, but it was functional; however, this made it quite difficult to view the report cards.

• Some Head Teachers believed that it would be difficult to train members of the boards of management on how to use the report cards because a majority of the members had limited literacy skills.

• Some Head Teachers, especially in Garbatulla and Isiolo Sub-counties, did not give mobile telephones to their Deputies, even just for training purposes. Almost one-fourth of the Deputies who attended the training sessions in the two sub-counties did not have mobile telephones.

Tablet Report Cards (Mombasa County)

• Despite the presentation of the objectives and goals underlying the training, a few Head Teachers could not see the relevance of the report card training. This issue was demonstrated by a few Head Teachers who submitted questions to the DERP Team that included, “Why is this necessary or important?”

Page 40: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 32

• The Head Teachers reported some confusion regarding whether to capture numbers of stored textbooks in the schools or the actual textbooks in use. The participants disagreed about this issue; therefore, a solution is needed to avoid a disparity in data collected from schools regarding textbooks.

• There was a question about students who joined Form 1 from specific schools and how to capture that information on the report cards. This question spurred a lot of debate, which failed to have a conclusion because some Head Teachers found it reasonable and feasible to track the students, whereas others found it to be a difficult task and therefore difficult to report on such a matter.

• Lastly, there were some technical glitches with the 2014 EMIS data that were incorporated into the report cards. Some 2014 school data were missing from the report cards, and some Head Teachers did not agree with the accuracy of the data that were presented by the schools. The Head Teachers claimed that they fed different information to the 2014 EMIS survey; therefore, the information that the tablets and report cards portrayed were contrary to the actual situation in their schools.

4.4.3 Recommendations and Suggestions for Report Card Improvement The concept of school report cards was new to most Head Teachers; therefore, this issue raised a lot of positive discussions about the accuracy of the information submitted and what they wanted from the reports cards now and in the future (if adopted). The Head Teachers provided many suggestions about a variety of topics, including enrollment, examination results, textbooks, classrooms, and toilets. These suggestions are discussed in the remainder of this subsection.

Enrollment

• Graphics should be provided that show enrollment per class in comparison with previous years. Some Head Teachers wanted information for more years and not just 2014 and 2015.

• Graphics were also requested that showed enrollment and characteristics of students with special needs.

• The reports should capture data separately for boys and girls that include the numbers of repeaters, transfers in and out of school, drop-out rates, and completion rates.

• Graphics should be provided that list the numbers of teachers and their qualifications.

• Graphics were also requested that showed the ratio of male to female teachers. The Head Teachers believed that wholesome judgment could be misleading because it may show enough teachers, yet a specific gender was missing or was in excess.

Examination Results

• Head Teachers want to see more analytics about students’ performance on examinations for multiple years, as well as mean scores per subject for both boys and girls so that they can obtain a clear picture of performance for both genders.

Textbooks

• Textbook ratios should be included and broken down by specific subject (i.e., reading, mathematics, science) and class for more clarity.

Page 41: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 33

• The graphics should distinguish supplementary books from the other types of books.

• The graphics should also show comparisons between worn out books and those that are in good condition.

Classrooms

• The reports should also provide specific conditions of the classrooms regarding permanent, not in use, temporary classrooms, and classes without a classroom as captured on the EMIS form.

• The report should also provide ratios of students to classrooms.

• Other amenities within the classrooms (e.g., desks) should also be presented in the report cards. The Head Teachers reportedly believed that information only about the classrooms is not complete because a classroom without desks is still a big challenge.

Toilets

• Some Head Teachers believed that the report cards were focusing too much on students; therefore, they requested that the report cards also capture ratios of teachers to toilets because this is also a challenge in some schools.

Some of the Other Suggestions and Recommendations Raised

• The report cards should have provision for a small comment section so teachers can include additional information to support data and performance indicators that may warrant further clarification.

• The report cards should have an option so the information captured on the tablets and telephones can be printed. Relying exclusively on digital displays limited their sharing with external school stakeholders, particularly because Head Teachers are accustomed to displaying school performance information on the walls of their offices and buildings.

• The report cards should not be limited only to the EMIS survey; they should also be used to provide other important information such as performance on examinations, breaking down the results per gender, grade, and subjects.

• The need to work together and share equipment must be emphasized to the Head Teachers so that the Deputy Head Teachers can also be empowered to fully experience the benefits and challenges of the technology.

4.4.4 Overall Conclusions There is no question that the report cards added value for the majority of the respondents who participated in the feedback sessions and qualitative assessments. Nevertheless, for these feedback tools to gain the foothold into the routine work practices of the county, sub-county, and school officials, concerted efforts must be made to support the use of the report cards at each level. An example of a concerted effort can include ensuring that the content and format respond to the decision-making needs of the end user and allow for the level of transparency required for horizontal (school to community) and vertical (school to sub-county and county) accountability within the system. This effort should be further supported by a change management strategy that articulates the expectations for how schools, sub-counties, and counties are expected to share and use these feedback tools as part of their routine work practices. Lastly, the MoEST should

Page 42: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 34

consider investing in multi-channel strategies for sharing school performance data. An example of a strategy could be the use of Web-based and paper-based report cards and posters to maximize reach, use, and relevance to different stakeholders within the system.

5. System Requirements The systemic requirements are divided into two overarching capacities to support a school-based information system at scale: (1) the technical equipment and infrastructure and (2) the human resources.

5.1 Data Hosting Technical Requirements For a school-based information system, the DERP Team recommends the following infrastructure for systems support at scale:

• A physical host server that is capable of hosting a number of guest operating systems. The guest operating systems will require a minimum of 64 gigabytes (GB) of RAM (random access memory) and 12v Central Processing Units (CPUs).

• Host Operating System (Microsoft Windows Server 2008 R2 Datacenter Edition) with Microsoft Hyper V installed.

• The appropriate firewall software.

• Sufficient storage capacity (minimum of 1 terabyte [TB] recommended).

• Public static Internet protocol, routing of port 80/443 to terminate on a specified virtual machine instance.

• Internet availability (break out) from guest operating systems. The recommendation is to maintain data on a cloud-based server that is managed by an external vendor such as Amazon Cloud Computing Services. As another alternative, the MoEST could consider a hybrid approach in which a third party still hosts the platforms, but the EMIS data are pushed to a Ministry-hosted local server that included the previously mentioned infrastructure.

5.2 Centralized Human Resources Capacity and Skills Requirements for System Administration

At the central level, the MoEST will need to hire or outsource the following key system administrative functions and skills:

• Database and network administrators for relational and non-relational database architecture and maintenance of databases

• Software engineering skills, including, but not limited to, PHP, Java, and others for making modular updates to the form fields and dashboards and for developing application programming interfaces to allow for data connectivity and systems integration

• A cybersecurity specialist to manage firewall, virus, and malware software, including upgrades and pushes and patches

• Advanced analytic techniques for querying and reporting on large scale data sets.

Page 43: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 35

5.3 Decentralized Human Resources Capacity and Skills Requirements for System Support and Maintenance

Every county (and preferably every sub-county) will need to have a trained system support EMIS Officer in place and available to troubleshoot potential bugs and access back-end system configuration issues. Such issues could include devices unable to connect to the Internet, applications not submitting data, and re-registration and population of school identification numbers. These dedicated EMIS Officers would have more functional responsibility for system support than just entering, monitoring, or reporting data. In addition to serving in the first-responder Help Desk function, the EMIS Officers would assist the County and Sub-county Quality Assurance Standards Officers, the District Education Officers, and school communities with accessing the feedback reports and dashboards and with making meaningful use of the data. The EMIS Officers do not need to be Information Technology professionals or data scientists; instead, they should be educated, technology savvy individuals who are highly familiar and comfortable with managing Information and Communication Technology (ICT) applications and supporting system end users. The EMIS Officers could also be mobilized to help validate the data by complementing the work of the County and Sub-county Quality Assurance Standards Officers and District Education Officers by flagging schools that are at a higher risk for audits and validation.

5.4 License Requirements 5.4.1 Mobenzi (Telephone) Application Mobenzi is an enterprise-grade solution that is used in more than 40 countries. The Mobenzi platforms are provided as “software-as-a-service” by using the following costing model: a monthly or annual device per user. At scale (more than 1,000 users), the cost for total platform usage would decrease to KES 6,200 per device (school) per year. At the national scale, Mobenzi would be open to negotiating further in terms of price based on volume.

Additional functionality (e.g., additional forms, reports, content, functionality) could be deployed without an increase in usage fees (however, additional once-off configuration fees would apply). From this perspective, the more the software is used, the greater the return on investment.

If the MoEST so desires, then it can engage the Mobenzi Team to replicate Mobenzi at scale. If so, the recommendation is to engage with the MoEST via a local partner on the ground in Nairobi. The MoEST would work with the local partner to build capacity and serve primarily as the technology service provider and second-line support. As has been done with Digital Divide Data, this partner would assist with requirement elicitation, facilitation of roll-outs, training, first-line support, and troubleshooting throughout the duration of the project.

If the MoEST is to host Mobenzi on the local servers in Nairobi, then the following third-party licenses are required to operate the Mobenzi system locally and host the solution in a dedicated internal environment:

• Microsoft Hyper V

• Microsoft SQL Server 2008 R2 (Standard Edition)

• Microsoft Windows Server 2008 R2 Datacenter Edition (for virtualization host)

• Microsoft Windows Server 2008 R2 Standard

Page 44: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 36

• Secure Sockets Layer (SSL) certificates

• New Relic server monitoring (recommended).

5.4.2 IBM School Census Hub Application IBM Corp.’s School Census Hub application does not require a license to deploy. However, if the MoEST desires to deploy this system at scale, then the DERP Team recommends engaging the services of IBM Corp. Research Laboratory, Kenya to provide systems support for the initial deployment and rollout.

5.5 Supervision and Validation System Support Needs The MoEST will need to hire and train technical support and supervisors at the sub-county level. These supervisors can be based in the sub-county offices because this will ensure prompt responses to issues raised. The number of technical issues gradually reduces over time as more Head Teachers understand how to use and troubleshoot the system, which will, in turn, make it easier for them to access the schools. The main role of the Sub-county supervisor will be to monitor the use of the devices and provide support in each sub-county. These individuals would also serve as technical support for the teachers in the field.

If the validation process is to be successful, then the MoEST must provide some capacity-building opportunities in each sub-county for the District Education Officer and Quality Assurance Standards Officer. In addition, each of these officers must have a tablet with the validation tool installed. The tablets require data bundles so that the officers can update or submit the data. If an EMIS Officer is hired for each county and sub-county, then this person could help implement the validation activities. Furthermore, the Quality Assurance Standards Officers and District Education Officers can rely on the MoEST–hired EMIS Officers based at their sub-counties for technical support, including assistance with any technical challenges they might experience during data validation.

Equally important is the hiring or addition of another member to the Validation Team. Adding an additional team member would make the process more efficient, as was evident from the study in which a majority of county and sub-county officials were overwhelmed by the task to validate the 20 schools assigned to them. The county and sub-county officials required additional assistance from the DERP Field Supervisors to conduct the validation effort of the schools. Alternatively, the MoEST may not need to hire more staff; instead, the Ministry could train the Data Clerks in the county to assist with data validation. Additionally, the County and Sub-county Officers at arid and semi-arid (rural and remote) areas require transport during school visits to accommodate mobile schools. A vehicle should be provided for their use.

6. Strengths, Weaknesses, Opportunities, and Threats Analysis

To assess the likely uptake of a school-based information system at scale, the DERP Team conducted a strengths, weaknesses, opportunities, and threats (SWOT) analysis. A summary of the SWOT analysis is briefly provided in Table 12 and is further described in the remainder of this section of the report.

Page 45: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 37

Table 12. Summary of the SWOT Analysis

Strengths Weaknesses

• Political will and leadership

• Familiarity with EMIS questionnaire and/or protocol

• DERP demonstration of school-based information systems

• TSC and Kenya National Examinations Council demonstrations of distributed data entry systems

• Kenya as an ICT hub and talent base for East Africa generally

• Limited financing for recurrent costs

• Limited personnel (manpower) at the county and sub-county education offices

• Limited use of existing data that are being captured by the EMIS form

• Unclear roles and responsibilities of actors

• Poor network coverage in some areas

• Weak boards of management and poor relationship with schools

• Weak management capacity of County and Sub-county Officers

Opportunities Threats

• Medium Term Plan II and EMIS Roadmap

• Leveraging donor initiatives

• Multi-modal and channel platforms

• SAGAs plans and investment strategies

• Board of management expectations and demands

• Tethering of FP capitation grants

• Administrative burden to Head Teachers

• Norms and beliefs about data accuracy

• Entrenched legacy systems and siloes

• Expectations of quality assurance and validation roles

• Perceived uselessness of the data

• Accountability threat to Head Teachers

6.1 Strengths and Opportunities There are many positive developments within the education sector that the MoEST can leverage as both strengths and opportunities moving forward. The primary strength involves the existing leadership and political will that can serve as a driving force behind the uptake of the system. Broadly, beyond the education sector itself, there is strong political will at the national level that supports innovation in the management of public affairs, including the provision of educational services, which is demonstrated in the Manifesto of the Jubilee Government. SAGAs such as the Kenya National Examinations Council and TSC have demonstrated the efficacy of an online-distributed data entry system. Tethering the FP capitation funds to the timeliness and accuracy of the EMIS enrollment data returns will greatly increase demand and reporting compliance. However, this policy will also continue to motivate schools and counties to misreport or over-inflate figures, thus reinforcing the need to ensure effective validation and audit of the data.

Page 46: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 38

Moreover, the MoEST can take advantage of many opportunities to leverage resources and create momentum for a digital EMIS. For instance, the MoEST could use as resources the Medium-Term Plan II and the EMIS Roadmap, both of which call for significant budget and resources to mobilize for the EMIS. The EMIS Roadmap, in particular, calls for dedicated EMIS Officers at the county and sub-county offices; the officers could be invaluable resources to schools and counties to ensure the system’s uptake and use locally. In addition, there are a variety of existing donor initiatives that either directly or indirectly support distributed information systems. These initiatives include World Bank–funded activities, the Global Partnership for Education, and UNICEF–supported EMIS initiatives; the national tablet program implemented under USAID/Kenya’s Tusome Program; the Bill & Melinda Gates Foundation–funded water and sanitation program implemented by CARE; and digital EMIS initiatives supported by Aga Khan.

Lastly, Kenya itself has a strong ICT backbone that serves as an Information Technology hub for East Africa. Not only does Kenya have a strong infrastructure to support an online distributed system, but it also has the talent base and private-sector capacity to support this, should the MoEST look externally for help with delivering this service. Kenya has also proven to be a leading innovator in this field with the advent of M-PESA (the Mobile Money Transfer Service in Kenya) and its hosting of leading ICT firms in Africa.

6.2 Weaknesses and Threats Against the backdrop of strengths and opportunities, there are strong prevailing headwinds that could challenge the successful implementation of the EMIS. Foremost, the MoEST must confront the existing institutional barriers within the system. These barriers include a very thin capacity at the county and sub-county offices in terms of personnel, management control, and resources to effectively validate the data and enforce consequences for non-reporting, reporting delays, and misreporting. Related to these barriers is the ongoing administrative burden that Head Teachers and Deputies experience in terms of their reporting requirements. If the MoEST deploys a school-based EMIS, then the Ministry must do everything it can to reduce the redundancy in administrative data reporting imposed on schools. Another corollary factor is the unclear or ambiguous roles and responsibilities of all actors at each level of the system. Head Teachers and Deputies do not have clearly defined roles or expectations specific to their EMIS responsibilities. Likewise, the District Education Officers and the County and Sub-county Quality Assurance Standards Officers do not have clearly defined responsibilities and duties regarding the validation and double checking of the data.

Another major threat to the system is the potential difficulty in getting officials from across the MoEST and SAGAs to rely on a shared, integrated system for their school information needs. How the MoEST and the SAGAs come together to form a collaborative solution will go a long way towards reducing administrative reporting burdens of the Head Teacher and enabling a truly integrated system. Lastly, many Head Teachers may experience a real or perceived threat from the increased accountability and transparency of their reported enrollment, teachers, and textbooks and other resources. How the MoEST and stakeholders manage this issue will greatly impact the uptake of the system by schools and counties. As the FP becomes linked to the enrollment figures that have been reported, a weak or ineffectual validation system will only lead to increased abuses and incidences of fraud, thereby undermining the utility and effectiveness of the school-based EMIS.

Page 47: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 39

7. Ways Forward This initiative was intended to apply operations research methods to help the MoEST understand the systemic requirements and capacities needed to implement a fully modernized, digital, and integrated EMIS at scale. As the MoEST considers its options moving forward, the observations and findings from this effort provide several actionable next steps, which are each discussed in the remainder of this report.

7.1 Articulate the Vision and Outcomes of an Integrated Digital EMIS in the Broader Strategic Planning Framework

An important first step in making this system a reality is to articulate the vision and goals of a digital school-based EMIS within the framework of a formal policy and planning strategy. Formalizing the vision and goals within this framework will allow for appropriate stakeholder consultations and broad-base consensus for moving the system forward, with goals and outcomes that all stakeholders can support. The expected outcomes that should clearly be articulated in a formal strategic document include, but are not limited to, the following:

• Reduced administrative burden of Head Teachers and County and Sub-county Officers. Having a “one-stop shop” for their data reporting and information needs would lessen the amount of time and manpower required to collect, administer, and manage the various data systems. Having a one-stop shop would lead to more productive and efficient information management system that serves all key MoEST stakeholders and agencies. Having a one-stop shop would also free up the Head Teacher to focus on the more important efforts of instructional leadership and school-community engagement.

• Increased capability to use and draw insight from integrated and connected data systems. Current data systems do not allow for a richer insight regarding how various aspects of the education system influence each other and impact learning outcomes. Also, the MoEST has not tapped into its potential capacity to conduct the type of policy research and evaluation that would provide a deeper understanding of the impacts of various programs that span the topics of school health and nutrition, teacher management, and early childhood development. Connected data systems would allow for this type of policy and program evaluations that have heretofore been the sole domain of private and academic efforts.

• Reduced silos of decision-making and management systems across the MoEST and SAGAs and within the counties and sub-county offices. Breaking down these siloes will enable the counties and sub-counties to assume the mantle for decentralized service delivery and education management that was envisioned in the counties’ and sub-counties’ mandate. Likewise, school and communities will have the data readily available to make better management, personnel, and administrative decisions about their school programs, teacher workforce, and school priority plans and budgets.

• Increased transparency, accountability, and equity regarding how school inputs are allocated, managed, and used at each level of the system and particularly in schools. The lack of transparency speaks to the unspecified degree of leakage in the system because of the under-reporting of enrollment, textbooks, teachers, and other key inputs that drive education financing. Improved data accuracy also addresses the under-funded

Page 48: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 40

schools and communities that are struggling because of the inequitable allocation of these resources.

7.2 Scaffold Ministry Capacity to Scale and Sustain a School Information System at Scale

The DERP Team’s recommendation is to set a strong foundation within the MoEST, counties and sub-counties before a school-based digital EMIS is deployed. This effort should involve establishing a centralized executive ICT office to report to the Cabinet Secretary, developing comprehensive Terms of Reference for multi-modal data collection platforms and data systems integration, and designing and implementing a behavior change and communications strategy to encourage system adoption and use.

7.2.1 Establish a Centralized Information Communications Technology Office as an Education Sector-Wide Service

A school-based digital information system has the potential to serve the entire sector information needs beyond what is contained in the one-page census questionnaire. The current system may satisfy the basic planning needs of the Policy and Planning Office and may satisfy the minimum reporting requirements for international stakeholders, but it does not otherwise serve the information needs of the entire MoEST, much the less the entire sector. If the MoEST were to invest in a digital platform, it should do everything it can to maximize the benefits of this system. Organizationally, this effort will likely require a dedicated executive ICT office that has the mandate and authority to work across the MoEST’s departments (i.e., Basic, Vocation and Technical Training, and Higher Education) and with the SAGAs. The centralized ICT office will help to address perhaps the most difficult challenge of breaking down the institutional barriers and entrenched legacy systems that dominate the institutional landscape. In addition to building institutional–political bridges, the centralized ICT office would serve as the centralized information repository and data warehouse of the MoEST, overseeing the rollout of the digital school information system platform and implementing the change management strategies needed to ensure successful uptake of the system.

7.2.2 Rollout a School Information System with Multi-modal Platforms A digital school information system does not require a one-size-fits-all approach. The common denominator for any system is one that relies on the Head Teacher or delegate to submit the data electronically over a network. Electronic submission of the data could conceivably occur by telephone, tablet, or laptop or desktop computer, or some combination, as long as an Internet connection is available. The implication is that the data entry system does not necessarily need to be a universal system as long as the data captured are standardized and feed into a single centralized database. The distributed data entry system could also allow for centralized data entry, when needed because of a lack of network coverage in rural or remote areas. As such, not all schools will have the coverage required. All schools do not need to have identical tablets, tablet applications, or desktop applications, as long as the systems all feed into the same database.

In this scenario, some schools will have access to tablets, some may rely on laptop or desktop computers, and others may have to travel to a central location (e.g., a cybercafé, district education office) to enter their data. Some schools may even have to continue relying on paper forms, if that is the most efficient means to transmit the data due to their environmental

Page 49: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 41

circumstances. In addition, expanding the electronic form to account for the vast majority of the MoEST’s and SAGAs’ information needs will require nothing less than a smart phone, tablet, or laptop computer. The data entry burden would be too onerous for populating the fields on a feature telephone or a basic hand-held device. The recommendation is to survey the school network and infrastructure, and then plan a multi-modal rollout that effectively differentiates between the varying schools access to Internet. The MoEST should take into account the existing programs, such as the Digital Literacy Program, that are planning to equip or have already equipped the schools with devices.

Likewise, the technical solution and architecture should be designed for systems integration of the various data entry platforms. Before rolling out the system to the schools, the MoEST should equip the centralized ICT office with the necessary means to manage such a system, including the manpower hiring and servers and network connectivity to manage this system (the systemic requirements of such a system were previously discussed in Section 5 of this report). Another alternative is that the MoEST could outsource its server and data hosting requirements to highly secured cloud services such as those provided by Amazon and others providers for a small monthly fee.

7.2.3 Develop and Implement a Change Management and Behavior Change Plan The change management strategies are designed to align the behaviors of the individual end users of the system that can include Head Teachers in schools, the County and Sub-county Officers, and the central MoEST officials. These change management strategies include a communications plan that informs not only what individuals are expected to do, but also why and what type of positive impact this will have on the system. Other strategies may include using incentives (e.g., recognition, small rewards) for early adopters, incorporating reports and tools into the routine planning, implementing monitoring and support practices, establishing feedback mechanisms to key education stakeholders, and mobilizing a Help Desk and First-Responder Team to troubleshoot problems. These change management strategies should be incorporated into the service provider Terms of Reference and in the mandate of the centralized ICT office.

7.3 Establish Expectations and Monitor Against Those Expectations in Relation to Data Reporting and Quality Assurance

There are three distinct but inter-related elements to this recommendation. The first element is that Head Teachers and Deputies should have clearly articulated expectations and assigned roles for school record keeping and reporting. The second element is that those expectations should be monitored through consistent, periodic, and routine double checks to validate the accuracy of the reports and the quality of the records. The last element is that the consequences for misreporting or dereliction of validation duties are understood by all and enforced by the MoEST.

7.3.1 Standardize School Record Keeping and Management As this study shows, school record keeping is inconsistent at best. School records are the wellspring of the education sector data. If the data from these records are of poor quality, then the entire system is affected. Standardized administrative forms and templates should serve as the basis for all schools’ records, particularly with regard to enrollment, teacher management, textbooks, examinations, facilities, and health and nutrition indicators. The study also indicates that having good paper forms may lead to reduced errors in digital data entry. Strengthening the

Page 50: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 42

quality and consistency of school records management will reinforce the data quality across the sector.

7.3.2 Clarify and Articulate Roles and Responsibilities for Data Reporting and Validation

This study also illuminated the differences regarding how key stakeholders and actors perceived their roles and responsibilities for data reporting. At a minimum, Head Teachers and Deputy Head Teachers should have a common understanding of their responsibilities for data management and reporting. Likewise, it would be helpful for the County and Sub-county Officers to have a defined mandate on data accuracy and quality. The data validation and double checks do not require significantly resources—a minimal amount of funding is needed to visit a random selection of schools across the county to perform the necessary audit function. Through the use of the tablets, these site visits can be monitored and tracked to ensure that the County and Sub-county Officers are performing their validation tasks. The job descriptions and EMIS protocols could articulate the expectations for the County and Sub-county Officers to ensure data quality, as well as the consequences for not meeting those expectations.

7.3.3 Enforce Consequences for Misreporting and/or Not Performing Validation Tasks

Reporting discrepancies in school data is an action that is equivalent to fraud because funding is tied to enrollment and the numbers of teachers and textbooks through the FP funding formula. There is a legal basis in the Basic Education Act for willful misreporting of data, but without a system for double checking and validating the data, the schools and the counties will not perceive there to be any negative consequences. The goals for improving data quality and use must be shared across schools and county and sub-county offices. Likewise, there should be consequences for willful misreporting and/or dereliction of validation duties. The MoEST must enact its policies to ensure that the consequences can be enforced within the boundaries of Civil Service law and implemented by both the Ministry and the TSC.

7.3.4 Tether the Free Primary Capitation Grants to the Timeliness and Accuracy of the Data Reported

The MoEST’s policy formally links the reported enrollment data with the FP obligation. This demand driver will ensure that Head Teachers are sufficiently motivated to comply with the reporting requirements. However, with the previously mentioned conditions regarding data quality enforcement and validation in place, the prospects for leakage and inefficiency will continue unabated. Nevertheless, successful school-based information systems have shown elsewhere a high degree of compliance with capitation and school discretionary funds are tied directly to the reporting requirements. Moreover, the MoEST may capitalize on this extrinsic motivating factor by requiring Head Teachers to determine their means for ensuring that data are entered in a timely and accurate manner. This issue raises the prospect of a multi-modal type system, in which schools could rely on a number of devices or applications, as was discussed in Section 7.2.2 of this report.

7.4 Commission Rigorous Cost-Effectiveness and Cost–Benefit Analyses The vision of a distributed, school-based EMIS requires significant investment and organizational change, the likes of which will have transformative effects on the education system. Such an endeavor should not be taken lightly because it will likely face significant

Page 51: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 43

political and institutional opposition. Although this study attempted to identify the systemic and capacity challenges to implementing such a system at scale, further analytical work is recommended to demonstrate the return on investment. Such an analysis should compare the outcomes between the existing system with that of a school information system over a 10- to 20-year time horizon. The analysis should take into account the potential gains in data accuracy, while attempting to quantify value of the other outcomes previously described in either monetary terms or other outcome indicators. Likewise, this analysis should take into consideration the full costs and potentials savings associated with the re-organization and functions that the MoEST and the counties and sub-counties would assume as part of their data management and validation tasks. Although the MoEST is ultimately challenged with all of the aspects involved with obtaining high-quality and accurate data, the threat to a modern system lies in the institutional and political barriers that the MoEST must overcome. A rigorous policy analysis that takes into account the costs and benefits in comparison to the status quo could serve as another powerful policy advocacy tool.

Page 52: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 44

Annex A. Agenda for the Policy Dialogue Workshop Purpose. The purpose of the Policy Dialogue Workshop will be to share collective experiences of Education Management Information Systems (EMIS) and build consensus on ways forward for a future school management information system for all of Kenya.

Duration. The duration of the Policy Dialogue Workshop will be two days, which will be determined.

Location. The Policy Dialogue Workshop will be held in Naivasha at a venue yet to be determined.

Participants. The possible participants in the Policy Dialogue Workshop are presented in Table A-1.

Table A-1. Participants in the Policy Dialogue Workshop

Number Institution Number of

Representatives Individuals

1 Ministry of Education, Science and Technology/Policy and Planning

3 representatives Polycarp Otieno, Samuel Nthenge, and Darius Mukaga

2 Teachers Service Commission—Kenya (TSC)

1 representative To be determined

3 Kenya National Examinations Council (KNEC)

1 representative To be determined

4 United Nations Children’s Fund (UNICEF) 2 representatives Daniel Baheta and Mamy Rakotomalala

5 World Bank and Global Partnership for Education

1 representative To be determined

6 Digital Divide Data 1 representative Kenneth Mutuma

7 IBM Corp. Research Laboratory, Kenya 2 representatives Charity Wayua and Kommy Weldemerian

8 Water, Sanitation, and Hygiene (WASH); CARE; and Georgetown University

1 representative Alex Wendo

9 Aga Khan 1 representative To be determined

10 Education Standard and Quality Assurance Council

1 representative To be determined

11 Mombasa and Isiolo County Education Offices

2 representatives County Directors of Education

12 USAID/Kenya Tusome 2 representatives Ben Piper and Onesmus Kiminza

Page 53: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 45

Number Institution Number of

Representatives Individuals

13 Data for Education Research and Planning (DERP) in Africa

3 representatives Andrew Riechi, Mitchell Rakusin, and Luis Crouch

14 U.S. Agency for International Development (USAID)

1 representative Robert “Wick” Powers

Total 22 individuals

Summary Agenda Day 1. Review the EMIS pilot findings and share EMIS experiences and the EMIS Roadmap

Day 2. Discuss findings, implications and ways forward in relation to the digital school EMIS pilot

Expanded Agenda The expanded agenda for the Policy Dialogue Workshop is presented as Table A-2.

Table A-2. Expanded Agenda for the Policy Dialogue Workshop

Time Topic Facilitator/Presenter

Day 1. Review the EMIS pilot findings and share EMIS experiences and the EMIS Roadmap

8:00–9:00 a.m. Arrival and registration

Introductions

Opening remarks Darius Mugaka

Official opening Cabinet Secretary or Permanent Secretary or delegate

9:00–11:00 a.m. Briefing of existing information systems and other related initiatives (15 minutes each) as follows:

• MoEST/EMIS national census and report cards

• TSC electronic Teacher Management Information System

• KNEC registration portal

• Aga Khan EMIS assistance

• USAID/Kenya Tusome Tangerine® Tutor

• World Bank and Global Partnership for Education initiative

• MoEST/EMIS

• TSC

• KNEC

• Aga Khan

• Tusome

• World Bank

Page 54: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 46

Time Topic Facilitator/Presenter

11:00–11:30 a.m. Tea break

1:00–2:00 p.m. Lunch break

2:00–4:00 p.m. Overview of the DERP pilot EMIS activity Andrew Riechi

Presentation of pilot EMIS applications and process

IBM Corp. Research Laboratory, Kenya and Digital Divide Data

Presentation and discussion of findings Rakusin

4:00–4:30 p.m. Tea break

4:30–5:30 p.m. Set the stage for Day 2 of the workshop as follows:

• Presentation of EMIS Roadmap

• Day 1 closing remarks

• MoEST/EMIS

• To be determined (RTI International or USAID)

Day 2. Discuss findings and implications from the DERP digital school EMIS Study

8:00–8:30 a.m. Arrival

8:30–9:00 a.m. Summarize the discussions from Day 1 To be determined

9:00–11:00 a.m. Implications from the DERP Study

• Presentation about and discussion of strengths, weaknesses, opportunities, and threats analysis

• Presentation about and discussion of the ways forward

• Presentation and discussion on design implications

• To be determined, plenary

• MoEST/to be determined, plenary

• Rakusin, plenary

11:00–11:30 a.m. Tea break

11:30 a.m.–1:00 p.m. Design implications of a future Kenyan digital school EMIS

• Break-out groups will recommend core digital EMIS features regarding data integration, platform modalities, management and administration, validation protocols, and behavior change strategies

Break-out groups

1:00–2:00 p.m. Lunch break

2:00–3:00 p.m. Presentations and report-outs

Page 55: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 47

Time Topic Facilitator/Presenter

3:00–4:00 p.m. Discussion and consensus building to recommend features of a future school digital information system

Plenary

4:00–4:30 p.m. • Wrap up the discussion and describe the next steps

• Closing remarks

To be determined

Page 56: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 48

Annex B. Detailed Validation Data Findings Validation Results The validation survey randomly sampled 20 schools in each county to determine the extent to which errors in reporting were found. District Education Officers, County and Sub-county Quality Assurance Standards Officers, members of the Data for Education Research and Planning (DERP) in Africa Team, and staff hired to support his project visited the schools in January and February 2016 to double check the school records against select fields in the form that included enrollment, teachers, classrooms, toilets, school revenue, and textbooks. The following subsection of this report discusses the validated enrollment, textbook, and revenue data. The DERP Validation Teams relied on prior year school records and followed the same protocol as was used by the Kenya Ministry of Education, Science, and Technology (MoEST)/Education Management Information System (EMIS) during the 2014 and 2015 validation exercises.

EMIS Returns and Validation Results for Enrollment and Textbooks Enrollment and textbook data form the basis of much of the school financing through the Free Primary (FP) school capitation grant. Tables B1–3 of this annex detail a set of indicators that provide a full picture of the degree of error for enrollment, textbooks, and revenues reported via paper and electronic forms in November and December 2015. The error rates are based on comparing the EMIS form submissions with the external validation conducted in January and February 2016 by County and Sub-county Officers and DERP Team field staff.

The first indicator at the top of the table presents the number of schools whose EMIS data were inaccurate by greater than ±10% from the validated data. For enrollment, four schools in Mombasa County over-reported or under-reported by greater than ±10% in their electronic submissions, whereas five schools had enrollment reporting errors of greater than ±10% in their paper submissions. In Isiolo County, three schools reported greater than ±10% in their telephone submission, and only two schools reported enrollment figures that were greater ±10% of the validated figures in their paper submission. Based on the lot quality assurance sampling (LQAS) methodology, the study finds that Mombasa County as a whole generally under-performed or did not meet expectations in terms of its schools accurately reporting their enrollment data.

The discrepancy range indicators show the degree of variability of the error between schools in the same county. Some schools under-reported their enrollment, textbook, and revenue data, whereas other schools over-reported these data. The minimum discrepancy range shows the greatest error from under-reporting of all the schools. In the case of Mombasa County, the greatest instance of under-reported enrollment data belongs to a school that under-reported the information by 85.9% by tablets; the greatest under-reported error by a school was by 66.3% by paper. In contrast, in Mombasa County, the highest error for over-reporting enrollment data was 10.9% by tablet and 13.7% by paper. The standard deviation of the discrepancies across all schools shows that for enrollment data, tablets had a wider variation of error than the data submitted by paper.

Lastly, the gross, absolute, and net error figures show the total over-reported error and the total under-reported error in raw terms. In Mombasa County, the total enrollment that was over-reported was 311 students by tablet and 481 students by paper. The total under-reported

Page 57: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 49

enrollment in Mombasa County was −692 students by tablet and −719 students by paper. These findings show that the absolute error (total combined under-reported plus over-reported enrollment) amounts to 1,003 students (6% of validated enrollment total) by tablet and 1,207 students (8% of validated enrollment total) by paper. In Mombasa and Isiolo Counties, the differences between the two forms of submission are not significant enough to conclude that one is superior to the other in terms of reporting enrollment data.

Table B-1. Enrollment Validation Results for Electronic and Paper Submissions for Mombasa and Isiolo Counties

Mombasa County Tablet Paper

Enrollment discrepancy > ±10% (number of schools out of 20) 4 out of 20 5 out of 20

Enrollment discrepancy range (minimum percentage of validated total) −85.9% −66.3%

Enrollment discrepancy range (maximum percentage of validated total) 10.9% 13.7%

Enrollment discrepancy range (standard deviation) 20.5% 16.6%

Gross enrollment over-reporting (percentage of validated total) 311 (2%) 488 (3%)

Gross enrollment under-reporting (percentage of validated total) −692 (−4%) −719 (−5%)

Absolute enrollment error (percentage of validated total) 1,003 (6%) 1,207 (8%)

Net enrollment error (sum of over- and under-reported) −381 (−2.5%) −231 (−1%)

Isiolo County Telephone Paper

Enrollment discrepancy > ±10% (number of schools out of 20) 3 out of 20 2 out of 20

Enrollment discrepancy range (minimum percentage of validated total) −32.91% −32.91%

Enrollment discrepancy range (maximum percentage of validated total) 38.55% 38.55%

Enrollment discrepancy standard deviation 13.6% 12.2%

Gross enrollment over-reporting (percentage of validated total) 209 (2%) 172 (2%)

Gross enrollment under-reporting (percentage of validated total) −578 (−6%) −276 (−3%)

Net enrollment error (sum of over- and under-reported) −369 −104

Net enrollment error (percentage of validated total) −3.52% −1%

Page 58: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 50

As shown in Table B-2, the discrepancies from the reported textbook figures were far more pronounced. In both Mombasa and Isiolo Counties, schools electronically reported their textbook data with a very high degree of accuracy. None of the schools reporting by tablets in Mombasa County had greater than ±10% error, and only two of the schools reporting by telephone in Isiolo County had greater than ±10% error. However, the vast majority of these same schools reporting by paper had greater than ±10% error: 18 out of 20 schools in Mombasa County and 19 out of 20 schools in Isiolo County.

The textbook data errors recorded by paper were due to significant over-reporting of available textbooks. The 20 schools that were double checked in Mombasa County recorded more than 30,000 more textbooks than were actually validated. The 20 schools that were double checked in Isiolo County recorded almost 21,000 more textbooks than were actually validated. The variance in error between paper and electronic data could be caused by misreporting by schools (either unintentional or possibly, intentionally) data entry errors into the national database by data entry clerks, or data query errors in retrieving the data from the database by the MoEST analysts. The DERP Team has requested confirmation from the MoEST regarding the accuracy of the data in its database.

Table B-2. Textbook Validation Results for Electronic and Paper Submissions for Mombasa and Isiolo Counties

Mombasa County Tablet Paper

Textbook discrepancy > ±10% (number of schools out of 20) 0 out of 20 18 out of 20

Textbook discrepancy range (minimum percentage of validated total) −7.52% 500%

Textbook discrepancy range (maximum percentage of validated total) 2.10% 2,900%

Textbook discrepancy standard deviation 1.80% 167.3%

Gross textbook over-reporting (percentage of validated total) 67 (0.2%) 30,136 (76.7%)

Gross textbook under-reporting (percentage of validated total) −116 (−0.3%) 0 (0.0%)

Net textbook error (sum of over- and under-reported) −49 (0%) 30,136 (77%)

Isiolo County Telephone Paper

Textbook discrepancy > ±10% (number of schools out of 20) 2 out of 20 19 out of 20

Textbook discrepancy range (minimum percentage of validated total) −4.3% 22.5%

Textbook discrepancy range (maximum percentage of validated total) 158% 401%

Textbook discrepancy standard deviation 39.4% 99.2%

Gross textbook over-reporting (percentage of validated total) 740 (3.4%) 20,900 (96.9%)

Gross textbook under-reporting (percentage of validated total) −85 (−0.4%) 0 (0%)

Page 59: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 51

Net textbook error (sum of over- and under-reported) 655 20,900

Net textbook error (percentage of validated total) 3.04% 96.93%

The information in Tables B-1 and B-2 is important because the reduced errors in over-reporting enrollment and textbooks has a direct impact on the efficiency of the FP capitation grant to schools. Moreover, the reduced under-reporting of enrollment and textbooks will lead to increased equitable allocation of resources for those schools that are not receiving the necessary funds and textbooks to meet their needs. Table B-3 outlines the potential impact in terms of the efficiency and equity gains realized from reduced over- and under-reporting, respectively. These numbers are derived by multiplying the each student error by 1,356 Kenyan shillings (KES), which is the base multiplier for the capitation grant. Likewise, KES 200 is multiplied for each textbook error, which is the standard per capita textbook allocation that is earmarked within the school’s overall capitation grant.

Table B-3. Efficiency and Equity Gains from Reduced Errors in Enrollment and Textbook Reporting (Note: Currency Is Presented in KES)

Efficiency Gains Mombasa County

Isiolo County

FP efficiency gains from reduced over-reporting (KES) 240,012 −50,172

FP equity from reduced under-reporting (KES) 36,612 −409,512

Textbook efficiency gains from reduced over-reporting (KES) 6,013,800 4,032,000

Textbook equity gains from reduced under-reporting (KES) −23,200 −17,000

Total Gains Mombasa County

Isiolo County

Enrollment gross (KES) 276,624 −459,684

Textbook gross (KES) 5,990,600 4,015,000

Enrollment gains as a proportion to total validated FP 1.3% −3.2%

Textbook gains as a proportion to total validated textbooks 76.1% 93.1%

Total resource (enrollment plus textbooks) gains 21.7% 19.2%

The efficiency and equity gains were calculated based on the differences in total under- and over-reporting between the electronic and paper forms, and then multiplied by the cost per student (KES 1,356) and per textbook (KES 200). These figures are based on the MoEST’s policy for allocating FP capitation funds. Based on these figures, the MoEST might realize enormous gains because of the reduction of over-reporting of textbooks, which could improve efficiency regarding how 76 KES out of 100 KES are spent on textbooks in Mombasa County and how 93 KES out of 100 KES are spent on textbooks in Isiolo County. The combined gains

Page 60: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 52

for the two counties total KES 9,822,540, or 20.3% of the total validated FP and textbook allocation for the schools.

EMIS Returns and Validation Results for School Revenues Table B-4 details the discrepancies or variances observed between the validated resources on record versus the total figures reported by schools. For Mombasa County, schools under-reported their income by more than KES 20 million by tablet and KES 25 million by paper, with a far greater number of schools exceeding a ±10% error in reporting accuracy. Also, in Mombasa County, the differences between what was recorded by tablet and paper forms appear to be significant. However, in Isiolo County, there is little difference in under-reported income between telephone and paper (approximately KES 9 million for each). Far fewer schools recorded a ±10% error in reporting accuracy, indicating that overall, Isiolo County is faring much better in terms of overall revenue data accuracy. Nevertheless, the total amount of revenue error reported in Isiolo County (31% by paper and 31% telephone) is comparable, though significantly less than the error reported in Mombasa County (38% by tablet and 50% by paper). Note Sections 4.2.2 and 4.2.3 in the main body of the report discuss the cost-effectiveness implications of these findings in greater detail.

Table B-4. School Revenue Validation Results for Electronic and Paper Submissions for Mombasa and Isiolo Counties

Mombasa County Tablet Paper

Revenue discrepancy > ±10% (number of schools out of 20) 8 out of 20 7 out of 20

Revenue discrepancy range (minimum percentage of validated total) −90.00% −100.00%

Revenue discrepancy range (maximum percentage of validated total) 32.17% 4.22%

Revenue discrepancy standard deviation 36.9% 286.3%

Gross revenue over-reporting (percentage of validated total) 986,862 (2%) 25,762 (0%)

Gross revenue under-reporting (percentage of validated total) −20,772,007

(−40%) −25,836,144

(−50%)

Net revenue error (sum of over- and under-reported) −19,785,145 −25,810,382

Net revenue error (percentage of validated total) −38% −50%

Isiolo County Telephone Paper

Revenue discrepancy > ±10% (number of schools out of 20) 2 out of 20 1 out of 20

Revenue discrepancy range (minimum percentage of validated total) −79.87% −79.87%

Revenue discrepancy range (maximum percentage of validated total) 0% 0%

Revenue discrepancy standard deviation 20.0% 20.0%

Gross revenue over-reporting (percentage of validated total) 0 (0%) 2 (0%)

Page 61: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 53

Gross revenue under-reporting (percentage of validated total) −9,055,600

(32%) −8,953,771

(31%)

Net revenue error (sum of over- and under-reported) −9,055,600 −8,953,769

Net revenue error (percentage of validated total) −31% −31%

Page 62: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 54

Annex C. Design Features and Functionality of the Digital Education Management Information System Applications

Software Design Features and Functionality of the IBM Corp. Research Laboratory, Kenya’s School Census Hub The School Census Hub, which is a suite of applications enabled on tablets, was used to facilitate the collection, validation, and visualization of data in schools. The School Census Hub allows for capturing and reporting of static and dynamic information in schools for improved accountability and transparency.

Data Collection Tool The four key modules of the School Census Hub data collection functionality are as follows:

• Data Collection Module—Supports data collection by Head Teachers and the approval or rejection of data submitted by District Education Officers as a data validation check

• Data Verification Module—Supports Education Management Information System (EMIS) data verification efforts because the District Quality Assurance Standards Officers visit schools to verify whether the data submitted by the Head Teachers accurately represent the actual situation in the schools

• Pictures Module—Captures image data by taking pictures of school facilities such as classrooms, washrooms, and other relevant school areas

• Reports Module—Provides school and county report cards and dashboards that visualize the data and indicators.

Data Collection Process The Head Teachers, Deputy Head Teachers, and County and Sub-county Officers were the main users of the application. The Head Teachers and Deputy Head Teachers entered the EMIS data into the application (Figure C-1), and then submitted the information. Each school had one user account that was shared by Head Teachers and Deputy Head Teachers.

Page 63: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 55

Figure C-1. A screen capture of EMIS data in the School Census Hub application.

The application provided basic validation to the Head Teachers and Deputy Head Teachers as they entered the data. For example, if the selected gender was set to boys, but the enrollment data for girls were entered, then a validation message would pop up in the application (Figure C-2) asking the Head Teachers and Deputy Head Teachers to correct this information.

Page 64: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 56

Figure C-2. A screen capture of the validation message that pops up in the application when an error occurs.

The County Director of Education, the Deputy County Director of Education, four District Education Officers, and four District Quality Assurance Standards Officers reviewed the submitted data and either approved or returned the data (Figure C-3) if errors or inconsistencies were discovered. Each of these county officials had user accounts to access the approval process for schools under their respective jurisdiction.

Once data from a school were either approved or returned by a county official, the school application will receive a notification. When returning submitted data to a school, the county officials can enter a reason for this action, which informs the Head Teacher and/or Deputy Head Teacher about which section needs to be corrected before resubmitting the data.

Page 65: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 57

Figure C-3. After reviewing the data, county officials can approve or return the submitted data by clicking on the respective green button at the bottom of the list.

Report Cards Report cards are a decision-support tool that provide visual representations of indicators to help users make meaningful use of their EMIS data. The report card analytics presents detailed visualizations with the aim of having a deeper understanding across the different indicators (i.e., students, performance, teachers, resources).

Figure C-4 shows six main views that are aggregated and adapted for each school based on indicators that were aggregated. Each view has detailed sub-sections that represent different data

Page 66: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 58

collected by using the School Census Hub. Users can navigate between these views in an interactive manner. Each view is listed and described as follows:

• Section A is used to view detailed analytics about a primary section, which includes enrollment, new students, and repeaters.

• Section B is employed to view detailed analytics about the number of teachers based on specific qualifications.

• Section C provides detailed analytics about examinations.

• Section D provides detailed analytics about textbooks.

• Section E is used to view detailed analytics about classrooms.

• Section F provides detailed analytics about metrics that affect their decision process.

Figure C-4. The six main views of the School Report Card screen.

Decision Support The decision-support module allows users to compare students against the most important resources that directly affect performance and learning by interacting with the embedded decision-support application. These ratios can be used as a standardized measurement of performance of a school in these specific areas regardless of the school size. For example, District Education Officers or the Ministry of Education, Science, and Technology (MoEST) can use this module to identify the areas in which a school needs the most assistance. A decrease in one of these ratios over subsequent years implies improvement of the school in the area measured by the ratio. When the Ratios button is clicked (Figure C-5), this action will launch this decision-support feature.

Page 67: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 59

Figure C-5. Decision support on the Reports screen for a specific school.

Features, Functionality, and Screen Captures of the Mobenzi Mobile Telephone Application

Mobenzi Gateway The EMIS survey is programmed on the Mobenzi gateway on computers, and a link is sent to the mobile telephones for download via short messaging service (SMS). The link can also be entered manually on the devices. Installing the application on mobile telephones requires the Internet; therefore, the telephones can be loaded with data bundles or wireless networking can be used, if available. Downloading the application into the telephones requires as little as 5 megabytes (MBs), but updating and uploading of surveys will depend on the length of the questionnaire and the number of surveys submitted.

Figures C-6 and C-7 show the Mobenzi gateway interface on the mobile telephones.

Page 68: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 60

Figure C-6. The Mobenzi gateway.

When the user clicks on the Mobenzi gateway, the application takes the user to the Configuration page, which checks for Internet availability, downloads any new changes made to the survey, and then allows the user to proceed to the Home page. If there is no Internet connection, an error message will pop up on the Configuration page to inform the user that the system has not downloaded any new changes; therefore, the user will need to check his or her connection or network coverage. The Home page contains the EMIS Questionnaire, Report Card, and a Help icons (Figure C-7), which are each described further as follows:

• Questionnaire: By clicking on this icon, users can open different sections of the questionnaire

• Report Card: By clicking on this icon, users can view visually informative graphs generated from the EMIS data.

• Help: By clicking on this icon, users can view solutions to some of the system problems that he or she may encounter while using the application or device.

Page 69: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 61

Figure C-7. The Mobenzi Gateway dashboard that displays Questionnaire, Report Card, and Help icons.

Completing the Survey The user opens the survey by clicking on the Questionnaire icon. The survey is divided into six sections based on the different sections created on the paper survey; these sections can be completed in any order.

As a Head Teacher completes each field, his or her response is saved locally on the device; and the section status is displayed as “in progress” if the Head Teacher decides to close the questionnaire. At the end of each section, the Head Teacher clicks on the Save & Continue button to save his or her responses (Figure C-8). At this point, the responses in that particular section are validated. The responses in that section can only be saved after all fields have passed validation. The section status will then display as being “completed.” Once the section is validated and saved, it will begin to synchronize at the backend.

Page 70: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 62

Figure C-8. The Save & Continue button is used on the EMIS Questionnaire to save a Head Teachers’ responses.

Once all sections have been completed, the Head Teacher is prompted to sign off and submit the questionnaire (Figure C-9). Even after submission, the Head Teacher can still modify previously captured responses and re-submit the survey.

Figure C-9. The Sign-off & Submit button is used on the EMIS Questionnaire submit the data.

The questionnaire will operate in three modes: Training, Practice, and Live. Devices will initially operate in Training mode. Operating in “Training” mode means that the data entered will be removed once survey is switched to “Practice” mode, and the same will occur when switching to

Page 71: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 63

“Live” mode, which allows for actual data entry. The divisions of the different modes ensure that the correct data are used for the final analysis.

In each mode, a particular message will pop up for the Head Teacher. These messages are presented as follows:

• Training mode: “The questionnaire is in Training mode. Any data the user enters now will be discarded after training is complete.”

• Practice mode: “The questionnaire is in Practice mode. Data that the user captures and submits now will only be used by MoEST once the survey is switched to live.”

• Live mode: The questionnaire is now Live. The data that the user captures and submits will be used by MoEST for official reporting.”

Report Cards Figures C-10 and C-11 show how to access the report cards and samples graph from one of the schools in Isiolo County.

Figure C-10. The Mobenzi Gateway dashboard.

Figure C-11. The Report Card dashboard.

When the user clicks on the Report Card icon, they open the report cards. The report cards are outlined in sections as depicted by the Report Card dashboard in Figure C-11. To view the report cards, Head Teachers must submit the EMIS data or pull it from a different database and upload it into the Mobenzi system. The Mobenzi Support Team preloaded the 2014 EMIS data to

Page 72: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 64

generate reports. By doing so, this action would help schools that were unable to submit data in 2015 to view report cards.

Figures C-12, C-13, and C-14 show data on the Report Card dashboard based on specific sections selected.

Figure C-12. Student-to-teacher ratios for primary school (left) and for early childhood development education (ECDE; right).

Page 73: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 65

Figure C-13. Student-to-toilet ratio for boys (left) and for girls (right).

Page 74: EdData II Data for Education Research and …...EdData II Data for Education Research and Programming (DERP) in Africa Using Data for Accountability and Transparency in Schools Big

DERP in Africa—Big Data Report 66

Figure C-14. The school mean examination Report is represented by the green line, and the dotted line shows the average national mean Report. (Note: Reports from only Isiolo and Mombasa Counties were used to compose the average mean Report for the pilot).