Does information lead to more active citizenship? An Evaluation of the Impact of the Uwezo Initiative in Kenya Evan S. Lieberman Princeton Daniel N. Posner MIT Lily L. Tsai MIT We gratefully acknowledge Jessica Grody for serving as our Project Manager; Ruth Carlitz, Kelly Zhang, Angela Kiruri, and Richard Odhiambo, for serving as field coordinators; and Jason Poulos, and Yue Hou for additional research assistance. We also appreciate the steadfast cooperation of various Uwezo staff, including Sara Ruto, John Mugo, James Angoye, Joyce Kinyanjui, Conrad Watola, Amos Kaburu, Ezekiel Sikutwa, and Izel Kipruto.
EXECUTIVE SUMMARY... 1 Introduction and theoretical motivation... 4 Review of relevant literature... 6 Information and accountability, especially in education reform... 6 The Uwezo Initiative as treatment... 7 Selection of villages and households... 9 Context: Kenya... 10 Research design: sampling and instrumentation... 10 Post- treatment, matched village sample design... 10 Selection of villages... 12 Selection of households... 13 Selection of respondents within households... 15 Balance of covariates and tests for treatment/spillover... 16 Research instruments... 16 Outcomes we investigate... 17 Interventions at home to help one s own children... 18 General involvement in efforts to improve one s children s learning... 18 Interventions at school... 19 Civic participation and citizen action... 20 Results... 24 Qualitative results, including in assessment only villages... 28 Why no treatment effect?... 28 Assumptions in the information- accountability causal chain... 31 Do these conditions hold in our study population?... 34 Rongo and Kirinyaga within the Kenyan Context... 46 Additional findings from studies of information dissemination... 47 Conclusions and Implications... 47 References... 50 Appendix I: Findings from SMS and Radio Dissemination Study... 52 Appendix II: Quotes from Qualitative Research... 54 Appendix III: Covariate Tables and Figures... 56 Appendix IV: Correlations of Actions in Past Three Months... 59 Appendix V: Action Recall... 61 Appendix VI: Parental Involvement in Education... 63 Appendix VII: Spending Priorities... 65 Appendix VIII: Health and Water... 66 Descriptive Statistics: Health... 66 Descriptive Statistics: Water... 67 Appendix IX: Uwezo Assessment Overview... 68
EXECUTIVE SUMMARY The research team of Evan Lieberman (Princeton), Daniel Posner (MIT), and Lily Tsai (MIT), all professors of political science, was engaged by Twaweza for a three- year contract to conduct studies on the impact of the Uwezo initiative in Kenya and Tanzania. This report describes the results of our Kenya Phase I research, which focused on the Uwezo II assessment and launch. Our study investigated the effects of one aspect of the Uwezo initiative on patterns of active citizenship the practice of individuals participating in decision- making forums and providing time and/or material resources to improve service delivery in education and other sectors. Specifically, we investigated the effects of the Uwezo literacy/numeracy assessment and the dissemination of a battery of Uwezo- produced instructional materials designed to promote children s learning and citizen participation. These interventions constitute the first two parts of Uwezo s three- pronged strategy for effecting change. The third, crucial part, which was just being initiated as we were concluding this study and which, in fact, will continue over the next several years involves the broad dissemination of the results of the assessments and the stimulation of a multifaceted national discussion about children s learning. Since the theory of change that lies behind the Uwezo initiative requires all three of these facets to be in place before we would expect to see large- scale changes in citizen agency, this report should not be read as a full evaluation of the Uwezo initiative. Rather, it was designed as a focused investigation of the impact of the first stages of the broader Uwezo undertaking of how citizens might respond to exposure to the types of information they were presumed to lack. The rationale for studying the impact of just the first parts of the longer- term Uwezo program was that it would provide an opportunity for testing whether exposure to information about student performance alone might have large effects on parents behavior even without the creation of a broader ecosystem of change that is the objective of Uwezo initiative writ large. While a finding of no impact at this initial stage could thus be taken as an indication that the program is not working, it could also be interpreted as being consistent with the Uwezo theory of change, which implies that information provision alone should not be sufficient for generating real behavioral change. We employed a multi- method research design, which allowed us to measure active citizenship and the factors likely to influence it at several levels, including through a household survey, direct observation, and semi- structured interviews with school and village leaders. By comparing levels of active citizenship within households and villages that received the Uwezo intervention with otherwise comparable units that did not, we were able to draw inferences about the average treatment effect of the Uwezo assessment and accompanying materials. We conducted our research during June- August 2011 in 34 villages in two Kenyan districts: Kirinyaga, in the Central Highlands, and Rongo, in Nyanza Province. 1
Our wide- ranging analyses suggested that the assessment and distribution of informational materials was implemented relatively successfully but generated no evidence of a treatment effect: that is, we found no statistically significant differences on any measure of active citizenship across the households and villages that did and did not receive the Uwezo assessment and accompanying materials. While we discuss several possible explanations for these findings in this report, we place greatest emphasis on the absence in our research sites of numerous conditions that must hold for the provision of information to be a concrete driver of political agency. To the extent that the villages we studied are representative of Kenya more generally, our findings suggest that only a very small share of the targeted population is likely to have been affected by the Uwezo interventions that have been implemented thus far. Although the bulk of our research efforts took place before the launch of the Uwezo information dissemination campaign that was designed to publicize the assessment results and to raise awareness about the state of education in Kenya more generally, we did conduct a pair of smaller- scale studies after that launch to investigate its impact. Neither accomplished its objectives. In the first study, which involved evaluating the impact of sending SMS messages via mobile phones, only a minority of the target audience received the message and, of those that did, the content had little impact on them. In the case of the second, which involved the broadcast of a radio program discussing primary education, we identified only a single citizen who had actually heard the radio broadcast. While these studies were extremely limited in scope, and may have been affected by idiosyncrasies of the particular sites in which we conducted them, they are nonetheless highly suggestive of the need to improve media penetration before it will even be possible to assess the impact of such messages on active citizenship and related outcomes. Our research suggests several implications for the design of future aspects of the Uwezo initiative. First, Uwezo may want to consider giving different assessments to children of different ages. Because pass rates among older children are very high with the current assessment, parents may be getting the message that learning quality is satisfactory, which may be the opposite of what Uwezo intends. Second, Uwezo may want to consider targeting areas where people are most likely to be affected by proposed information dissemination campaigns. Our research suggests that areas where there is prior information that schools have not shared exam results and/or where educational attainment has been lowest are the most likely places to have conditions where citizens would find the Uwezo information to be new and distressing. Third, Uwezo ought to plan and monitor the penetration of its communications and information dissemination campaigns more carefully to make sure they reach their intended audiences. Fourth, Uwezo ought to consider particular strategic informational strategies that might elicit stronger responses from citizens. As their own theory of change suggests, possible strategies include the clear presentation of comparative data across geographic regions; and/or presentation of information concerning the relationship between levels of active citizenship and the quality of educational attainment. In other words, Uwezo should 2
consider the social, psychological, and strategic incentives for citizens to act and in turn, tailor their informational campaigns in ways that play to such incentives. Finally, we urge Uwezo s program managers to look closely at the information- accountability causal chain suggested by our research, to reflect on which of the bottlenecks they believe may be most important in limiting the impact of their programming, and then focus future programming efforts toward reducing those obstacles to citizen activism. 3
Introduction and theoretical motivation The identification of strategies to enable and to empower citizens to hold governments accountable has become a central concern among scholars and practitioners of development over the last decade. Advocates of popular accountability frequently argue that one of the most stubborn and yet potentially fixable impediments to human development in democratic, low- resource settings is the asymmetric access to information between governments and poor people (e.g. Sen 1999; Besley and Burgess 2000). Without ready access to comprehensible information about how their government is performing on their behalf, and perhaps also lacking knowledge about how to channel dissatisfaction productively, ordinary citizens have generally been unable to effect change in the long route of accountability (World Bank 2003: 6). If, however, citizens were able to gain access to better information about government performance and greater awareness of how to effect change, then they should become more active agents in pressuring governments to improve service provision. Optimistically, interventions designed to provide such missing information could lead to a self- reinforcing virtuous cycle: With greater information, citizens should be more likely to pressure relevant actors to improve the quality of service delivery, which in turn would lead to higher levels of educational attainment. More educated citizens would have a more sophisticated understanding of government performance, which would enable them to participate even more effectively in holding the government accountable for service delivery. In this report, we present the results of a study of such an intervention in the education sector the Uwezo initiative in two districts of Kenya. The heart of the report is the analysis of a field study in which we compare levels of active citizenship among households that did and did not receive a randomized informational treatment. Ultimately, we find that the information dissemination, which involves providing parents with information about their children s performance on math and literacy tests as well as offering information about how they might get more involved in improving education outcomes for their children, has no discernible impact on levels of active citizenship relative to the control group. While our empirical results are unambiguously negative, it is critical to place this study in the proper context. First, this phase I study was not meant to be a thoroughgoing evaluation of the entire Uwezo initiative. Rather, it is a focused study of the impact of particular aspects of that initiative. We do not evaluate the impact of the interaction of various forms of information provision in the creation of an ecosystem of change. Later evaluations conducted after such an ecosystem has emerged will be in a better position to assess the impact of the broader Uwezo initiative. That said, the Uwezo theory of change, like other theories that posit a link between information and greater government accountability, hinges on a set of assumptions about how individuals will respond to new information, and our study sought to 4
provide a direct and clean test of those propositions. In the case of the Uwezo initiative the critical information includes objective diagnostic data concerning actual educational attainment (English and Kiswahili literacy and numeracy) and clear directions about how to influence children s education in private and in public. That the direct provision of such information was not associated with any detectable change in attitudes or behaviors calls into question a key piece of Uwezo s hypothesized causal chain, and thus demands serious reflection. We do not think the results here should be discounted because of the limited temporal, programmatic, or geographic scope of the study. While we cannot rule out the possibility that the parents who did not respond to the information they were provided about their children s school performance might respond at a later date after being exposed to a broader range of information and pressures generated by the later stages of the Uwezo intervention, we believe nonetheless that the Uwezo program would substantially benefit from closer attention to understanding why we found no effect. Specifically, we believe that the failure of information provision to generate its hypothesized effects in our evaluation is likely due to the absence, in the context we were studying, of a series of necessary conditions that are implicit in the information- accountability causal chain. The implication is twofold: first, that attempts to generate citizen activism through the provision of information must pay greater attention to the implicit assumptions that underlie the presumed linkage between greater information and citizen activism. Second, that Uwezo s next stage activities should be focused on overcoming the specific bottlenecks that we identify in this report as plausibly responsible for the weak effects of the informational intervention. In the remainder of the report, we discuss the theoretical motivation for the research; describe the intervention that we study; explain the research design we employ for evaluating its impact; discuss the results of our analyses of the resulting data; and conclude with reflections about why we believe we found little impact. We also identify some non- experimental correlates of active citizenship that may be useful for future research. To be clear from the outset: our focus is on the determinants of active citizenship, which we define to be the practice of individuals participating in decision- making forums and voluntarily or quasi- voluntarily providing time and/or material resources to improve the provision of public goods. Although our research is built around an intervention in the education sector, we are not concerned with educational practice or policy per se. Our concern is with how the provision of information about government performance can lead to behavior by citizens that improves that performance. Children s education, while a critically important outcome in its own right, is merely the area of service provision about which information is provided in our study. And while the particular kind of citizen activism that we study is with respect to improving children s learning, our broader interest lies in understanding the conditions under which citizens will expend 5
personal and collective resources to apply pressure to governments and local bureaucracies to improve the quality of the public services that they provide. Review of relevant literature The notion that the quality of democracy depends on an informed public has been central to studies of democratization since early in the 20 th century. In his seminal work of modernization theory, Lipset quotes Bryce as saying that education, if it does not make men good citizens, makes it at least easier for them to become so (1959: 79). To be sure, much of the proposed link between education and citizenship in this formulation involved value orientations, including norms of tolerance, [and restraints] from adhering to extremist and monistic doctrines, but the link was also made in terms of the ability to make informed decisions. More recently, scholars have focused more precisely on the role of information. Adsera, Boix, and Payne, for example, stress the important role of information for accountability, arguing that, with perfectly informed voters, politicians rent should disappear (2003: 448). In cross- country analyses, they find that the free circulation of newspapers has a positive effect on the quality of democracy. While plausible, such studies have lacked micro- level evidence testing the specific proposition that an individual is more likely to be an active and engaged citizen when presented with more and better quality information. And yet, a series of other studies have done just that, as we discuss below. Information and accountability, especially in education reform The most directly relevant set of scholarly works has asked the question of whether information can help empower poor citizens in poor countries to enjoy better services by making governments more accountable (e.g. Banerjee et al. 2007; Pande 2011). Along these lines, the World Bank recently published a landmark study, Making Schools Work: New Evidence on Accountability Reforms (Bruns, Filmer, and Patrinos 2011), which systematically reviews a wide range of studies that estimate the effects of interventions to address service delivery failures in education. They highlight that, without information, citizens have little power to hold service providers accountable and that information can be expected to affect learning outcomes through participation and voice, as well by providing citizens with the ability to choose among alternatives and service providers with better tools for management (33-36). As scholars of political science, we are most interested in the effects of information on participation and voice, or what we call active citizenship. Specifically, we are concerned with assessing the conjecture that information will lead to closer monitoring of school performance and service delivery and more effective lobby[ing of] governments for improved policies (35-36). The impact evaluation studies reviewed in Making Schools Work offer mixed results. Across the three countries they study (Pakistan, Liberia, and India), there is some variability in the degree to which target audiences absorb and process new information (74), and much of the impact appears to generate private rather than collective action (75-76). But the authors conclude that such research is only in its 6
infancy, and that many questions remain open for example, on the most effective ways to design information programs, or on the factors that enable or disable impacts on learning outcomes, or on the way to collect and disseminate information (75). One of the studies mentioned, conducted in Uttar Pradesh, India by Banerjee et al. (2007), concerns the relationship between information dissemination and citizen participation in local governing agencies. On a baseline survey, they found that very few households participated in local level governance and that, of those that did, education was a low priority for them. They implemented a series of randomized controlled treatments across villages but found that none including one involving the facilitation of small group meetings to discuss education and/or village- level report cards had any effect on citizen involvement in public schools. Keefer and Khemani (2011) explored a related question by evaluating the effects of information disseminated via radio shows in Benin. They used a natural experiment built around within- commune variation in access to community radio programming to examine the effects of exposure to such programming. They found no difference in terms of participation in the production of government inputs, or involvement in Parent- Teacher Association meetings. However, they did find greater private investment in children in households with greater radio exposure. Thus, studies conducted in different places and evaluating different interventions have generated largely negative results concerning the effects of information on citizenship in the education sector. As in all experimental field studies, negative findings could be the result of insufficiently strong or well- executed treatments or of particular contextual factors that attenuated the effects of the treatments in ways that might not hold elsewhere. Moreover, research in other sectors has provided strong evidence of the potential power of bottom up approaches. For instance, Bjorkman and Svensson (2009) found that child deaths declined in Ugandan communities that were provided with information necessary for monitoring health services. Thus, given the strong theoretical basis for the information- accountability link, there is a great need for further research in this area. The Uwezo Initiative as treatment The Uwezo initiative is a large- scale information- based intervention inspired by precisely the theoretical expectations that motivated the studies discussed above. Specifically, Uwezo, which means capability in Kiswahili, seeks to improve educational outcomes among children in Kenya, Uganda and Tanzania in three linked steps: first, by providing parents with good quality information about how much their children are (or are not) learning in school; second, by providing concrete suggestions about steps that parents might take to improve education outcomes for their children; and finally, by facilitating a broad public discussion of the state of education in the country. As noted, the present evaluation is of the 7
impact of the first two of these three steps. Uwezo s expectation is that these measures will empower citizens to hold their governments accountable for improving the quality of their children s education, and also equip them with the knowledge necessary to contribute themselves to improving their children s learning. The core of the part of the Uwezo initiative that we study is a series of informational treatments that provide parents with feedback about their children s performance (the assessment ) and guidelines for action (the instruction materials ). Our research exploits the random implementation of these treatments across households and villages to estimate their effects on parents willingness to take actions on behalf of improved educational outcomes, and on their degree of citizen activism more generally. Villages and households were selected for treatment as described in the next section. 1 Treated households received the following treatment during a 2-3 day period in February- March of 2011: 1. Assessment: An Uwezo volunteer administered tests of basic literacy, numeracy, and reading comprehension in both English and Kiswahili to every child in the household aged 6-16. Parents were presented with the results of these tests at the conclusion of the assessment. Parents were told if their children were performing at or below grade level. Since the assessment was designed to test Class 2 abilities (ages 6-7) but was administered to children ages 6-16, parents of older children who passed the assessment were told simply that their child had basic skills in reading and math, not that their child was performing at grade level. 2. Instruction materials: An Uwezo volunteer presented assessed households (and, very rarely, also selected households where the assessment did not take place because children were not present or there were no children aged 6-16 residing in the household) with a battery of materials that outlined strategies that parents might pursue to improve their child s learning, including: a. A wall calendar with statements about the value of education; b. A poster with a checklist of six strategies parents might take to improve their children s learning. These were in the form of questions: Do you teach your children new words? Do you narrate 1 Throughout the paper, when we refer to villages we mean village or urban areas, as the latter were also included in Uwezo s random sample. Our own evaluation, however, was limited to rural districts and included no urban locations. 2 An alternative approach might have involved the deployment of a pre- treatment survey, which would permit a comparison of levels of citizen activism in treated households before and after the Uwezo intervention. If the pre- treatment survey were also administered in untreated households, then a difference- in- difference analysis could be undertaken by comparing the differences in the changes in citizen 8
stories to your child? Do you make sure your child sees you reading? Do you ask your child to read to you? Do you talk to your child about the writing on different products? Do you insist that your child practice writing? c. A sign- up sheet to become a friend of education and to receive periodic text messages from Uwezo on education themes; d. A story in English, intended to be read by children; e. A story in Kiswahili, intended to be read by children; f. A citizen s flyer with nine recommendations about how to get involved in local and national efforts to improve educational outcomes: (1) asking your child to do the suggested exercises; (2) giving ten children these exercises to try; (3) helping ten children to read and write; (4) talking to others about how to improve education; (5) sharing your thoughts in the church, mosque, on the radio, and other networks; (6) joining Uwezo s Friends of Education group or sending an SMS to Uwezo; (7) making sure the school and government take up their responsibility; (8) closely following school activities and attending school meetings; (9) closely following to know that your child is benefiting from the Free Primary Education policy. In addition, in each selected village, Uwezo volunteers visited the primary school that served the largest number of children from that village (usually there was just one primary school serving the village), met with the head teacher, asked a series of questions about the conditions of the school, evaluated the school, and left a poster with the head teacher. Uwezo volunteers also interviewed the village elder to gather information about the village. Selection of villages and households During the 2011 round of the Uwezo initiative, 124 districts were randomly selected (from a total of 158 districts nationwide) such that the number of districts selected in each province would be proportional to the province s population. Thirty villages were then randomly selected for treatment from each district. In each selected village, twenty households were chosen to receive the assessment/ instruction materials. To select these households, a designated Uwezo volunteer worked with the village elder to draw up a list of all the households in the village. The volunteer then divided the total number of households by twenty to generate a value n, and selected every n th household on the list. Five additional households were selected as alternates using a similar method. In total, 72,106 households were visited by Uwezo volunteers in 2011, and a total of 134,243 children were administered the literacy and numeracy tests. 9
Context: Kenya Although the Uwezo initiative covers three East African countries, we focus our evaluation on just one: Kenya. A democratic, multi- ethnic country with a historically strong educational system and a fairly high literacy rate (87 percent, according to the 2011 UNDP Report), Kenya introduced universal free primary education in 2003. Since that time, primary school enrollment rates have risen dramatically, but, due to the absence of a commensurate infusion of new funds, many observers believe that children s learning has suffered. Corruption, mismanagement, and a lack of resources have also undermined educational outcomes over the past two decades (Wrong 2009). Uwezo s initiative thus comes at a particularly opportune time. While the country s high baseline levels of literacy might bias against finding a strong effect of the intervention (this is because the theory of change underlying Uwezo s program depends on the assessments revealing deficiencies in children s math and reading skills a point to which we return later in the report), this may be counterbalanced by the country s relatively high rates of baseline citizen activism, which should make Kenya especially fertile soil for the kind of social mobilization that the Uwezo initiative is deigned to inspire. Moreover, as we discuss, we study the intervention in areas with relatively low and relatively high overall levels of educational attainment. Research design: sampling and instrumentation Although the Uwezo initiative was developed independently from the evaluation study discussed in this report, several aspects of the way it was structured were extremely useful for the design of our own research into its impact. Most important in this respect was the attention paid by Uwezo to the random sampling of villages and households for receipt of the assessment and accompanying instruction materials. This random sampling of units made it possible for us to select untreated villages and households to serve as controls, and thereby to estimate treatment effects in a manner akin to a randomized controlled trial (RCT). In short, the study discussed here benefits from the inferential opportunities associated with random assignment of treatment without the artificiality sometimes associated with experimental studies in which scholarly researchers implement or manage the central intervention of interest. Post- treatment, matched village sample design A straightforward research design for assessing the impact of the Uwezo initiative would have been to randomly select a large number of households located in untreated districts (or in untreated villages within treated districts) and to compare average levels of citizen activism in these control households with average levels of citizen activism in households that had received the assessment and instruction 10
materials. 2 The drawback to such a strategy is that it would have necessitated the collection of data in a vast number of locations scattered around the country, which would have been logistically challenging and precluded the focused, time- intensive, qualitative research that we felt was important to undertake to supplement our experimental work. Hence, we selected two specific districts for intensive study and chose treated and untreated households in each of these districts from among the villages that had and had not received the Uwezo assessment/instruction materials. 3 We chose the two districts so as to offer variation in SES and baseline literacy rates: For example, Kirinyaga, in Central Province, ranks in the top third of advanced schooling completed (secondary or higher); Rongo, in Nyanza Province, ranks in the bottom third (based on analysis of 2009 census data). To ensure that untreated villages had not received materials or information from Uwezo during the first Uwezo assessment round in 2010 (and also that treated villages had only received a single dose of the Uwezo intervention) we selected both study districts from among the set of districts that had not been included in the 2010 assessment. 4 To reduce the likelihood of information diffusion from the earlier Uwezo intervention and follow- up dissemination campaign, we also selected the districts so as to be as far from 2 An alternative approach might have involved the deployment of a pre- treatment survey, which would permit a comparison of levels of citizen activism in treated households before and after the Uwezo intervention. If the pre- treatment survey were also administered in untreated households, then a difference- in- difference analysis could be undertaken by comparing the differences in the changes in citizen activism across treated and untreated households. Unfortunately, this was not possible given the specific timing of our team s engagement with Twaweza/Uwezo. In any case, such a design would have been undesirable due to the likely bias introduced by priming households, through the administration of the pre- treatment survey, to the importance of (and/or the research team s interest in) education outcomes. Such bias would likely have led to an underestimate of the true impact of the Uwezo intervention. 3 In 2011, Kenya adopted a new system of devolved government in which counties replaced districts as the key units of sub- national administration below the province level. However for ease of exposition, we refer to these units as districts. Also, Rongo district became part of a broader Migori county, so adopting the county designation would lead to ambiguity as to the boundaries of our research site, which corresponds to the boundaries of the old Rongo district. Kirinyaga district became Kirinyaga county, so there would be no loss of precision if we adopted the new label. 4 Uwezo s sampling protocol calls for the progressive expansion of assessed districts in each assessment round to ensure that each round (after the first) will contain a combination of districts that had previously received the assessment/information materials and those that had not. 11
Nairobi and other large population centers as possible and to maximize the distance from districts that had been included in the 2010 assessment round. Selection of villages Within Kirinyaga and Rongo, we selected six villages from among the thirty that had received the Uwezo assessment ( treated villages ). We selected these villages so as to be physically distant from one another and, where possible, located in different administrative divisions. To better tease out the relative impact of the assessment and the instruction materials, we supplemented our main study villages in each district with four additional villages in which Uwezo volunteers administered the assessment according to normal procedures but withheld the distribution of any of the instruction materials. While the main analyses described below focus only on the matched pairs of villages that received both the assessment and the instruction materials, we discuss the assessment only villages in a separate section at the end of the report. From among the hundreds of untreated villages in each district, we then selected six villages to serve as controls. Each of these control villages was matched with one of the six treatment villages to create a set of corresponding pairs. Matching was accomplished using data from the 2009 Kenyan Population and Housing Census on a number of village- level characteristics that we hypothesized might influence the impact of the Uwezo intervention: population size, number of households in the village, number of people currently attending school, percentage of population that had finished primary and secondary school, percentage of population with radio and mobile phone service, and percentage of households with a mobile phone. Matches were also chosen so as to contain villages from the same electoral constituency and from adjacent, albeit different, sub- locations. Because we discovered that one of the treated villages in Kirinyaga contained only four treated households, we selected an additional treated village and matched pair in that district, for a total of 7 village pairs in Kirinyaga and 6 in Rongo. 5 Table 1 summarizes the characteristics of the matched pairs. 5 This very small number of treated households was due to the relatively small numbers of households in that village that contained appropriately aged children. 12
Table 1: Characteristics of Matched Village Pairs district Pair code Treatment status 2009 pop hhlds pop at school % pop only primary school % pop only 2ndary school % pop w radio service % pop w mobile phone service %hhlds w mobile phone Kirinyaga A Treated 800 200 200 0.5 0.2 0.9 1 0.8 A Control 700 200 200 0.5 0.3 0.9 0.8 0.8 B Treated 200 100 100 0.6 0.2 0.9 0.7 0.6 B Control 200 100 100 0.6 0.2 0.8 0.4 0.6 C Treated 400 100 100 0.4 0.4 1 0.7 0.8 C Control 400 100 100 0.5 0.2 0.9 0.9 0.7 D Treated 1400 500 500 0.5 0.3 0.8 0.6 0.8 D Control 1200 400 400 0.6 0.2 0.9 0.8 0.8 E Treated 800 200 200 0.6 0.2 0.8 0.4 0.7 E Control 700 200 200 0.6 0.1 0.8 0.5 0.6 F Treated 900 300 200 0.6 0.2 0.8 0.5 0.5 F Control 1000 300 300 0.6 0.1 0.8 0.7 0.7 F2 Treated 300 100 100 0.6 0.2 0.7 0.4 0.6 F2 Control 200 100 100 0.5 0.2 0.9 0.5 0.6 Rongo G Treated 200 50 100 0.5 0.2 0.9 0.4 0.5 G Control 200 50 100 0.5 0.1 0.8 0.4 0.4 H Treated 1200 200 700 0.3 0.4 0.9 0.8 0.7 H Control 700 100 300 0.6 0.1 0.9 0.4 0.8 I Treated 1300 200 700 0.4 0.3 0.4 0.3 0.7 I Control 1000 200 400 0.5 0.1 0.7 0.4 0.6 J Treated 900 200 400 0.6 0.1 0.6 0.4 0.4 J Control 800 200 400 0.5 0.1 0.6 0.5 0.4 K Treated 300 100 100 0.6 0.1 0.6 0.4 0.4 K Control 300 100 100 0.5 0.1 0.6 0.4 0.4 L Treated 300 100 200 0.5 0.2 0.8 0.5 0.6 L Control 400 100 200 0.5 0.1 0.7 0.4 0.6 NB: Village names are not provided and numbers are rounded to protect the identities of respondents in villages. Selection of households Our main interest in the study was to measure differences in citizen activism across households that did and did not receive the assessment and instruction materials ( treated and untreated households, respectively). Since treated households could only be located in treated villages, and since our control villages contained (by definition) no treated households, this meant comparing households located in treated and untreated villages. However, we were also interested in ascertaining whether the impact of the Uwezo intervention spilled over within treated villages from households that received the assessment and instruction materials to those that did not. Hence, we administered our household questionnaires (described below) in three different types of households: treated households in treated villages, untreated households in treated villages, and households in untreated villages. We selected these households in the following manner: 13
Treated households in treated villages: Uwezo s protocol had called for conducting assessments in twenty households in each treatment village. However, these twenty households were selected from the village household lists before the Uwezo volunteers had been able to confirm that the households did in fact contain school- aged children (and thus were suitable for the assessment). Approximately one- third of the time, the pre- selected households did not contain children aged 6-16. Since the Uwezo protocol did not provide for the replacement of such households with new ones, the number of households in which the assessment was actually carried out was far below the target of twenty per village: treated villages contained an average of only twelve treated households. In order to maximize the number of treated households in our study, our protocol was to include all of them in our sample. Untreated households in treated villages: We also sought to study households in treated villages that had not themselves received the assessment and accompanying instruction materials. To do this, we returned to the original household list developed during the assessment exercise and randomly sampled 15 households from among those that had not received the treatment. We also selected a set of replacement households, using the same procedures. Households in untreated villages: In the control villages, we had no household lists to draw upon, so our field coordinators worked with village elders to develop them. From those lists, we randomly sampled 15 households, along with an additional set of replacements. The final size of our sample 6, tabulated in terms of our three treatment conditions and village pairs, is presented in Table 2. 6 Our target sample size was 540 households, and for reasons described in the text, our final sample was slightly higher. Analytical power is achieved in a number of ways including through A) Large sample size; B) Strong treatments; C) Careful measurement; D) Carefully controlled comparisons. Given the unprecedented nature of our study of the Uwezo intervention, we could only guess about the likely treatment effects. It was also not clear whether the effects would be observable at the village or household level. Under various assumptions, we found that our proposed sample size of approximately 180 treated households; 180 untreated households in treated villages; and 180 control households would allow us 80% power with a 95% confidence interval. That said, we prioritized careful measurement, and multi- method research over pure statistical power through a large sample size. If we had attained a substantive but not statistically significant difference between treated and untreated groups, we would have expanded the scope of the study to achieve greater statistical power. Because this was not the case, we did not increase the sample size. 14
Table 2: Summary of survey responses Control Villages Treated Villages Pair Control households Untreated households Treated households Total Kirinyaga Total 109 105 77 291 A 16 20 13 49 B 15 12 10 37 C 15 14 11 40 D 17 13 4 34 E 16 15 11 42 F 15 16 18 49 F2 15 15 10 40 Rongo Total 91 99 69 259 G 15 16 11 42 H 16 16 10 42 I 14 20 10 44 J 15 17 16 48 K 16 14 11 41 L 15 16 11 42 200 204 146 550 Selection of respondents within households To select respondents to interview within each sampled household, we employed the following protocol: Enumerators were instructed to greet the first adult they encountered when they approached the household. They were to briefly mention that they were conducting a survey on democracy and family behavior and ask if there were children aged 6-16 living in that household. If the answer was no, then the enumerator would not proceed with the interview. If the answer was yes, then the enumerator would ask the adult if he/she was the direct caregiver of a school- aged child living in the household. If the answer was again yes, then the enumerator would continue with the survey. If the adult indicated that he/she was not a direct caregiver of a school- aged child, then the enumerator would ask him/her to identify another adult living in the household who was. Once a direct caregiver was identified, the enumerator would interview that person. Enumerators were instructed return two times to households in which it was not initially possible to conduct an interview, and to select a household from the replacement list when a suitable respondent could not be identified. During the interview, the enumerator would ask the respondent to list the names of every member of the household and to identify the children for whom they had direct responsibility. Questions later in the survey referred to decisions that the respondent had made on behalf of those specific children. 15
Balance of covariates and tests for treatment/spillover Our strategy for making meaningful comparisons between treated and untreated households rests upon the assumption that, at least in aggregate, the respondents from our treatment villages are not markedly different from respondents in the control villages with which they are matched. Some confidence for this assertion comes from the balance in village characteristics summarized in Table 1. Further confidence comes from a comparison of covariates collected in the household surveys that we administered after the matched pairs had been selected (See Appendix III for tables and figures). Research instruments The various research instruments we employed were designed to gather four types of information: information about the extent of active citizenship; information about background characteristics of villages, households, and household members that might have affected levels of active citizenship; information about the extent to which the Uwezo treatment was actually received by the intended subjects (and by those not intended to receive the treatment); and information about the ways in which the treatment might have influenced relevant outcomes the mechanisms. We gathered this information using a multi- method approach. 7 Our core instrument was an extensive household survey. This document was translated into Swahili and Luo and administered by trained enumerators fluent in the appropriate local language. Interviews were conducted face- to- face with adult household members selected as described above. Households that had received the Uwezo assessment were told that we were following up on a survey conducted in March; households that had not received the assessment were told that their names had been selected at random. In neither instance did the enumerators specifically mention the Uwezo initiative while introducing the survey. The household survey took approximately 1.5 hours per household to administer. In addition to the household survey, we developed and administered three instruments to gather other forms of data in our study villages. A school survey collected information about local school conditions, student and teacher performance, and levels of citizen involvement in school activities at the largest public primary school serving each study village. The school survey was completed by our district team leaders based on interviews with the head teacher. We collected additional comprehensive information about each study village through a village background survey, which we administered in focus groups that 7 Our research team in each district included one local Kenyan project coordinator and one American Ph.D. student, as well as 7-8 Kenyan enumerators. Wherever possible, we sought to harmonize our survey questions and data gathering techniques with those employed in our work for Twaweza in Tanzania, so as to facilitate comparisons across settings. 16
included the village elder and other village leaders. Typically, these focus groups were held on the first afternoon of a two- day survey enumeration, and thus served the additional function of introducing our research team to the village leadership. The survey collected information about the village s history, development and demographic composition; its physical infrastructure; the quality of its associational life; its connections with/exposure to other villages and the country more generally; and the main social and economic activities that occupy its residents. Finally, because we were interested in capturing changes in associational activity that were taking place over time, we required our field coordinators to revisit each village several times during the course of our study period. To organize the collection of their observations on these visits, we developed a village observation survey, which was designed to be completed by our field coordinators on each trip back to each study village. In contrast to the other surveys, which involved formal interviews or focus groups, the village observation survey was designed to be completed by our district team leaders based on simple observations and informal conversations with village residents. Outcomes we investigate Our evaluation looked at a broad spectrum of outcomes that we had theoretical reason to believe might be or that the Uwezo theory of change suggested should be affected by the Uwezo assessment and informational treatment. Some of these outcomes involved actions that related directly to improving one s own children s learning. Others involved actions taken to improve educational outcomes for children in the community more broadly. Still others involved citizen activism in areas outside of the education sector altogether. Before discussing the particular outcomes we treat, an important caveat: In order to measure these outcomes, we rely on what the survey respondents reported about their own behavior. Self- reported behavior in response to a survey questionnaire is a very common form of measurement, but it has a number of drawbacks. Individuals who are asked to recall what they did may remember facts or events incorrectly, or they may answer questions in ways that they believe are more socially desirable or pleasing to the interviewer. We pursued a number of strategies to minimize these sources of error. For example, to help respondents recall past behavior and events, we developed questions that would jog their memory. Thus, when we asked about actions taken in the last three months that is, since the Uwezo assessment (which we could not mention directly without priming respondents to our interest in its effects) we phrased the question in terms of since the start of the rainy season, which had begun three months earlier. In order to minimize desirability bias, we wrote the introduction to the survey to minimize the possibility that respondents would be primed to think about education or the Uwezo assessment, or that they would think that the researchers and 17
interviewers were principally interested in education (and thus might be assumed to support efforts to improve children s learning outcomes). For similar reasons, we embedded questions on education in the latter part of the survey (alongside questions about governance, health, and water service provision) rather than place them front and center. Enumerators were also trained to administer the questionnaire in a completely neutral manner and to explicitly avoid leading questions, expressing their own opinions, or elaborating in any way on the hypotheses that motivated our research. Since, moreover, there is no reason to expect response effects due to memory errors or desirability bias to differ between control and treatment groups, we do not expect these issues to affect the conclusions we draw from our research. We studied a total of eleven outcomes, which can be classified into four categories: interventions at home to help one s own children; general involvement in efforts to improve one s children s learning; interventions at school; and civic participation and citizen action more broadly. Interventions at home to help one s own children The provision of information to parents about the level of literacy and numeracy attained by their children is at the core of the Uwezo initiative. One logical way that parents may have responded to this information is by increasing their commitment to helping their children at home with their schoolwork. We therefore asked parents whether they helped their children with their reading, writing, and math. The distribution of responses for all respondents is provided in Table 3. Roughly equal portions of parents report always, sometimes, and never helping their children with these subjects. In the analyses we report below, we combine the three questions into a single measure by averaging the answers, although we generate similar findings if we analyze the questions separately. Table 3: Home Interventions to Help One s Own Children Often (%) Sometimes (%) Never (%) Do you help your child with reading? 34.9 38.1 27 Do you help your child with writing? 33.2 35 31.7 Do you help your child with mathematics? 32.1 38.5 29.4 General involvement in efforts to improve one s children s learning We also asked parents for a subjective assessment of their level of involvement in their children s education. We first asked parents: Overall, how involved would you say that you are in trying to improve the quality of your child s education? We then asked: Has this level of involvement changed during the past three months?" [i.e. 18
since the Uwezo assessment was performed]. This measure thus allows us to ascertain whether receiving the Uwezo assessment made parents more likely to say that their level of involvement in improving the quality of their child s education had increased during the period that followed. Responses to these questions are provided in Table 4. Table 4: Parental self- assessment of involvement in quality education How involved would you say that you are in trying to improve the quality of [your child s] education? Very (%) Somewhat (%) Not very (%) Not at all (%) 52.3% 33.2% 9.4% 5.1% Has his level of involvement changed during the past three months (since the start of the rainy season)? I ve become more involved I ve become less involved No change 41.5% 14.0% 44.3% Interventions at school In addition to taking action at home or becoming involved in their own children s education, parents may also undertake more public activities in support of improved learning at their children s school. Such activities are particularly desirable, as they will likely have positive externalities that will benefit other children as well as the parents own. To measure activities of this type, we asked parents whether, in the three months between the Uwezo assessment and the administration of our survey, they had discussed their child s performance with their teacher, attended a parent- teacher meeting, organized school activities for children, assisted teaching at school, provided extra lessons outside school, provided teaching materials to school, helped with school maintenance, provided food or water to the school, or discussed learning quality with their child s teacher. Responses to these questions are summarized in Table 5. On most measures, levels of reported parental involvement are quite in some cases strikingly high. In the analyses described below, we combine the answers to these questions and use the number of actions respondents report taking as a measure of their level of action or participation at their children s school. 19
Table 5: Parental interventions at school In the past 3 months have you done any of the following things? No (%) Yes (%) Discuss child's performance with teacher 28.1 71.9 Attend parent- teacher meeting 20.4 79.6 Organize school activities for children 78.3 21.7 Assist teaching at school 95 5 Provide extra lessons outside school 80.5 19.5 Provide teaching materials to school 88.6 11.4 Help with school maintenance 74.5 25.5 Provide food/water to school 87.3 12.7 Discuss learning quality with teacher 60.5 39.5 Civic participation and citizen action We also asked respondents a series of questions about their participation in education- related groups and/or associations. We measured such participation in three different ways. The first indicator was a dichotomous measure recording whether or not the individual had participated in any groups or associations dealing with education issues in the last three months. The second was a slightly more fine- grained measure based on the number of meetings the individual had attended on the subject during this period, if they in fact had participated in such a group. The third recorded whether or not the individual had contributed any money in the last three months to the group. The results of these questions are summarized in Table 6. They suggest real diversity across our respondents in their degree of engagement in associations of this type. Whereas 59 percent of parents indicated they do not participate in education- related groups, of those that did, 44 percent reported 20
attending meetings at least once a month, and 55 percent reported having contributed money to the group. Table 6: Civic Participation In the past 3 months, did you participate in any groups or associations dealing with education issues? No Yes % 58.7 41.3 If yes, how often have you attended/ participated in such meetings in the past 3 months? Once a week Once a month Several times a year Once a year or less % 12.8 31.4 45.1 10.6 In the past 3 months, have you contributed any money to this group? Yes, regularly Yes, occasionally No % 21 34.3 44.6 In order to get a sense of whether the Uwezo assessment may also have triggered not just individual behavioral change but also broader collective action at the village level, we asked respondents how often in the last three months members of the village had jointly approached village officials or political leaders, such as MPs and councilors, to improve their schools. Their answers are summarized in Table 7. 21
Table 7: Villagers approaching official to improve schools In the past three months, how often have members of this village jointly approached village officials or political leaders (councilor, MP) to improve our schools? Never Once Several times Frequently (%) 57.4 29.5 11.1 2 Because we were interested in the possible spillover of the impact of the Uwezo intervention beyond the educational realm, we also broadened our frame to look at citizen activism on behalf of improvements in the delivery of public services more broadly. We therefore asked individuals whether or not they had taken any of the ten actions listed in Table 8, which involve behaviors to improve the delivery of education, health, or water services. We find that participation varies significantly across the actions we inquired about, from 47 percent of respondents indicating that they had attended a health, education, or water committee meeting to just 12 percent who said they had sent a SMS about these issues. In the analyses reported below, we aggregate these ten responses by adding the total number of actions each individual reported taking. There appears to be a significant amount of overlap between the citizens who attend the meetings and the citizens who speak to health, education, or water committee members about services outside meetings. The two behaviors are highly correlated. Only three percent of respondents in the survey spoke to committee members outside of a meeting but did not themselves attend meetings. There was even more overlap between citizens who raise issues about clinic, school, or water services in community meetings and citizens who raise issues about health, water, or education with local officials outside community meetings. Again, only three percent of respondents raised issues with official outside the community meeting but did not raise issues about services in the community meeting. The vast majority of respondents who raised issues with officials outside the community meeting also raised issues during the meeting. (See Appendix IV for relevant tables). 22
Table 8: Civic participation Have you done the following at all over the past three months, with or without a request from someone else? No (%) Yes (%) Attend health, education, or water committee meetings? 52.9 47.1 Speak to a health, education, or water committee member about health services outside of a meeting? Raise issues about clinic, school, or water services in a community meeting? Raise issues about health, water, or education with local officials outside a community meeting? Speak to a health worker, teacher, or water company employee about an issue? Discuss health, education, or water problems in a meeting of any group/organization you belong to? Monitor health issues (like drug stocks outs), education issues (like teacher attendance) or water issues (like water point functionality)? 74.3 25.7 79 21 81.9 18.1 78 22 71.9 28.1 83.2 16.8 Send SMS about health, education, or water issues? 88.3 11.7 Call radio program to talk about health, education, or water problems? Take any other action to improve health/education/ water services? 87 13 70.2 29.8 In Table 9, we return to the question of village- level collective action dealt with in Table 7, but look at efforts to affect aspects of village government or public service delivery that go beyond education. 23
Table 9: Villagers approach officials about service delivery In the past three months, how often have members of this village jointly approached village officials or political leaders (councilor, MP) to improve some aspect of village government or public service delivery? Never (%) 62.1 Once Several times 31.1 6 Frequently 0.7 Finally, we looked at people s general level of political participation. It is possible that by increasing a parent s knowledge and awareness of problems with their children s learning, parents may be activated to become more proactive in politics more generally. We thus looked at three forms of political participation: participation in a protest or demonstration, attendance at a political rally or meeting, and contacting the court or police. As Table 10 makes clear, levels of political participation among our respondents are fairly low. Table 10: Political Activism Have you done the following at all over the past three months, with or without a request from someone else? No (%) Yes (%) Participate in a protest or demonstration? 90.4 9.6 Attend a political rally or meeting? 76.7 23.3 Notify the court or police about a problem? 82.4 17.6 Results The response patterns presented in the previous section pool the answers of all 550 of our respondents, irrespective of their treatment status. Taken together, they present a picture of a study population that is relatively engaged in its community and in its children s education. But the key question is whether we see different levels of involvement and citizen activism among parents whose children received the Uwezo assessment and accompanying instruction materials. That is, is there a 24
systematic difference across these outcome measures in treated and untreated households? 8 The unambiguous answer is no: no matter how we analyze the data, we find no evidence for a treatment effect on any of the eleven outcomes discussed above. Figure 1 summarizes these results. The dots report the average treatment effect, which can straightforwardly be interpreted as the difference between average responses to each question among members of treated and untreated households. The horizontal lines are the 95 percent confidence intervals, which provide a sense of how confident we are about our estimates. The fact that these confident intervals cross the dashed vertical line (indicating an estimated effect of zero) in every case implies that we can be extremely confident that the treatment had no impact on citizen activism. Although Figure 1 only compares average responses across treated and untreated units, our (non- ) findings are robust to an alternative regression specification in which we control for a host of covariates that might have differed slightly across these two populations (results not shown). 8 Note that in the results presented in the previous section, we pool respondents from all three treatment groups (control households; households that received the treatment; and households that did not receive the treatment but are located in villages where the treatment was administered) in order to provide a broad characterization of our study population. In this section, where we focus on describing the impact of the Uwezo intervention, we limit our comparisons only to the first and second of these cohorts (except where specified). 25
Figure 1: Average Treatment Effects Treated vs. Untreated Households Figure 1 focuses on average differences between the 146 households that received the Uwezo assessment/instruction materials and the 200 control households that did not. However, our research design also makes it possible to investigate the possible spillover of the treatment effect from treated households to non- treated households located in the same village. 9 This involves comparing the 200 control households with the 204 households that were located in treatment villages but did not receive the assessment or instruction materials. If there is spillover, we should see a difference between these two categories of households (although given the lack of a treatment effect among households that actually received the treatment, a finding of a treatment effect here would be quite surprising). We present these 9 Because control villages were selected so that they did not border villages in the Uwezo assessment, and since our household survey work was conducted prior to the launch of Uwezo s dissemination campaign, spillover to households in the control villages should be minimal. 26
results in Figure 2. Like Figure 1, there is no evidence of a treatment effect, and thus no evidence of a spillover. Figure 2: Average Treatment Effects Untreated Households in Treated Villages vs. Control Households 10 10 q439c: Notified the court or police about a problem q439b: Attended a political rally or meeting q439a: Participated in a protest or demonstration q418a: Reported that villagers collectively approached officials to improve public service delivery actcit3: Number of actions taken to improve education, health or water q418c: Reported that villagers collectively approached officials to improve schools q403c: Contributed money to groups dealing with education q402c: Number of meetings attended involving education q401c: Participated in groups dealing with education actschool: Number of actions taken to improve school q645: Increased involvement in improving their children s education helpkids: Helped children with reading, writing, and math 27
Although we find no evidence that receipt of the Uwezo assessment and instruction materials had any impact on citizen activism either on behalf of improved children s learning or more generally we do find some factors that are associated with greater levels of activism. For example, we find that greater literacy, having more reading materials in the house, greater exposure to radio, TV or newspapers, and membership in active social networks are all associated with higher rates of citizen activism. However all of these factors were, by design, equal across our treatment and control households, so these findings have no bearing on our conclusions regarding the impact of the Uwezo interventions that we study. Qualitative results, including in assessment only villages As noted above, we supplemented our quantitative analyses, built around the household survey, with a series of more qualitative data collection activities. Our field coordinators visited the villages on repeated occasions between June and August 2011, during which time they held focus group meetings with village elites, interviewed school leaders, and walked around the villages (see the descriptions of the village background survey, school survey, and village observation survey, above). The main goals of these additional data collection exercises were to provide further measures of our key variables and to provide information about the ways in which people discussed and acted upon the Uwezo- provided information in order to establish the mechanisms linking information to accountability (assuming that we detected a treatment effect). More broadly, our objective was to put ourselves in a position to confirm, in a more qualitative way, the results of our quantitative analyses. Our qualitative interviews and meetings revealed no significant differences between villages that received the Uwezo assessment/instruction materials and those that did not (see Appendix II for quotes from the qualitative interviews). Treated and untreated villages held meetings and participated in joint activities to provide public goods with roughly the same frequency, and reported similar levels of citizen outspokenness and activism. This finding carried over to the assessment only villages that received the assessment but not the assortment of additional Uwezo- provided instructional materials. Why no treatment effect? As discussed earlier, the theory of change underlying the Uwezo initiative posits that increases in citizen activism will be a product of both information provision of the sort we study here and the longer- term creation of an ecosystem in which discussions of civic engagement and examples of successful action become commonplace. Hence, the finding that providing citizens with information about the quality of their local schools had no impact on their engagement or activism need not be taken as a repudiation of the broader Uwezo initiative. Nonetheless, the expectation that information provision will generate citizen action is both a key part of Uwezo s causal logic and based on strong theoretical 28
foundations. 11 So understanding why we found no treatment effect is important. So, why do we find such unambiguous evidence that it had no impact? Before we go on to discuss the possible reasons for the lack of impact, we can eliminate basic implementation failure as a significant explanation. Based on our research, we found that the Uwezo assessment was implemented relatively well and that the instructional materials were successfully distributed to the households. Indeed, one or more of Uwezo s instructional materials were visibly displayed three months after the assessment in 84 percent of the treated households in our study. Given that Uwezo successfully delivered the assessment and instructional materials to the treated households in our study, we can think of four reasons to explain why we found no treatment effect, which we discuss below. First, it is possible that insufficient time had elapsed between the time of the assessment, in March, and the time of our data collection efforts, in June and July. Three months may simply have been too short a time period for the information that the Uwezo assessment provided to have led to behavioral change. Real behavioral change may require reflection, discussions with other community members, and a rearrangement of commitments and prior obligations to make room for new activities and behaviors all of which takes more than three months to accomplish. If we had delayed our survey work for another several months, it is possible that a measurable difference might have emerged in the levels of citizen activism between the treated and untreated households. For reasons discussed below, we do not believe this explanation is very likely, but we cannot rule it out entirely. Given this possibility, it is useful to revisit the question of why we decided to implement our evaluation work so shortly after the assessments. The reason, arrived at after extensive consultations with our Uwezo sponsors, was to make sure that we had collected our data before the launch of Uwezo s dissemination campaign, during which the aggregated results of the assessments were publicized through a series of media campaigns, public meetings, and other channels. Our rationale for scheduling our data collection before the dissemination launch was twofold: first, to make sure that we could disentangle the impact of the assessment/informational campaign from the impact of the dissemination campaign, and, second, to make certain that our control villages had not received any treatment from Uwezo something that would be extremely difficult to ensure 11 Furthermore, the design of the Uwezo intervention was exactly in keeping with these theoretical models. It provided parents with specific, personalized information about the performance of their own children, along with carefully designed informational materials that offered recommendations for how to effect change. From a theory- testing perspective, the Uwezo intervention offered a very strong test. 29
once the broad- based media campaigns had begun (their whole point being to reach as broad an audience as possible). 12 While the scheduling of our evaluation was thus sensible from a research design perspective, it had the drawback of putting us in a position of assessing the impact of the Uwezo program when it was only half complete that is, when the assessments and the distribution of the instructional materials had occurred but before the process of publicizing their findings, and triggering the national discussion about children s learning that is central to the Uwezo theory of change (Uwezo Kenya 2011) had taken place. Thus, a second possible explanation for our inability to find a treatment effect is that it we should not have expected to find one until after the dissemination campaign. 13 Again, while there is merit in this explanation, it is still informative that the assessment (which provided parents with critical and personalized information about their children s learning) and the instruction materials that were distributed at the time of the assessment (which provided suggestions about concrete strategies that parents might pursue to improve their children s performance) seem to have had no effect on parents behavior. Although it is reasonable to suggest that the dissemination campaign should reinforce the impact of the assessment/instruction materials among assessed parents, it is also reasonable to have expected that the assessment/instruction materials would have had a measurable impact on their own. Yet a third explanation for our non- findings is that real behavioral change requires not just individual action but collective action, which in turn requires that a critical mass of citizens is mobilized to act. It is possible that the number of treated households in each village (an average of just twelve) was too few to generate this critical mass. To the extent that this factor was responsible for the weak impact of 12 Quite apart from the difficulty of disentangling the impact of the assessment from the impact of the dissemination campaign, another problem with trying to evaluate the impact of the latter stems from the non- random absorption of the campaign by media consumers. Two households that, in principle, received the broadcast of a radio show describing the assessment results might be quite different in their degrees of attentiveness to this treatment, and these differences might be plausibly correlated with the likelihood that receiving the broadcast would have an impact on their behavior. 13 This said, the impact of the dissemination campaign depends critically on the ability of the publicity channels Uwezo employs centrally, radio and SMS to reach citizens. Our supplementary analyses of the impact of Uwezo s radio programming and SMS distribution strategy, described in the appendix, suggest that these mechanisms are, in fact, quite weak (See Appendix I). So we are not at all sure that waiting until after the dissemination campaign had taken place would have altered our findings about the impact of the Uwezo intervention. 30
the intervention, the implication for Uwezo would seem to be that it should increase the number of treated households in each treated village. A fourth and, we believe, most important explanation for the lack of a treatment effect is the absence in our study setting of a set of key conditions that must be present for the provision of information to be reasonably expected to generate citizen activism. While the theoretical model linking information and citizen activism is quite simple, the connection between the two turns out to depend on a series of unspoken, and frequently unacknowledged, assumptions. In the next section, we lay out these assumptions, showing why each is necessary for the provision of information to lead to behavioral change. Although our purpose in outlining these necessary conditions is to help explain the (non- ) findings in our evaluation of the Uwezo program, we lay them out in general terms in order to underscore their relevance to any effort to generate citizen activism by providing people with information about local service delivery. In the section that follows, we then draw on our extensive survey data to demonstrate that many of these assumptions do not, in fact, hold in the setting we study. Assumptions in the information- accountability causal chain In Figure 3, we lay out a series of conditions that we believe must hold for an informational treatment to have a discernible impact on active citizenship. 31
!!! Figure 3: The Information- Accountability Causal Chain! "#!$!%&'()*+,&'! +-(!.&/#)0,+.#&1!!"#$%&'()*+,-(&./00(1&,+2-3-,4$5+16+3$5"+-&$! >#!$05,6+!?!8(*! $*!.+!!"#!!.&/#)0,+.#&1! >#!$05,6+!?!&#! "#(*!.+!*%22(*+!+-,+! +-(!*.+%,+.#&!.*!3#)*(! +-,&!$!-,'!(45(6+('1! >#!$05,6+! "#!$!6,)(1! >#!$05,6+! "#!$!+-.&7!+-,+!.+!.*!08!! )(*5#&*.9.:.+8!+#!'#!! *#0(+-.&2!,9#%+!.+1! >#!$05,6+! "#!$!-,;(!+-(!*7.::*! +#!0,7(!,!'.//()(&6(1! >#!$05,6+! "#!$!-,;(!+-(!*(&*(!#/!! (//.6,68!+#!+-.&7!+-,+!08! (//#)+*!3.::!-,;(!,&!.05,6+1! >#!$05,6+! <)(!+-(!7.&'*!#/!,6+.#&*! $!,0!.&*5.)('!+#!+,7(!! './/()(&+!/)#0!3-,+!! $!,0!,:)(,'8!'#.&21! >#!$05,6+! "#!$!9(:.(;(!08!#3&!.&'.;.'%,:!,6+.#&!3.::!! -,;(!,&!.05,6+1! "#!$!(45(6+!/(::#3!! 6#00%&.+8!0(09()*! +#!=#.&!0(!.&!+,7.&2!!,6+.#&!+#!,//(6+!6-,&2(1! >#!$05,6+! $05,6+! $05,6+! First, we make the potentially uncontroversial claim that the people at whom the information is directed must understand the content. This may seem like a trivial consideration, but given that many informational treatments are directed at poor people with little education, the possibility that people simply do not grasp the import of the information they are receiving cannot be ruled out. Second, even if the messages that are provided are understood by their recipients, the information that they convey must be new: if the recipients already know what they are being told, then hearing the information again is unlikely to lead to a change in behavior. Of course, it is possible that repetition may increase the impact of the information (Allport and Lepkin 1945, Schwarz et al. 2007), or that getting the 32
information from a particularly trustworthy source (such as a foreign researcher or an NGO worker) may add to its weight. But if, for example, citizens are being informed about the poor quality of service delivery in their neighborhood, and they already know that local service delivery is poor, then the impact of receiving this information is likely to be small. Only if the information they receive causes them to learn something new is it likely to have an impact on their behavior. But even then, it would be wrong to assume that receiving this new information will necessarily lead to action. The next question to ask is whether the information that citizens receive suggests that the situation in question is worse than they had previously thought. If the information conveys the message that local service delivery is actually better than they had previously assumed, then this new information is unlikely to generate activism for improvements. Even if the information is new not in the sense that it tells people something they did not already know but in the sense that it increases their certainty about what they already suspected it will only plausibly have an impact if it suggests that things are worse than people had previously thought. Yet, even if the information is new and suggests that things are worse than people had previously thought, action to effect change is only likely if the recipients care about the issue in question. For example, citizens may have real evidence that local service delivery in a particular sector is worse than they had thought, but if they do not attach value to the quality of service delivery in that sector or not attach as much importance to service delivery in that sector as they do to other deprivations that they suffer then this new information will not be likely to inspire action. Of course, other types of information might affect service delivery preferences, but, for example, a citizen who never travels abroad is unlikely to care about the quality of the passport administration. If the information is understood, is new, suggests that things are worse than previously thought, and speaks to an issue that citizens care about, the next question is whether they think it is their responsibility to do something about the problem. In many cases, citizens in poor countries believe that service delivery is the government s job, or that monitoring the government and applying pressure on it to improve the quality of the services it provides is the responsibility of local leaders, NGOs, professional inspectors, journalists or other individuals but not themselves. To the extent that this is the case, the recognition that something needs to be done may not generate behavior by citizens to do something about it. Again, this perception may also be affected by other types of information campaigns, but on its own, information about service delivery failures are not likely to motivate citizens to act unless they perceive responsibility for holding government (or the service delivery unit in question the school, the clinic, the bank, etc.) to account. Even if all these criteria are satisfied that is, even if citizens are inspired to act and believe it is their role to do so the would- be actors must possess the skills to make a difference. They need to know who to contact, what to say, and, more generally, how the system works and where they can most effectively apply pressure for improvements. If they do not have these skills and knowledge, then they may take 33
actions that have no impact or, anticipating their inability to effect meaningful change, they may not take any actions at all. A closely related though analytically distinct consideration is whether citizens have the sense of efficacy to think that their efforts will have an impact. Even if they know who to contact, when and where to hold the meeting, and which buttons to push, they may still believe that their appeals will fall on deaf ears, that their pressure will generate no change, or that their efforts will, in the end, do nothing to change the status quo. If this is the case, then skills and knowledge alone will not be enough to lead to citizen action. Yet another factor to consider is whether citizens may already be engaged in the behaviors that the information they have received is designed to inspire. If citizens are already holding meetings, applying pressure, and taking other actions to improve service delivery, then giving them information about the quality of the services they are receiving may not lead to behavioral change even if all of the other conditions discussed above are met. Finally, for citizens to act on behalf of change, they must believe either that their own individual actions can make a difference or, if they think that generating real change will require collective action, that others in the community will act with them. This second consideration will depend on the characteristics of the community that individuals inhabit its norms of reciprocity, the facility with which social sanctioning can occur, or, more broadly, its reservoir of social capital. If the particular actions that people believe can effect change can be accomplished by individual effort, then citizens may act. But if those actions require the simultaneous participation of everyone in the community or, at any rate, a large share of it then even satisfying all of the criteria listed above may not be enough if people do not trust their fellow community members to join their efforts. In most accounts of the link between information provision and behavioral change, these logical steps are assumed away. However, as the foregoing discussion suggests, the complexity and contingency of this causal chain may hold the key to understanding why informational interventions so frequently fail to generate the behaviors they are hypothesized to produce. In what follows, we draw upon data from our surveys to weigh the evidence for each of these considerations in our study population. Do these conditions hold in our study population? In this section we turn to a consideration of the characteristics of our study population to identify the extent to which they were truly susceptible to the influence of the Uwezo informational treatment that is, the extent to which they seem to understand the information they received; were ignorant about their children s school performance or thought their children were performing better than the assessment revealed them to be doing; care about educational outcomes; etc. In fact, we find that these conditions did not generally hold. In all cases, we find that only a minority sometimes a very small minority of the subject population was reasonably likely to advance down any single step of the causal pathway 34
towards increased citizen activism. We conclude that this is the most plausible explanation for why the receipt of the Uwezo treatment generated no discernible behavioral change. While it is possible that our subject population was atypical in this regard, we think this is unlikely especially since it was drawn from two quite different districts. We interpret our findings, instead, as simply suggesting that the conditions under which an informational treatment is likely to have an impact on accountability- enhancing behaviors are more restrictive than many researchers (and project designers) acknowledge. Do I understand the information? We lack direct measures of understanding, but in the survey we measure recall of the assessment results and the Uwezo informational bundle, which provides information about ideas for action. Many parents could not remember the results of their child s assessment. When we asked parents of assessed children whether their child s assessment scores were higher, lower, or about what they thought it would be, half did not remember the assessment or could not remember what they thought of their child s assessment score. Parents were more likely to retain and remember information from the supplementary instruction materials distributed to parents after their children were assessed, however. These materials included a calendar, a story in English, a story in Kiswahili, a poster listing actions parents could take to improve their children s learning, a flyer listing actions that everyone could take to improve education in general, and a form to sign up to receive further information through a Friends of Education initiative. One way in which we evaluated absorption of the instruction materials was to look around the respondent s home to see if any of the materials were visible for example, posted on the wall or lying on top of a desk. Households where the materials continued to be visible and easily accessible should find it easier to retain and access the information on a daily basis. As shown in Figure 4, the vast majority of households 84 percent had at least one of the instruction materials visible in their home. About half of the households had three or more (out of six) instruction materials visible. 35
Figure 4: Visible Uwezo materials (treated households only) We also tested parents on whether they could remember the content of the information provided by the parent s poster and citizen s flyer. Both the poster and flyer suggested specific actions that parents and citizens could take to try to improve children s learning and the provision of education. For individuals who wanted to take action but weren t sure what to do, these materials provided ideas and information about concrete actions. For individuals who already had a sense of the possible actions, these materials provided encouragement to translate motivation into action. The parent s poster contained a checklist of six strategies parents might take to improve their children s learning. These were in the form of questions: Do you teach your children new words? Do you narrate stories to your child? Do you make sure your child sees you reading? Do you ask your child to read to you? Do you talk to your child about the writing on different products? Do you insist that your child practice writing? The citizen s flyer listed nine recommendations about how to get involved in local and national efforts to improve educational outcomes: (1) asking your child to do the suggested exercises; (2) giving ten children these exercises to try; (3) helping ten children to read and write; (4) talking to others about how to improve education; (5) sharing your thoughts in the church, mosque, on the radio, and other networks; (6) joining Uwezo s Friends of Education group or sending an SMS to Uwezo; (7) making sure the school and government take up their responsibility; (8) closely following school activities and attending school meetings; (9) closely 36
following to know that your child is benefiting from the Free Primary Education policy. When surveyed three months after receiving the materials, most parents could remember at least one of the actions suggested by the handouts. Two- thirds of parents were able to recall at least one action suggested by the parent s poster. The most commonly recalled action from the parent s poster was teaching children words, followed by narrating stories, and asking the child to read. (See Appendix V for relevant figures). Parents were less likely to remember the information provided by the citizen s flyer; slightly over half of the parents were able to recall at least one action from the flyer. The mean number of actions that parents could recall from the parent s poster was 2.07 (out of six suggested actions), and the mean number of actions that parents could recall from the citizen s flyer was 1.86 (out of nine suggested actions). (See Appendix V for relevant figures). People in households where the poster was hanging on the wall were more likely to remember more of the actions, either because it was easier to access the information when it was visible or because people who hang the poster were also the type of people more likely to remember the information anyway. At the same time, however, having the poster hanging on the wall was no guarantee that people absorbed the information on the poster. Seventy- eight percent of households where the poster was visible were able to remember at least one action, while only 44 percent of households where the poster was not visible could remember at least one action. Certain kinds of active parents were more likely to remember the actions on the parent s poster. Parents who reported that they help to organize school activities for children, provide extra lessons outside school and that they contribute teaching materials to the school were more likely to remember the actions suggested by the parent s poster. However, parents who reported that they help their children with their school work, ask children about teacher s presence at school, discuss performance and learning with teachers at school, and help with school maintenance were not significantly more likely to remember the actions suggested by the parent s poster (See table in Appendix VI). The same was true for the citizen s flyer. Seventy- six percent of households where the flyer was visible were able to remember at least one action, while only 47 percent of households where the flyer was not visible could remember at least one action. People, however, were far less likely to post the citizen s flyer on the wall than they were to post the parent s poster. Only 25 households (out of 118) posted the citizen s flyer, whereas 63 households posted the parent s poster. Is it new information? There is evidence that the treatment does not provide parents with any new information. Many parents already know how their children are performing. We 37
asked parents of assessed children whether their child s assessment scores were higher, lower, or about what they thought it would be. As documented in Figure 5, more than 30 percent of parents reported that their child s scores were about the same as they expected them to be. Figure 5: Assessment results relative to expectations Parents seemed generally well informed about the performance of their children s schools. All children completing the eight years of primary school take the Kenya Certificate of Primary Education (KCPE) examination, and all schools are ranked by their students performance on the examination. More than 80 percent of parents reported that they had heard of the KCPE, and more than 70 percent of parents reported that they knew the KCPE rank quartile of their child s school (however, we did not test the accuracy of their knowledge of the rankings). There was little difference in reported parent knowledge about the KCPE between control and treatment groups. As shown in Figures 6 and 7, overall reported levels of awareness were substantially higher in Kirinyaga as compared with in Rongo. 38
Figure 6: Awareness of KCPE exam Figure 7: Awareness of school rank on KCPE exam 39
Does it suggest that the situation is worse than I expected? In many cases, the treatment provided parents with the information that their children are already performing adequately. The assessment tests literacy and numeracy for all children 6-16 years old but only at a Class 2 level, typically attended by children 6-7 years old. Not surprisingly, the majority of the assessed children received a passing score on the assessment. Many parents therefore receive what sounds like positive information about their child s performance, even in cases when the information was quite negligible. That is, while it would have been striking for a 16- year- old to fail the Class 2- level literacy test, a passing grade for that child would be substantively trivial. Nonetheless, by communicating to parents that their child received the highest possible passing grade on a test administered by an objective outsider, the Uwezo treatment may have inadvertently signaled to parents that the schools are doing a good job and that their children are learning well. In our study villages, 61 percent of children passed the English assessment, 62 percent the Kiswahili assessment, and 62 percent passed the numeracy assessment. More than half of the children 54 percent received passing scores on all three tests. As shown in Figures 8 and 9, for tests in English literacy, Kiswahili literacy, and numeracy, more than 80 percent of assessed children above the age of 12, passed the exam with a top score, and more than 90 percent of children from that age group received either the top score or one point below. 40
Figure 8: Percent of children receiving passing score on Uwezo assessment, by age Figure 9: Percent of children receiving passing or next- to- passing score on Uwezo assessment, by age 41
About twenty percent of parents in the survey said that the treatment did provide them with unexpected information about their child s learning, but very few of these parents only five, in fact said that the treatment provided them with bad news and that their child s score was lower than they expected. The remaining 27 parents who received new information through the assessment reported that their child s score was actually higher than they expected. Given that many parents already know how their children are performing and that many of them passed the assessment, it is also not surprising that many parents are already quite satisfied with the quality of schools and children s learning. As shown in Figure 10, more than half reported being very satisfied with the quality of English teaching at their child s school, and more than 80 percent reported being very satisfied or somewhat satisfied. Levels of satisfaction with the teaching of Kiswahili and math were similarly high. In all, fewer than ten percent of all respondents said they were somewhat or very dissatisfied with the quality of teaching, and there were no differences across control and treatment groups. Figure 10: Satisfaction with quality of English teaching There may be some evidence, however, that parents who have children that fail the assessment are more likely to be dissatisfied with the quality of teaching in their child s school. Even if parents find out and absorb new information about deficiencies in education, in order for this information to motivate them to take action to remedy these problems, they still have to care about this information. They have to value education and feel they are responsible for their children s learning. 42
Do I care? Many parents do in fact prioritize education, even above other public goods such as health care and drinking water infrastructure. When we asked 261 parents (in a supplementary survey conducted only in Kirinyaga) what they would do if they were given 1000 shillings to spend on improving the local health clinic, school, or village well, respondents allocated an average of 380 shillings for education, 343 for health, and 272 for water improvements. Most respondents support all sectors. They tended to split the contributions relatively evenly with preferences toward education and health. Forty- three percent of respondents allocated the most money to education (See tables in Appendix VII). Do I think it is my responsibility to do something about it? In the smaller survey of 261 respondents in Kirinyaga, only 16 percent thought that parents working on their own should take responsibility for improving the schools. In contrast, 49 percent thought that teachers or principals should take responsibility. Similarly, in the larger household survey, only six percent thought that parents were responsible for making sure that teachers come to school and teach the children, while 83 percent thought the school or headmaster was responsible for doing so. These findings were consistently echoed in our more open- ended qualitative research within the villages. Not surprisingly, the small group of parents who do think they have primary responsibility for monitoring teachers are more likely to take action to improving school quality and children s learning. They are more likely to participate in community groups and associations dealing with education issues, to contribute time and resources to activities that support the school such as helping to organize school activities and providing materials such as pencils and chalk, and to report that their involvement in their child s education has increased even further over the last three months. But even when parents do value education and take responsibility for their children s learning, the treatment may still fail to make a difference when parents think they lack the skills or capabilities to take effective action. Do I have the skills to make a difference? Another part of the problem may be that many people lack the civic skills that are necessary for organizing activities and taking actions to improve education and pressure the government for better education provision. Most people have never engaged in actions that allow them to develop skills like talking to government officials, writing petitions, or organizing groups. Only fifteen percent, for example, have had experience contacting an official, and only twenty percent have written a letter as part of a community group. About a quarter say they have planned a meeting before, although a larger number 42 percent say they have given a speech in a community group before. Yet, experience engaging in these kinds of skill- building activities is significantly correlated with taking action to improve children s learning. 43
Do I have the sense of efficacy to think that my efforts will make an impact? The answer to this question is not entirely clear. Many parents think they have some ability to make a difference, but many also say they wouldn't know what specific action to take. Overall, many people thought they have some influence and efficacy to make a difference in their communities. Thirty- three percent thought that people have some influence in making their village a better place, and another 25 percent thought that people have a lot of influence. More than two- thirds said that if people found local officials embezzling government money meant for schools, people would be upset and take action. But when it came to addressing problems with their child s school specifically, many people were not sure what action to take even though they say they would be willing to take some action. A majority 58 percent indicated that they themselves would consider taking action to address problems with their child s school. At the same time, most respondents 78 percent said they wouldn t know what specific actions to take, or wouldn't know how to figure out what actions to take. Those, however, who reported a stronger sense of efficacy and said they would know how to figure out what concrete action to take were in fact significantly more likely to engage in action. (People seemed to have similar problems with taking concrete actions to improve health care and water quality. For example, most people 88 percent thought that citizens have a role in ensuring that health care improves. Yet when it came to specific actions for monitoring health care quality, virtually no one thought that citizens had primary responsibility for ensuring that drugs are available at the health clinic or making sure that health care workers go to work (See relevant tables in Appendix VIII). Another reason parents may decide not to act is because they believe that any actions they take are unlikely to have a discernible effect. In addition to internal feelings that they don t have the skills or efficacy to take action, people may also see external obstacles to action presented by the government or by fellow citizens and make a rational decision not to act. In fact, government trustworthiness and state capacity seem to be a major concern among respondents. Many respondents, for example, feared that complaining about corruption at the village school or clinic would result in punishment or retribution. Forty- two percent thought punishment was very likely and another 24 percent thought punishment was somewhat likely. Expectation of punishment, not surprisingly, was correlated with lack of action. People may also think that investing effort into pressuring the government to improve education is pointless because the government is too corrupt or ineffective to implement reforms that would improve outcomes. Many respondents in the survey estimated the level of corruption in government to be very high. When asked how much money people would actually receive if the government gave out 10,000 shillings in relief payments, 84 percent thought that people would receive half or 44
less than half of the payment. On average, people thought that only 2678 out of 10,000 shillings allocated by the government would make it into their hands. In other words, they anticipated that almost three- quarters of the allocation would be siphoned off by corrupt officials Do I expect fellow community members to join me in taking action to effect change? Or do I see significant obstacles to collective action? One hypothesis is that parents fail to take action because they recognize that effecting significant change will require others to contribute their time and energy as well. If they don t think others will also contribute, then they may also decide not to contribute. Concern about collective action problems does not seem to be an issue. There is little evidence suggesting that people fail to take action because they are particularly concerned about collective action problems. If people are making these kinds of calculations, then we might see stronger treatment effects on private actions to improve learning, like reading to one s own children, and weaker effects on participation in group activities, like going to a meeting. We find very weak evidence for this but the findings are not very robust. If people are worried about collective action problems, then we should also find stronger treatment effects in places where respondents say they live in communities where people contribute to common ends. We do not, however, find stronger treatment effects in places where respondents say there is higher social capital and that their neighbors and friends help them and the village when in need. Are the kinds of actions I am inspired to take different from what I am already doing? Finally, the treatment may fail to have an effect because so many parents are already doing what the treatment is designed to get them to do. In fact, a large proportion of parents are already active at home, in schools, and in their communities. Many of the treated do not need the treatment. Seventy- four percent of parents report that they either always or sometimes help their children with their schoolwork. Thirty- six percent of parents report that they always help their children with their schoolwork. Eighty- eight percent of parents report that they engage in one or more of the actions associated with the battery of questions about school- related activities. On average, people report that they engage in almost three out of nine listed actions. Forty- one percent of parents report that they are members of community groups or associations that deal with education issues. About one- third of parents had contributed money to such a group. On average, people had attended one meeting over the last three months. Forty- three percent report that in the past three months, villagers have approached village officials or other political leaders such as members of parliament to ask 45
about improving their schools. Thirty- eight percent report that in the past three months, villagers have approached officials to ask about improving village government or public service delivery. In all of these cases, we have no evidence concerning the quality of their engagement, its intensity or efficacy. Notwithstanding, given that a large share of the population is being treated for a condition they don t have i.e., lack of active citizenship, we should not be surprised that there is no difference between treated and control groups. Rongo and Kirinyaga within the Kenyan Context Given limited resources for our phase I study, we essentially had to take a very careful bet that our study districts were not exceptional in ways that would affect responses to the Uwezo intervention. One way of checking this assumption is to consider some of the national results from the 2008 Afrobarometer survey. And while this is not the place for a full analysis of that survey, several findings are strikingly similar to what we found in our study districts, which suggest that the lessons learned here may well apply to the larger Kenyan context. Perhaps most important, Kenyans report being very satisfied with government education. To recall, we also found that just over 80 percent of respondents in our sample reported being very or somewhat satisfied with the quality of English teaching. According to the Afrobarometer, 75 percent of Kenyan adults responded that the government was doing fairly well or very well in meeting educational needs rating government performance extremely high compared to virtually every other problem and sector queried. For instance, only 5% of respondents affirmed government performance in narrowing gaps between rich and poor ; 24% were positive about how the government was managing the economy ; 38% were positive about government provision of water and sanitation; and 66% were positive about the improvement of basic health services. Only in term of combating HIV/AIDS was there slightly higher approval (76% said government was doing well or very well). In terms of responses to an open- ended question about priorities for government, only 2% mentioned education first, which rendered it the ninth most identified top priority among citizens. Citizens identified education more frequently as a second or third priority, but in total, only 17% mentioned education as any one of their three top priorities, while 40% mentioned management of the economy, and 33% mentioned unemployment. Our study asked if the respondent had ever attended a village meeting and 78% responded affirmatively; Similarly, 70% of Kenyans said they had attended a community meeting. In terms of efficacy, we also attained almost identical results as the Afrobarometer survey. We asked, if you complained to a local official, would he pay attention? and 25.4 % of our respondents said, A lot, and 37.2% said, some. The Afrobarometer 46
asked, How likely is it that you could get together with others and make the local councilor listen to your concerns about matters of importance, and 22% of respondents said, very likely while 37% said somewhat likely. The main point to be made here is that both in terms of attitudes that might have affected responses to the Uwezo information and in terms of baseline citizenship attitudes and behaviors, our study sites appear to be quite typical of national averages. Additional findings from studies of information dissemination In conjunction with the research described above, we conducted studies of the impact of information dissemination campaigns carried out by Uwezo during July and August 2011. These campaigns targeted citizens with SMS messages to mobile phones (phone numbers had been gathered during the assessment) and with radio shows, both highlighting the results from the assessment campaign, and making the point that educational attainment was unsatisfactory. We summarize our procedures and findings in Appendix I but most importantly, we found extremely weak penetration of the information campaigns, virtually none in the case of the radio campaign, and no detectable impact in terms of attitudes or behavior. Conclusions and Implications In this report, we have detailed the findings from our preliminary investigation of the first part of the Uwezo intervention in Kenya. The main conclusion we draw from our analysis data is that there has not been any substantive effect resulting from the various informational treatments implemented by Uwezo on the target population of Kenyans adults. While there is always the chance that our results suffer from type II errors (i.e. false negatives), given the triangulation of our research, we are skeptical that this is the problem. We have suggested several alternative explanations for this null result, including an explicit detailing of conditions necessary for an informational treatment to be efficacious, which it turns out do not hold for much of the targeted population. Of course, it is possible that other planned aspects of the Uwezo intervention, including mass information dissemination campaigns, will be more effective in promoting active citizenship. Indeed, the broader Uwezo theory of change, which posits that real change will only occur after the creation of a new ecosystem of discussion, debate and action surrounding children s learning, might lead us to expect to find no impact at this rather early stage in the unfolding of the program. Our research nonetheless has several implications for the design of future aspects of the Uwezo initiative: First, it may not make sense to provide the same, very simple assessment to older children. Because pass rates among these children are extremely high, the 47
information that the assessments provide to parents may be exactly the opposite of what Uwezo intends: rather than suggesting the need for mobilization for change, it suggests (perhaps erroneously) that the children, and by extension the educational system that is training them, are doing fine. Uwezo should consider adjusting the target assessment population, or adjusting the nature of the exercise for older children. Indeed, the Uwezo initiative may be having an impact in neighboring Tanzania, a country with much lower levels of literacy. Second, Uwezo ought to consider careful targeting of its interventions. While it is not likely that Uwezo will want to deviate from its plan of nationally representative random sampling of households for its assessments, the organization ought to consider who is most likely to be affected by proposed information dissemination campaigns, and what information is likely to facilitate the development of more active citizens. Using the framework we have provided in Figure 3, and as discussed above, Uwezo might try to target those areas where there is prior information that schools have not shared exam results and/or where educational attainment has been lowest, which would at least provide the most likely conditions that citizens would find the information to be new and distressing. It may be difficult to identify the presence or absence of the conditions we suggest are necessary for success of the treatment ex ante, but this ought to be considered and discussed further. Third, Uwezo ought to more carefully implement and monitor its information dissemination campaigns. In many cases, we found that the message simply did not reach the intended subjects, a problem that must be fixed in future work. Fourth, Uwezo ought to consider particular strategic informational strategies that might elicit stronger responses from citizens. Creative analysis of information would be considered new information and this might include the clear presentation of comparative data across geographic regions; and/or presentation of information concerning the relationship between levels of active citizenship and the quality of educational attainment. In other words, Uwezo should consider the social, psychological, and strategic incentives for citizens to act and in turn, tailor their informational campaigns in ways that play to such incentives. Finally, we urge Uwezo s program managers to look closely at the information- accountability causal chain figure provided above and to reflect on which of the bottlenecks they believe may be most important in limiting the impact of their programming, and then focus future programming efforts toward reducing those obstacles to citizen activism. Along these lines, we look forward to working with Uwezo and Twaweza on the next stage of our research. We hope that these organizations can share their local knowledge of how their work has been received in villages around Kenya and help us to identify examples of successful change in favor of more active citizenship. Such examples may help us to identify possible conditioning factors that make success more likely on a systematic basis. It is our hope that Uwezo will continue to work closely with our research team to share knowledge of planned interventions such 48
that we can deploy the proper resources to assess the impact of such actions on Kenyan citizens. 49
References Adsera, A., C. Boix, and M. Payne. 2003. "Are you being served? Political accountability and quality of government." Journal of Law, Economics, and Organization 19 (2):445. Allport, F.H., and M. Lepkin. 1945. "Wartime rumors of waste and special privilege: why some people believe them." The Journal of Abnormal and Social Psychology 40 (1):3. Banerjee, A, R Banerji, E Duflo, R Glennerster, D Kenniston, S Khemani, M Shotland, and PL Mode. 2007. "Can Information Campaigns Raise Awareness and Local Participation in Primary Education?" Economic and Political Weekly 42 (15):1365. Banerjee, A, R. Banerji, E. Duflo, R. Glennerster, and S. Khemani. 2010. "Pitfalls of Participatory Programs: Evidence from a randomized evaluation in education in India." American Economic Journal: Economic Policy 2 (1):1-30. Barnhardt, S., D. Karlan, and S. Khemani. 2009. "Participation in a School Incentive Program in India." Journal of Development Studies 45:369-90. Besley, Timothy, and Robin Burgess. 2000. "Land reform, poverty reduction, and growth: Evidence from India." Quarterly Journal of Economics:389-430.. 2001. "Political agency, government responsiveness and the role of the media." European Economic Review 45 (4 Äì6):629-40. Bjorkman, M., and J. Svensson. 2009. "Power to the People: Evidence from a Randomized Field Experiment on Community- Based Monitoring in Uganda*." Quarterly Journal of Economics 124 (2):735-69. Brady, Henry E., Sidney Verba, and Kay Lehman Schlozman. 1995. "Beyond Ses: A Resource Model of Political Participation." The American Political Science Review 89 (2):271-94. Bruns, B., D. Filmer, and H.A. Patrinos. 2011. Making Schools Work: New Evidence on Accountability Reforms: World Bank Publications. Christakis, Nicholas A., and James H. Fowler. 2009. Connected : the surprising power of our social networks and how they shape our lives. 1st ed. New York: Little, Brown and Co./Hachette Book Group. Goldstein, N.J., R.B. Cialdini, and V. Griskevicius. 2008. "A room with a viewpoint: Using social norms to motivate environmental conservation in hotels." Journal of Consumer Research 35 (3):472-82. Keefer, Philip, and Stuti Khemani. 2011. "Mass Media and Public Services: The Effects of Radio Access on Public Education in Benin." SSRN elibrary. Kuvaas, BÂrd, and Marcus Selart. 2004. "Effects of attribute framing on cognitive processing and evaluation." Organizational Behavior and Human Decision Processes 95 (2):198-207. Lau, R.R., and M. Schlesinger. 2005. "Policy frames, metaphorical reasoning, and support for public policies." Political Psychology 26 (1):77-114. Lipset, S.M. 1959. "Some social requisites of democracy: Economic development and political legitimacy." The American Political Science Review 53 (1):69-105. Olson, M. 2007. "The Logic of Collective Action [1965]." Contemporary Sociological Theory:111. Ostrom, Elinor. 2000. "Collective Action and the Evolution of Social Norms." The Journal of 50
Economic Perspectives 14 (3):137-58. Pande, R. 2011. "Can informed voters enforce better governance? Experiments in low- income democracies." Annu. Rev. Econ. 3 (1):215-37. Pandey, P., S. Goyal, and V. Sundararaman. 2009. "Community participation in public schools: impact of information campaigns in three Indian states." Education Economics 13 (3):355-75. Schultz, P. Wesley, Jessica M. Nolan, Robert B. Cialdini, Noah J. Goldstein, and Vladas Griskevicius. 2007. "The Constructive, Destructive, and Reconstructive Power of Social Norms." Psychological Science 18 (5):429-34. Schwarz, Norbert, Lawrence J. Sanna, Ian Skurnik, and Carolyn Yoon. 2007. "Metacognitive Experiences and the Intricacies of Setting People Straight: Implications for Debiasing and Public Information Campaigns." In Advances in Experimental Social Psychology, ed. P. Z. Mark: Academic Press. Tarrow, Sidney. 1996. Power in Movements: Social Movements, Collective Action and Politics. New York: Cambridge University Press. Uwezo Kenya. 2011. "Are Our Children Learning? Annual Learning Assessment Report." Uwezo Kenya. World Bank. 2003. "World Development Report 2004 Making Services Work for Poor People." In World Development Report. Washington, D.C.: The World Bank. Wrong, M. 2009. It's our turn to eat: the story of a Kenyan whistle- blower: Harpercollins. 51
Appendix I: Findings from SMS and Radio Dissemination Study 1) Radio show after the national launch (Rongo only) We sought to study the effects of the radio show broadcast by Lake Victoria FM in Rongo through surveys and focus groups in the days immediately following the initial broadcasts. We found, however, significant failure in the delivery of information through this radio show. We conducted a short survey in our 12 villages in Rongo following the second radio broadcast on Radio Lake Victoria. We sampled 162 households (average of 13.5 households per village), including both treated and untreated households. Of that sample, only 1 person mentioned listening to the Uwezo broadcast at all. This person did say that he spoke to someone outside of his family about the broadcast. With virtually no penetration of the radio program to the grassroots level, we were unable to evaluate diffusion of the information provided by the radio program from listeners to non- listeners. Many people do not listen to Radio Lake Victoria. Many other people said either they do not own radios or their radios were broken/out of batteries. The LPT team recommends that in future information dissemination campaigns more attention be paid to the quality and appropriateness of the media outlet. Information dissemination campaigns may also want to consider maximizing listenership within targeted geographic regions, rather than spreading resources nationwide. 2) SMS messages sent before the national launch (Kirinyaga and Rongo) In both Kirinyaga and Rongo, we administered a short survey immediately after two SMS messages were sent out one inviting assessed households to regional launches and one with information about the national- level results of the assessment. We arranged with Uwezo to send SMS's to our treated villages in Kirinyaga and Rongo on specific days in the week leading up to the launch, so we could visit those villages on the following days and observe any responses. We found significant failure in the actual delivery of the information through SMS. There was also very little diffusion of the information from SMS recipients to non- recipients via SMS forwarding or face- to- face discussion. In the weeks following the dissemination, we found no substantial new citizen action taking place within the study villages. Although everyone in our survey should have received the SMS messages, only about one- fifth of the assessed households in Kirinyaga and about one- third of the assessed households in Rongo reported receiving the messages. We surveyed 127 citizens in Kirinyaga, all of whom were sent SMS s by Uwezo. We conducted our survey approximately one day following the broadcast of those messages. Of these, 26 recalled the launch invitation SMS; 23 recalled the findings 52
SMS; 25 had the launch SMS and 27 had the findings SMS still on their phone; 11 made the connection between the SMS and the previous Uwezo assessment; 4 said they either forwarded or replied to the SMS; and 19 said the SMS was of interest to them. We surveyed 130 citizens in Rongo, all of whom were sent SMS s by Uwezo. We conducted our survey approximately one day following the broadcast of those messages. Of these, 39 recalled the launch invitation SMS; 38 recalled the findings SMS; 36 had the launch SMS and 35 had the findings SMS still on their phone; 12 made the connection between the SMS and the previous Uwezo assessment; 5 said they either forwarded or replied to the SMS; and 29 said the SMS was of interest to them. Compared to the radio program, the SMS campaign was much more effective in actually reaching citizens, although, even so, it was much lower than was initially hoped. Moreover, it did not appear that the content of the SMS messages had any substantial impact on thinking or discussion within the villages, and diffusion of the SMS messages from recipients to non- recipients was also very low. Problems we identified with the use of SMS: - - - - people cannot read people do not know how to use SMS functions on their phones and so have difficulty accessing and/or forwarding messages people do not read SMS messages from unknown senders/assumed it was spam and/or a scam people confuse Uwezo with the Safaricom Uwezo tariff 53
Appendix II: Quotes from Qualitative Research Question to Village Elder: More generally if someone in this community has a problem especially with a service that is typically provided by government is there anyone they can go to to try to fix it? Are most people successful? Kirinyaga, Control Village: "Yes, you can go to the administration, policymakers, elders. It is not always successful, and sometimes the government takes too long." Kirinyaga, Control Village: "They take their issues to the M.P's office and sometimes they are helped." Kirinyaga, Treatment Village: "Yes there is, and they usually go to government offices but they never get any help." Rongo, Control Village: "Issues with services provided by the government are reported at the District Officer and District commissioner who deals with the issues and provides solution. The DC and DO fix these problems instantly by contacting the government officer in charge or during mobile offices by these officials and government open days." Rongo, Control Village: "They would complain to the village elder, chief, DO and DC Most of the issues are always resolved or forwarded to a higher office for resolution." Rongo, Treatment Village: "They go to the village elder, assistant chief, chief and the relevant GOK offices. Most issues are never resolved because of corruption in GOK offices." Rongo, Treatment Village: "The community complains mostly to the provincial administration to get community issues aired and solved. Most of the people are not successful because the village elder is most of the time ignored by the office of the chief and the community thinks it is because she is female." Question to Head Teacher: Is there anything about your school that I should know that we have not discussed? For example is there anything that makes this school particularly unique compared with other schools in this area? Kirinyaga, Treatment Village: "The deputy head teacher said that if it was up to the parents, that only 50% of the kids would show up to school. They try to tell the parents about the value of education, but many are disillusioned by the lack of opportunities for those who have been educated and don't see its value. Some need income and send kids to child labor in nearby coffee and tea farms." Kirinyaga, Control Village: "School performance has been going up because people on school management committee are focused on education- - they chart and analyze the results regularly to see where they can improve. They also give rewards to the students to incentivize good performance, with 6000 Ksh dedicated specifically to this currently." 54
Rongo, Control Village: "Parents don't take action because the school's rural, interior setting means people are not informed." Rongo, Control Village: "The relationship between the schoolteachers and the committee has not been smooth. The community should raise issues about teachers and complain about their conduct and performance without embarrassing the teachers concerned. The lack of community support for the teachers causes low morale among the teachers hence the low performance of the school despite the well developed physical structures in the school." Rongo, Treatment Village: "The school management committee deliberates and agrees to do certain things but they don't pay up, so they have not made any positive change to the school apart from helping in the administration of the school." 55
Appendix III: Covariate Tables and Figures Treatment = 0 vs treatment = 2 56
57
58
Appendix IV: Correlations of Actions in Past Three Months Have you done the following at all over the past three months, with or without a request from someone else? Attended health, education, or water committee meetings Attended health, education, or water committee meetings 1 Spoke to a health, education, or water committee member about health services outside of a meeting? Spoke to a health, education, or water committee member about health services outside of a meeting? 0.478*** 1 * p<0.05, ** p<0.01, *** p<0.001 Attended committee mtg.; did not speak to member outside mtg. (yes to 4.36d; no to 4.36e) Did not attend committee mtg.; spoke to member outside mtg. (no to 4.36d; yes to 4.36e) Attended committee mtg. and spoke to member outside mtg. (yes to 4.36d; yes to 4.36e) Did not attend committee mtg. or speak to member outside mtg. (no to 4.36d; no to 4.36e) Obs. 134 17 122 271 % of total 0.25 0.03 0.22 0.50 59
Have you done the following at all over the past three months, with or without a request from someone else? Raised issues about clinic, school, or water services in a community meeting Raised issues about clinic, school, or water services in a community meeting Raised issues about health, water, or education with local officials outside a community meeting Raised issues about health, water, or education with local officials outside a community meeting 0.729*** 1 * p<0.05, ** p<0.01, *** p<0.001 1 Raised issues about clinic, school, or water services in community meeting; did not raise issue with officials outside community mtg. (yes to 4.39fd; no to 4.39g) Did not raise issues about clinic, school, or water services in community meeting; raised issue with officials outside community mtg. (no to 4.39fd; yes to 4.39g) Raised issues about clinic, school, or water services in community meeting; raised issue with officials outside community mtg. (yes to 4.39fd; yes to 4.39g) Did not raise issues about clinic, school, or water services in community meeting; did not raise issue with officials outside community mtg.. (no to 4.39fd; no to 4.39g) Obs. 31 16 83 416 % of total 0.06 0.03 0.15 0.76 60
Appendix V: Action Recall 61
62
Appendix VI: Parental Involvement in Education Correlations: Parental involvement in education Remembers suggestion to teach your child new words and how to pronounce them Remembers suggestion to narrate stories to your child and then ask questions after Remembers suggestion to make sure your child sees you reading books or newspapers Remembers suggestion to ask your child to read to you when she or he is at home Remembers suggestion to talk to your child about the writings on different products Remembers suggestion to insist that your child practice writing Remembers suggestion to teach your child new words and how to pronounce them 1 Remembers suggestion to narrate stories to your child and then ask questions after 0.684*** 1 Remembers suggestion to make sure your child sees you reading books or newspapers 0.630*** 0.658*** 1 Remembers suggestion to ask your child to read to you when she or he is at home 0.575*** 0.554*** 0.658*** 1 Remembers the suggestion to talk to your child about the writings on different products 0.557*** 0.660*** 0.715*** 0.657*** 1 Remembers the suggestion to insist that your child practice writing 0.630*** 0.607*** 0.630*** 0.631*** 0.684*** 1 Helps kids with math, reading, etc 0.00342 0.0347 0.00533 0.0486 0.0244 0.0730 Asks child about teacher's presence 0.00767-0.0322-0.0128 0.0169 0.0378 0.0890* Participate on school committee - 0.0906* - 0.0222-0.0212-0.0273-0.0187-0.0212 Discusses child's performance with teacher 0.0511-0.00578 0.0659 0.0576 0.0625 0.0659 Attends parent- teacher meeting 0.0205-0.00264 0.0359 0.0379 0.0178 0.0542 Organizes school activities for children 0.0873* 0.0862 0.0957* 0.133** 0.143** 0.149*** Assists teaching at school 0.0652 0.0804 0.0362 0.0559 0.0116 0.0701 Provides extra lessons outside school 0.183*** 0.150*** 0.217*** 0.148*** 0.212*** 0.199*** Provides teaching materials to school 0.195*** 0.139** 0.178*** 0.173*** 0.229*** 0.178*** Helps with school maintenance 0.0169 0.0303 0.0386 0.0435 0.0463 0.0386 Provides food/water to school 0.0551 0.0168 0.0446 0.0461 0.0866 0.111* Discusses learning quality with teacher 0.0504 0.0500 0.0936* 0.0969* 0.0829 0.124** * p<0.05, ** p<0.01, *** p<0.001 63
Correlations: Parental involvement in education Helps kids with math, reading, etc Asks child about teacher's presence Participate on school committee Discusses child's performance with teacher Attends parent- teacher meeting Organizes school activities for children Assists teaching at school Provides extra lessons outside school Provides teaching materials to school Helps with school maintenance Provides food/ water to school Helps kids with math, reading, etc 1 Asks child about teacher's presence 0.400*** 1 Participate on school committee 0.176*** 0.176*** 1 Discusses child's performance with teacher 0.185*** 0.250*** 0.0998* 1 Attends parent- teacher meeting 0.163*** 0.124** 0.112* 0.467*** 1 Organizes school activities for children 0.115** 0.234*** 0.129** 0.140** 0.107* 1 Assists teaching at school 0.0605 0.0498 0.0143 0.109* 0.0754 0.257*** 1 Provides extra lessons outside school 0.156*** 0.137** - 0.0491 0.130** 0.0720 0.345*** 0.245*** 1 Provides teaching materials to school 0.0260 0.123** 0.0383 0.123** 0.0655 0.301*** 0.0835 0.393*** 1 Helps with school maintenance 0.0513 0.182*** 0.0440 0.167*** 0.159*** 0.463*** 0.211*** 0.354*** 0.349*** 1 Provides food/ water to school - 0.0223 0.207*** 0.0236 0.171*** 0.100* 0.246*** 0.174*** 0.232*** 0.404*** 0.449*** 1 Discusses learning quality with teacher 0.0735 0.148*** 0.0979* 0.424*** 0.275*** 0.159*** 0.161*** 0.261*** 0.250*** 0.255*** 0.271*** 1 Discusses learning quality with teacher * p<0.05, ** p<0.01, *** p<0.001
Appendix VII: Spending Priorities How much are you willing to spend on... mean sd count Out of 1000 shillings, how much would you spend on improving local health clinics? 343.5423 99.80349 260 Out of 1000 shillings, how much would you spend on improving schools? 380.2731 103.3212 260 Out of 1000 shillings, how much would you spend on improving wells? 272.7731 93.20616 260 What percentage of respondents give the most to education? frequency percent Percentage of respondents who gave the most to other sector 150 57.47 Percentage of respondents who gave the most to education 111 42.53 Total 261 100 65
Appendix VIII: Health and Water Descriptive Statistics: Health question response mean sd count Most common response about the biggest problem affecting the health care in this village: No medicines. Percentage with this response - - categorical 0.3004 0.4588 546 Second most common response about the problems affecting the health care in this village: Clinic too far away. Percentage with this response - - categorical 0.3388 0.4737 546 Percentage who would consider taking action about health care problems 0 = no; 1=yes 0.5613 0.4968 424 Percentage who would know how to go about taking the necessary action to address health care problems 0 = no; 1=yes 0.1722 0.378 424 Hypothetical situation: If some people in this village discovered that health workers were stealing medicine from the government health facility, which sounds more likely to you? Option 1: People would be upset, but they would feel that they are not powerful enough to fix this problem. Option 2: People would be upset and they would take action to fix the problem. 1= option 1; 2=option 2 1.5664 0.496 542 Most common response about who is most responsible for making sure that health workers come to work: Clinic. Percentage with this response - - categorical 0.598 0.4908 510 Second most common response about who is most responsible for making sure that health workers come to work: District Municipal Council. Percentage with this response - - categorical 0.1725 0.3782 510 Most common response about who is most responsible for making sure that drugs are available at clinic: Clinic. Percentage with this response - - categorical 0.5273 0.4997 512 Second most common response about who is most responsible for making sure that drugs are available at clinic: National government. Percentage with this response - - categorical 0.2148 0.4111 512 Percentage who think citizens have role in ensuring health care improves 0 = no; 1=yes 0.8796 0.3258 548 66
Descriptive Statistics: Water question response mean sd count Most common response about the main problem affecting water suppy: Distance from water source. Percentage with this response - - categorical 0.3187 0.4664 546 Second most common response about the main problem affecting water suppy: Cleanliness of water. Percentage with this response - - categorical 0.2399 0.4274 546 Percentage who have taken any action to address water supply problems 0 = no; 1=yes 0.3418 0.4748 474 Percentage who would consider taking action about water supply problems 0 = no; 1=yes 0.6299 0.4836 335 Percentage who would know how to take necessary action to address water supply problems 0 = no; 1=yes 0.1691 0.3754 337 Most common response about who is most responsible to make sure water point functions: Water committee. Percentage with this response - - categorical 0.4004 0.4904 522 Second most common response about who is most responsible to make sure water point functions: Village government. Percentage with this response - - categorical 0.1705 0.3764 522 Most common response about who is most responsible to make sure water point reduce prices if they are overcharging: Individual/company. Percentage with this response - - categorical 0.3878 0.4877 490 Second most common response about who is most responsible to make sure water point reduces prices if they are overcharging: District Municipal Council. Percentage with this response - - categorical 0.1327 0.3395 490 Percentage who think citizens have a role in repairing water point if it breaks down 0 = no; 1=yes 0.936 0.245 547 Hypothetical situation: If people in this area found out that members of the water committee were stealing money, which sounds more likely to you? Option 1: People would be upset, but they would feel that they are not powerful enough to fix this problem. Option 2: People would be upset and they would take action to fix the problem. Option 3: People would be upset and they approach others and ask them to take action on their behalf to fix the problem. 1= option 1; 2=option 2; 3=option 3 1.9506 0.668 547 67
Appendix IX: Uwezo Assessment Overview In December 2010 and January 2011, Uwezo recruited district coordinators in each of 124 districts selected for the 2011 assessment. The district coordinators are generally individuals who work with community- based organizations within the districts. They are hired by Uwezo for two- month contracts. In January 2011, the district coordinators attend training with Uwezo staff in sessions organized by province. District coordinator training lasts 4 days (need to confirm that w/ Uwezo) In January and February, the district coordinators recruit 60 volunteers from each district. Each of 30 villages randomly selected in each district will be visited by a pair of volunteers. Uwezo volunteers: all come from the villages they will be surveying they know the community, so it is easier for them to gain acceptance/get people to talk to them. In a few instances, volunteers come from other villages (if someone cancelled last minute, they might be replaced by an alternate from another village in the district). Teams consist of 1 female/1 male whenever possible Volunteers have finished secondary school with a C- average or better Volunteers are 18-25 years old whenever possible, though sometimes Uwezo take older volunteers so as not to sacrifice on educational standards. In Kirinyaga, Daniel spoke with the village elders first, many of whom took application materials and recommended volunteers to him. In Rongo, George hung 2 posters in each village, generally around church/school The district coordinators then conduct interviews with applicants to pick the final volunteers. At this time, the district coordinators also made household lists for each village and drew village maps. They generally called on the assistance of the village elder or other resident for help with this. The district coordinator selected 20 households to be assessed (out of N households, every N/20 th household was selected). The district coordinator also identified five replacement households for alternates. In February and March, the volunteers first attend a two- day training run by the district coordinators and then conduct the assessment. The assessments were staggered across districts over a period of about 8 weeks, but all districts started training on Tuesday. Volunteers departed for the villages on Thursday and started the assessments on Thursday night or Friday. On the first day of training, volunteers worked in hotel/conference centers, generally in the district capitals. On Wednesday, volunteers practiced conducting the household survey in practice villages in groups of five to eight volunteers. In the afternoon, volunteers reconvened to debrief from the practice survey and discuss strategies for overcoming challenges. On Thursday morning, volunteers received all the 68
assessment materials and depart in pairs for the villages. Depending on travel time, some volunteers chose to begin the household assessments on Thursday. Volunteers were given the household list and a village map created by the district coordinator. In case a household refused to participate or was not able to be located during any of the days of the assessment, the volunteers needed to receive permission from the district coordinator before using a replacement household. Volunteers were instructed only to visit households when children were likely to be at home, in the evenings and on Saturday. Volunteers visited the chief or village elder to fill out the village information sheet. Some used the village elder to locate the households on their list. Although the village elder was supposed to be the volunteers' first stop, many village pairs conducted other parts of the survey before visiting the village elder. In instances where the village elder was unavailable, volunteers filled out the village information sheet themselves to the best of their abilities. On Friday morning, volunteers conducted the school survey at the public primary school selected by the district coordinators. The district coordinators asked the village elder and other members of the community to identify the public primary school that most children in the village attended, and chose that school as the school to be part of the assessment. At the school, volunteers requested to talk to the head teacher. If the head teacher was unavailable, they asked to speak to the deputy head teacher or senior teacher. The volunteers explained the Uwezo assessment and presented letters of authorization from the district commissioner and district education officer. The volunteers then filled out the school data sheet. The first portion is completed with the head teacher. The volunteers then toured the school grounds to record data on student attendance and school facilities. Finally, the volunteers asked the head teacher the questions on the Head Teacher Ask poster and then left the poster with the head teacher. In schools where the head teacher was unavailable, the volunteers left the Head Teacher Ask with whomever they'd spoken with, but they did not fill out the questions asked on the poster. After school hours, the volunteers visited 20 households in the village. Households were assessed regardless of whether they had children. When arriving at a household on the list, volunteers introduced themselves, explained Uwezo's mission, and described the assessment procedure. They were instructed that they might only begin recording information after receiving permission from the respondent to do so. Volunteers were instructed on how to conduct themselves: they were told not to enter the house if not invited in, but to conduct the survey outside if necessary, not to sit on a chair if everyone else was on the ground, be sensitive when speaking to older children who may not think of themselves as "children", be sensitive to children who may be shy because they cannot read, let the female volunteer assess girls who seem particularly hesitant. Volunteers generally split the work: one volunteer recorded the household data while the other tested children. Volunteers were instructed to try to engage other household members if they appeared to be interfering with the testing in any 69
way. All children ages 6-16 were supposed to be assessed. If a child was not at home at the time, volunteers were instructed to return later to find the child or to ask family members to call the child and instruct him to return home. During the assessment, volunteers were instructed to be patient and encouraging to relax nerves. First, the English reading skills were assessed, followed by the Kiswahili reading skills. The volunteer started the child with the paragraph reading. Below is the Uwezo guideline on classifying the child's reading level: 70
71
Uwezo Kenya 2011 Volunteers Manual After the two literacy tests, the volunteers assessed numeracy. Below is the Uwezo guideline for classifying numeracy level: 72
Uwezo Kenya 2011 Volunteers Manual 73
Following the numeracy test, volunteers asked the two bonus questions (about identifying body parts) and performed the visual acuity test on the child's right, then left, eyes. After the assessment, volunteers provided instant feedback to the parents or adults present. The guidelines on how to describe the scores are below: Uwezo Kenya 2011 Volunteers Manual 74
After all children were tested, the volunteers and the respondent completed the Parent Ask poster. The volunteers left the poster, calendar, English story, Kiswahili story, and Rafiki wa Elimu application form with each household. The volunteer manual does not provide instructions for explaining the Rafiki wa Elimu form. Nearly all volunteers were able to finish visiting 20 households by the end of day Saturday. Early the next week, the volunteers returned to the training site to hand in their materials to the district coordinator. The district coordinator checked over every booklet to make sure all answers were filled in before the volunteers left and returned home. The district coordinator ensured all booklets were transported to Nairobi (need clarification on how this happened). A data entry team entered the survey booklets into the computer data entry system in the Uwezo office in Nairobi. In the weeks after the assessment, Uwezo staff conducted random spot checks to confirm the quality of the data collection (need to confirm how many villages/districts they visited). The results of the assessment were first released to the public in July 2011 with coordinated launch events in Nairobi and each of Kenya's provinces. 75