Size: px
Start display at page:



1 TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT A PARTICIPATORY EVALUATION GUIDE FOR CSOS Prepared by: Michael Stephens Program Officer, Public Engagement and Capacity Building, CCIC

2 The Canadian Council for International Co-operation (CCIC) is a coalition of Canadian voluntary sector organizations working globally to achieve sustainable human development. CCIC seeks to end global poverty and to promote social justice and human dignity for all. CCIC gratefully acknowledges the financial and in-kind support provided by collaborating institutions for this publication, as well as on-going financial support from the Canadian International Development Agency (CIDA) and the International Development Research Centre (IDRC). The views expressed in this document are those of the authors and do not necessarily reflect the views of funders. CCIC would like to thank the following member organizations, their Evaluation Committees and the primary contacts for each organization. This journey would not have been possible without them. Atlantic Council for International Cooperation (ACIC) Eliza Knockwood, Jennifer Sloot, Jessica Dubelaar, and Louise Hanavan Canadian Nurses Association (CNA) Colleen MacDonald and Tanya Salewski Jamaican Self-Help (JSH) Marisa Kaczmarczyk and Julia Anderson Manitoba Council for International Cooperation (MCIC) Zack Gross and Mike Tutthill We would also like to thank the Advisors to the PEP pilot initiative who served as navigators to the project: Janice Hamilton (MCIC), Richard Touchette (Oxfam-Québec/Club 2/3), Graham Pike (University of P.E.I.), Cass Elliot (4Unity Productions), and a very special thank you to Lynette Shultz (University of Alberta), who was there from the very inception of this initiative. Appreciation also goes to Ann Simpson, Anne Buchanan, Jacques Bélanger and the CSOs and advisors mentioned above for their careful eye in revising this document. Thanks also goes to Jean Loubert for the design and layout of the document. All or part of this document may be reproduced and used for nonprofit purposes, provided that CCIC is credited as the source. Use of any part of this document for commercial purposes is forbidden without prior written permission from CCIC. For additional information, contact: Canadian Council for International Co-operation (CCIC) 1 Nicholas Street, Suite 300 Ottawa, Ontario K1N 7B7 Tel.: Fax: Web Site: ISBN: Toward Good Practice in Public Engagement: A participatory evaluation guide for CSOs is also available in French as Vers de bonnes pratiques en matière d engagement du public : guide d évaluation participative pour les OSC. All rights reserved. Canadian Council for International Co-operation 2009

3 TABLE OF CONTENTS History of this Initiative Section I - Overview of the Initiative and Defining Our Terms Why Evaluation? Make the Road by Walking: An overview of Our partnership process Good Practice in PE Participatory Evaluation and Participatory Action-Research (PAR) Section II - Conducting Your Evaluation Testing for Organizational Readinesss Creating Enabling Conditions Defining your Evaluation Philosophy Drafting the Purpose of the Evaluation Drafting your Theory of Change Designing your Evaluation Framework and Methods Gathering, Compiling and Analyzing your Results Double-loop Learning: Changing your Practice and/or Theory of Change Documenting and Disseminating your Results and Lessons Learned Section III - Proving and Improving your PE The Results of the Pilots Evaluations Identifying Good Practices in PE Conclusions: From Theory to Action Appendices References TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT: A PARTICIPATORY EVALUATION GUIDE FOR CSOS

4 HISTORY OF THIS INITIATIVE This publication is the culmination of the Public Engagement Practice (PEP) Project which the Canadian Council for International Co-operation (CCIC) initiated in September The PEP project was developed in response to the need expressed by CCIC members and funders to: a) help define good practice in public engagement for global citizenship, and b) identify evaluation methodologies that can help civil society organizations (CSOs) document their effectiveness and tell their stories in a meaningful way. The PEP project was implemented in three phases. In Phase I (November 2006), CCIC hosted two-day intensive, participatory evaluation workshops, PEP Talks, in Winnipeg and Ottawa. In Phase II (February 2007 to February 2008), CCIC launched an action-research accompaniment project open to all organizations that had attended the workshops in Phase I. Four member organizations, the PEP Pilots, were selected to work in partnership with CCIC to evaluate the success of their various public engagement (PE) initiatives. These four CSOs are introduced throughout the section Overview of the Initiative and Defining our Terms. It is hoped that this guidebook will: help to demystify evaluation and provide tools to build capacity in participatory evaluation and action-research, strengthen organizational understanding about good practice in public engagement from a CSO perspective, and stimulate or reinforce a culture of reflection in CSOs regarding their social change philosophies and models for change. The current publication attempts to profile the process, tools, and learnings from the PEP Project. Building on this initiative, from September 2008 to March 2009, CCIC conducted an Evaluation Learning Circle involving 12 public engagement (PE) practitioners from its membership. Learnings from the Evaluation Learning Circle reinforce the lessons learned in the PEP Project. One of CCIC s primary objectives is To enhance the capacities of CCIC and its members in their work to engage Canadians as global citizens. This guidebook describes one approach to building organizational capacity in actionresearch and participatory evaluation and grounds it in real life examples. 2

5 SECTION I - OVERVIEW OF THE INITIATIVE, AND DEFINING OUR TERMS Why Evaluation? Engaging the Canadian public as global citizens is a cornerstone of the programming of many Canadian civil society organizations (CSOs), as they seek to change the conditions that perpetuate global poverty, injustice and environmental destruction. Public Engagement (PE) practitioners, volunteers and staff alike, coordinate a range of interesting and creative platforms and activities to help Canadians become and stay involved in global issues and in shaping the world around them. CSOs provide citizens with opportunities to engage with global issues through volunteering, participating in overseas placements, raising awareness in schools, engaging in policy dialogue, promoting ethical consumption, donating, and many other ways. Recognizing the critical role they play, agencies like the Canadian International Development Agency (CIDA) and certain private foundations provide financial support to CSOs for a range of public engagement programming. Consultation with PE practitioners in the CCIC membership has confirmed that CSO effectiveness and evaluation are issues of growing interest to the sector. It is clear, however, that PE practitioners and the sector as a whole are disenchanted with Results-Based Management (RBM) as an approach to reporting and evaluation. Many feel that RBM skews accountability toward quantitative outcomes and away from less tangible, qualitative ones (which they often see as most important) and toward donors rather than beneficiaries. 1 Many CSOs are looking for alternative evaluation methodologies that include a focus on reflection and learning and at the same time help CSOs appropriate the evaluation process. Participatory evaluation using an action-research approach is profiled in this guidebook as one such approach in the sector s quest toward good practice in public engagement. Make the Road by Walking: An overview of our partnership process CCIC believes that a strong international civil society sector in Canada is critical to promoting peace, defending human rights and ending global poverty. To be strong, CSOs must act with integrity and continuously improve their development practices, organizational practices, and vitality. But what does it mean for PE practitioners to act with integrity? Learning how to do the mechanics of public engagement is certainly part of it. But encouraging integrity and effectiveness also involves going beyond the skills and the numbers to look at questions of good practice in public engagement, questions like: What is the role that public engagement plays in helping to achieve your organization s international cooperation goals?, and How do you know your public engagement practices are making a difference? This guidebook tells a story of a capacity-building partnership and a journey into uncharted territory that required a leap of faith for all partners. The PEP project was a first in many ways for CCIC and for the groups that participated. Rarely has CCIC coordinated accompaniment action-research initiatives of this nature. Four CSOs spanning three time zones, five advisors and one coordinator, with no set curriculum or road map, bridged together over the course of one year through the use of telephone calls, s, in-person meetings and live Web conferencing definitely a first for all! Embarking on this initiative required that we make the road by walking, to borrow a term from Myles Horton and Paulo Freire. 2 This is in keeping with the CCIC capacity-building principle of learning by doing. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 3

6 Letters of Intent were used to select the four pilot organizations. Although the process of accompaniment was slightly different for each pilot, there were certain commonalities in the process to keep the cohort moving along in tandem. An initial telephone conversation took place to gain clarity on the needs of each initiative and establish expectations for the partnership. Each pilot then began forming a multi-stakeholder Evaluation Reference Group or Committee, and the four pilots exchanged their Letters of Intent by as a way of introducing themselves. A Terms of Reference document and a partnership work plan were drafted and sent to the pilots to modify before a Letter of Agreement was signed by all partners. Each pilot was appointed an advisor from the PEP Project Advisory Committee that helped to guide the initiative. An initial in-person, Web conference, or conference call workshop with each pilot s Evaluation Committee and advisor took place to introduce action-research as an evaluation methodology and to help each group define its evaluation philosophy and the purpose of its evaluation. Following this initial working session, each pilot began working to define its Theory of Change and develop a work plan for their evaluation. Although this overview gives the impression of a linear path, the process for each pilot and for the partnership as a whole was similar to any uncharted journey complete with hills, valleys, wrong turns and discoveries! It was hoped that participatory evaluation would serve as a mechanism to help the PE practitioners involved in this study, and in the CCIC membership and beyond, gain a better understanding of what constitutes good practice in PE for their organizations. Often, evaluation is heavily focused on accountability or proving that it works and does not give enough attention to learning or working to improve. The focus of this initiative was to provide a space for curiosity, reflection, and learning rather than efficiency. All four pilots were brought together by Web conference in June and September 2007, and in-person in October 2007, to share their experiences and documents and ask questions of the advisors. Throughout this time, the work of each committee continued and was supported by regular communication and coaching from the CCIC coordinator, including in-person site visits when possible. After each pilot had conducted its evaluation, the data were compiled and analyzed at a final analysis session in January All pilots then set out to complete a draft report to profile their processes, learnings and results. These were sent to all pilots and advisors prior to a final live Web-conference session in February At this session, the pilots and advisors came together to share knowledge, contrast learnings and evaluate the partnership. Exit interviews were conducted in March 2008, and by June 2008 all final reports had been received. The Four Pilots and their Public Engagement (PE) Activities The four CSOs in text boxes throughout this section are members of CCIC and were partners in the PEP Project action-research participatory evaluation initiative. The organizations and the public engagement initiatives they evaluated are highlighted here in brief. Attempts will be made to profile them in greater detail by integrating their experiences into the sections that follow. 4

7 Good Practice in PE Our concern [is that] we will do a lot but possibly not accomplish a great deal that is lasting. (Letter of Intent) In the current context, there is a strong drive toward action, results and efficiency. Although most organizational players will agree that it is important to reflect on practice, there is a tendency in organizations to do a lot but reflect little. Most organizations struggle against a dominant mindset that does not see reflection as being essential to the competence, sustainability and transformational nature of organizations. Taking time to reflect on our practice or carry out an internal evaluation is a bold act of leadership. It is also important if we wish to act with integrity and be effective and responsive in reaching our social change goals. When we reflect on our practice, we attempt to make explicit the implicit and the tacit assumptions that drive our beliefs and actions: What are the root causes of injustice? How do we believe social change happens? What are the best PE strategies to encourage change? Are we being successful and is it making any difference? What is it about our practice that leads to this success or lack of success? How can our practices be more effective or have more integrity? Attempting to answer questions such as these moves us toward good practice in public engagement. In this way, good practice can be identified from what we know through experience, examination, research, and judgement to be most beneficial, ethical, or successful in engaging Canadians as global citizens. In section III, we attempt to describe in concrete terms the elements that make up good practice according to the findings of the pilots. Public engagement practices can be said to be have three dimensions (3D= width, depth and height) and therefore can be broken down into three basic types: 1) those designed to reach a wide number of people, often over a short duration, e.g. a panel discussion at a university, a T.V. documentary, a rally, an on-line petition, etc., 2) those designed to engage more deeply a smaller number of participants, often over a longer period of time, e.g. extended placement overseas, thematic committee work, a professional secondment, a teacher training program, etc., and 3) initiatives designed to do your work better, have greater impact, reach new audiences, or add more meaning to your efforts, e.g. building a coalition, targeting ethnically diverse audiences, working with the media, conducting an evaluation, etc. Good practice in one of these dimensions is not necessarily good practice in the other since the desired outcomes are divergent. Although it is tempting to want to reach large numbers of citizens, the choice to focus on one of these dimensions at any given time should be linked to your theory of change (see Section II) and your desired outcomes. A working definition of public engagement and global citizenship can be found in Appendix 1. We have opted to use the term good practice in place of best practice because there is not one set of practices that leads to excellence or success in all circumstances. Good public engagement activities are context-specific, and often cannot be reproduced with similar results in different geographic and socio-political contexts. We do, however, attempt in Section III to identify those practices and principles that appear to be common to the successful public engagement activities involved in this study. For the purposes of this study, success is defined by the practitioners themselves and is measured against the TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 5

8 ATLANTIC COUNCIL FOR INTERNATIONAL COOPERATION (ACIC) AND THE FIRST VOICES PROJECT ACIC is a coalition of individuals, organizations, and institutions working in the Atlantic region, which are committed to achieving global sustainability in a peaceful and healthy environment, with social justice, human dignity, and participation for all ( ACIC supports its members in international cooperation and education through collective leadership, networking, information, training and coordination, and represents their interests when dealing with government and others. ACIC also takes a leadership role in engaging Atlantic Canadians around issues relating to international development, global sustainability, and social justice. Through its public engagement work, ACIC strives to give Atlantic Canadians the knowledge, skills, and tools necessary to become active global citizens. The First Voices Project (Phase I) brought together First Nation youth from Atlantic Canada and indigenous youth from Guatemala and Chile to make a collaborative video documentary based on stories of hope from their communities. (To see the project description and watch the video, go to The Canadian youth congregated in Halifax for a workshop in video production. Similar workshops were held in Guatemala and Chile. Once equipped with digital video cameras, technical skills and creative vision, the youth travelled back to their respective communities to capture, on video, the stories, voices and images that were important to them. In January 2007, the six chosen youth met in Halifax to view the footage each had gathered, and worked collaboratively to weave together a documentary. The participants in the program later travelled to Guatemala to screen the documentary and visit various communities throughout the country. The aim of the project was to unite indigenous youth from the North and South so that they could share with each other their stories and experiences. The aim was also to create an opportunity for the participants to learn about video production, increase pride in their communities, and empower and inspire participants to take action. It was hoped that the project would help to create and strengthen awareness of issues faced by aboriginal youth and increase critical thinking regarding stereotypes in the media. ACIC wished to conduct a participatory evaluation to: i) develop its knowledge of evaluation methods and tools; ii) document the effects (direct and indirect) of the project on stakeholders; iii) demonstrate the qualitative impact of its work in order to better tell its story ; and iv) improve its practice. 6

9 indicators or standards of success that they have set for their own initiatives. Finally, we have chosen to use the phrase Toward good practice in the title of this guidebook because good practice is an iterative process and not a final end state. Practice needs to be responsive to a constantly changing context that includes changing generational norms, technologies, etc. Evaluation through action-research can be used as a mechanism to help move us toward good practice. Participatory Evaluation and Participatory Action-Research (PAR) organizations to report unintended results, which are often those of greatest impact. 5 Some CSOs argue that the power dynamics of traditional evaluation and reporting foster a culture of fear that can encourage distortions of information and exaggerated claims. 6 Often in this funder-driven model, the funder, rather than the organization, is the user of the evaluation results. Although evaluator recommendations may be incorporated into an organization s programming, they may not necessarily complement the organization s theory of change or strategies. [The funder] does not encourage participatory evaluation in its RBM framework, thus when time and resources become an issue, participatory evaluation usually becomes a secondary priority. (Letter of Intent) There is a growing need expressed by CSOs to evaluate the success of their programming, and a growing interest in alternative evaluation methods to do this. 3 There is also considerable interest from governments and multilateral bodies in issues of aid and development effectiveness and in CSO accountability for results. In particular, CSOs and funders struggle to find effective frameworks for evaluating efforts to engage citizens on global issues. Alternative evaluation methodologies may serve to help bridge these interests and strengthen CSOs in the process. Traditional evaluations linked to a Results-Based Management (RBM) framework tend to concentrate on measuring tangible results. They are often positivist in their leaning, focusing on fixed, controlled designs, quantitative data, objective analysis, and linear attribution (cause and effect) between activities and desired outcomes. The desired attitudinal or behavioural shifts inherent in public engagement work are extremely difficult to capture using these types of methodologies. CSOs readily accept the need for accountability for the funds they receive and their results. The type of results is of issue however. 4 Although many attest to RBM s value in the planning process, they point out that its focus on inputs leading linearly to results does not permit the richness of an organization s story to be told, and leaves little room for It can be argued that there will always be a role for thirdparty evaluations. It appears, however, that traditional evaluations as they are usually conceived will not encourage organizations to appropriate their own learning or heighten their understanding about good practices. Evaluation needs to be a mechanism to prove or confirm to ourselves, our stakeholders and our funders that we are on the right track and realizing our mandate, and if we are not, a chance to learn and improve our practice. Participatory evaluation shows some promise in this regard. Contrary to traditional third-party evaluations, participatory evaluations tend to focus on emergent designs, qualitative data, analysis by internal stakeholders, and the contribution of activities to outcomes rather than attribution. The primary users of a participatory evaluation are the organization and its participating stakeholders. Done well, participatory evaluations can enhance the knowledge base and the technical, analytical and adaptive capacities of organizations to engage Canadians as global citizens. Participatory Action-Research (PAR) Participatory evaluation is a form of participatory actionresearch. PAR is a learning approach aimed at improving practice through an iterative process of experience, reflection, learning and action planning. Practitioners are the central researchers who learn by doing and work in collaboration to: identify a problem or question (in our case, are we being successful?), develop theories of how change happens and strategies for alternative courses of action, plan and implement change experiments to resolve the issue TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 7

10 CANADIAN NURSES ASSOCIATION (CNA): GLOBALIZATION AND ITS IMPACT ON NURSES AND HEALTH SYSTEMS CNA is the national professional voice of registered nurses, supporting them in their practice and advocating for healthy public policy and a quality, publicly funded, not-for-profit health system. The Canadian Nurses Association is a federation of 11 provincial and territorial nursing associations and colleges representing more than 136,200 registered nurses and nurse practitioners ( CNA is committed to advancing international health policy and policy development in Canada and abroad to support global health and equity. Public engagement forms an important part of CNA s Strengthening Nurses, Nursing Networks and Associations program (SNNNAP). One of the goals of this program is to increase awareness of global health issues and support active global citizenship amongst Canadian nurses. CNA believes that nurses have the right and the responsibility to be active global citizens, to raise awareness of the root causes of inequity in global health and to participate in finding solutions. The public engagement activity that CNA chose to evaluate was the workshop entitled Globalization and its impact on nurses and health systems, which took place in three provinces Newfoundland/Labrador, British Columbia and Saskatchewan. The workshop examined how trends in the national and global economy change health care in Canada and abroad, challenge the nursing profession, affect nurses at work, and have an impact on nurses in their everyday lives. In this workshop, the linkages between globalization, poverty, health, and health human resource issues were explored. A description of the workshop can be found at The workshop methodology was an adaptation of the popular education workshop entitled The Wall Workshop. The Wall Workshop is a tool to carry out a gender analysis of the global economy, starting with women s experiences ( It was hoped that a participatory evaluation would help determine if the CNA workshops were an appropriate mechanism to educate and engage nurses in Canada and contribute to the ever-growing dialogue on these issues. 8

11 (in our case, gather data to measure success), monitor and evaluate the success of these change efforts, and identify key learnings. If not satisfied with the results, the collaborating practitioners use their learnings to re-think their strategies and try again, repeating the cycle of learning and experimentation. The process provides a safe and supportive environment for examining how organizations currently do their work and for experimenting with innovative practice (see Appendix 2). encourages a space in which to speak honestly about weaknesses; encourages learning and knowledge acquisition; builds analytical and adaptive capacity; builds capacity to evaluate; Participatory Evaluation is a generalized term to describe an alternative evaluation methodology that includes the principles and methods of action-research. The word alternative describes more the fact that such evaluations are participatory in nature; many of the data-gathering techniques used, as in some of the case studies described here, include conventional methods such as surveys and interviews. Sometimes, however, participatory evaluations include slightly unconventional methods such as the video interviews used by ACIC for The First Voices Project. Specific types of participatory evaluation methodologies that are gaining currency in the development community include Outcome Mapping, SAS 2 and The Most Significant Change Technique among others 7 ; it is outside of the scope of this document to describe these. Regardless of which alternative methodology is used, no two evaluations are alike, since participatory evaluation is both an art and a science. Benefits of PAR and Participatory Evaluation So why bother conducting an internal evaluation using a PAR methodology? There are many advantages to using a PAR approach to evaluation, including: increases utility of what is studied; helps to make explicit your assumptions about change; encourages whole-brain thinking; provides a space for individuals to negotiate different interpretations; helps move you toward good practice; helps provide internal accountability to an organization s members; encourages group cohesion; can be used for formative or summative evaluations; permits changes to programming mid-stream; and encourages ownership and integration of results or recommendations. PAR highlights the perspectives of those stakeholders who are managing, implementing or involved directly in the program. Ideally, participatory evaluation helps organizations answer their own questions, helps them test the merits of their efforts, and helps to test their assumptions about how best to effect change. If CSOs advocate for participatory approaches in our development work with Southern partners, why are we not applying them in our own organizations with respect to evaluation? Participatory evaluations are not without their critics, however. The most common critique levelled at participatory processes is that they lack objectivity. The assumption that traditional evaluations are entirely objective is somewhat misguided, given that every evaluator brings his or her own biases to the evaluation process. There is also a chance that evaluators who do not know the intricacies of an organization may unintentionally misinterpret findings. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 9

12 JAMAICAN SELF-HELP: YOUTH ENGAGEMENT STRATEGY Established in 1980, Jamaican Self-Help (JSH) is a Peterborough-based registered charitable organization that supports education and community development projects in Jamaica as well as global education activities in the Peterborough region, Ontario ( Public engagement has been an integral part of the JSH mandate since its inception, and active volunteer participation is an important component. Volunteers are involved in all parts of the organization. In terms of public education, JSH organizes workshops, speakers, presentations or other educational events, often in cosponsorship with other community organizations. For the CCIC pilot project, JSH focused on the key activities of its youth engagement work, which are: annual two-week youth awareness trips to Jamaica, and the Make Poverty History (MPH) Youth Action Committee. The two activities are explicitly intended to engage youth in the Peterborough region. The MPH Committee was first instigated in 2005 as a means to offer high school youth returning from Jamaica a place to continue their engagement in global issues. The MPH Committee quickly expanded to include high school youth who may not have direct overseas experience but are nonetheless interested in being actively involved in global social justice. The MPH Committee is innovative as an approach to public engagement, as it offers youth a forum that is youth-driven and community-based (rather than school-based) and supports youth to become directly involved and take leadership in both policy and practice in international cooperation. JSH was interested in participating in the CCIC initiative for the following reasons: to improve its public engagement work with youth, to build the capacity of JSH staff and volunteers to engage in systematic participatory evaluation approaches, to offer CCIC the experience and perspectives of a small community-based non-governmental organization (NGO), and to offer the NGO community a critical look at a model of youth engagement. 10

13 Participatory evaluations, on the other hand, have the benefit of those who understand the organization offering a collective interpretation and analysis of findings. Making false claims about their own organization s success would be unproductive, given that the evaluation is being conducted for their own learning purposes. Barriers to evaluation in general So what prevents CSOs from doing their own evaluations? PE practitioners have identified five challenges to implementing internal evaluations 8, including: 1. time excessive workloads and the amount of time required to conduct an evaluation; 2. attitude lack of interest, evaluation not valued or seen as a luxury, and internal resistance to examining practice; 3. lack of skills/tools evaluation can be intimidating, lack of appropriate methodologies, etc.; This initiative attempted to deal squarely with some of these roadblocks by structuring time for the process, creating a supportive committee for the champions, providing basic tools and supporting skill development, and not requiring a financial investment other than human resources. The following sections describe the process and tools that participants in the CCIC initiative used to build their participatory evaluations. It is hoped that these can serve as templates for other CSOs to further develop. In this document, we describe a process of participatory evaluation that would often span several months to a year and is somewhat extensive. The time span and the level of participation can, however, be modified to suit an organization s reality and the nature of what is being evaluated. In evaluation, bigger is not necessarily better. In fact, a quicker and lighter participatory process may be exactly what the organization needs. It might also be what the organization is able to conduct at a given point in time. Expectations of the benefits to be obtained may need to be adjusted accordingly, however. 4. cost; and 5. focus and expectations of funders for quantitative, objective, third-party evaluations. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 11

14 MANITOBA COUNCIL FOR INTERNATIONAL COOPERATION (MCIC): FAIR TRADE MANITOBA S ONE MONTH CHALLENGE MCIC is a coalition of organizations involved in international development, which are committed to: respect, empowerment and self-determination for all peoples, development that protects the world's environment, and global understanding, cooperation and social justice. MCIC s mission as a coordinating structure is to promote public awareness of international issues, to foster member interaction, and to administer funds for international development ( In 2006, MCIC became involved in the efforts of individuals and groups around the province to heighten Manitobans awareness of fair trade through Fair Trade Manitoba (FTM), now a program of MCIC. Fair Trade Manitoba s mission is to promote knowledge of, and support for, fair trade issues and products throughout the province. For Manitobans, purchasing fair trade products provides an opportunity to be active global citizens, whereby actions in Manitoba can impact positively on the lives of farmers and artisans in the global South. For the action-research initiative, MCIC chose to focus on FTM s annual campaign The One-Month Challenge (OMC). The OMC provides citizens with the opportunity to show support for producers in the developing world by taking a pledge to go fair trade for 30 days by choosing fair trade coffee, tea and chocolate instead of conventional brands ( It also provides a chance to learn about and discuss fair trade issues. From February 17 to March 19, 2007, approximately 300 Manitobans participated, as individuals or families. (Note: As of 2009, this number had grown to over 5,000.) Many church groups, schools, offices and unions acted as partners to generate interest in this campaign in their own organizations. Participants were asked to complete six short on-line surveys as part of their participation: a welcome survey at the time of signing up; four weekly surveys (one each week of the campaign); and a closing survey to comment on their overall experience. In its action-research project, MCIC wished to learn to better document the development of this project, to better understand various strategies, and to better judge whether it was moving toward its goals. MCIC wanted to learn to monitor and evaluate not just individual meetings, events or programs undertaken by MCIC/FTM, but also the direction of this new movement or campaign in Winnipeg and, to some extent, in the rest of Manitoba. 12

15 SECTION II - CONDUCTING YOUR EVALUATION The participatory evaluation process that the four CSOs conducted is summarized in a one-page tool in Appendix 3. Again, it is important to emphasize that participatory evaluations are rarely as linear as these steps appear to be, and each one is unique. Below, we elaborate on various elements of the process. Testing for Organizational Readiness time are devoted to it in the plans of the organization and staff. One of the pilots mitigated this time crunch by hiring someone familiar to the organization on contract to oversee part of the process. Program or project lifecycle Timing When a car is on fire, it is not a good time to check under the hood. A similar principle applies for evaluation. If an organization is going through a rough period, a conflict or experiencing a constant staff turn-over, it is likely not the time to undertake a participatory evaluation. The busyness of an organization, however, should not deter us from such a process. True, creating the space and resources for a participatory evaluation will be challenging and may require you doing less in a given programming cycle, but think of it as stepping back in order to leap further. Our resources are too limited for us not to take the time to think about how we can do our work better. Time investment There is a recognized challenge of fitting space for reflection into the hectic pace of CSOs. A PAR initiative represents a significant commitment. Participatory evaluations require a significant investment in time and energy up front in planning the process, discussing philosophy, getting volunteers on board, etc. The payoffs come later in the process. Most pilots went through the experience of squeezing the evaluation into jam-packed program agendas because it had not been included as part of the program. This is obviously not an enabling environment for thoughtful reflection. The lesson here is that evaluation will always be viewed as an add-on or a luxury unless resources and The groups involved in this project were constantly concerned that they were beginning their evaluation planning at the end of the programming or project cycle and not at the beginning. Let s face facts everyone knows it is best to plan evaluations at the beginning of the cycle, but this is difficult to do. Often, programming in one fiscal year is still ongoing when it is required to begin planning for the next fiscal year. So, when do you fit in a longer-term evaluation process? The best time to start an evaluation is now, wherever now is in the planning cycle, provided other conditions in this section are satisfied. Evaluating last year s programming now will prepare you with a set of indicators and tools, mechanisms and experience to help effectively evaluate next year s activities. The only possible downside to that will be that you will not necessarily have benchmark data collected upfront for comparison purposes, but often this is not even necessary. Need Participatory evaluations are conducted when CSOs feel the need to ground their work in thoughtful collective reflection. Participatory processes are not a panacea, however, and may not be appropriate if your organization requires timely answers to complex questions or if you lack the human resources to carry out an effective participatory evaluation. If a quick Survey Monkey questionnaire designed and sent out by one staff member gives you the depth of understanding you need to modify your PE activities, then there is no need to build a participatory process. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 13

16 Organizational culture It is important to point out that participatory processes do not come naturally to most organizations, in that participation is often not part of the culture of CSOs. This is in part due to the us/them paradigm ; we design and provide programming for them. Participatory evaluation challenges this paradigm by bringing board members, volunteers and program participants into the design phase of an organizational process. Participatory evaluation requires a group of people who are open to exploration and who possess a certain tolerance for ambiguity and risk. Do not expect to have all the answers of how to do PAR and participatory evaluation at the beginning of your process. Recognizing that participatory evaluation processes are usually muddy at the beginning is half the battle. Unlike most third-party evaluations, evaluations using PAR are non-linear processes in which the process and the evaluation framework are not necessarily fully mapped out from the onset. The process is necessarily iterative, and is shaped by dialogue and what you learn as you go, or as a result of changing needs or context. Learning by doing underlies this form of evaluation, in which understanding emerges as you walk together through the process. The beauty of this form of evaluation is that you do not have to have absolute clarity or a high level of expertise before starting. A PAR approach to evaluation is also equally concerned with both process and product. Given that ownership of the process and results is a goal of participatory evaluation, organizations that are primarily results-focused will be required to adjust their approach and expectations, and demonstrate patience as they nurture the project through the inevitable time crunches taking place elsewhere in the organization. Organizational buy-in and ownership It is important that the leadership of the organization and the vast majority of the users of the evaluation be aware and supportive of an evaluation process. Otherwise, you run the risk of the results and recommendations being ignored or your process being stymied. A comment from one of the pilots demonstrates the importance of organizational buy-in as an enabling condition for a successful evaluation: There was a challenge to our including the evaluation process as a regular part of our activities some members having the perception that the impetus for the process came from afar (CCIC), others finding the evaluation not relevant, concrete and action-oriented enough in their own minds. Of course, some people have too much invested...[to want others reflecting on their work]! This may be particularly true in larger organizations and organizations with a strongly hierarchical culture, or in initiatives involving coalitions. Innovation, of course, always has its critics. And since participatory action-research challenges the way things are traditionally done, involving potential naysayers in your process might be a good proactive strategy. Creating Enabling Conditions Coordination and facilitation The practitioners experience highlights the need for one person from the organization to be given the mandate and the time to coordinate the initiative. This person could be a paid staff person or key volunteer. Based on the above experiences of the CSOs and their comments about the valuable role that CCIC provided, we are led to believe that an outside or internal facilitator who has experience with these processes is key to a successful participatory evaluation process. It is not a contradiction to hire an outside evaluator or facilitator to help provide guidance and keep you on track without actually doing the evaluation for you but this person should not be responsible for driving the process. 14

17 Participation Will you be able to generate enough interest in the process, preferably from different types of stakeholders, to make the process truly participatory? One strategy is to form an evaluation reference group or committee and decide how you will work together. Given that the word evaluation can be intimidating, you might wish to find an alternative name for the group such as good practice or action-research working group. It is critical to have more than one person helping to drive the process. This helps to lend credibility and importance to the work and ensure rigour and ownership for the process and the results. It also ensures the sustainability of the evaluation over time because it increases accountability. Roles, frequency of meetings, who makes decisions, and time commitments can also be discussed. The more that decisions are made collectively, the greater the chance of participation through the initiative. Each pilot in the action-research study formed an Evaluation Reference Group that had between four and seven members, including program participants, board, staff, academic partners and volunteers. It s important to keep the group small enough that scheduling meetings does not become a nightmare. Keep in mind that once you begin, it is never too late to bring new recruits into the process in fact, it is strongly encouraged! In each of the four cases, new blood was recruited into the pilot s reference group, providing an injection of energy each time. Participation proved difficult at times for the pilots due to time constraints, limited availability of committee members, or scheduling (the CNA had committee members in four different time zones!). Keep in mind that not everything needs to be done with everyone around the table. Often it is more effective if one or two people go off and create a first draft for the others to critique. For JSH, documents were sometimes drafted by staff (e.g. Theory of Change) and sent for reflection to the Committee, while other documents were drafted by participants or leaders and sent to the staff members for reflection (e.g. Awareness Trip surveys). Given that volunteers and program participants are often important stakeholders, and at times are the object of your evaluation, it is important to encourage their involvement in the processes. Having said this, finding meeting times to fit their schedules and ensuring that volunteers have the knowledge to participate in a meaningful way can be a challenge. Rigour and momentum Just because participatory evaluation is iterative does not mean to say it is not rigorous. In fact, because it is not a prescriptive process, thought needs to be given to all stages to ensure that there is continuity from one meeting to the next, that decisions are respected, that documents created (e.g. your evaluation framework) are respected and revisited regularly, that the process remains participatory and that real learning is taking place. Due to the ebb and flow of organizational life, rigour and being systematic can be a challenge. When you are learning by doing, hiccups are a natural part of the process. To mediate this, it proved helpful for the pilots to draft agendas and take notes from each meeting. Learnings can be captured in a systematized fashion by placing an item at the end of the agenda for each committee meeting that asks the question, What did we learn about our process, PAR, evaluation, or our public engagement practice today or since the last meeting? These learnings can be compiled at various times or at the end of your process to share so that they become organizational learnings. There is also a natural tension between being patient with the process and moving things along so as not to lose momentum. PAR is a collective process that requires time to involve people in meaningful ways, and the summer months and key people going overseas (not to mention a couple of maternity leaves!) were part of the pilots realities. Our experience shows us that there is an advantage in having a second back-up champion within the organization, including senior management. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 15

18 16 (Note: This may be why all four pilots completed the partnership). When there are two champions of the process, they can provide support to each other during messy phases, pick up slack, provide continuity when one is absent, provide critical mass if participation wanes, and help to ensure organizational learning and take-up of recommendations. Besides sharing the coordination between committee members, the pilots attempted to maintain momentum by setting self-imposed deadlines. Helping the pilots maintain rigour and momentum was also one of the roles played by CCIC staff. Learning about PAR and different alternative evaluation methodologies At least one of the key contacts for each of the pilots had been involved in a two-day workshop three months prior to beginning their own process. Most of the pilots, however felt that they would have benefited from an additional two-day workshop designed with their entire evaluation committee. Such a workshop may not always be possible, due to the logistics involved in working with volunteers who have jobs elsewhere or committee members living in different regions or provinces. What this does illustrate, however, is that there is a need for the immediate team of people conducting a participatory evaluation to learn about PAR and different alternative evaluation methodologies and tools as part of the evaluation process. A Working Definition of public engagement and global citizenship The object of measurement for many PE evaluations is the individual global citizen. It seems only fitting, therefore, that practitioners need to define what a global citizen is. It is also helpful to have a working definition of the basic elements or principles of a public engagement activity. These definitions are very likely to be embellished by reflecting on them throughout the evaluation process. For this initiative, organizations were provided with a working definition of public engagement and global citizenship (see Appendix 1) but were also encouraged to define the term for themselves. A Draft Terms of Reference (ToR) document A ToR is a fancy name for a document that helps to define the scope and parameters of your evaluation. It speaks about the history or rationale leading up to the evaluation, and it describes the methodology and what you hope to accomplish, the roles of the different people involved, the commitment required, the timeline, the uses of the information, how it will be disseminated, and confidentiality issues. This is a particularly important document if you are commissioning someone to help facilitate your process. Defining your Evaluation Philosophy Each organization that participated in the initiative was encouraged to draft its own evaluation philosophy. An evaluation philosophy or theory makes explicit the values and principles that underlie the foundation of your evaluation process. Below is a draft of an evaluation philosophy produced by Jamaican Self-Help (JSH). JSH THEORY OF EVALUATION - responsiveness and flexibility are important - provide opportunities to reflect and evaluate in an ongoing way - be self-critical on a continual basis - participant focused - about learning more than accountability - gets people excited; allows people to dream - reminds us of why we do this work keeps actions grounded - should look at failure there is room for failure - need to be proactive in presenting our evaluation approach

19 An interesting technique was developed by ACIC to elaborate the values behind its evaluation process. The Medicine Wheel was used as a template to build principles for the evaluation. Like the Medicine Wheel, evaluation: 1) can be a tool for healing and inner understanding, 2) will give us a clearer understanding of where our strengths and weaknesses lie, 3) must be inclusive and involve different stakeholders similar to the 4 colours of the Wheel representing the races of the world, 4) must be holistic, assessing mental, physical, emotional and spiritual effects. Drafting the Purpose of your Evaluation Drafting your Theory of Change A Theory of Change was the cornerstone document drafted by each pilot in this initiative. For many, a theory of change is a road map that shows the linkages between an organization s goals and its strategies, activities and outcomes. This is often called a logic framework. We, however, went beyond the inputs, outputs and outcomes of the typical logic framework and asked the pilots to answer more profound questions: What are the root causes of the injustices you are targeting, what is your long-term vision, and why are you using PE to achieve that goal? When your organization drafts its purpose statement, it provides an opportunity for the Committee to delineate its mandate and hone in on exactly what is essential to evaluate. This can lead to some very animated discussions of what is important to know for the different members of the organization. Sharing your purpose document with various stakeholders outside the Committee is an excellent way to generate excitement and credibility for the group s work and can help you find out what others in the organization believe is important to evaluate. Appendix 4 shows the iterative evolution of CNA s purpose document from its first draft to its fourth. Of particular note is the way the outcome objectives became more specific and the process objectives more numerous with time. It is usually a common experience at this point for the group to want to measure too many things. (Note: There is a big difference between what would be nice to know and what would be useful to know!) As the evaluation progressed, most pilots realized that they needed to be more realistic about what could actually be accomplished. In the case of ACIC, the initial desire to assess the impact of work on the media needed to be dropped in order to focus properly on the impact of the program on the First Nations youth participants themselves. Often, public engagement practitioners within CSOs have not had the opportunity to reflect on these questions in any concerted fashion with their colleagues. Mapping out your theory of change can spark some very interesting conversations about how people in your organization believe change actually happens. For MCIC, this process raised questions of whether public engagement or government and institutional procurement policies would be a better way of encouraging fair trade in Manitoba. MCIC s draft theory of change is illustrated in Appendix 5. Theories of change are malleable and should be influenced in an ongoing fashion by reflections on practice and results over time Designing your Evaluation Framework and Methods As part of the collective process, all pilots were asked to complete an evaluation framework. This is a document that illustrates the outcomes to be measured, the evaluation questions that best target these outcomes, indicators or progress markers, the evaluation technique or method that will be used to collect qualitative or quantitative data, when data collection will take place, and how different stakeholders will be involved. At times, the outcomes for the evaluation (as described in the Purpose document) differ from the outcomes for the PE activity (as outlined in the Theory of Change document). For example, although PE outcomes might focus on fair trade consumers, one of the evaluation outcomes might involve assessing the effect on merchants or the coverage in the media. TOWARD GOOD PRACTICE IN PUBLIC ENGAGEMENT 17

20 Be sure to consider both types of outcomes in the evaluation framework. For a very easy-to-use template of an evaluation framework, see Appendix 6. Outcomes and indicators For CNA, short- and medium-term outcomes for the workshop were based on the association s RBM logic framework as per its original plan to funders. Its extensive evaluation framework, equipped with indicators, is illustrated in Appendix 7. MCIC, on the other hand, elected to benefit from the collective reflection of its committee to revise the intended outcomes of the One Month Challenge. For those outcomes that could not be measured directly, MCIC chose to use proxy indicators, such as media hits and speaker requests to measure public awareness levels. For the First Voices Project, ACIC used the Medicine Wheel to develop the outcomes and indicators, ensuring at least one outcome for each of the human traits: physical, mental, emotional, or spiritual traits (see Appendix 8). Jamaican Self- Help spent a significant amount of time creating and testing a tool, inspired by Outcome Mapping methodologies, which involved the use of progress markers. Progress markers can be seen as an alternative form of indicator or sign of change and differ from the short- and medium-term outcomes in logic frameworks in the sense that they are not time-bound and are exclusively linked to behaviour change rather than changes in awareness or attitudes. Progress markers can be divided into three categories: i) what we expect to see, ii) what we would like to see, and iii) what we would love to see (see Figure 1). FIGURE 1: A SAMPLE OF JSH PROGRESS MARKERS FOR ITS YOUTH AWARENESS TRIP TO JAMAICA Behaviour change (post-trip) Expect to see Would like to see Would love to see Writing and speaking about trip Change in language used to describe poverty issues Continued relationship with JSH or other NGO Seeking other overseas opportunities Continued interest in future trips as an alumni Students pursuing interest that was developed on the trip Interpreting Canadian governmental policy Continued relationship with one of the organizations in Jamaica 18

Develop Your Personal Leadership Style

Develop Your Personal Leadership Style Self-Paced Course In this course you will explore and develop your unique leadership style, and identify what kind of leadership would be most effective for your particular situation. You will create a

More information

Making a difference in your neighborhood

Making a difference in your neighborhood Making a difference in your neighborhood A Handbook for Using Community Decision-Making to Improve the Lives of Children, Youth and Families Acknowledgements Making a Difference in Your Neighborhood: Using

More information

Information Management 1MODULE. A System We Can Count On. The Planning Process

Information Management 1MODULE. A System We Can Count On. The Planning Process 1MODULE Information Management A System We Can Count On The Planning Process The Health Planner s Toolkit Health System Intelligence Project 2006 Table of Contents Introduction: Which Way To Go?...............

More information

How to do a VCA A practical step-by-step guide for Red Cross Red Crescent staff and volunteers

How to do a VCA A practical step-by-step guide for Red Cross Red Crescent staff and volunteers How to do a VCA A practical step-by-step guide for Red Cross Red Crescent staff and volunteers The International Federation s Global Agenda (2007 2010) Over the next five years, the collective focus of

More information

building a performance measurement system

building a performance measurement system building a performance measurement system USING DATA TO ACCELERATE SOCIAL IMPACT by Andrew Wolk, Anand Dholakia, and Kelley Kreitz A Root Cause How-to Guide ABOUT THE AUTHORS Andrew Wolk Widely recognized

More information

Guidance for working with other organisations

Guidance for working with other organisations Guidance for working with other organisations This document was made possible by the generous support of the American people through the United States Agency for International Development (USAID). The

More information

Guidelines for Research with Children and Young People. Catherine Shaw, Louca-Mai Brady and Ciara Davey

Guidelines for Research with Children and Young People. Catherine Shaw, Louca-Mai Brady and Ciara Davey Guidelines for Research with Children and Young People [Blank page] Guidelines for Research with Children and Young People NCB s vision is a society in which children and young people are valued, their

More information



More information

A Cooperative Agreement Program of the Federal Maternal and Child Health Bureau and the American Academy of Pediatrics

A Cooperative Agreement Program of the Federal Maternal and Child Health Bureau and the American Academy of Pediatrics A Cooperative Agreement Program of the Federal Maternal and Child Health Bureau and the American Academy of Pediatrics Acknowledgments The American Academy of Pediatrics (AAP) would like to thank the Maternal

More information

Turning Strategies into Action

Turning Strategies into Action Turning Strategies into Action Economic Development Planning Guide for Communities Turning Strategies into Action Table of Contents Introduction 2 Steps to taking action 6 1. Identify a leader to drive

More information

Diploma Programme. The Diploma Programme From principles into practice

Diploma Programme. The Diploma Programme From principles into practice Diploma Programme The Diploma Programme From principles into practice Diploma Programme The Diploma Programme From principles into practice Diploma Programme The Diploma Programme: From principles into

More information

Monitoring and Evaluation

Monitoring and Evaluation OVERVIEW Brief description This toolkit deals with the nuts and bolts (the basics) of setting up and using a monitoring and evaluation system for a project or an organisation. It clarifies what monitoring

More information

As Good As They Give. Providing volunteers with the management they deserve. Workbook Two Attracting and Selecting Volunteers

As Good As They Give. Providing volunteers with the management they deserve. Workbook Two Attracting and Selecting Volunteers As Good As They Give Providing volunteers with the management they deserve Workbook Two Attracting and Selecting Volunteers Volunteering takes many forms - traditional service giving, mutual aid and self-help,

More information



More information

Getting to Grips with the Year of Care: A Practical Guide

Getting to Grips with the Year of Care: A Practical Guide Getting to Grips with the Year of Care: A Practical Guide October 2008 Contents Page No. Foreword 3 Introduction 5 What can the Year of Care offer me? 6 What is the Year of Care? 7 What the Year of Care

More information



More information

Ready... Set... Engage!

Ready... Set... Engage! Ready... Set... Engage! Building Effective Youth/Adult Partnerships for a Stronger Child and Youth Mental Health System READY...SET...ENGAGE! Building Effective Youth/Adult Partnerships for a Stronger

More information


DISTRICT READINESS TO SUPPORT SCHOOL TURNAROUND 1 DISTRICT READINESS TO SUPPORT SCHOOL TURNAROUND A Users Guide to Inform the Work of State Education Agencies and Districts Daniel Player Dallas Hambrick Hitt William Robinson University of Virginia Partnership

More information

Making Smart IT Choices

Making Smart IT Choices Making Smart IT Choices Understanding Value and Risk in Government IT Investments Sharon S. Dawes Theresa A. Pardo Stephanie Simon Anthony M. Cresswell Mark F. LaVigne David F. Andersen Peter A. Bloniarz

More information


where is coach? COACHING YOUTH PROJECTS w i t h i n t h e YOUTH IN ACTION PROGRAMME where is my coach? COACHING YOUTH PROJECTS w i t h i n t h e YOUTH IN ACTION PROGRAMME BACKGROUND TO THE REPORT TABLE OF CONTENTS SALTO-YOUTH Participation together with National Agencies of the Youth

More information

Work the Net. A Management Guide for Formal Networks

Work the Net. A Management Guide for Formal Networks Work the Net A Management Guide for Formal Networks Work the Net A Management Guide for Formal Networks Contact Michael Glueck Programme Co-ordinator Natural Resource Management Programme D-108 Anand

More information

Report of Recommendations on Financial Literacy. Canadians and Their Money Building a brighter financial future

Report of Recommendations on Financial Literacy. Canadians and Their Money Building a brighter financial future Report of Recommendations on Financial Literacy Canadians and Their Money Building a brighter financial future December 2010 About the Task Force on Financial Literacy Appointed in June 2009 by the Government

More information

PEOPLEINAID. Code. of good practice. in the management and support of aid personnel

PEOPLEINAID. Code. of good practice. in the management and support of aid personnel PEOPLEINAID Code of good practice in the management and support of aid personnel The People In Aid Code of Good Practice, 2003, is a revision of the People In Aid Code of Best Practice, 1997 PEOPLEINAID

More information

Setting Up a Community-Managed Not-For-Profit Organisation. A guide to the steps and the benefits

Setting Up a Community-Managed Not-For-Profit Organisation. A guide to the steps and the benefits Setting Up a Community-Managed Not-For-Profit Organisation A guide to the steps and the benefits The Office for the Community Sector (OCS) was established as part of the Department of Planning and Community

More information

When Will We Ever Learn?

When Will We Ever Learn? When Will We Ever Learn? Improving Lives through Impact Evaluation Report of the Evaluation Gap Working Group May 2006 Evaluation Gap Working Group Co-chairs William D. Savedoff Ruth Levine Nancy Birdsall

More information

Not a one way street: Research into older people s experiences of support based on mutuality and reciprocity

Not a one way street: Research into older people s experiences of support based on mutuality and reciprocity JRF programme paper: Better Life Not a one way street: Research into older people s experiences of support based on mutuality and reciprocity Interim findings Helen Bowers, Marc Mordey, Dorothy Runnicles,

More information

of Scotland s looked after children & young people.

of Scotland s looked after children & young people. JANUARY 2007 looked after children & young people: Working together to build improvement in the educational outcomes of Scotland s looked after children & young people. looked after children & young people:

More information

A simple guide to improving services

A simple guide to improving services NHS CANCER DIAGNOSTICS HEART LUNG STROKE NHS Improvement First steps towards quality improvement: A simple guide to improving services IMPROVEMENT. PEOPLE. QUALITY. STAFF. DATA. STEPS. LEAN. PATIENTS.

More information



More information

Count me in The readers take on sustainability reporting

Count me in The readers take on sustainability reporting The readers take on sustainability reporting Report of the GRI Readers Choice survey How to use this interactive PDF This document is provided as an electronic, interactive Portable Document Format (PDF)

More information