How Charities Learn from Evaluating Themselves, and Tips for Measuring Results



Similar documents
Nonprofits Find New Donors With Databases That Track Connections

OPERATING PRINCIPLES. Strengthening Nonprofit Organizations. Approaching Grants as Investments. Leveraging Resources

National Commission for Academic Accreditation & Assessment

Chapter 28: Expanding Web Studio

DEVELOPING A FUNDRAISING PLAN: FOUNDATIONS. Background

An Introduction to Grant Writing

How to Avoid Fundraising Scams

Fun with Fractions: A Unit on Developing the Set Model: Unit Overview

Mobile App Development: How to Create a Useful App

10 Secrets to Developing Client Loyalty. with Ken Hardison, PILMMA and LawMarketing.com

Program Quality Self-Assessment (QSA) Tool Planning for Ongoing Program Improvement

Unleashing your power through effective 360 feedback 1

Save a personal copy of this article and quickly find it again with Furl.net. It's free! Save it.

Uniting D.C. families

Assessment of Primary Care Resources and Supports for Chronic Disease Self Management (PCRS)

PERFORMANCE MANAGEMENT AND EVALUATION: WHAT S THE DIFFERENCE? Karen E. Walker, Ph.D., and Kristin Anderson Moore, Ph.D.

first steps in monitoring and evaluation

Run a Class Like a Game Show: 'Clickers' Keep Students Involved

An introduction to impact measurement

Nurse Consultant Impact: Great Ormond Street Hospital Workshop Report

Last year was the first year for which we turned in a YAP (for ), so I can't attach one for

BBBT Podcast Transcript

CUSTOMER SUCCESS. United Way of Greater Atlanta Connecting People in Need with Community Resources

Leadership Coaching Bursary

How to Manage your Extracurricular Activities

Using IT to boost call-center performance

Compensation Reports: Eight Standards Every Nonprofit Should Know Before Selecting A Survey

Students. Ads Urge Students to Think Twice About Colleges With a Rap... May 15, By Robin Wilson

Drury University - Effective Community Partner Survey 2013

PARTICIPATORY ANALYSIS

Employee Turnover Reduction. Background:

Creating a More Welcoming League WELCOME NEW MEMBERS

A Look at Girls Attitudes Toward Math, Science, and Technology. Are We Really Making a Difference?

High Schools That Work: How Improving High Schools Can Use Data to Guide Their Progress

Armed Services YMCA of USA recognized as a top non-profit

Section 4: Key Informant Interviews

Mystery Money: the Art of Seeking Gifts From Donor-Advised Funds

The Happiness Challenge

Performance Management

VIRTUAL VOLUNTEERING: Current Status and Future Prospects. Vic Murray University of Victoria Yvonne Harrison University of Victoria

The Value of British Gas Energy Trust. Impact Report Summary

Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.

Different nonprofits devote different

Qualitative and Quantitative Evaluation of a Service Learning Program

Haroon Hussain Case Study

Can research inform practice?

Then call us today (07) or to find out more about how we can help you!

Wyman s National Network: Bringing Evidence-Based Practice to Scale. COA Innovative Practices Award Case Study Submission. Emily Black.

We are the regulator: Our job is to check whether hospitals, care homes and care services are meeting essential standards.

The Targeting Advantage: Improve Your Marketing Success

Association Marketing

EXECUTIVE SUMMARY: 2013 ICF Organizational Coaching Study

How-to-Guide for Writing Personal Statements. What is a personal statement? How should I begin? What should I write about?

As strange as it may sound, but: 50% of the interview is preparation, 50% of the interview is attitude.

Nonprofit Publications

Developing and delivering a communications strategy

Copyrights PPC Bully (MyAdWise Ltd.) - Page 1 of 13

Minnesota Nonprofit Job Seeker Guide

Organizational Alignment is Key to Big Data Success

4th Annual Ambulatory Care Nursing Symposium The Role of the Ambulatory Nurse in Driving Organizational Performance May 19 and 20, 2011

17 of the Internet s Best Banner Ads. Love em or Hate em They Do Work!

Raising money at Girlguiding

Case study: Improving performance in HR London Camden

Grant Writing Fundamentals: Positioning Your Organization for Success

ERAS Online Q&A. Sections. September 8, 2014

Grants & Grant Applications

Most community foundations operate a

Colleges Cite Inequities in New Benefits for Veterans

What are some effective standards-based classroom assessment practices?

Use the Package and Deployment Wizard to distribute your Visual Basic 6 programs

PRESENTATION TALKING POINTS (FOR ENGINEERS)

CO-OP... THE STEPCHILD OF ADVERTISING

The New Value of Change Management: Success at Microsoft

What To Do If You Have a Concern About Quality In a Pennsylvania Hospital

UC Irvine Information Technology Consolidation. Dana Roode UCI Office of Information Technology July 8, 2010 (updated 7/2012)

Enhancing Student/Faculty Communications in Online Courses

An Oracle White Paper February Oracle Human Capital Management: Leadership that Drives Business Value. How HR Increases Value

Embracing Technology for Moves Management Success

Making sales sizzle for success

System is a not-forprofit

Funding success! How funders support charities to evaluate

BmLE LESSON # was able to tell us the meaning of our dreams, and the dreams came true!

ASSESSMENT AND QUALITY ENHANCEMENT

Test your talent How does your approach to talent strategy measure up?

EDUCATIONAL GOALS: CONSIDER THE QUESTION

Media Training Quick Reference Guide

How to Build Your List 24 Tips for Dentists

BEST PRACTICES IN COMMUNITY COLLEGE BUDGETING

The Challenge of Helping Adults Learn: Principles for Teaching Technical Information to Adults

Overview. Here to serve, Cedarville University Marketing Department Tyler 212 Suite cedarville.edu/marketing

VIDEO TEACHING GUIDE. for use with Program 8 ASKING FOR HELP. In the Youth Guidance Video Series

Writing Proposals for Capacity Building. Funders want to improve organizations not rescue them. Make sure you're starting from a position of strength.

Staffing Grants Management

DEVELOPING A FUNDRAISING PLAN: CORPORATE FUNDING. Background

Are You Ready to Launch a Major Fundraising Campaign?

activities at home Planning the day for a person with moderate or severe dementia

Your Guide To Crowdfunding With Superior Ideas

Guidelines for conducting tabletop exercises Penn Mission Continuity Program (MCP)

Concept report: The Australian Personal Loans Market - Targeting the high value personal loan customer

Catering to physicians: Specialized financial services firms find a niche

Transcription:

Print: The Chronicle, 6/12/2003: How Charities Learn from Evaluating Themselves, and Tips Page for Measuring 1 of 5 Results http://philanthropy.com/jobs/2003/06/26/20030626-405842.htm IN THE TRENCHES How Charities Learn from Evaluating Themselves, and Tips for Measuring Results By Kimberlee Roth When Lisa Brakebill became program director of the Greater Maryland Chapter of the Alzheimer's Association, in Timonium, eight years ago, she says, program evaluation meant figuring out whether her organization's number of clients was increasing, and whether they were satisfied with the programs she led. Then, in the late 1990s, the chapter began applying for the Maryland Association of Nonprofit Organizations' Standards for Excellence certification, a voluntary program that shows charities ways to measure program effectiveness. When she saw the requirements for program evaluation, she says, "I balked to no end. I thought, 'This will take so much work. People love us. They say how good we are all the time. Why do we have to do this? ' I had no clue whatsoever." After spending about a year fine-tuning her evaluation criteria, Ms. Brakebill was able to glean more information from clients -- their feelings about the program as well as the impact the program had on their lives. Her efforts paid off: The group has made changes in the way it operates that help it serve clients more thoroughly and efficiently, she says. In addition, the organization received the Maryland association's Seal of Excellence in 1999 (and won recertification last year). Ms. Brakebill now serves on a committee to help the national Alzheimer's Association develop evaluation tools for all its chapters, a plan that it will begin this summer. As Ms. Brakebill once did, many nonprofit executives confuse monitoring "numbers" and client satisfaction with evaluation, says Allison Fine, executive director of Innovation Network, a national nonprofit group in Washington that helps charities conduct evaluations. "Almost all of the organizations I've seen are doing some kind of regular monitoring," she says, "but that's different than evaluation, a system of ongoing learning." Since the late 1980s, evaluation has grown in importance to grant makers. Given the increased competition for funds, donors want to see the results of their gifts in quantifiable rather than anecdotal ways. A 1998 study by Innovation Network, sponsored by the Aspen Institute and the Robert Wood Johnson Foundation, showed that these assessments are usually done at the urging of grant makers or trustees -- 69 and 56 percent, respectively. But increasingly, charities are changing their views about evaluation from merely a requirement set forth by grant makers to an opportunity to learn and make changes in their programs that enhance their missions. The best evaluations come from charities that are thoroughly committed to the process for its own sake, says Bill Niederloh, chief executive officer of the National Results Council, in St. Paul, which develops evaluation criteria for employment, training, and other human-service programs. "Where we've seen program evaluation be successful, it's part of the organizational culture and it has high visibility" within the group, he says, "versus simply being seen as a reporting mechanism to satisfy whoever's offering funding."

Print: The Chronicle, 6/12/2003: How Charities Learn from Evaluating Themselves, and Tips... Page for Measuring 2 of 5 Results Seeking Guidance Although grant makers shouldn't provide the only impetus for evaluating programs, they can offer guidance, charity leaders say. "We try to dispel some of the anxiety around evaluation," says Carol Goss, vice president for programs for the Skillman Foundation, in Detroit, which supports efforts to improve the lives of local children. "It's not some way we're going to judge 'good' or 'bad.' We see evaluation as actual learning, for the organization and for us. That may mean learning some hard lessons, too, but that's OK." Ms. Goss says her foundation requires evaluation as part of all its grants. "It doesn't have to be anything very complicated," she says, "but grantees do have to have something in place that will provide them with information to help them improve and will provide us with information on what's happening." To that end, Skillman program officers work closely with charities' project directors during the proposal stage to make sure grant recipients have a plan for measuring results. A few years ago, the foundation also developed a basic guide to evaluation for recipients and prospective recipients. For example, the guide prompts charities to first think about what questions they want an evaluation to answer, and then shape their evaluation method accordingly. As a result of offering the guide and of the assistance that program officers give grantees, she says, "organizations know we value evaluation, and they talk about it right from the beginning. They want to engage; they want to know what our expectations are. They ask good questions, and that causes some really good conversations about goals, results, and where they want to end up. It really shows growth." Other entities that support charities, such as local United Ways and state and regional associations of nonprofit organizations, can provide guidance regarding program evaluation. (For more information, see "A List of Resources to Help Nonprofit Groups Assess Their Programs.") Choosing Methods Evaluations can be conducted in many ways, says Rini Banerjee, program director at the New York Women's Foundation, in New York City, which supports programs to help low-income women and girls achieve economic self-reliance. Her foundation's grantees tend to be small organizations, some with only one staff member. "When we say 'evaluation,' we don't want them to think of it as a burden, but something of a learning process," she says. "How an organization does it can range from informal to formal, depending on size, capacity, type, and we tailor our ideas about what's appropriate -- an all-volunteer organization probably isn't thinking about evaluation, but an advanced organization should have systems in place." Although evaluations can help a charity fine-tune particular programs, not every program warrants a rigorous assessment, says Ms. Fine. Choosing which programs to evaluate, she says, depends on the goals and priorities stressed in an organization's strategic plan. "Maybe your after-school math and science program is the focus this year because it's so important to your plan, and so it's less important to look at your rec program," she says. "You can look at that a year from now. It can be overwhelming -- it's important to start with bite-size pieces people are going to be successful with."

Print: The Chronicle, 6/12/2003: How Charities Learn from Evaluating Themselves, and Tips... Page for Measuring 3 of 5 Results The methods an organization uses to evaluate its programs depend upon the reasons it is undertaking the evaluation, says Ms. Fine. "From the 'why' comes the 'how,'" she says. For instance, if a food bank wants to know how many clients it is serving, it is merely a matter of monitoring and counting. If, however, it wants to know whether its services are helping its clients avoid hunger, counting clients won't provide the information it seeks. True evaluation, Ms. Fine says, begins with monitoring but includes other components, such as evaluating how a program is carried out. Organizations that are evaluating their programs on this level might look at how tasks are coordinated, and whether the program is appropriately staffed. Results, Ms. Fine says, make up the "what" portion of any evaluation: "What difference has all of this activity made? Are we making lives better?" Determining results is the most difficult portion of any thorough evaluation, she says, and typically requires both quantitative and qualitative measurement. Existing client-feedback mechanisms, such as surveys, might be tweaked to help measure program impact. When Ms. Brakebill began revamping the way in which she evaluated programs at the Alzheimer's Association, she changed the questions on a survey routinely given to support-group participants to yield more specific information -- asking clients whether they received emotional support and felt better equipped to cope with the demands of caring for Alzheimer's patients, deleting vaguer queries such as, "Do you like the support-group facilitator?" Of course, measuring program impact becomes difficult when a charity tries to track intangible results, such as what impact a telephone help line has on callers. When Ms. Brakebill and her colleagues were facing that question, they brainstormed a list of indicators, such as whether callers reported a lower stress level after calling the help line, and whether they used the information the organization sent them. From there, the group devised a survey that was included in the information packet mailed to callers. However, the charity soon found that only the most impressed help-line clients were bothering to respond to the survey. Now, the organization randomly selects a percentage of callers and contacts them by phone to ask their opinion of the help-line service. Making Changes Evaluations provide insight not only into programs and services, but also into unmet needs. "If you ask for input, be ready for the good, the bad, the ugly, and sometimes, the unexpected responses," says Mary Wambach, executive director of the Access Center of San Diego, an independentliving center that advocates for people with disabilities. For instance, in a 2000 evaluation survey, Access Center clients identified recreation as a valuable but missing service. So, the organization obtained funds from a local resource to provide it at one of its two satellite locations. (A subsequent evaluation of the recreation program indicated that those who took advantage of the new recreation services experienced weight loss and quantifiably lower body fat, Ms. Wambach adds.) Ms. Brakebill says she has also put her group's evaluation findings to work. Based on an assessment of a program that provides respite for people who take care of Alzheimer's patients, she learned that clients felt the charity's staff members weren't very accessible. So the organization began requiring its workers

Print: The Chronicle, 6/12/2003: How Charities Learn from Evaluating Themselves, and Tips... Page for Measuring 4 of 5 Results to respond to client calls within 24 hours. Sometimes evaluation results can motivate grant makers to loosen their purse strings. In the mid-1990s, the Boys & Girls Clubs of America, in Atlanta, measured the success of an educational program called Project Learn, says Karen MacDonald, director of education programs for the national organization. Data showed that children who attended the program improved their school attendance and grades compared with those who didn't attend. The results led to more than $5-million in new grants from several foundations for construction and renovation of learning centers at dozens of clubs around the country. While grant makers are interested in the Boys & Girls Clubs' assessment of its learning centers -- and foundations that support the centers are regularly sent reports from each one -- the information gathered is primarily used to provide training to other clubs about what practices work best, says Ms. McDonald. "There's a lot of sharing that goes on," she says. The national organization emphasizes the importance of local affiliates evaluating themselves for their own benefit. It has given affiliate clubs a variety of evaluation tools for all of its programs, including a Web-based survey designed to be taken by club members, which about one-fourth of clubs use regularly, she says. The survey asks young club members numerous questions that assess their self-esteem, health behaviors (such as whether they smoke), and other youth-development indicators. Grant makers can sometimes help facilitate the sharing of evaluation data among charities. For example, says Ms. Banerjee, the New York Women's Foundation holds annual meetings that bring together its grantees that work on similar issues -- so they can learn from each other, she says, and so the grant maker can learn from its grantees' evaluations. As a result of such feedback, she says, it has simplified its grant-application process and also launched a Management and Leadership Institute for grantees to further explore topics of interest, including program evaluation. A charity's early steps toward evaluation can be small ones, but they should be part of a long-term plan for improvement, say charity leaders and grant makers. "After one, two, three years, what measurable difference may have been made?" asks Ms. Fine, who adds that nonprofit groups don't often hold these discussions about their goals. Such talks, she says, should be the building blocks for a solid plan that includes what type of data should be collected and how the organization will use it to modify and improve over time. "It's an evolving process," she says, "and one that changes the perception many charities still hold of program evaluation, from an add-on done only when a funder pushes to the natural and expected way nonprofit organizations work." Do you have any tips for evaluating nonprofit programs? Give your advice in the Share Your Brainstorms online forum. Copyright 2003 The Chronicle of Philanthropy Gifts and grants Fund raising Managing Non-Profit Groups Technology Front Page Current issue Back issues Guide to grants Ideas & resources Facts & figures Events Deadlines The Nonprofit Handbook Links Products and services Jobs

Print: The Chronicle, 6/12/2003: How Charities Learn from Evaluating Themselves, and Tips... Page for Measuring 5 of 5 Results Search Site map About The Chronicle How to contact us How to subscribe How to advertise Feedback Help