M&E in Action: Lessons Learned When a Program Doesn’t Go According to Plan

Margaret Gibbon  |  March 27, 2018

This post is part of a series spotlighting M&E practices and learning among refugee service providers in the U.S., beginning with graduates of META’s FY17 certificate course. Today, META hears from Amer Al Fayadh, Language Services Program Coordinator at Church World Service. If you have M&E practices to share, we’d love to hear from you—email META@Rescue.org!

 

Tell us about the program you observed:

When it comes to monitoring and evaluation (M&E), we often hear about best practices and ideal scenarios. But I think it’s important to share cases where the ideal process wasn’t followed, because this can be a great opportunity to learn. One example is an English as a Second Language (ESL) program in my city. As an outside observer, not the beneficiary nor the provider, I noticed a few missing monitoring and evaluation strategies that, if present, could have improved the quality of service, saved time in the planning stage and led the way towards improving the delivery itself.

 

What key M&E lessons did you learn by observing this program?
1. To better understand a problem, step back and look at the big picture.

Through reflective discussions, it was determined that our city lacked effective, well-organized and timely English as a Second Language (ESL) classes. Individuals with limited English often waited for many months to be enrolled in ESL class—that is, if any even became available. A coalition of non-profits began looking for a solution and not long after, a grant opportunity became available. The group set a program goal to increase clients’ employability by increasing their English level. But the crunch to meet the grant application deadline led to rushed steps during the design.

While the program proved to be successful in increasing the number of students enrolled in ESL and the availability of classes, employability didn’t seem to be increasing. Many of the refugee clients were finding jobs within a week or two after starting ESL, limiting their ability to reap the benefits of the class. These clients were securing the same entry-level jobs that were available to them before entering ESL, not higher paying jobs that would ensure consistent financial stability. And although classes were available in the mornings and the evenings, many clients did not attend due to the timing of class, lack of availability of childcare, and transportation conflicts.

Devoting more time in the planning phase would have allowed those involved to step back, look at the greater picture, and better understand the problem they were trying to solve. The group should have asked themselves: Is the only problem lack of ESL offerings in our city? If these offerings were out there, would clients necessarily attend? What obstacles might prevent beneficiaries from returning to class after getting employment (e.g., class time; do not see the benefit of the class, daycare for kids, the commute, etc.)?

2. Get buy-in from all parties in the organization on what success looks like and how we will get there.

After we clearly define a problem, we still need to clearly define success. We’re often challenged by lack of understanding of a program’s objective, and can tend to focus on individualized interpretations of what makes a successful program rather than a collective understanding at an organizational level. Combating this challenge during the planning phase can ensure that the implementation, delivery and evaluation phases will be more effective and productive.

In the case of this ESL program, was success simply increasing clients’ access to ESL, or improving their employment outcomes? One didn’t necessarily equal the other! Here, program planners didn’t make time to establish a clear definition of the challenges, the proposed solutions, and a road-map that would pave the way to accomplishing the intended goals. Establishing this in the beginning of the planning phase would have saved time later down the road because members would have already had the framework to begin making adjustments.

3. Develop indicators and data collection tools sooner rather than later.

In the planning phase, it’s important to develop tools to serve as monitoring and evaluation resources once the implementation begins. The establishment of these tools, prior to beginning the ESL class, could have provided a reference in measuring the effectiveness of the class and in seeking continued ways to evolve and improve the service.

Considering the following questions could have laid the foundation for stronger implementation and quite possibly have saved time on troubleshooting challenges that arose during service delivery: Do we need a before-and-after assessment of English proficiency? How will we measure whether English proficiency is tied to employment outcomes? How can one evaluate the class to determine if attendees gained additional skills to become more self-sufficient, or more prepared for the workforce than their counterparts (those who did not attend ESL class)?

Of course, this would have involved a greater investment of time and resources in planning, in providing adequate training at the different levels within the organization, and in ongoing monitoring to understand program progress and make adjustments when necessary. But these steps would likely have been a time-saver in the long run, and may have led to a stronger program overall.

 

Are there any resources you can recommend?

The best practice is to keep things simple and clear when planning a new program to prevent deviating unintentionally from the preset goals. Here are some resources that will assist you in understanding and preventing common M&E pitfalls.

 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *