Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

The Roadmap to High-Quality Monitoring and Evaluation at MCC

November 24, 2023

By , Cindy Sobieski , Rebecca Goldsmith and Emily Mondestin

The four steps to achieving high-quality monitoring and evaluation.

MCC is excited to announce major changes to the MCC Monitoring and Evaluation (M&E) Policy. Since the policy’s last update six years ago, we’ve completed 85 evaluations covering 25 countries and drafted or revised 20 M&E Plans. This revision builds on the lessons learned from that experience. The revised policy paints a clear picture of what high-quality M&E looks like at MCC and addresses challenges to achieving it. Our goal is to show the impact of MCC investments while supporting evidenced-based decision making in current and future MCC programs.

A central theme to the policy revision is the need for greater collaboration between project designers and M&E. The policy now transparently lays out what is needed to produce high-quality M&E, giving project teams a goal to work towards.

A comprehensive framework for results centers around a clear and consistent definition of success — the project objective.

1. START WITH A COMPREHENSIVE FRAMEWORK

Before embarking on this policy revision, we commonly heard teams express confusion on how all the components of the MCC results framework fit together, including the project objective, project logic, indicators, cost-benefit analysis, and independent evaluation. So, this policy revision starts out by doing just that — laying out a comprehensive framework for results.

The project objective, or the definition of success of a project, is the focal point of the results framework. The framework tells a logical story about how project funds will be used to achieve intermediary results that ultimately lead to that vision of success. Each result is connected to an indicator in the M&E Plan, so we have a clear understanding and agreement around how to measure success and the path along the way. The evaluation analyzes these results to determine whether the project objective was achieved and why or why not.

The results framework from Step 1 - filled in with sample details of a pretend project.

2. DEFINE PROJECT RESULTS

You can think of the above framework like an empty template that is filled in as a project is developed. Project designers document their plans and vision for the project in the framework. The extent that the framework can be filled in is different for each project. In the revised policy, MCC establishes the “Results Definition Standard” — the minimum degree to which a framework should be filled-in to meet MCC’s M&E Policy criteria at the time of the investment decision. Meeting this standard is necessary for high-quality M&E. This standard also gives MCC senior leadership a clear picture of the results they can expect if project funding is approved.

The results framework from Step 2 - filled in with the “actuals” measured.

3. TIME FOR MONITORING AND EVALUATION The heart of M&E’s work is to implement each country’s M&E Plan and measure and report on the results that have been defined in the prior step. The policy lays out what the M&E Plan should contain, and how it can be revised. One key change is that the revised policy organizes evaluations around each project — not multiple components of a project per the previous policy — which will make it possible for MCC to tell a much more holistic results story for each project than the prior piecemeal approach.

MCC strives to apply lessons learned to future program design and implementation.

4. WE’RE GETTING BETTER ALL THE TIME

The goal of high-quality M&E is to facilitate real learning. With this policy revision, we hope to facilitate learning from each evaluation and carry these lessons forward. For example, as a result of our new focus on clearly defined project objectives, we will be able to produce clearer and shorter evaluation reports that are easy for project teams to digest and translate into practical lessons.

At MCC, we are committed to accountability, learning, and transparency. This means true reflection on whether or not we achieved our project objective, and where our project logic held up and where it fell apart. We will use this knowledge to inform future MCC programming and share with the public. By always seeking more accurate and cohesive approaches to understanding what our projects achieve, we can continue to reinforce a virtuous circle of planning, implementation, measurement, and learning at MCC.