Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

A Commitment to Data-Driven Investments

May 4, 2016

By

The Millennium Challenge Corporation was founded on the principle that data and hard evidence should drive its investments. A data-driven approach can reduce waste and expand MCC’s ability to help lift people out of poverty.

But to harness data, MCC must be gathering the right data and using the best tools to analyze it. To delve deeper, I produced a report on one of the tools at the core of MCC’s data-driven principle:  the cost-benefit analysis.

The cost-benefit analysis tool summarizes a project’s projected costs and benefits and then compares the discounted present value of the two figures to provide what is known as an economic rate of return (ERR). The larger the ERR, the larger the benefits relative to the costs.

MCC staff conducts a cost-benefit analysis for nearly every project, and the resulting ERR helps to determine whether or not MCC funds the project. In general, MCC looks for projects to meet at least a 10 percent threshold to be funded. We also consider risk and distributional and social impacts when deciding which projects to fund.

The ERRs for selected projects – known as “original ERRs” – are projections based on data available at the beginning of MCC’s five-year compacts. When compacts end, MCC estimates what we call “closeout ERRs.” While the costs of a project at closeout are clear, the benefits—which extend over 20 years after the completion of a project—are still largely projections. Analyzing the distribution between original and closeout ERRs helps us assess the usefulness of MCC’s ERRs as an investment tool. And that’s exactly what MCC’s first systematic report on close-out ERRs does.[[Ospina S., A. Mitchell and A. Szott,"2014 Report on Closeout ERRs", MCC Internal Report, 2015]](Editor’s note: a second report on close-out ERRs was published in June 2016.[[Ospina S. and M. Block, "2015 Report on Closeout ERRs", MCC Internal Report, 2016]])

So, are there clear patterns in the data? Does the data provide key takeaways to guide MCC policy?

The report reviewed all projects with a closeout ERR as of the end of 2014. And analysis shows that the weighted average closeout ERR was 16 percent – above the 10 percent threshold but 4 percentage points below the average original ERR. With some important exceptions that I discuss in the report, the majority of closeout ERRs surpassed MCC’s guiding threshold and came close to their original ERRs.

The interactive chart below shows Original ERRs and Closeout ERRs, with a black line indicating the 10 percent threshold.

Some additional patterns can be seen in this chart. For instance, while the majority of projects exceeded the 10 percent threshold, some projects – primarily smaller projects – did fall below.

So what accounts for the difference between original ERRs and closeout ERRs for these projects?

Our research identified a variety of drivers that reduced the ERR below the threshold at closeout, including insufficient data at the outset of the project. But two major reasons appeared across multiple projects. In some cases, costs were simply higher than expected. In others, project modifications sometimes led to reductions in scope and scale and caused benefits to decrease more than costs.

Of a total of 94 completed projects, MCC conducted closeout cost-benefit analyses for 57 projects. MCC is still conducting closeout cost-benefit analyses for some of the remaining projects, while others—such as projects that were canceled—are not expected to be analyzed.

Examining the relationship between original ERRs and closeout ERRs is one example of MCC’s commitment to evidence-based decision-making. The findings of this report are encouraging, but they also provide a roadmap for the future. The report and the questions it raises make an important contribution to our efforts to continue learning and improving within the MCC model and within the development community.

Moving forward, the challenge is to incorporate these lessons to help MCC better estimate costs and benefits of potential projects as we develop new compacts. Particularly as we gather more data on closeout ERRs and completed projects, MCC will use this information to improve program design and more effectively fight poverty.