What Contributed to Successful Implementation?
The evidence workshop took place within two important contexts. First, MCC committed to implementing evidence workshops in partner countries with compact or threshold programs, and the El Salvador workshop acted as a pilot for this effort. Second, a 2015 reform of the executive branch in El Salvador strengthened the role of the Presidency’s Technical and Planning Secretariat, SETEPLAN, in directing the Five-Year Development Plan, organizing the National Planning System, and operating the Monitoring and Evaluation (M&E) Subsystem. To fulfill the latter mandate, SETEPLAN identified a need to strengthen the M&E skills of key personnel in government, civil society institutions, and academia. The goals of MCC and SETEPLAN thus aligned well, and their strong pre-existing relationship facilitated the planning of the 2016 evidence workshop and 2017 M&E trainings.
Workshop organizers at the MCC El Salvador Country Team, FOMILENIO II, and SETEPLAN further stressed that implementation was facilitated by “MCC’s commitment to promote rigorous evaluations and put lessons learned into practice” as well as the fact that high-level authorities from both MCC and the Government of El Salvador supported the workshop early on and attended in person.
How Did Participants Apply Learnings to Their Work?
One government stakeholder described how the Government of El Salvador identified and established M&E indicators for national planning instruments, such as the Five-Year Development Plan, as a result of the capacity building and knowledge acquired. The government also intends to use these lessons learned to achieve the Sustainable Development Goals and address multidimensional poverty. Another key stakeholder from government stated that “I believe that there have been important advances in the outcome indicators” used to measure government work.
Several of the participants who attended the 2016 evidence workshop came from academia, and highlighted in interviews how they have used the impact evaluations presented at the workshop in the university courses they teach, enriching the classes “with concrete examples.” Another participant from academia mentioned using learning from the 2017 M&E training when conducting a project evaluation for a nongovernmental entity. Another wrote that unemployment hindered her from applying the learning from the events.
Results of the evaluations of the 2017 M&E trainings found that participants especially appreciated modules on Theory of Change for policies and programs, when to use different types of evaluations (needs assessments, impact evaluations, process evaluations, and cost-effectiveness analysis), and what questions each can help answer.
When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution. One wrote that “It has been necessary to develop a culture around the evidence. The resistance in this sense is quite strong. The systems for capturing information have their limitations, which are very difficult to overcome in times when resources are scarce. In this process it has been important to count on the support of the authorities.” Another put it this way: “Considering that I have been at the front of M&E units, I can tell you that knowledge is easier to apply from an institution that gives importance to evaluation. Many times the obstacles come from the bosses, the superiors who are not interested in the issues, much less the real results and in part this is because they do not know about the subject.” One respondent noted that while the strong technical team within SETEPLAN “is receptive to MCC’s findings, the current political and fiscal climate has hindered full adoption and implementation of these lessons learned.”
What Are the Main Challenges to Using Evidence for Policymaking in El Salvador?
Regarding challenges to evidence use in general, participants spoke overwhelmingly about the need to create an evaluation and learning culture within their institutions. “It is necessary to create the culture of M&E in our professionals […] It is necessary to stop seeing evaluation as a way to measure and punish the employee,” wrote one participant. Similarly, another interviewee explained how “evaluation can be seen as an audit and not as a learning and improvement process.” Another stated that “The information systems that are available do not respond to current needs and the evolution of the programs. Nor is there a culture of evaluation and many of the processes are susceptible to human errors due to lack of systematization.” The frequent turnover of governments and personnel exacerbates efforts to create an evidence culture, and necessitates “continuously sensitizing decision-makers and managers of public policies, programs and projects on the importance of the generation and use of evidence,” according to another stakeholder from government.
In addition to the lack of a culture of using evidence and evaluations, access to relevant evaluations or updated statistics was a commonly cited challenge, especially for participants from outside government. Other challenges cited include:
- “lack of knowledge about how the results of an evaluation can be concretized into actions for continuous improvement of the work of government”
- “availability of quality and timely information in the short term for public decision-making”
- “to consolidate an effective M&E System that transcends periods of government” and
- how to ensure that “the information generated by the M&E System is actually used for decision making.”
What is the Attitude Toward Evidence Use in El Salvador, and What Has Changed in the Last Two Years?
On the whole, workshop organizers and participants stressed that the mood has shifted in El Salvador, and that while room for improvement remains, significant advances have been made in the last two years to institutionalize evidence use, and to view evidence and evaluations as positive and essential tools for government. One participant described a shift in measuring impact, from purely qualitative to quantitative evaluations using the counterfactual. Another wrote that “there are mechanisms already institutionalized to publicize the results that are available. A lot of attention and resources have also been given to evaluation to be able to continue improving interventions.” Several others highlighted greater interest in or commitment by public institutions to measure results and open themselves to citizen participation. The fact that the M&E trainings had participation from top-level ministerial staff all the way down to program staff is a clear sign of the government’s commitment to use evidence in a more systematic way, declared one stakeholder.
Several participants spoke of remaining challenges, pointing mainly to limited resources for conducting evaluations, and the political environment. As one stakeholder wrote, “The technical offices within the government are receptive toward the use of evidence-based decision making, but the electoral environment and political leadership of both sides of the aisle favors decision making using a different calculus.” Another put it more frankly, “the bosses do not like to know that things go wrong and that they should be adjusted.” However, just as a different interviewee observed, the fear of admitting that things have gone wrong is not unique to El Salvador.
Onwards and Upwards: Catalyzing Continued Culture Change for Evidence Use
Overall, the organizers and participants of the 2016 evidence workshop and 2017 M&E trainings spoke positively of the events and the relationship between MCC and the Government of El Salvador that made them possible. Most participants noted that they were able to apply their learning in some capacity in their work, though how they are doing so differed widely, from better identifying indicators for national planning, to using concrete examples of evaluations in university teaching. Organizers and participants had similar responses when asked about challenges to evidence-informed policymaking in El Salvador: nearly every interviewee emphasized, at some point, the importance of developing institutional cultures of evidence use and learning. That includes by sensitizing government authorities, who participants explained can present obstacles when they lack familiarity with evidence methodologies or topical issues, when they see evaluation as primarily an audit and not a learning tool, when they are reluctant to see negative results, or when they devalue evidence compared to political or electoral priorities.
At Results for All, we have studied many other training programs and initiatives that aim to build policymaker knowledge, skill, and motivation to use evidence in government. The challenges cited by stakeholders interviewed for this blog resonate with our research – namely that institutional cultures, political leadership, and staff turnover can impede the effectiveness of isolated training programs or events that focus on individual government personnel. Instead, we find that creating cultures of evidence use, and institutionalizing evidence-informed policymaking in government, requires the right incentives.
In the United States, our colleagues at Results for America help incentivize political leaders to become evidence champions by highlighting their work in the media (policymakers love press) and recruiting them to join peer learning and advocacy networks. At Results for All, we recently hosted a peer learning workshop for teams of government policymakers from nine countries, focused on using evidence to improve policy implementation. We saw how the opportunity to share and learn from their peers motivated teams of government policymakers from around the world to apply for the workshop. We found that policymakers really want to learn about and create guidelines, policies, and frameworks that can incentivize, systematize, and govern evidence use in their institutions. Lastly, we witnessed strong demand for a sustained network of evidence champions, one that could provide a powerful global platform to advocate for and incentivize cultures of evidence-informed policymaking and learning in government.
Over the next few months, we will publish a series of briefs and case studies on mechanisms to incentivize evidence use in government. We are also shaping a strategy for a global peer learning network on evidence-informed policymaking. The network has the potential to unite leaders from governments like El Salvador to highlight, disperse, and deepen existing evidence practices, pair governments grappling with similar challenges, and jointly develop guidelines and tools to govern evidence use and incentivize continued progress.
This blog was originally published on the Results for All blog on August 27, 2018.