An introduction to experimental and non-experimental methods for assessing policy impact. ETF Knowledge Sharing Event, 9-10 September 2025
"In an era of fiscal constraints and rapidly evolving policy landscapes, evidence-based evaluation is more imperative than ever."
These opening words of Cristina Mereuta, Head of the ETF Knowledge Hub, appropriately set the scene for an intense and challenging knowledge sharing event which brought together experts and practitioners from the ETF, Egypt, Sweden, and Italian organisations specialising in employment and skills development to explore the fundamentals of impact evaluation in public policy.
The ETF's Piotr Stronkowski, Human Capital Development Expert, who chaired the event set out its aim to learn about rigorous methods for measuring the impact of public policies on human capital development with a focus on practical application for policy advisors and implementers.
Day One
The first session of the event was led by Dr Reham Rizk, Nayera Adly Husseiny, Arwa Adel and Zeina Osama from Egypt Impact Lab, a joint initiative of J-PAL MENA and the Egyptian Ministry of Planning, Economic Development, and International Cooperation. It focused specifically on randomised controlled trials (RCTs) which are widely regarded as the gold standard in impact evaluation, however attention was also given to alternative methodologies suited to diverse contexts.
“RCTs are very costly—but implementing a programme without testing it first can be even more costly,” said Dr Rizk.
The workshop covered definitions and rationale for impact evaluation, conditions for randomisation, sampling strategies, and J-PAL case studies to demystify technical terms and illustrate real-world applications.
“A good impact evaluation needs a good programme design and implementation built on a Theory of Change,” explained Dr Rizk.
The theory accounts for each of the steps in the evaluation of the intervention from the inputs to the outputs and outcomes. There are two types of evaluation - the impact and the process evaluation - with the former measuring the change attributable to the intervention and the latter assessing whether the programme was implemented as intended.
Participants learned how to choose suitable evaluation methods, design and manage evaluations, interpret results for policy, and support evaluation capacity in partner countries.
Participants engaged in a lively discussion on the complexity of interventions, emphasising the importance of clearly articulated assumptions. While recognising the power of randomised controlled trials (RCTs), the group agreed that these are not universally applicable—ethical and contextual considerations must always guide the choice of methodology.
The afternoon session provided participants with the opportunity to explore the practicalities of planning and setting up RCTs in greater depth.
Day Two
The second day of the event began with an exploration of public employment services in Sweden, where Dr Gisela Waisman shared a real-life example of randomised controlled trials (RCTs) with particular emphasis on the use of administrative data.
“We do large scale randomised control trials if possible and otherwise we go for quasi-experimental methods, leveraging high quality data collected since 1992," said Dr. Waisman adding that:
"RCTs are ideal for causal inference although ethical and contextual factors are paramount, and sometimes it’s just not even possible to randomise.”
The presentation was based on a major evaluation of private job search assistance in Sweden which was found to be up to 70% more expensive than public provision without delivering better employment outcomes, prompting changes in compensation schemes and increased scrutiny of underperforming private providers.
The session also addressed the challenge of generalising findings, with J-PAL MENA’s Nayera Adly Husseiny whose presentation focused on how to turn evaluation findings into useful policy advice and the extent to which they can apply across countries and contexts.
She outlined a framework considering 4 components:
Framework Components:
- Programme Theory: Understanding the theory of change and assumptions.
- Local Conditions: Comparing the context of the original intervention with the new context.
- Mechanisms of Change: Identifying what made the programme work.
- Implementation Capacity: Assessing whether the intervention can be faithfully replicated.
A case study of the “Graduation Approach” (poverty alleviation) from Bangladesh, adapted and tested in Egypt was used to illustrate how even with strong evidence from other countries, local adaptation and new RCTs are often necessary due to contextual differences and implementation challenges.
The last example of the event came from Dr Joanna Hofman who described the roles, functions, and models of What Works Centres in the UK in translating evidence into policy, stressing that “evidence must be accessible and actionable,” and that “building a culture of evaluation is a gradual process, requiring capacity building and stakeholder engagement.” Moreover, the examples from education and policing show how evidence-based toolkits and guidelines can influence practice and policy at scale.
The discussion concluded with reflections on the importance of timing, communication of both positive and negative results, building an evaluation culture and the growing role of AI in evidence synthesis. Notably Hofman described how "cost-effectiveness is often as important as effectiveness."
Conclusion
The event underscored the critical importance of rigorous, context-sensitive evaluation in shaping effective public policy. Through in-depth discussions and practical examples from both European and MENA contexts, participants reinforced their knowledge and understanding of how robust evidence grounded in sound methodology and ethical practice is indispensable for informing policy decisions. Although the consensus is that no single evaluation method is universally applicable, participants learnt about the value of RCTs in particular and the importance of transparency, learning, and continuous improvement of the evidence base for advancing policy effectiveness and societal impact.
This knowledge sharing initiative reflects ETF’s commitment to enhancing its advisory role by generating and interpreting robust evidence, thereby supporting more effective and accountable public policy in the field of employment and skills in partner countries in the EU's neighbouring regions.
Please log in or sign up to comment.