Education and training are under immense pressure to adapt to rapidly changing global challenges, such as climate change, migration, inequity and inequality, the pandemic and digitalisation. 

But how do we decide how to respond to these challenges? And how do we find out afterwards whether our response was right?

The evidence needed to make such policy decisions is gathered through monitoring and monitoring and adapting was the topic of the second session on day 2 of the joint ETF-UNESCO conference Building Lifelong Learning Systems. 

The session was split up in two parts, with 11 experts discussing what ETF moderator Hugues Moussy called the upstream part of monitoring (what information is needed and how do we gather it) and the downstream part of monitoring (the use of monitoring results in change processes). 

Hiromichi Katayama, Senior Programme Specialist at UNESCO, explained how their conceptual work addresses one of the big hurdles in monitoring internationally: the incompatibility of data. With the help of an example from Palestine, he also explained how artificial intelligence can be employed to overcome the challenge of processing the vast amounts of data that are accessible today.

He warned not to rely exclusively on automated processes though. An integrated approach is needed. 

Mihaylo Milovanovitch, Senior Human Capital Development Expert at the ETF, opened his overview of lessons learned by the ETF, primarily through the Torino Process, with a reminder that the success of data collection and analysis all depends on the people who work with it.

There are content and data parts to the Torino Process but the ETF does not just crunch numbers.

“We look at whether people actually understand the data they collect,” said Mr Milovanovitch.

“We want to build consensus. We want to have state of the art data but not without partners understanding it and agreeing on it.”

He introduced four principles of the Torino Process that were echoed throughout the discussions in the session. They were ownership, broad participation, and an evidence-based and holistic approach. 

“After all these years, we can see that following these principles is not so simple. Education and training stakeholders are extremely diverse and they all come with their own luggage: from students and parents, who deal with daily problems at the grassroots level all the way to international organisations who work in a top-down manner with vision and strategic development. They all share the Torino Process room and we are not always using the same language.”

So there are trade-offs to be made, said Milovanovitch, between consensus and inclusion, between speed and accuracy, between scope and relevance, to name but a few.

“How successful we are depends on people and therefore on our ability to involve people.”

Senior Analyst Tracey Burns of the OECD reminded the audience that already in 2013, the World Economic Forum had misinformation in its top 10 of global issues. 

“We are witnessing an explosion of available evidence, but no indication of its quality because the classic gatekeepers of research are not always being used any longer,” she said.

“The data is super useful, but the volume makes it hard to use.”

Ms Burns highlighted three challenges for monitoring in OECD countries. One is inherent in the democratic policy cycle, where monitoring and evaluation typically come at the end and there is not always the patience to wait for its results before policy changes are being made.

Another is the question of whose voice counts, which Mihaylo Milovanovitch already addressed when he spoke of the need to find consensus in the Torino Process. With all stakeholders involved in data provision, collection and analysis, different conclusions are bound to be drawn even on the basis of the same material. 

Finally, Ms Burns highlighted the move from a push culture (dumping all the data that is available out there and hoping something good will come out of it) to a pull culture, where focus is on the users of data. Do all users have access? Do they have the skills needed to process the data? Is their interaction and collaboration with other users facilitated?

The subsequent discussion showed that there is a great deal of agreement on why we need to monitor and how the results should be used, even if different people can use quite different ways to phrase similar issues.

There was broad agreement that everyone who is involved in lifelong learning systems must be involved in monitoring the effectiveness of learning and using the results for adaptation. And cooperation does not only involve all stakeholders within the country. International cooperation is imperative too.

It is very important to not just focus on traditional education, but to also find ways to include non-formal learning in monitoring exercises, however difficult this may be. School grades are easy to compare. That does not mean that comparing them is a useful exercise when we want to move towards learner-centred lifelong learning systems.

Most of all: we must nurture a 360°monitoring culture. Whether or not to monitor should no longer be the question. Our focus must be on how to monitor what and how to use the results. 

Key take-aways:

  • Evidence is critical in policy development. Whether or not to monitor should no longer be a question. 
  • Everyone must be involved in lifelong learning systems, therefore everyone must be involved in monitoring – both in collecting data and in using the results. 
  • However, involving more sources will yield more data. There is already too much information and the traditional gatekeepers of information have lost authority. We must tread carefully in balancing quantity with quality.
  • The balancing exercise does not stop there. We must also balance the desire for consensus with the need to include and the dramatically increased need for speed with the desire for accuracy.

To see the full event: