Skip to main content

RE: From Hindsight to Foresight: How Evaluation Can Become Future-Informed

Posted on 01/04/2026

I really appreciate the insights shared, especially the focus on adaptive learning and forward-looking evaluation. I would also argue that one critical limitation lies in how evaluation knowledge is produced and communicated. This time, my comment is less focused on methodology and more on offering my perspective in response to your question Steven about what needs to change for evaluation to truly contribute to transformation.

Evaluation reports are often written by evaluators, using technical and methodological language that mainly speaks to other evaluators or technically trained audiences. As a result, these reports tend to be most appealing and useful to those who already understand that language.

What I find interesting is that most projects have communication plans that clearly identify who needs what information, how it should be shared, and why. However, this logic is rarely applied when writing evaluation reports. Different stakeholders have very different needs depending on how they interact with the project, and a single report cannot effectively serve all of them.

To me, evaluation reports should be seen as a starting point, not the final product. The data and findings should serve as inputs into multiple targeted knowledge products, developed by communication experts or sector specialists (health, education, economic development, etc.), that speak directly to specific audiences. These tailored outputs would translate evaluation insights into formats and messages that are relevant and actionable, supporting implementation, positioning, and donor engagement.

In addition, while I understand the rationale behind lean data approaches, they can sometimes be too restrictive. Focusing only on what is needed for indicators or donor requirements may limit opportunities to explore emerging issues or strategic areas more deeply. Sector specialists could play a role here by collecting additional data for learning, positioning, thought leadership, or future programming, as long as there is clear accountability for why that data is being collected and how it will be used. Every question has a cost, but it can also bring real value if it is intentional.

Overall, if evaluation is meant to support transformation, it is not just about improving methods or tools. It is also about making sure that the knowledge we produce is usable, relevant, and accessible to the people who need it. We need to better understand our audiences and what they need.