Skip to main content

RE: Beyond the final report: What does it take to communicate evaluation well?

Posted on 07/11/2025
  • Start communication at the preparatory stage
    • Hold an early alignment discussion with the evaluand and commissioning organization to identify learning priorities, sensitivities, and areas where clarification may be needed.
    • Ask project teams to share what they hope to learn from the evaluation. This respects their experience and allows the evaluator to understand what findings could be most useful to them.
    • Example: Circulate a simple pre-inception survey asking, “What are three lessons you believe would be most valuable from this evaluation?” and use these inputs to refine evaluation questions.
  • Clearly define roles and expectations of all stakeholders
    • Even though the evaluation remains independent, all parties must understand their roles. The evaluand’s responsibility may include coordinating access to project data, arranging interviews, and supporting field visit logistics.
    • Example (evaluand): The Ministry focal person supports scheduling interviews with district officials and ensures necessary approval letters are issued.
    • Example (other actors): Local partner NGOs prepare community entry protocols, informing chiefs or community leaders so that visits are welcomed rather than abrupt.
    • Beneficiaries should also be informed in clear, non-technical language about the evaluation’s purpose and how they will participate.
    • Example (beneficiaries): Before interviews, data collectors read a short script explaining confidentiality and that their responses will not affect their eligibility for future support.
  • Ensure communication is ongoing and two-directional
    • Communication should not be limited to inception and final dissemination. Instead, set up regular structured touchpoints to review progress, clarify emerging issues, and resolve challenges.
    • Example: Agree to a 30-minute weekly check-in call with the evaluand to confirm upcoming interviews, check document access status, and surface any concerns early.
  • Strengthen internal communication within the evaluation team
    • The commissioning organization should clearly assign roles such as team leader, qualitative lead, quantitative analyst, report writer, field coordinator, and liaison officer.
    • Do this explicitly in both contracts and kickoff meetings, not informally.
    • Example: To avoid duplication, designate one person to contact implementing partners. If two team members contact the same partner separately, it may appear disorganized or raise confusion.
  • Provide training and familiarization with communication platforms before the evaluation begins
    • Do not assume all team members are familiar with the chosen platforms (e.g., Slack, MS Teams, Dropbox, Trello).
    • Allocate a dedicated learning period before the inception phase to ensure everyone understands how to use the tools effectively.
    • Example: If Teams will be used, conduct a live session demonstrating how to:
      • Upload and version-control documents
      • Share screens during interviews
      • Use channels to separate field logistics from analysis discussions
  • Designate one central point of contact throughout the evaluation
    • To avoid mixed messages, identify a single individual responsible for all communication to and from the evaluation team.
    • Example: The Evaluation Team Leader serves as the only authorized point of communication with the donor. Other team members route queries internally first to avoid inconsistent messaging.
  • Conduct a preliminary findings sharing session immediately after data collection
    • Present emerging themes, not final conclusions. This invites clarification and ensures interpretations reflect realities.
    • Example: If data suggests a drop in project attendance, stakeholders may clarify that school calendars shifted due to strikes or weather events. This prevents incorrect assumptions in the final report.
    • This step helps refine analysis, validate insights, and improve the relevance of recommendations.

Use multiple communication formats to share evaluation findings

  • Evaluation reports should not be one-format-fits-all. Different audiences require different levels of detail and styles of presentation.
  • Text-heavy reports often discourage use and learning, especially among stakeholders who prefer visual information.
  • Practical examples:
    • Full Technical Report: For donors, policy analysts, and researchers who need detail.
    • Easy-to-Read Summary (5–10 pages): For program managers and partners.
    • Infographic-Only Version: For community members, advocacy groups, and general audiences. This can show key results using visuals such as charts, icons, timelines, and outcome pathways.
    • Thematic Digests: Short 2–3 page briefs on individual themes (e.g., gender, youth employment, capacity building), derived from the full report.

This approach improves accessibility, strengthens knowledge uptake, and increases the likelihood that the evaluation will inform decision-making.