Charity/Social Enterprise

From Report to Reality: Making Insights Practical

Written by
Full Name
Published on
22 January 2021
October 28, 2025
Subscribe to newsletter
By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

At ImpactEd, and in the evaluation space more generally, we all know that good insights can change things. But in practice, the difference between a report that sparks real action and one that gathers dust often comes down to how recommendations are designed and shared. At ImpactEd, we believe that the most valuable outputs are those that help partners see what to do next with their findings: clearly, confidently, and collaboratively.

How can you make evaluation specific and practical?

When we write recommendations, we start with one question: What’s the next step this evidence points to? It’s not enough to summarise what the data shows. We need to translate it into something that feels practical.

Strong recommendations are specific and prioritised so partners can make informed choices. For example, “Establish a peer-mentoring programme for Year 9 pupils, starting with a pilot in two schools next academic year” is much more useful than “Schools should focus on pupil engagement.” Avoid recommendations that are too broad and generic.

Being clear about who should act and when builds ownership. Adding categories such as “quick win,” “high impact,” or “long-term goal” helps teams decide where to start.

Partner example: During our work with Sumdog over the 2024-25 academic year, we found that teachers observed pupils sometimes getting too engaged with the platform, meaning that they go too fast on the exercises without prioritising accuracy. We reported on this in our interim report and Sumdog was able to take immediate actions. They introduced a "You are going too fast" sign, telling pupils that they keep going too fast they will lose some points. This meant that in the final report at the end of the school year, we were actually able to report back to Sumdog about how much teachers loved this new feature, giving us a really good example of an agile feedback loop.

How do you link recommendations to evidence?

Every recommendation should have a clear link back to the data. That connection builds trust and keeps the focus on what the evidence actually supports. Briefly summarising the relevant finding alongside the recommendation. For example, “Based on improved attendance in mentoring interventions…” helps decision-makers see the logic and gives them confidence to act.

It’s also worth being honest about what we don’t know. Acknowledging limitations doesn’t weaken recommendations, it makes them more credible. If the evidence isn’t definitive, frame it as something to test: “Consider piloting…” or “Trial an approach to…"

Project example: For our Understanding Attendance project, we created a handbook for participating schools and Trusts, designed to be accessible to any staff member, from Trust leads to SENCOs to Heads of Year. It translates data insights into actionable recommendations through a flexible, choose-your-own-path structure. Schools have commented that they have found the handbook particularly effective during Insights meetings, where the partnership team collaborates with schools to synthesise report findings, identify context-specific priority themes, and select relevant strategies from a research-backed menu of real school examples. Each intervention is tagged by theme (e.g., 'sleep', 'sense of belonging', 'parents and carers') to easily match recommendations to identified needs. Schools work through the handbook to engage with the project's full offer, understanding their data at both macro and micro levels, from Insights meetings and Trust-level reports to individual pupil responses and webinars.

How to co-create recommendations with partners?

We’ve found that recommendations are far more likely to stick when they’re shaped with partners, not just for them. That means testing ideas, refining wording, and making sure actions feel realistic in their context. Using a partner’s own language or framing their priorities within the recommendations can make a big difference in how they’re received.

In some cases, this might mean sitting down together to rank priorities or adjust timelines. In others, it’s about exploring “what this could look like here” before finalising the report. That collaboration turns recommendations into shared plans for improvement rather than external instructions.

How should you present recommendations in evaluation reports?

Finally, how we present recommendations matters. Simple visuals, summary tables, and concise headlines can make it easier for busy teams to engage. A table with columns for “Action, Who, When, and Evidence Link” is often more effective than pages of text.

And above all, keep the conversation going. Share draft recommendations, invite feedback, and be open to iteration. Evaluations are most powerful when they don’t just describe what’s happening, but help people move forward with confidence.

Because ultimately if insights don’t get actioned, what are we doing this all for? Data becomes meaningful when it drives change and every small, evidence-led step brings us closer to lasting impact.  

Want to make recommendations PRACTICAL?
- Prioritise:
Identify high impact or quick win recommendations
- Realistic:
Make sure they are relevant and actually realistic to implement
- Actionable:
Consider who should act and when
- Collaborative:
Work with key stakeholders to shape recommendations
- Transparent:
Be clear about the evidence underlying your recommendation
- Insight driven:
Make sure your recommendation is driven by the evidence
- Clear:
To the point
- Accessible:
Ensure insights and recommendations are accessible to all stakeholders
- Lasting:
All the above should help towards making this a sticky recommendation that has an impact

Get in touch

To speak to one of our senior team about how we could support your work, please get in touch