Skip to content

Impact & Reports

Evidence that stays close to community realities.

S N Ath Na Lionta publishes practical reporting on who is reached, what changes over time, and how resources are stewarded. Our impact work is designed to help families, funders, partners, and public bodies see both the scale and the texture of delivery.

Quarterly dashboard reviews Outcome and finance summaries Mixed-method evidence

Annual Highlights

2025 results at a glance

The latest reporting cycle shows stronger retention, more consistent referral pathways, and wider geographic coverage while maintaining a disciplined approach to governance and disclosure.

4,860

participants reached across children, parents, carers, and youth cohorts.

87%

retention across programmes where sustained engagement is needed for outcomes.

19

counties served through direct partnership agreements and local referral routes.

Impact Story

How reporting translates practice into public accountability

Each programme site collects baseline information, attendance and referral records, and participant feedback. This evidence is reviewed alongside practitioner notes so that published reports reflect lived experience as well as headline numbers.

That means a report from S N Ath Na Lionta does not stop at outputs. We track whether services became easier to reach, whether school attendance stabilised, whether families felt more confident navigating supports, and whether local partners could respond faster because the network became more joined up.

  • Monthly operational reviews compare delivery against programme plans and referral demand.
  • Quarterly outcome dashboards highlight movement in attendance, confidence, and participation indicators.
  • Annual reports combine quantitative measures with case studies from partner communities.

Measurement Framework

Shared indicators across learning, wellbeing, and participation

Our reporting framework is designed to be clear enough for public interpretation and rigorous enough for funder review. Metrics are set at programme level, then reconciled into a common dashboard so trends can be compared across sites.

91%

of enrolled learners improved attendance within two terms.

74%

of families reported stronger confidence engaging with local services.

63%

of youth participants completed a community action project.

What appears in every report

  • Reach, retention, and referral figures by programme strand.
  • Baseline-to-exit changes where participant consent allows longitudinal tracking.
  • Partner commentary on implementation quality, timing, and access barriers.
  • Risk notes covering data quality, delivery constraints, and unmet demand.

Community Evidence

Case evidence from local delivery settings

Report pages are strongest when they connect hard numbers to real operating environments. The examples below reflect the kinds of settings and group activity that inform our published evidence base.

Family support outcomes

Reports show where joined-up family support improved attendance planning, reduced missed appointments, and shortened the time between referral and first meaningful contact.

Referral responsiveness Service uptake

Youth leadership progression

Participant reports combine completion data with confidence, teamwork, and civic participation measures to show whether leadership work is translating into local action.

Project completion Leadership confidence

Neighbourhood access mapping

Site reports document where transport, food access, or digital exclusion create barriers and which practical interventions helped families remain engaged.

Barrier tracking Access coordination

Place-based reporting

Local context matters. Reports include narrative snapshots that explain how geography, transport, school relationships, and partner capacity shape results in each area.

Context notes Site comparison

Financial Reporting

Income and frontline spend are published for comparison

Our reports track revenue growth, direct programme expenditure, and the share of funding that reaches frontline delivery. Variance is reviewed each quarter so annual reporting is consistent with operational records.

2025 financial snapshot

Illustrative totals presented in the same public-facing format used across the site.

€2.8m

Total income stewarded through the reporting year.

€2.1m

Direct programme costs supporting frontline delivery.

75%

Share of spend directed to programmes and community delivery.

Download Centre

Reports available on request

  • Annual impact report with programme and finance summary.
  • Quarterly dashboard packs for funders and public partners.
  • Monitoring and evaluation framework overview.
  • Procurement and delegated authority disclosures.
Reporting Practice

How to read the data

  • Figures are reviewed against partner records before publication.
  • Outcome reporting uses programme-specific logic models and shared KPI definitions.
  • Qualitative evidence is anonymised and edited to protect participant privacy.
  • Trend data is presented with context notes where delivery conditions changed materially.
Assurance

Board and leadership oversight

  • Quarterly dashboards are reviewed by leadership and board committees.
  • Material risks, variances, and data gaps are logged and tracked to closure.
  • Community feedback informs changes to indicators and reporting language.