At the centre of the Fulfilling Lives (Multiple Needs) programme is evaluation and here the workforce is on the frontline. For it is they who source, collate and complete the two key measures –  the Homelessness Outcomes Star and the NDT Assessment – with the service beneficiaries. Frontline workers say something they enjoy about the programme is the move away from a purely target driven approach. To make the most of this we need to make sure that the essential evaluation is not a chore. Not tasks that are completed as an add-on to a day’s work but ones that are integral to the way of working alongside beneficiaries and peer workers.

Thus, we need to know what it is about the projects that captures the imagination and creativity of practitioners. Ask what is it they like about the work alongside beneficiaries which will secure as full an evaluation picture as possible? It is my contention that it is knowingly being part of a virtuous circle – one which takes theory into practice, evaluates that practice and thereby shapes a better informed and improved theory that can go on to influence wider practice, which is so rewarding. Pride – for both frontline worker and service user – is in curiosity satisfied. In the Fulfilling Lives (Multiple Needs) overall programme then, evaluation of just how each individual beneficiary achieves their aspirations should be accumulated, lead to system change and improve the lives of many people, families and communities with multiple and complex needs. Evaluation, and the curiosity that fuels it, are vital to generating that positive feedback loop on which the projects depend for their results.

 

flblog

 

In my experience curious practitioners have:

  • Clarity of programme and project purpose
  • Detailed understanding of their job role and responsibilities
  • The skills and training required to make good use of the evaluation tools
  • Time to collect the data and information needed
  • An ability to make evaluation an engaging part of their relationship with beneficiaries
  • Involvement in the analysis and assessment of data and information
  • Encouragement to bring their wider knowledge and experience to support and/or challenge data and information
  • Support to ask difficult questions about the reliability of data and information
  • Knowledge and understanding of the common pitfalls in research and evaluation
  • Feedback on what the data and information is revealing across projects

 

Essentially the outcomes of evaluations should help not hinder, be practical for practitioners and challenge in a way that stimulates further curiosity. Evaluations must be inquisitive about the interventions and approaches. The more we know about what works, and why, the better. In this way the best workforce can be recruited, trained, supported and retained. What follows is not just an effective service but one that is sustainable and replicable because it is founded on research. For, as we all know “research is formalised curiosity. It is poking and prying with a purpose”.[1]

 

What we need to know from project beneficiaries, frontline workers and their managers is what will assist us all to undertake really effective evaluations? How can we make sure we make the best use of the evaluation tools we have? Do drop me a note at vic.citarella@cpea.co.uk or call on my mobile 07947 680 588.

[1] Zora Neale Hurston