robJackson

The wrong metrics can harm your volunteer programme

Last month I wrote about ways you can tell if a chief executive really gets volunteering. One of the issues I highlighted was what leadership asks for by way of reporting data on a volunteer programme. I want to expand on this point.

A while ago I heard a volunteer manager speak proudly that they had finally managed to get their senior management team to accept a key performance indicator on volunteering on their management scorecard. The KPI in questions looked at how many volunteers the organisation had in total.

To me, that measure is probably worse than no measure at all. Measuring the total number of volunteers risks management falling into two significant traps when considering volunteering:

  • It encourages a focus on numbers and that inevitably results in a mindset that more volunteers is automatically a good thing. If an organisation can achieve the same or a better outcome without having to take on more volunteers (and the associated time and cost involved) isn’t that an efficient thing to do? Similarly, a focus on volunteer numbers means a focus on always getting more volunteers, with recruitment becoming the overall priority. Attention on other aspects of volunteer management can then wain, resulting in a revolving door where people leave in greater volume than they start.
  • Just as it risks more volunteers being seen as the goal, such a KPI risks fewer volunteers being seen as failure. So the volunteer manager ends up never wanting to lose a volunteer. This can result in poor performing volunteers or those who are disruptive being kept on when they should be let go. It can also result in volunteers not being given the flexibility they need to stop volunteering because of other demands on their time, something which can result in those volunteers never coming back again in future.

A much better approach would be to link a volunteering KPI to some kind of strategic goal for the organisation, enabling the senior management team to see how volunteers are contributing to the mission and vision. My current favourite is a KPI that measures what percentage of volunteers would recommend volunteering to their friends and family. It rightly focuses on the experience volunteers have: if it’s good, then they will be more inclined to speak favourably about their volunteering, and, if it’s a bad experience, this is less likely.

It isn’t perfect, but it’s better than a bums-on-seats approach.

What organisation leaders ask for by way of reporting speaks volumes about the importance they place on volunteers and how well they understand volunteering. Similarly, what volunteer managers propose they measure has major implications for their work and how well volunteering is understood within the organisation.

Tread carefully. Any KPI is not necessarily a good KPI.

If you have suggestions or examples for KPIs that your organisation uses on volunteering or that you think organisations should use, please leave a comment below.

  • David Coles

    Hi Rob, great blog.

    At the LSE Volunteer Centre we have recently had similar conversations. Many wish to count hours as a key indicator, something that we don’t do. Similar to bums on seats hours is only an input. It doesn’t show what that volunteer has achieved. I would much prefer someone did a piece of work in 1 hour than 5. Registering numbers of hours is a very laborious process as well.

    95% of LSE students who volunteer would recommend it to their friends, a number that we’re really pleased with. We also want to know how many students volunteer, at the moment it is 42%.

    We look to see if they think their skills. confidence and employability opportunities have increased on the back of their volunteering.

    With our charity partners we ask if they would recommend LSE Volunteer Centre to others and what impact the student volunteer made on their work.

  • Rick Benfield

    Hi Rob, Hi David – a good discussion. Whilst both your proposed metrics are valid in terms of measuring the impact on the volunteer (i.e. did they have a good experience or improve their skills, and therefore likely to either donate or encourage others to volunteer at the organisation) it doesn’t get to the heart of whether they had a positive impact on the organisation. 

    I would suggest a similar dual measure but focused on asking staff in the department in which the volunteer worked whether the volunteer made a positive contribution/met objectives – this could be on scale of say 1-10..with 10 = exceeded expectations (went above and beyond their original remit and really made a difference), 5 = met expectations (did everything that was asked of them), and 1 = did not meet expectations (they did not complete or fulfil the tasks adequately).  Target would be to get an average score above 5. 

    By tracking both, you ascertain simple and easy KPIs on the impact on the organisation (i.e. are our volunteers actually providing a benefit to us) and the impact on the volunteer (i.e. their experience and whether they would come back).  This simple 1-10 measure can be equally applied to ‘community volunteering’ and ‘skills-based volunteering’ (provided simple objectives are set – which is good practice for any task).

    • David Coles

      Good points Rick! In our partner survey we ask how satisfied they were with the student volunteer. It might be good to change it to meeting expectations next year.