By Edward Beswick, Research Coordinator, Generations For Peace Institute
Generations For Peace (GFP) is a volunteer-led, peace-building organisation dedicated to sustainable conflict transformation at the grassroots in communities across Africa, Asia, and Europe. During the last week of November 2015, we at GFP hosted an Advanced Training for 39 volunteers from ten countries. This Training was aimed at further improving volunteers’ peace-building skills, and sharing successes, best practices, and lessons learnt from the programmes that they have implemented over the past year. The Training included sessions on conflict analysis, programme design, monitoring and evaluation as well as the use of sport, art and advocacy in peace building.
One of the sessions covered Participatory Evaluation (PE), a technique that volunteers used to assess their programmes. The evaluation model we use at GFP is heavily informed by participatory theory, and achieves a great deal in terms of handing evaluative control over to those who designed and implemented the programme. Now that many of our volunteers have tried PE we can collect their feedback, assess what works, and what can be improved. This post will look at what advantages PE provides and why we chose it, before going on to consider the reflections of our volunteers concerning the highs and lows of their PEs. The feedback at Advanced Training showed that the PE model has had significant successes. But, at the same time, some changes are needed. To achieve this, GFP needs to keep listening to the people who organised and carried out the PE – our volunteers.
Participatory evaluation processes grew in reaction to traditional, top-down forms of evaluation. Whereas the latter was ‘expert’-led and removed from local contexts, the former seeks to embed itself in the community and aims to produce simple, more informative results. Participatory methods of evaluation are premised on the idea that it is the community that knows what is best for them – it is their reality, not that of the external ‘expert’, that matters. Based on this theoretical grounding, GFP first adopted its specific PE model in 2013; as a volunteer-led organisation it made little sense to focus solely on external assessments. PE was chosen so that volunteers could bring together the main groups involved in or connected to their programme to discuss what happened and why. A PE enables volunteers themselves to evaluate the programme that they designed and implemented.
The GFP PE model consists of three parts: ‘PE planning’, the ‘PE day’, and the ‘write up and sharing’. The ‘PE planning’ involves organising attendance, a venue, materials, and logistics. The ‘PE day’ consists of separate focus groups – consisting of volunteers, programme participants, the wider beneficiary community and key stakeholders – which are then followed by a large group discussion where the focus groups findings are shared and discussed. Finally, the ‘write up and sharing’ consists of compiling summaries of the findings and then distributing them amongst the volunteers, the wider community and GFP Headquarters (HQ). Guidance for each of these steps is provided by HQ staff and through GFP’s Programming Framework. While staff provide detailed explanations, technical support and mentoring as needed, the Programming Framework serves as a document that provides volunteers with lots of information on how to carry out the PE, and is used as a reference to guide them through each step. It provides volunteers with a background to PE, tips on how to conduct it, and an explanation of what it aims to achieve. This represents an excellent resource that volunteers can consult throughout the process.
Our model achieves a lot in terms of increasing involvement, expanding decision-making power, and building the capacity of the volunteers. Yet, the model is new, informed heavily by participatory theory, but lacking in practical lessons. However participatory it is on paper; it is what happens in the field that really counts. At GFP we know that it is essential for us to listen to our volunteers and learn from their experiences. They produce a wealth of practical knowledge and one of our big strengths as an organisation is that we are able to tap into this information so that we can learn, adapt, and improve. This is where Advanced Training comes in.
At the session, volunteers were asked to think about the highs and lows of their PEs. The highs show that the PE achieves many of the goals that it sets out to. Volunteers spoke of the PE being a collective learning experience that brought the community together and allowed everybody to find out more about the results of their programme. It was also seen as a great leveller, as it gave everyone a voice to express themselves and speak honestly. In terms of its use, the volunteers highlighted how it provided immediate results that enabled them to plan for the future – as one volunteer put it, after the PE ‘we know where we are’ with our programmes. Finally, the PE was cast as a rewarding experience that demonstrates both the programmes more or less positive impacts as well as areas of improvement. This in turn motivated volunteers and community members to continue with their peace-building work. All in all, these highs show that the PE is a positive and useful process that brings everyone together, giving everyone the opportunity to share and learn. This is backed up by the mentoring of HQ staff and the detailed information provided in the Programming Framework.
Yet, while the highs reveal the model’s many successes, the lows revealed that some tweaking is needed. Lows points included poor attendance, difficulty with the questions asked at the PE, and the challenge of sticking to timings, either due to people arriving late or the PE lasting too long. All these points reveal that certain hurdles encountered in the field are preventing the model from reaching its full potential. Without representative attendance the PE cannot be an inclusive consultation of all those involved. If the questions asked are not being understood or do not seem relevant to certain groups in attendance at the PE (as some volunteers pointed out) then this harms the usefulness of the knowledge produced by the process. The questions asked at a PE are vital, as they shape what it is able to discover. Finally, timings, a difficult area to improve upon, but one that is still essential for a PE’s success. If half the participants show up late or if the PE runs out of time then this undermines people’s ability to take part and make a contribution.
Clearly, from the feedback, the GFP PE model achieves a great deal: it is informative, inclusive, and produces useful results that help with the design of future programmes. It is also rewarding for our volunteers as it demonstrates what they achieved with their programme. Yet, the feedback shows that some important changes are in order. Now that it has been collected GFP can build on the practical experience and knowledge of its volunteers to improve the PE model. This kind of constant improvement and change is essential to who we are as a learning organisation. This process has already begun and will continue so that we are able to implement a PE model that is malleable and attuned to local needs, as well as the needs of GFP HQ. The first step is to gather all the feedback we have, sit down, and fine-tune the PE process so that it responds to the practical concerns of our volunteers. As a first attempt, the process has shown important successes, but there is always room for improvement. In keeping with participatory processes, we believe that the more people we consult about what needs to change the better those improvements will be, and the sessions at Advanced Training 2015 allowed us to carry out that consultation.
*Originally published on Peace Portal: Read here.
Sign up to our e-newsletter to learn more about the impact of our programmes in the Middle East, Africa, Asia and Europe.