Supporting
107

Connecting Voices


⛶ Full screen

Our Challenge: 

 

The COVID-19 pandemic has shaken the world and affected how we operate. Our challenge is:

How can the evaluation community remotely use modern technologies to undertake participatory qualitative evaluations in conflict settings?

 

Project stakeholders:

 

The target group for this project is humanitarian agencies, non-governmental
organisations, civil society organisations, government agencies, UN Agencies,
consultancy firms, evaluators, and other interested individuals. 

Their needs are: 

  1. Being able to conduct evaluations remotely when travel to the field is restricted; 
  2. Having the right knowledge about the technologies that can be used; 
  3. Being practical about their applications in a conflict setting.

 

What's at stake?  

With the Covid-19 pandemic, there are going to be economic strains that will affect the funding of humanitarian and developmental programmes. With limited funding, evaluation findings must both provide useful learning and win the confidence of funding bodies regarding their impact on the ground. Additionally, it is key to have beneficiaries' voices heard, especially when they are in a vulnerable conflict-affected setting.

 

Our Solution: 

 

We developed a Most Significant Change (MSC) toolbox which consists of the following:

✅ A How-to-Guide on conducting MSC remotely in conflict settings, with checklists

✅ A guide outlining and comparing different modern technologies

✅ Case studies of success stories

 

The Most Significant Change (MSC) technique involves generating and gathering significant change (SC) stories in response to a question from the field. A panel of stakeholders decides through discussion which story showcases what they view as the most significant of these stories.

The Toolbox for the purposes of the challenge will focus on Evaluation but has the potential to be developed into a holistic Monitoring, Evaluation, Accountability, and Learning (MEAL) offering.

Beneficiaries, implementing organisations, evaluators, and funding organisations benefit from the solution.

The assumptions are that :

  • The implementing organisation has an active team in the field 
  • The qualitative data collected for the evaluation will be a supplement to other traditional techniques 
  • There is a budget line dedicated to technology and related infrastructure

For a detailed explanation of how our solution fits the criteria, please view the pitch video below.

 


Our Challenges :

The project was intense but extremely stimulating! Although the size of a team meant that there were more brains to collaboratively devise a sustainable solution for the challenge, the timezone differences made it slightly tricky to have all seven members of the Connecting Voices team on a call. To overcome the problem we assigned tasks with deadlines tailored to the local time of every team member. 

Furthermore, some of us had family commitments which limited the time that could be dedicated to the project. We had two superhero moms feeding their babies while concurrently joining an interview or typing up our documents. 

 

Our takeaways and learnings: 

1/ Design thinking

2/ How a Hackathon is organised

3/ Collaboration with team members in different timezones

 

What's next?

While we were unable to schedule an interview with Rick Davies who developed the MSC technique, he has as of today 13th of July connected with us, offered to look at our guide, and provide comments for improvements. We hope to be able to circulate the toolbox within the evaluation community as a result.

 

Resources that we used:

  • Interviews conducted by Skype or WhatsApp
  • Published and grey literature
  • Social Media (including Linkedin to connect with interviewers and promote the Hackathon)

 

The Connecting Voices team:

 

Active members: Taruna Gupta, Lisa Nissdal Olin, Sapna Ullal, Diane Audras, Bakori Marbian, Kazuyoshi Hirohata, Ehtisham ul Hassan 

 

Timeline: 

 

We are only two active members in the group chat, but we’re terribly excited to kick-start the Hackathon. We begin with an issue map for our challenge, and rapidly focus on the use of cell phones as a technological tool for remote data collection. A few other members join the group. The MSC method is proposed as the initial framework that we can use for further work.

07.07.2020 14:12

Our first draft challenge definition. We’ve onboarded three new members as we continue to navigate our way through Rick Davies’ 105 page-long MSC Guide. We look into websites specializing in mobile data collection. We’ve now got a team name: Connecting Voices !

8.07.2020 14:59

We consider applying the MSC framework to other technological tools, and we start thinking about a more specific context. Conflict-affected areas emerge as an innovative choice for MSC application, particularly because the whole process is performed remotely. Some members have worked in these areas too. We start to reach out to academics for interviews in the hope of getting more information about remote use of MSC techniques…

09.07.2020 10:54

Four interviews today! We first get the chance to speak with Hicham Jaoun, MSC trainer at UNICEF. Some of us then jumped on a call with Dr. Fiona Kotvojs, MSC trainer. This is followed by another one with Dr Nur Hidayati, MSC trainer at Results in Health. The cherry on the cake for our conflict-setting context: an interview with Dr Katie Kraft, co-author of “Most Significant Change in conflict settings: staff development through monitoring and evaluation”.

10.07.2020 09:59

Intense but productive - we drafted one proposal for our MSC toolbox, our prototype for the project. It contains three key elements: a How-to-Guide on conducting MSC remotely in conflict settings, with checklists; a guide outlining and comparing different modern technologies; case studies of success stories. All 7 members of our team jumped on a call with Facilitator Nicholas, who was very helpful in eliciting further steps we could take in preparation for our final video. We’ve each been allocated a task to work on until Sunday, and we start drafting a few ideas for the pitch.

11.07.2020 10:01

The last sprint to the finish line! We’re finalising the script and putting together our Powerpoint Presentation. The How-to-Guide is being polished to reflect as precisely as possible its applications to remote data collection in conflict settings. We learn how to code in HTML for the project page, and play around with fonts and sizes to make it visually appealing. We practice the voice-over for the presentation, present it to family members to have their feedback, and correct accordingly.

12.07.2020 18:30

We’re putting in the last few touches to our evalhack page and uploading all the links and video work to our Google Drive. We look forward to receiving feedback on our solution!

13.07.2020 12:53

Toolbox final touches- check; Evalhack page final edits- check; Video- uploaded to google drive check ahhh panic needs to go on Youtube. We will get there -adrenaline is high!!!!! 5 MINUTES TO GO!!!

Launched at Evaluation Hackathon by

jeremiah_ipdet oleg_lavrovsky felix_stips diane_audras lisa_olin brenda_lia_chavez keith_goldstein kazuyoshi_hirohata tarunagpta2

Maintainer jeremiah_ipdet

Updated 13.07.2020 10:23

  • jeremiah_ipdet / update / 13.07.2020 10:23
  • tarunagpta2 / update / 13.07.2020 09:58
  • tarunagpta2 / update / 13.07.2020 09:56
  • tarunagpta2 / update / 13.07.2020 09:55
  • diane_audras / update / 13.07.2020 09:39

Data Collection

Alternative data collection if field visits are not possible in times of Covid-19

Due to the situation caused by the Covid-19 pandemic worldwide, travels abroad are currently and during the coming months impossible. Therefore, on site evaluations cannot be carried out and we have to look for different means to gather the necessary data, e.g. with the help of technological means. Similar conditions for an evaluation could also arise for other reasons. How can we use modern technologies and existing data in order to carry out “off-site” evaluations without access to the field, replacing methods of data gathering by others? What potential, but also what lacks does such an approach have, especially when evaluators’ interaction with site residents cannot be replaced?

All attendees, sponsors, partners, volunteers and staff at our hackathon are required to agree with our Code of Conduct. Organisers will enforce this code throughout the event. We expect cooperation from all participants to ensure a safe environment for everybody.

Creative Commons LicenceThe contents of this website, unless otherwise stated, are licensed under a Creative Commons Attribution 4.0 International License.