Promoting
91

Adaptive Solution Team

How should an evaluator respond to unexpected shocks? We have devised a framework to guide you through the process


⛶ Full screen

1. The problem

Unexpected crises like the COVID-19 pandemic, wars and natural disasters pose a challenge to everyone, evaluators included. When facing shocks like these, they must ask themselves questions like the following:

  1. How does this crisis affect the assumptions of my Theory of Change?
  2. Do data access limitations force me to modify the scope of my evaluation?
  3. Does the crisis force me to change my research question or hypothesis? (related to the previous question )
  4. Do I have to change my data collection strategy and the logistics of the implementation of my pilot?

The challenge of revising an evaluation and finding the answers to questions like these can be tackled based on intuition alone, but this risks missing crucial steps due to lack of a systematic approach. That is what our solution has to offer.

Eval Hack Members – Nqabutho Nyathi, Norman Rodriguez, Paul Maluful, Edward Williams, Michael Stephens

2. The framework

We have called our framework 'the four Rs' (4R.) These four Rs are:

  1. Rethink – Rethink the evaluation system/process
  2. Remote – Since it will be difficult to physically obtain data, remote data collection must be considered (that does not mean only online forms and surveys)
  3. Review – Evaluators must be able to review and synthesise existing data and knowledge (includes conventional hand-collected data and big data)
  4. Report – Evaluators need to prepare informative reports to their stakeholders (what has changed as a response to the crisis, why, and what to expect next)

The approach can be visualised like this (the second circle corresponds to BeetterEvaluation's rainbow framework, which is ideal to use in tandem with the 4Rs):

The 4R framework

3. Data gathering alternatives

As part of your adaptive process, you might want to reach those populations that are far apart or aren't very interconnected. Some good options are the following (click on the notepad at the top of each box to see a description):


13.07.2020 12:38

Launched at Evaluation Hackathon by

m_j_steffens nqabutho_nyathi paul_mafulul Norman Rodriguez Edward_williams felix_stips

Maintainer hackathons-ftw

Updated 13.07.2020 12:38

  • jeremiah_ipdet / update / 13.07.2020 12:38
  • nqabutho_nyathi / update / 13.07.2020 09:57
  • nqabutho_nyathi / update / 13.07.2020 09:56
  • nqabutho_nyathi / update / 13.07.2020 09:54
  • felix_stips / update / 13.07.2020 08:36

Adaptive Evaluation

Inclusive and adaptive evaluations in times of Covid-19

During crises such as COVID-19, evaluation teams need to rely on remote data collection methods. This entails intrinsic potential biases against hard-to-reach populations that need to be mitigated through innovative methods and tools. What methods and tools can evaluators use to ensure that hard-to-reach populations are not left behind in evaluations undertaken during crises? Covid 19 unveiled an invisible thread linking methodological challenges, need for equity, and “do no harm”. To be effective, evaluations have to adapt and respond to each of these challenges. What concrete and practical tools can we develop by looking at the intersection of methodological challenges, “do no harm”, equity, and innovation? What can we learn and immediately apply from evaluation methods designed to operate in fluid and uncertain conditions and with imperfect information?

All attendees, sponsors, partners, volunteers and staff at our hackathon are required to agree with our Code of Conduct. Organisers will enforce this code throughout the event. We expect cooperation from all participants to ensure a safe environment for everybody.

Creative Commons LicenceThe contents of this website, unless otherwise stated, are licensed under a Creative Commons Attribution 4.0 International License.