How to plan and prepare evaluations?

The international initiative BetterEvaluation (betterevaluation.org)1 has defined a general framework to structure general guidance about the evaluation process into seven colour-coded clusters: this is the so-called Rainbow framework, shown in Figure 3 below.

Figure 3. Rainbow framework for evaluation (source: BetterEvaluation 2014).

For each cluster, the framework explains the different options (methods or processes) that can be used for each task in an evaluation. It was designed as a planning tool that can be used to: “commission and manage an evaluation; plan an evaluation; check the quality of an ongoing evaluation; embed participation thoughtfully in evaluation; develop evaluation capacity”.

1 BetterEvaluation is an “international collaboration to improve the practice and theory of evaluation by creating and curating information on choosing and using evaluation methods and processes, including managing evaluations and strengthening evaluation capacity”. The founders are Australia’s and New Zealand’s public bodies. For more details, see https://www.betterevaluation.org/en/about

Table 2 below presents the brief descriptions of each cluster. For more details, see: https://www.betterevaluation.org/en/rainbow_framework

Table 2. Clusters of the Rainbow Framework (source: BetterEvaluation 2014).

Cluster Brief description
1.MANAGE an evaluation or evaluation system Decide how the evaluation will be managed, including clarifying stakeholders, roles and decision making processes, and ensure processes for these are transparent and well-managed.
2. DEFINE what is to be evaluated Develop a description (or access an existing version) of what is to be evaluated and how it is understood to work.
3. FRAME the boundaries for an evaluation Set the parameters of the evaluation – its purposes, key evaluation questions and the criteria and standards to be used.
4. DESCRIBE activities, outcomes, impacts and context

 

Collect and retrieve data to answer descriptive questions about the activities of the policy measure, the various results it has had, and the context in which it has been implemented.
5. UNDERSTAND CAUSES of outcomes and impacts Collect and analyse data to answer causal questions about what has produced outcomes and impacts that have been observed.
6. SYNTHESISE data from one or more evaluations Combine data to form an overall assessment of the merit or worth of the intervention, or to summarise evidence across several evaluations.
7. REPORT AND SUPPORT USE of findings Develop and present findings in ways that are useful for the intended users of the evaluation, and support them to make use of them.

The UK Department in charge of energy (now BEIS, formerly DECC) has developed an evaluation guide that provides a complementary view of an evaluation plan in eight steps whose the first six steps are about planning and preparing the evaluation, as shown in Figure 4 below.

Figure 4. Steps to plan and undertake evaluations (source: DECC 2011).

The EPATEE toolbox will complement both general sources of guidance (BetterEvaluation’s Rainbow framework and DECC evaluation guide) with practical examples specific to the evaluation of energy efficiency policies, based on the EPATEE case studies and other sources.

Feedback collected along the EPATEE case studies shows that the preparation of the evaluation is very important for its success.

Example: quote from the presentation made by Michael Aumer about the evaluation of the Austrian Environment Support Scheme (Thenius and Böck 2018, pp.5-7):

To ensure that the evaluator will have a good knowledge of the scheme, the evaluator has then to gather all the information that can be spread in various sources and over time. Checking and sorting the information is often needed to make them clear and usable by the evaluator. And this should often be complemented by discussions with the evaluator along the evaluation, when further clarifications might be needed. This work on the side of the evaluation customer can be timeconsuming. But experience shows that it is essential for the analysis of the evaluation to be consistent with the actual implementation of the scheme (and not disconnected from the ground). Moreover, Michael Aumer emphasised that this also provides policy officers with a better understanding of the programme.