Checklists, Boxes and Tables

Checklists

2.1

Checklist for assessing study quality.

56

3.1

Checklists for evidence reviews and systematic maps.

80

3.2

Checklist of quality criteria for meta-analyses.

89

11.1

Checklist for assessing the extent of evidence use by organisations.

337

12.1

Checklist of components of evidence-based decision making.

373

12.2

Checklist of evidence use by organisations.

374

12.3

Checklist of eight ‘easy wins’ for leaders to consider enacting

375

12.4

Checklist for evidence use by knowledge brokers.

377

12.5

Checklist for evidence use by practitioners and decision makers.

378

12.6

Checklist for ensuring reports are evidence-based.

379

12.7

Checklist for philanthropists and funders to encourage evidence use.

381

12.8

Checklist for researchers and educators to support evidence use.

384

Boxes

Box 1.1

Examples of large-scale weak delivery in policy and practice.

6

Box 1.2

A brief history of evidence use.

12

Box 1.3

Advantages of including evidence.

14

Box 2.1

Examples of evidence with associated ISR scores.

41

Box 4.1

General principles for presenting evidence.

98

Box 4.2

Evidence sources for plant reintroduction argument map.

117

Box 4.3

Evidence sources for mind map on snipe management.

120

Box 4.4.

A simple example of a Bayesian network.

124

Box 5.1

The serious challenge of relying on experts: three examples.

136

Box 5.2

Process for running a Delphi technique.

155

Box 5.3

Process for running a Nominal Group technique.

156

Box 5.4

Process for running an IDEA group.

157

Box 5.5

Process for running a Prediction Market.

159

Box 5.6

Process for running a Superforecasting group.

160

Box 5.7

A simple process for judging the veracity of a statement.

161

Box 5.8

A simple process for selecting options.

162

Box 5.9

A simple process for estimating numeric values.

163

Box 6.1

Principles and methods for working with local and Indigenous communities.

182

Box 6.2

Stakeholder mapping and analysis.

186

Box 6.3

Examples of community engagement.

191

Box 7.1

Diagnosing declines: vultures on the Indian subcontinent.

207

Box 7.2

A widely used approach for horizon scanning.

210

Box 7.3

Typical process for scenario planning.

213

Box 7.4

Typical process for solution scanning.

216

Box 7.5

Creating a research agenda of questions for policy and practice.

225

Box 7.6

Designing PICO (population, intervention, control, outcome) questions.

226

Box 8.1

The decision-making process.

238

Box 8.2

Clarifying objectives.

246

Box 9.1

General principles for presenting evidence.

274

Box 9.2

Creating a learning agenda.

277

Box 9.3

Preparing an evidence-based plan.

279

Box 9.4

Means by which funders ask applicants about the evidence underpinning the proposed actions.

290

Box 10.1

Details to include in publications to enable data to be included in evidence collations.

325

Box 11.1

Possible elements of an evidence-use plan.

338

Tables

Table 1.1

Summary of studies looking at inefficiencies or potential gains in investment from using evidence.

16

Table 1.2

How the components of decision making shown in Figure 1.4 become more precise as thinking moves around and inwards around the hexagon towards making a decision.

18

Table 1.3

Suggested questions for determining the extent of good practice.

21

Table 2.1

Some common distinguishing features that can be used to classify different types of evidence.

34

Table 2.2

Some examples of evidence and their suggested classification based on Table 2.1.

35

Table 2.3

Criteria for classifying evidence weight scores, as shown in Figure 2.1.

37

Table 2.4

Types of financial costs and benefits of conservation interventions.

44

Table 2.5

Comparison of the effectiveness of six experimental and quasi-experimental methods.

58

Table 2.6

A classification, with examples, of the elements of most statements.

61

Table 4.1.

Suggested possible content, with examples, for presenting different means of searching for evidence.

98

Table 4.2

The terms used to describe study designs in Conservation Evidence summaries.

101

Table 4.3

Conversion of weights of single pieces of evidence (from multiplying three axes) into descriptions of evidence strengths.

107

Table 4.4

Converting the combined evidence into statements of the strength of evidence.

108

Table 4.5

The Strategic Evidence Assessment model.

109

Table 4.6

Terms suggested by the IPCC (2005) for referring to probabilities.

110

Table 4.7

Examples of ‘weasel’ terms whose likelihood is ambiguous.

110

Table 4.8

The different main elements of summarising evidence described in this chapter with an illustrative sentence.

113

Table 4.9

Example of tabular presentation of evidence for a proposed project that plans to introduce natural grazing with ponies to the montado habitat in Iberia to increase biodiversity.

115

Table 4.10

Summary of current evidence for analytical questions relating to the theory of change shown in Figure 4.7.

123

Table 4.11

A conditional probability table for the Sprinkler node given the three states of the Weather node.

125

Table 5.1

Some of the most common sources of bias.

140

Table 5.2

Summary of strategies for improving individual experts’ judgements.

145

Table 5.3

Summary of strategies for improving group judgements.

152

Table 6.1

Types of interactions with communities.

184

Table 6.2

Groups that may be impacted by interventions and other key figures.

185

Table 6.3.

An example of a stakeholder analysis.

188

Table 7.1

Potential actions concerning wide-scale and local changes.

202

Table 7.2

Examples of diversity within environmental horizon scanning.

209

Table 8.1

Summary of tools described in this chapter.

240

Table 8.2.

A list of fundamental questions that can be used to quickly sketch a decision.

244

Table 8.3

A consequences table for seven different river management options (including no change with the river continuing to deteriorate) assessed under six criteria. Evidence is shown in relation to the current status.

252

Table 8.4

As for Table 8.3 but with options C (expensive), F (flooding) and the tourism criterion (not important) all removed.

254

Table 8.5

As for Table 8.4 but with dominated options B and E removed along with water quality (as no longer differs).

254

Table 8.6

As for Table 8.5 but with D’s moderate gain in fish swapped for a 2% reduction in flood risk and G’s considerable benefits in fish swapped for a 5% reduction in flood risk.

256

Table 8.7

As for Table 8.6 but with A’s increase in flood risk considered equivalent $5 cost, D’s reduction in flood risk considered equivalent to $290 savings and G’s equivalent to $75 savings.

256

Table 8.8

The consequence table with the preferred option D but with a new option (H) added, which is now considered the overall preferred option.

257

Table 8.9

Strategy table for an imaginary series of programmes.

258

Table 9.1

Examples of wording to describe different evidence support when omitting evidence sources.

275

Table 9.2

Evidence Use Capability Maturity Model.

276

Table 9.3

Evidence used during different stages of creating agricultural schemes to benefit biodiversity including whether the necessary information is already collated and easily available (YES, PARTLY, NO) and how evidence gaps were filled.

281

Table 9.4

Example text describing range of approaches for evidence checking.

292

Table 10.1

The different stages of a project life cycle (see Figure 10.1) at which data can be collected, with examples of the type of data at each stage.

310

Table 11.1

A taxonomy of reasons for project failure.

344

Table 11.2

Summary of learning from failure methods.

348

Table 11.3

Identifying significant species within Ingleby Farms using IUCN Red List, National lists and the status on farms.

352

Table 11.4

The Whitley Fund for Nature evidence summary table used for shortlisted projects.

361

Powered by Epublius