Evidence review and synthesis methodology (hereafter referred to as ‘evidence synthesis’) is now in widespread use in sectors of society where science can inform decision making and has become a recognised standard for accessing, appraising and synthesising scientific information. The need for rigour, objectivity and transparency in reaching conclusions from a body of scientific information is evident in many areas of policy and practice, from clinical medicine to social justice. Our environment and the way we manage it are no exception and there are many urgent problems for which we need a reliable source of evidence on which to base actions. Many of these actions will be controversial and/or expensive and it is vital that they are informed by the best available evidence and not simply by the assertions or beliefs of special interest groups. For evidence synthesis to be credible, legitimate and reliable, standards regarding its conduct need to be clearly defined. Such standards include examining possible sources of bias both in the evidence and in the way the review and synthesis is conducted. In so doing, the goal is to provide an explicit level of confidence in the findings to the end-user. Here we present the latest guidelines for the planning and conduct of CEE Evidence Syntheses (separated into Systematic Reviews and Systematic Maps see below) in environmental management.
The guidelines and standards for CEE Evidence Syntheses (including the planning and review stages) have been adapted from methodologies first developed and established over more than two decades in the health sciences (Higgins & Green 2011). They have been further informed by developments in other sectors such as social sciences and education (Gough et al. 2012) and tested through practice in developing the CEE Library of Evidence Syntheses and the CEE Journal Environmental Evidence. Through undertaking and peer reviewing CEE Evidence Syntheses, researching and adapting existing methodologies, and through analysis of procedures and outcomes, CEE contributors have developed specific guidelines for application to environmental management and the types of data and study designs that are prevalent in environmental research. Whilst past CEE Systematic Reviews and Systematic Maps may provide some guidance, our advice is not to assume that past practices are sufficient for future CEE Standards. This document refers to examples of best practice and CEE is constantly trying to improve standards of evidence review and synthesis.
Although the basic ethos of evidence synthesis is generic, environmental methodologies are often different in nature and application from those in other fields and this is reflected in these guidelines. At first glance, many of the approaches may seem routine and common sense, but the rigour and objectivity applied at key stages, and the underlying philosophy of transparency and independence, sets them apart from the majority of traditional reviews published in the field of applied ecology (Roberts et al. 2006, O’Leary et al. 2016). Evidence syntheses are being commissioned by a wide range of organisations in the environmental sector and the need for common guidelines and standards, and collaborative development of the methodology, is critical to the formation of an openly accessible and credible evidence base that functions as a public good. We argue that, once more widely established, CEE methodology will significantly improve the identification and provision of evidence to inform practice and policy in environmental management. For this methodology to have an impact on effectiveness of our actions, more environmental scientists and other stakeholders need to get involved in the conduct of CEE Evidence Syntheses. For those intending to conduct evidence reviews syntheses, these guidelines are provided in the spirit of collaboration and we encourage you to contribute your work to the CEE, use and improve these guidelines, and help establish an evidence-based framework for our discipline.
Who are these guidelines and standards for?
These guidelines are primarily aimed at those teams intending to conduct a CEE Evidence Synthesis. The structure of the document takes the reader through the key stages from first consideration of the need for an evidence synthesis to the dissemination of the findings. Novice Review Teams should not expect that these guidelines alone will be sufficient support to conduct an evidence synthesis to CEE standards. They are guidelines only and do not replace formal training in CEE methodology.
We hope that these guidelines and standards will also be of use to those considering commissioning the conduct or using the findings of a CEE Evidence Synthesis and for stakeholders who may become involved in their planning. In this context the Guidelines provide standards for conducting and reporting syntheses that commissioners and stakeholders can expect to be demonstrated by their authors.
Finally, these guidelines set a standard for the conduct of evidence syntheses and are therefore relevant for decision makers using evidence from CEE and wishing to understand the nature of the CEE process and how it provides a reliable assessment of the evidence.
For clarity of process, the guidelines are split into separate sections. There is obviously considerable overlap between planning, conducting and reporting and we cross reference as much as possible to avoid undue repetition but some is unavoidable. We use examples of completed Systematic Reviews and Systematic Maps in the CEE library to illustrate each stage of the process and to highlight key issues. A glossary is provided on the CEE website but here are some key definitions.
The differences between Systematic Reviews and Systematic Maps, and guidance on how to decide whether to conduct a Systematic Review or a Map, are explained in more detail in the following sections. In essence, both approaches review evidence and start out in the same way, being protocol-based and requiring systematic searches and systematic evidence selection techniques, but their mode of synthesis, analyses and outputs differ. A Systematic Review provides an aggregate answer to a specific question, whereas the output of a Systematic Map is a configurative, descriptive characterisation of the evidence base. Systematic Reviews may be confirmatory and hypothesis-testing, whereas Systematic Maps may be more exploratory and hypothesis-generating, although this is not a rigid distinction.