Systematic reviews of qualitative evidence for environmental policy and management: an overview of different methodological options
Qualitative research related to the human dimensions of conservation and environment is growing in quantity. Rigorous syntheses of such studies can help develop understanding and inform decision-making. They can combine findings from studies in varied or similar contexts to address questions relating to, for example, the lived experience of those affected by environmental phenomena or interventions, or to intervention implementation. Researchers in environmental management have adapted methodology for systematic reviews of quantitative research so as to address questions about the magnitude of intervention effects or the impacts of human activities or exposure. However, guidance for the synthesis of qualitative evidence in this field does not yet exist. The objective of this paper is to present a brief overview of different methods for the synthesis of qualitative research and to explore why and how reviewers might select between these. The paper discusses synthesis methods developed in other fields but applicable to environmental management and policy. These methods include thematic synthesis, framework synthesis, realist synthesis, critical interpretive synthesis and meta-ethnography. We briefly describe each of these approaches, give recommendations for the selection between them, and provide a selection of sources for further reading.
KeywordsCritical interpretative synthesis Framework synthesis Meta-ethnography Mixed methods reviews Qualitative evidence synthesis Realist synthesis Thematic synthesis
Qualitative research related to the human dimensions of conservation and environment is growing in quantity [1, 2] and robust syntheses of such research are necessary. Systematic reviews, where researchers use explicit methods for identifying, appraising, analysing and synthesising the findings of studies relevant to a research question, have long been considered a valuable means for informing research, policy and practice across various sectors, from health to international development and conservation [3, 4, 5, 6, 7].
The methodological development of systematic reviews took off in the 1980s, initially with a strong focus on the synthesis of quantitative data. The exploration of specific methods for qualitative synthesis started to grow a decade or so later [8, 9]. Examples addressed questions related to the lived experience of those affected by, and the contextual nuances of, given interventions. The methodology for the synthesis of quantitative research appears to have been adapted for environmental management for the first time in 2006 and has been developing since [10, 11]. However, guidance in the field for those producing or interested in working with qualitative evidence synthesis still does not exist.
To date, the vast majority of systematic reviews in environmental management are syntheses of quantitative research evidence that evaluate the effectiveness of an intervention or the impact of an activity or exposure —here called systematic reviews of quantitative evidence. These typically aggregate relatively homogenous outcome measures from similar interventions or exposures to create a more precise and accurate summary estimate of an overall effect [13, 14].
Current debates about systematic reviews of quantitative evidence in other fields point out that such reviews, while they address essential questions about the magnitude of effects or impacts, cannot help us answer other policy- and practice-relevant issues [15, 16]. In addition, the complexity within studies on impacts of environmental actions or exposures, and in studies of environmental management initiatives, will mean that a simple aggregation of study findings will only mask important differences and enable us to predict very little about what might happen to whom (human or otherwise) in any set of given circumstances. Here we argue that qualitative evidence syntheses can add value to environmental research and decision-making. Systematic reviews that make use of qualitative research can provide a rigorous evidence base for a deeper understanding of the context of environmental management. They can give useful input to policy and practice on (1) intervention feasibility and appropriateness (e.g., how a management strategy might best be implemented? What are people’s beliefs and attitudes towards a conservation intervention?); (2) intervention adoption or acceptability (e.g., what is the extent of adoption of a conservation intervention?; What are facilitators and barriers to its acceptability?); (3) subjective experience (e.g., what are the priorities and challenges for local communities?); and (4) heterogeneity in outcomes (e.g., what values do people attach to different outcomes? For whom and why did an intervention not work?) [8, 15, 17, 18].
In common with individual studies of quantitative research, individual qualitative studies may be subject to limitations, in terms of their breadth of inquiry, conceptual reach and/or methodology or conduct. Projects that systematically find, describe, appraise and synthesise qualitative evidence can provide findings that are more broadly applicable to new contexts  or explanations that are more complete . Such qualitative evidence syntheses (QES) may stand alone, be directly related to a systematic review of quantitative evidence on a related question(s) or may be part of mixed methods multi-component reviews that aim to bring two distinct syntheses of evidence together.
conduct syntheses of evidence so as to go beyond questions of effectiveness or impact;
use synthesis to identify explanations for and produce higher levels of interpretation of the phenomena under study;
include rich descriptive and often heterogeneous evidence from different research domains; and
combine and link qualitative and quantitative evidence.
The objective of this paper is to present a brief overview of different methodological options for the synthesis of qualitative research developed in other fields (such as health, education and social sciences) and applicable to environmental management practice and policy. A selection of sources for further reading, including those that expand on how to identify, describe and appraise evidence for QES is also included. Before describing the different synthesis options, we briefly explore the nature of environmental problems and management to explain the context for QES in this field.
The context of environmental policy, management, and research
Environmental and conservation problems are wicked, highly complex, and embedded in ecological as well as social systems [21, 22, 23, 24]. The complexity stems from several sources: (1) a high level of uncertainty; (2) large temporal and spatial scale; (3) cross-sectoral and multi-level spanning; and (4) the irreversibility of potential damages [25, 26]. The loss of global biodiversity or changes in the global climate system [27, 28] can illustrate this complexity: our knowledge about these systems is imperfect, a multiplicity of actors is associated with them (see, e.g., [22, 25]); their impacts span from local to global levels and the damages potentially cannot be repaired [29, 30, 31]. On top of this, interventions to address these challenges are themselves often complex, in that they are made up of many interacting components and are introduced into and rely upon social systems for their implementation .
Instead, the dynamic nature and complexity of environmental problems, and their possible solutions call for the use and integration of scientific knowledge from several and different disciplinary domains. This need is reflected already in the interdisciplinary nature of environmental research that occurs at the level of theory, methods and/or data [33, 34, 35, 36]. Environmental research is frequently based on observational studies . Studies are commonly developed around a well-defined theoretical and a geographical boundary, with the aim to develop a comprehensive understanding of the chosen phenomena. However, this means that such research produces highly heterogeneous evidence scattered across different contexts .
These issues related to the type and nature of environmental evidence imply that systematic review methods need to include a plurality of different approaches . Adding qualitative and mixed methods evidence synthesis to the systematic review toolbox may be vital in cases where context is very important, complexity and heterogeneity is the norm, and where a more in-depth understanding of the views and experiences of various actors can help to explain how, why and for whom an intervention does or does not work . These methods can further aid in the understanding of success and failure of environmental interventions through the analysis of implementation factors. Furthermore, they can also help in describing the range and nature of impacts, and in understanding unintended or unanticipated impacts .
What is qualitative evidence synthesis (QES)?
Qualitative evidence synthesis refers to a set of methodological approaches for systematically identifying, screening, quality appraisal and synthesis of primary qualitative research evidence. Various labelling terms have been used (see Box 1).
It should be noted here that QES is distinct from two other categories of reviews that have been labelled as ‘qualitative’. The first category contains narrative summaries of findings from studies with quantitative data. Here, the original intention was to use quantitative synthesis methods (e.g., meta-analysis) but that was not possible due to, for example, the heterogeneity between studies. Review authors in the second category have the intention to use a narrative approach to synthesis of quantitative data right from the start. Neither of these two review categories is discussed further here.
Box 1 Definitions and labels
Qualitative research refers to a wide range of different kinds of research studies that tend to collect and analyse qualitative data, to organise and interpret the results and produce findings that are largely narrative in form (see also ).
Qualitative data typically refers to textual data (although other types of data, such as visual data, can be produced during the research process). Data are obtained through recording of, for example, e.g., individual or group interviews, or observations of behaviours.
Qualitative evidence synthesis (QES) is an umbrella term that encompasses a set of various methodological approaches for systematically identifying, screening, quality appraising and synthesising primary qualitative research evidence.
Systematic review of qualitative research
Qualitative systematic reviews
Qualitative research synthesis
An overview of QES approaches
In common with methods for systematic reviews of quantitative evidence, there are a number of stages of the systematic review process which are followed in most QES approaches, including (1) question formulation, (2) searching for literature, (3) eligibility screening, (4) quality appraisal, (5) synthesis and (6) reporting of findings. However, the methods used within each of these stages varies, depending on the specific review approach adopted with its epistemology and relation to theory.
QES approaches may also vary in the way they address and understand the importance of the context and so, they can be multi-context or context-specific. Multi-context reviews aim at an exhaustive sampling of literature to include diverse contexts, e.g., different geographical, socio‐cultural, political, historical, economic, ecological settings. Such reviews are currently common in systematic reviews of quantitative evidence. Context-specific QES use selective sampling and focuses on only one context to provide specific understanding to a targeted audience and develop theories that are specific to the local setting (see ).
Type of evidence
Adapts and/or develops a pre-existing theoretical framework
Develops new explanatory theories and/or conceptualisations
Thomas and Harden 
Develops new explanatory theories and/or conceptualisations
Critical interpretive synthesis
Develops new explanatory theories and/or conceptualisations
Dixon-Woods et al. 
Explores the mechanisms which cause interventions to result in specific outcomes in specified contexts
Framework synthesis uses a deductive approach and it has been used for the syntheses of qualitative data alone (e.g., ), as well as by those undertaking mixed methods syntheses [50, 51]. Framework synthesis has been grouped along with other approaches that are less suitable for developing explanatory theory through interpretation or making use of rich reports in study findings. The approach can be seen as one means of exploring existing theories . Framework synthesis begins with an explicit conceptual framework. Reviewers start their synthesis by using the theoretical and empirical background literature to shape their understanding of the issue under study. The initial framework that results might take the form of a table of themes and sub-themes and/or a diagram showing relationships between themes. Coding is initially based on this framework. This framework is then developed further during the synthesis as new data from study findings are incorporated and themes are modified, or further themes are derived. The findings of a framework synthesis usually consist of a final, revised framework, illustrated by a narrative description that refers to the included studies. The initial conceptual framework in framework synthesis is seen as providing a “scaffold against which findings from the different components of an assessment may be brought together and organise” (:29). The approach builds upon framework analysis, which is a method of analysing primary research data that has often been applied to address policy concerns .
Six stages of framework synthesis are generally identified: familiarisation, framework selection, indexing, charting, mapping and interpretation. In the familiarisation stage reviewers aim to become acquainted with current issues and ideas about the topic under study. The involvement of subject experts in the team can be particularly helpful at this stage. The next stage, framework selection, sees reviewers finalising their initial conceptual framework. Here some argue for the value of quickly selecting a ‘good enough’ existing framework , rather than developing one from a variety of sources. An indexing stage then sees reviewers characterising each included study according to the a priori framework. In the charting stage reviewers analyse the main characteristics of each research paper, by grouping characteristics into categories related to the framework and deriving themes directly from those data. During the mapping stage of a framework synthesis, derived themes are considered in the light of the original research questions and the reviewer draws up a presentation of the review’s findings. The interpretation stage, as with much research, is the point at which the findings are considered in relation to the wider research literature and the context in which the review was originally undertaken.
Framework synthesis is relatively structured and therefore able to accommodate quite large amounts of data. Like thematic synthesis (see below), researchers using this method often seek to provide review output that is directly applicable to policy and practice. This method can be suitable for understanding feasibility and acceptance of conservation interventions. A variation of the method, the ‘best-fit synthesis’ approach, might help if funder timescales are extremely tight . A review by Belluco and colleagues  of the potential benefits and challenges from nanotechnology in the meat food chain is a recent example of framework synthesis. Here reviewers coded studies to describe the area of the meat supply chain, using a pre-specified framework. Belluco’s team interrogated their set of 79 studies to derive common themes as well as gaps—areas of the framework where studies appeared not to have been conducted.
Thematic synthesis draws on methods of thematic analysis for primary qualitative research and is a common approach to qualitative evidence synthesis in health and other disciplines . Examples in the literature range from more descriptive to more interpretative approaches. Findings from the included studies are either extracted and then coded or, increasingly, full-texts of the eligible studies are uploaded into appropriate software (e.g., NVIVO or EPPI-reviewer) and coded there. These codes are used to identify patterns and themes in the data. Often these codes are descriptive but can then be built up into more conceptual or theory-driven codes. Initial line-by-line descriptive coding groups together ideas from pieces of text within and across the included papers. Similarities and differences are then grouped together into hierarchical codes. These are then revisited, and new codes developed to capture the meaning of groups of the initial codes. A narrative summary of the findings, describing these themes is then written. Finally, these findings can be interpreted to explore the implications of these findings for the context of a specific policy or practice question that has framed the review. The method is therefore suitable for addressing questions related to effectiveness, need, appropriateness and acceptability of an intervention  and usually from the point of view of the targeted groups (e.g., local communities, conservation managers, etc.). Similar to systematic reviews of quantitative research, this method attempts to retain the explicit and transparent link between review conclusions and the included primary studies . There are only a few examples of reviews in the environmental management field that have explicitly applied thematic synthesis. For instance, Schirmer and colleagues  use “thematic coding”  (within the approach they call qualitative meta-synthesis) to analyse the role of Australia’s natural resource management programs in farmers’ wellbeing. Haddaway and colleagues  use thematic synthesis to define the term “ecotechnology”.
This method was developed by Noblit and Hare  and originally applied to the field of education. The method was further improved in the early 2000s by Britten and colleagues  who applied it to health services research and has since been used for increasing numbers of evidence synthesis, particularly in health research and other topic areas.
Meta-ethnography is an explicitly interpretative approach to synthesis and aims to create new understandings and theories from a body of work. It uses authors’ interpretations (sometimes called second-order constructs, where the quotes from study participants are first-order constructs) and looks for similarities and differences at this conceptual level. It uses the idea of “translation” between constructs in the included studies. This involves juxtaposing ideas from studies and examining them in relation to each other, in order to identify where they are describing similar or different ideas.
This method includes seven stages: (1) identification of the intellectual interest that the review might inform; (2) deciding what is relevant to the initial interest; (3) reading the studies and noting the concepts and themes; (4) determining how the studies are related; (5) translating studies into one another; (6) synthesising translations; and (7) communicating review findings . There are three main types of synthesis (stages 5 and 6): reciprocal translation, refutational translation, and line of argument. Different findings within a single meta-ethnography may contain examples of one or all of these approaches depending on the nature of the findings within the included studies. Reciprocal translation is used where concepts from different studies are judged to be about similar ideas, and so can be “translated into each other”. Refutational translation refers to discordant findings, where differences cannot be explained by differences in participants or within a theoretical construct. A line of argument can be constructed to identify how translated concepts are related to each other and can be joined together to create a more descriptive understanding of the findings as a whole. This method is therefore very well suited to produce new interpretations, theories or conceptual models [61, 62]. In the conservation, this method could be used to understand how, for example, local communities experience conservation interventions and how this influences their acceptance of conservation interventions. Head and colleagues  used meta-ethnography to understand dimensions of household-level everyday life that have implications for climate change mitigation and adaptation strategies.
Critical interpretive synthesis
The critical interpretive synthesis approach was originally developed by Dixon-Woods and colleagues . Review authors using this approach  are interested in theory generation while being able to integrate findings from a range of study types, and empirical and theoretical papers. Further, this method can integrate a variety of different types of evidence from quantitative, qualitative and mixed methods studies. We included critical interpretative synthesis in our paper because this method is often used for synthesis of qualitative evidence.
In the overall synthesis a coherent framework is usually presented, showcasing a complex network of interrelating theoretical constructs and the relationships between them. The framework partly builds on existing constructs as reported in the different studies and introduces newly derived, synthetic constructs generated through the synthesis procedure itself. Reported themes are then gradually mapped against each other to create an overall understanding of the phenomenon of interest. This is similar to developing a line of argument in a meta-ethnography (see above). Critical interpretive synthesis distinguishes itself from other approaches such as formal grounded theory [65, 66] and meta-ethnography by adopting a critical stance towards findings reported in the primary studies, the assumptions involved, and the recommendations proposed. Rather than taking the findings for granted, review authors involved in critical interpretive synthesis “critically question the entire construction of the story the primary-level authors told in their research reports” . They would potentially critique recommendations based on, e.g., ethical or moral arguments, such as the desirability of a particular rollout of an intervention. This method is therefore very well suited for understanding of what may have influenced proposed solutions to a problem  and to examine the constructions of concepts . In the environmental field, this method could, for example, be applied to understand how different narratives influence environmental practice and policy or to critically assess new forms of conservation governance and management. Explicit examples of critical interpretive synthesis review projects applied to the broad area of environmental sciences are currently non-existent to our knowledge. However, there are a few related examples from health studies, such as review on environmentally responsible nursing . In that review, authors justify the use of critical interpretative synthesis mainly by the ability of this method to synthesise diverse types of primary studies in terms of their topic and methodology.
Realist synthesis is a theory-driven approach to combining evidence from various study types. Originally developed in 2005 by Ray Pawson and colleagues , it is aimed at unpacking the mechanisms for how particular interventions work, for whom and in which particular context and setting. It is included here because it is increasingly used for synthesising qualitative data, although data can be both qualitative and quantitative.
Realist synthesis has been developed to evaluate the integrity of theories (does a program work as predicted) and theory adjudication (which intervention fits best). In addition, it allows for a comparison of interventions across settings or target groups or explains how the policy intent of a particular intervention translates into practice .
The realist synthesis approach is highly iterative, so it is difficult to identify a distinct synthesis stage as such. The synthesis process usually starts by identifying theories that underpin specific interventions of interest. The theoretical assumptions about how an intervention is supposed to work and what impact it is supposed to generate are made explicit from the start. Depending on the exact purpose of the review, various types of evidence related to the interventions under evaluation (potentially both quantitative and qualitative) are then consulted and appraised for quality. In evaluating what works for whom in which circumstances, contradictory evidence is used to generate insights about the influence of context and so to link various configurations of context, mechanism and outcome. Conclusions are usually presented as a series of contextualised decision points. An example of a realist synthesis in the environmental context is the one from McLain, Lawry and Ojanen  in which the evidence of 31 articles examine the environmental outcomes of marine protected areas governed under different types of property regimes. The use of a realist synthesis approach allowed the review authors to gain a deeper understanding of the ways in which mechanisms such as perceptions of legitimacy, perceptions of the likelihood of benefits, and perceptions of enforcement capacity interact under different socio-ecological contexts to trigger behavioural changes that affect environmental conditions. Another example from the environmental domain is the review by Nilsson and colleagues  who applied a realist synthesis to 17 community-based conservation programs in developing countries that measured behavioural changes linked to conservation outcomes. The RAMESES I project (http://www.ramesesproject.org) offers methodological guidance, publication standards and training resources for realist synthesis.
Choosing the appropriate QES method
Here we explain the criteria for the selection of different QES methods presented in this paper.
There are several aspects to be considered when choosing the right evidence synthesis approach [42, 67, 72]. These include the type of a review question, epistemology, purpose of the review, type of data, and available expertise including the background of the research team and resource requirements. Here, we briefly discuss the more pragmatic aspects to be considered. For a detailed discussion of other criteria we refer the reader to the work of Hannes and Lockwood , and Booth and colleagues [42, 72].
Particularities of the evidence
As noted above, environmental problems are complex and involve a high degree of uncertainty. Environmental research is often inter- and transdisciplinary and involves, for example, the use of contested and/or diverse concepts and terms, as well as heterogeneous datasets. Thus, it is very important to understand if the QES method is fit-for-purpose and if it will result in the expected and desired synthesis outcomes. More complex and contextual outcomes are expected from the idealist methods (such as critical interpretative synthesis or meta-ethnography) (Fig. 1 and Table 1), which offer insights to policy or practice only after further interpretation. In contrast, more concrete and definitive outcomes can be expected from more realist methods (such as thematic synthesis) . The type of evidence to be synthesised (e.g., qualitative or mixed, see Table 1) is yet another aspect needing consideration when choosing the synthesis method.
Background of the researchers and the review team
Researchers should consider their methodological backgrounds and epistemological viewpoints, to make sure they have appropriate expertise as well as experience in the review team when choosing the method. Some more complex methods (such as realist synthesis) may require specific skills (e.g., a familiarity with the realist perspective), and larger teams of researchers with different disciplinary backgrounds. Such methods may also require that the researchers are more familiar with the content of the research they review. Other methods (such as thematic or framework synthesis) can be done in a smaller team of researchers who do not necessarily have deeper subject expertise.
Requirements for review funding will obviously depend on the resource requirements, i.e. a number of researchers to be involved, the time needed to conduct a review, costs associated with access to a specific data analysis or review management software, and access to literature. Some methods may be more resource demanding. Multi-component mixed method reviews, for example, requires expertise in both qualitative and quantitative synthesis methods, as well as the allocation of time for producing more than one parallel and/or consecutive syntheses. Other methods, such as framework synthesis, are maybe less resource-consuming (needing comparatively fewer people over less time) as long as initial frameworks have already been developed and are uncontentious. The issue of time spent on a review also depends on the breadth of the research question and the extent of the literature.
Challenges and points of contestation
Whilst QES can be valuable for environmental practice and policy, readers should be aware of several well-known challenges that might also appear problematic when QES approaches are used for the synthesis of environmental qualitative research. Here we summarise some of the most important ones including conceptual and methodological heterogeneity in primary research studies, issues with quality appraisal and transparency in reporting.
Qualitative evidence is likely to be situated in different disciplines, theoretical assumptions, and general philosophical orientations . For aggregative less interpretative methods (such as framework synthesis), this poses a challenge in terms of comparability during the synthesis stage of the review process. In case of more interpretive approaches (e.g., meta-ethnography), such diversity is often seen as an asset rather than a problem as the translation of one study to another  allows for a comparison of studies with different theoretical backgrounds.
As with systematic reviews of quantitative evidence, critical appraisal of study validity is perhaps one of the most contested stages of the QES review process . Quality appraisal (and the extent to which it matters) likely depends on the methodological approach. For example, framework and thematic syntheses assess the reliability and methodological rigour of individual study findings and may exclude methodologically flawed studies from the synthesis. Meta-ethnography or critical interpretative synthesis assess included studies in terms of content and utility of their findings, level to which they inform theory and include all studies in the synthesis .
Finally, reviews can be often criticised for lack of transparency and unclear or incomplete reporting. However, to ensure that all the important decisions related to the review conduct are reported at the sufficient level of detail, there are reporting standards applicable for QES such as ENTREQ  and ROSES . Additionally, RAMESES are reporting standards developed specifically for realist syntheses  and the EMERGE project developed reporting standards for meta-ethnographies (, http://emergeproject.org). These standards aim to increase transparency and hopefully drive up the quality of the review conduct .
Additional methodological options: Linking quantitative and qualitative evidence together
In the following paragraphs, we briefly present an additional methodological option that could be, for example, useful for the synthesis of complex conservation interventions and is suited to address some of the above challenges (such as methodological heterogeneity).
Namely, in some cases, synthesis of only one type of study findings (either qualitative or quantitative) might not be sufficient to understand multi-layered or complex interventions or programs typical for the environmental sector. The mixed methods review approach has been developed to link qualitative, mixed and quantitative study findings in a way to enhance the breadth and depth of understanding phenomena, problems and/or study topics [81, 82]. Mixed methods reviews is a systematic review in which quantitative, qualitative and primary studies are synthesized using both quantitative and qualitative methods . The data included in such a review are the findings or results extracted from either quantitative, qualitative or mixed methods primary studies. These findings are then integrated using a mixed method analytical approach .
This approach allows us to study how different (intervention) components are related and how they interact with each other . Apart from studying the effectiveness of interventions, these reviews include qualitative evidence on the contextual influence, applicability and barriers to implementation for these interventions. For example, topics covered by reviews that link qualitative and quantitative data are the impact of urban design, land use and transport policies and practices to increase physical activity ; the socio-economic effects of agricultural certification schemes ; the impact of outdoor spaces on wellbeing for people with dementia . Qualitative and quantitative bodies of evidence can point to different facets of the same phenomena and enrich understanding of it. In a review on protected area impacts on human wellbeing , it is revealed that qualitative findings were not studied quantitatively and only once combined in a synthesis these two evidence bases could provide a complete picture of the protected area impact.
Synthesis of qualitative research is crucial for addressing wicked environmental problems and for producing reliable support for decisions in both policy and practice. We have provided an overview of methodological approaches for the synthesis of qualitative research, each characterised by different ways of problematising the literature and level of interpretation. We have also explained what needs to be considered when choosing among these methods.
Environmental and conservation social science has witnessed an accumulation of primary research during the past decades. However, social scientists argue that there is a little integration of qualitative evidence into conservation policy and practice , and this suggests that there is a ‘synthesis gap’ (sensu ). This paper, with an overview of different methodological tools, provides the first guidance for environmental researchers to conduct synthesis of qualitative evidence so that they can start bridging the synthesis gap between environmental social science, policy and practice. Furthermore, introduced examples may inspire reviewers to adapt existing methods to their specific subject and, where necessary, help develop new methods that are a better fit for the field of environmental evidence. This is especially important as currently used methods in synthesis of environmental evidence fall short on utilising the potential of qualitative research that translates into lack of a deeper contextual understanding around implementation and effectiveness of environmental management interventions, and disregard the diversity of perspectives and voices (e.g., indigenous peoples, farmers, park managers) fundamental for tackling wicked environmental issues.
We thank the BONUS Secretariat for covering article processing fees. BM thanks to Mistra Council for Evidence-based Environmental Management (EviEM) and BONUS RETURN for allocated time to draft this manuscript. RG is partially supported by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care South West Peninsula.
BM and MS developed the framework for and edited the end version of this paper. All authors (BM, MS, RG, KH, R. Rees and R. Rodela) wrote substantial pieces of the manuscript. RG, KH, R. Rees and R. Rodela commented on previous versions of the manuscript. All authors read and approved the final manuscript.
Article processing fees were covered by BONUS RETURN. BONUS RETURN project is supported by BONUS (Art 185), funded jointly by the EU and Swedish Foundation for Strategic Environmental Research FORMAS, Sweden’s innovation agency VINNOVA, Academy of Finland and National Centre for Research and Development in Poland.
Ethics approval and consent to participate
Consent for publication
The authors declare that they have no competing interests.
- 6.The Steering Group of the Campbell Collaboration: Campbell collaboration systematic reviews: policies and guidelines. Campbell systematic reviews, (supplement 1), p. 46; 2015.Google Scholar
- 12.Collaboration for Environmental Evidence. Guidelines and standards for evidence synthesis in environmental management. Version 5.0; Eds. Pullin AS, Frampton GK, Livoreil B, Petrokofsky G. 2018. Available from: http://www.environmentalevidence.org/information-for-authors. Accessed 1 Oct 2018.
- 13.Gough D, Oliver S, Thomas J. An introduction to systematic reviews. London: SAGE Publications Ltd; 2012.Google Scholar
- 28.Steffen W, Grinevald J, Crutzen P, McNeill J. The anthropocene: conceptual and historical perspectives. Philos Trans R Soc A Math Phys Eng Sci. 1938;2011(369):842–67.Google Scholar
- 42.Booth A, Noyes J, Flemming K, Gerhardus A, Wahlster P, Van Der Wilt GJ, Mozygemba K, Refolo P, Sacchini D, Tummers M, Rehfuess E. Guidance on choosing qualitative evidence synthesis methods for use in health technology assessments of complex interventions [Online]. 2016. Available from: http://www.integrate-hta.eu/downloads/. Accessed 1 Oct 2018.
- 44.Thomas J, O’Mara-Eves A, Harden A, Newman M. Chapter 8: Synthesis methods for combining and configuring textual or mixed methods data. In: Gough D, Oliver S, Thomas J, editors. An Introduction to systematic reviews. 2nd ed. London: Sage; 2017.Google Scholar
- 45.Andrews T. What is social constructionism? Grounded Theory Rev. 2012;11:39–46.Google Scholar
- 53.Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Huberman AM, Miles MB, editors. The qualitative researcher’s companion. Thousand Oaks: SAGE Publications, Inc.; 2002.Google Scholar
- 55.Belluco S, Gallocchio F, Losasso C, Ricci A. State of art of nanotechnology applications in the meat chain: a qualitative synthesis. Crit Rev Food Sci Nutr. 2017;3:1084–96.Google Scholar
- 67.Hannes K, Lockwood M, editors. Synthesizing qualitative research: Choosing the right approach. Oxford: Wiley-Blackwell; 2012.Google Scholar
- 83.Sandelowski M, Voils CI, Barroso J. Defining and designing mixed research synthesis studies. Res Sch. 2006;13:29.Google Scholar
- 84.Heath G, Brownson R, Kruger J, Miles R, Powell K, Ramsey L. Task Force on Community Preventive Services: the effectiveness of urban design and land use and transport policies and practices to increase physical activity: a systematic review. J Phys Activity Health. 2006;3:S55–76.CrossRefGoogle Scholar
- 86.Whear R, Thompson Coon J, Bethel A, Abbott R, Stein K, Garside R. What is the impact of using outdoor spaces such as gardens on the physical and mental well-being of those with dementia? A systematic review of quantitative and qualitative evidence. J Post-Acute Long-Term Care Med. 2014;15:697–705.Google Scholar
- 89.Garside R. A comparison of methods for the systematic review of qualitative research: two examples using meta-ethnography and meta-study. Doctoral dissertation. Exeter: Universities of Exeter and Plymouth; 2008.Google Scholar
- 90.Brunton G, Oliver S, Oliver K, Lorenc T. A synthesis of research addressing children’s, young people’s and parents’ views of walking and cycling for transport. In. London, UK: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London; 2006.Google Scholar
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.