Skip to main content
Ambio logoLink to Ambio
. 2020 Sep 3;50(2):393–399. doi: 10.1007/s13280-020-01385-x

On the uncertainty and confidence in decision support tools (DSTs) with insights from the Baltic Sea ecosystem

Floris M van Beest 1,, Henrik Nygård 2, Vivi Fleming 2, Jacob Carstensen 1
PMCID: PMC7782639  PMID: 32885402

Abstract

Ecosystems around the world are increasingly exposed to multiple, often interacting human activities, leading to pressures and possibly environmental state changes. Decision support tools (DSTs) can assist environmental managers and policy makers to evaluate the current status of ecosystems (i.e. assessment tools) and the consequences of alternative policies or management scenarios (i.e. planning tools) to make the best possible decision based on prevailing knowledge and uncertainties. However, to be confident in DST outcomes it is imperative that known sources of uncertainty such as sampling and measurement error, model structure, and parameter use are quantified, documented, and addressed throughout the DST set-up, calibration, and validation processes. Here we provide a brief overview of the main sources of uncertainty and methods currently available to quantify uncertainty in DST input and output. We then review 42 existing DSTs that were designed to manage anthropogenic pressures in the Baltic Sea to summarise how and what sources of uncertainties were addressed within planning and assessment tools. Based on our findings, we recommend future DST development to adhere to good modelling practise principles, and to better document and communicate uncertainty among stakeholders.

Electronic supplementary material

The online version of this article (doi:10.1007/s13280-020-01385-x) contains supplementary material, which is available to authorized users.

Keywords: Baltic Sea, Confidence, Decision support tools, Uncertainty

Introduction

Most ecosystems on Earth are affected by multiple pressures originating from anthropogenic activities, often with negative environmental and socio-economic consequences (Halpern et al. 2015; Steffen et al. 2015). To effectively manage human activities affecting environmental systems, policy makers frequently rely on decision support tools (DSTs) to guide the decision-making process (Milner-Gulland and Shea 2017). In the broadest sense, a DST can be defined as any guidance, procedure, or analysis tool that can be used to help support a decision (Sullivan 2002). The optimal DST should provide a structured process in which all assumptions, model parameters, and predicted outcomes are tested, reviewed, and documented. This allows environmental managers to make the best possible decision based on prevailing knowledge and uncertainties. Uncertainty is unavoidable and all scientists and policy-makers are familiar with it. Nonetheless, many people working with DSTs shy away from quantifying, documenting, and communicating the uncertainty within the tool, the output, and the decision-making process.

Our main aim was to evaluate how uncertainty has been handled in DSTs that have been developed specifically to address pressures in the Baltic Sea and its catchment area. The ecosystem services that the Baltic Sea provides have been severely compromised over recent decades by deteriorated water quality. This is primarily due to eutrophication, exploitation of coastal areas, overfishing, and contamination by toxic pollutants (Bonsdorff et al. 2015; Reusch et al. 2018). Together these pressures have resulted in what is called the largest human-induced dead-zone in the world (Conley 2012) leading to the recent collapse of cod fisheries the Baltic Sea (Eero et al. 2015; ICES 2019). Fortunately, some negative trends in ecosystem function have levelled off or have even partly reversed, owing to increased scientific understanding of the ecological processes within the Baltic area combined with changed management strategies and policy interventions (Reusch et al. 2018). The development and application of DSTs to mitigate a variety of pressures have played a key role in the initial success and knowledge generated from different DSTs has been used for managing pressures (e.g. nutrient reductions in HELCOM Baltic Sea Action Plan (HELCOM 2007)). Combined, this makes the Baltic Sea an ideal system to assess how uncertainty has been dealt with in various DSTs. Based on our review of 42 DSTs, including planning and assessment tools, we highlight challenges but also opportunities to better incorporate and communicate uncertainty and confidence assessments in DSTs and environmental decision-making.

A synopsis of uncertainty, its sources, and how to quantify it in DSTs

Uncertainty is typically divided into three categories: (i) aleatory uncertainty, which includes inherent randomness and natural variability, (ii) epistemic uncertainty, which results from imperfect knowledge, and (iii) linguistic uncertainty, which arises due to language issues (Regan et al. 2002; Walker et al. 2003; Refsgaard et al. 2007). Epistemic uncertainty especially is an integral part of every stage of the scientific process: from the assumptions to the observations, extrapolations over time and space and the generalizations made. Aleatory and epistemic uncertainty can originate from numerous sources and most commonly include:

  • Inherent randomness and natural variation are types of uncertainty that are omnipresent in nature. It means that even though the process and the initial conditions that make up a natural system are well known; we cannot be completely certain of what the outcome will be following some disturbance or management intervention (Regan et al. 2002; Uusitalo et al. 2015).

  • Measurement error causes uncertainty about the value of the measured quantity for the variable of interest. When several samples are taken, the measurement error can be quantified (Regan et al. 2002; Refsgaard et al. 2007).

  • Systematic error refers to a bias in the sampling over time and space. If systematic error goes unnoticed, it may have cumulative effects in the end-product that is generated by the data (Regan et al. 2002; Uusitalo et al. 2015).

  • Subjective judgement-based uncertainty occurs due to interpretation of data, especially when data are scarce or error prone (Chatfield 1995).

  • Model structure and parameter uncertainty arises due to simplifications of the natural system being modelled as not all variables and interactions influencing the system are known (Chatfield 1995; Regan et al. 2002).

A wide variety of methodologies exist to assess uncertainty in DST input data and its effect on the confidence in DST output. Below, we highlight the more common methods used to address uncertainty and confidence in DSTs but refer to other relevant literature for more exhaustive and detailed reviews (e.g. Refsgaard et al. 2007; Bennett et al. 2013; Uusitalo et al. 2015).

  • Expert elicitation is a structured process intended to extract judgements or scores from experts and is often applied in situations where empirical data is lacking or insufficient for a direct quantification of uncertainty but there is an urgent need for management decisions (Morgan 2014). Limitations of this technique are linked to the subjectivity of the results that are sensitive to the selection of experts, and in case of differences among experts, it may be difficult to safely quantify the uncertainties (Refsgaard et al. 2007; Morgan 2014).

  • Sensitivity analysis aims to assess how the model outputs respond to changes in parameter values, with an emphasis on identifying the input parameters to which outputs are most sensitive (Borgonovo 2013). Sensitivity analyses can vary greatly in their level of complexity as they can be performed using simple change-one-parameter-at-a-time methods to more comprehensive approaches that take error explicitly into account, such as Approximate Bayesian Computation (van der Vaart et al. 2018).

  • Scenario analysis is used to assess possible state changes of a system by quantifying the impacts of alternative mitigation measures under different assumptions (Tapinos 2012; Khosravi and Jha-Thakur 2019). Because the future is inherently uncertain, all predictions into the future need to cope with uncertainties. Scenario analysis is therefore ideally suited for planning tools, which are a group of DSTs designed specifically to forecast state changes of an environmental system.

Multi-model analysis, also termed ensemble modelling, is a strategy that can estimate model structure and parameter uncertainty by combining multiple plausible and independent models developed to describe the same domain. If the output of different models or DSTs produces similar estimates for the same assessment, it is often assumed that the structural and parameter uncertainty is low, though caution is advised in making this conclusion (Uusitalo et al. 2015).

Probabilistic modelling is a quantitative approach to assess uncertainty that is based on probability distribution functions. A suite of statistical methods now exist including both classical frequentist (Carstensen and Lindegarth 2016) as well as Bayesian (Davies and Hope 2015; Laurila-Pant et al. 2019) approaches. The latter approach also works for small sample sizes as Bayesian statistics provide probabilistic expression of uncertainty about parameters of interest for the given sample size (McNeish 2016). When probability distribution functions can be estimated reliably, they allow for describing inter-dependence between different sources of uncertainty and for tracing the sources of uncertainty in the data and their impact on the output throughout the DST development phase (Heuvelink et al. 2007).

Methods

We define a DST as: “an interactive virtual tool developed with the purpose of supporting decision making in responses to degradation of the Baltic Sea and its catchment area. The tool may assist in distinguishing changes in the state, links to pressures caused by human activities and drivers behind them, or showing ecological impacts and effects on social welfare, thus facilitating information change between decision-makers and scientific experts” (see also Nygård et al. 2020).

Following this definition, we only considered tools that were developed for application in the Baltic Sea and its drainage basin. DSTs did not have to be used in the whole Baltic Sea region as national or regional tools were also considered. To identify DSTs suited for review, we first defined relevant management problems (e.g. eutrophication, biodiversity loss, contaminants). These problem areas were then screened for DSTs using personal knowledge, literature and web searches as well as interviews with stakeholders likely to be using or developing DSTs. When a DST fulfilled our definition, any documentation, including reports, manuals and scientific literature, that referenced the tool was studied to assess if and how uncertainty was dealt with and to what extent this was possible to do by end-users (i.e. policy-makers, environmental managers) without any further adjustments of the DST. However, DSTs were only considered if documentation was available in English, German, Swedish, Danish or Finnish, which was necessary for the evaluators of the tools to assess if and how uncertainty was incorporated into the DST. Although any source of uncertainty described in the DST documentation was considered for our review, we focused specifically on how uncertainty in the measurement, sampling and model output was estimated and whether the DST accounted for all or a subset of potential sources of uncertainty.

Relevant DSTs were categorized into assessment tools or planning tools. Assessment tools are primarily developed to assess the current state of the system related to one or more pressures (e.g. eutrophication or contamination by pollutants). Planning tools on the other hand are designed specifically to forecast possible future states of the system by quantifying the impacts of possible mitigation measures related to a specific pressure.

To evaluate what sources of uncertainty were dealt with (or not) in the DSTs and, moreover, whether uncertainty was dealt with more in planning or assessment tools, we report qualitative summaries in percentages.

Results

How is uncertainty dealt with in DSTs developed for the Baltic Sea and catchment area?

The most common environmental issues addressed in the 42 Baltic-based DSTs included: eutrophication (26%), impact evaluation of coastal areas (16%), biodiversity and conservation (14%), and contamination by toxic pollutants (12%), which align with the most important pressures in the region (Reusch et al. 2018; Nygård et al. 2020). A complete list of all DSTs considered in our review is provided in Table S1 in the Supplementary Information.

We found that 48% (n = 20/42) of the DSTs do not allow the user to directly incorporate or quantify any of the known sources of uncertainty (i.e. measurement, sampling, and model/parameter structure) directly in the modelling or calculation process and, as such, no confidence in the output is therefore expressed. In fact, in 12 out of these 20 DSTs (60%) uncertainty or confidence was not even mentioned in the documentation, while for the remaining 8 DSTs (40%) the need to quantify uncertainty associated with the output was highlighted, though this was not possible as part of the DST version reviewed.

DSTs that allowed for uncertainty to be quantified within the modelling process (n = 22/42), employed a variety of approaches that ranged from purely qualitative heuristic scoring of the input or output data to completely data-driven multifaceted confidence assessments within a statistical probability framework. Most often, assessments of uncertainty in measurement and sampling error (i.e. input data) was possible through simple sensitivity analyses (27%: n = 6/22) or through qualitative scoring based on expert judgement (9%: n = 2/22). Yet, in these cases any uncertainty in the input data did not directly influence or express confidence in the DST output. We also found the reverse, where confidence in the output data, which is influenced by sources of model structure and parameter uncertainty, could be estimated using alternative scenario modelling (5%: n = 1/22) or through qualitative scoring based on expert judgement (9%: n = 2/22). However, in these cases confidence assessment was independent of any uncertainty in the input data. A more comprehensive confidence assessment of the output was possible in 27% of the DSTs (n = 6/22) where either spatial, temporal or methodological sources of uncertainty could be estimated within the DST.

Only in a few cases (n = 5/22: 23%) did we find that input uncertainty or confidence in the DST output could be assessed through a multifaceted approach where spatial, temporal and methodological sources of uncertainty are considered concomitantly. This was made possible by calculating confidence in the end-result using a well-described statistical theory (probability framework) within either a completely data-driven approach (n = 2/22) or by supplementing data gaps with expert judgement (n = 3/22).

Is uncertainty dealt with in some DST-types more than in others?

The majority of Baltic-focussed DSTs reviewed were assessment tools (64%: n = 27/42) and 36% (n = 15/42) of the DSTs were planning tools. Within the group of assessment tools, 44% of the DSTs did not allow for uncertainty to be quantified or incorporated within the modelling process (n = 12/27). The remaining assessment tools employed either simple sensitivity analyses (27%, n = 4/15), purely qualitative heuristic scoring/expressions of the input or output (13%, n = 2/15), or uncertainty and confidence assessment were based on either spatial, temporal or methodological sources of uncertainty (47% n = 7/15). Completely data-driven multifaceted confidence assessments within a statistical probability framework were only used in 13% of the assessment DST (n = 2/15). Uncertainty was generally considered less in planning tools compared to assessment tool as 53% (n = 8/15) of the planning DSTs did not provide the user with an option to directly quantify uncertainty in the input/output. For the remaining planning tools (n = 7) uncertainty could be quantified only using sensitivity analyses of model parameters (20%, n = 3), purely qualitative heuristic scoring/expressions of the input/output (13%, n = 2) or through a non-comprehensive confidence assessment where either spatial, temporal or methodological sources of uncertainty could be considered independently (13%, n = 2).

Discussion and Recommendations

Based on our review it can be concluded that uncertainty and confidence assessments are rarely integrated as an automated procedure in tool development and thus not systematically available to end-users of Baltic-based DSTs. Indeed, in approximately half of the DSTs, main sources of uncertainty (i.e. measurement, sampling, and model or parameter structure) were incorporated in the modelling or calculation process though no confidence in the output could be expressed. Only rarely did we find that input uncertainty or confidence in the DST output could be assessed through a multifaceted approach where spatial, temporal and methodological sources of uncertainty are considered concomitantly. Thus, uncertainty in DST input and output was typically quantified separately and independently through an ‘end of pipe’ analysis once model set-up, calibration, and validation had been completed. Such an approach ignores much of the uncertainty associated with structural uncertainty and may overestimate confidence in DST outcome (Refsgaard et al. 2007). Indeed, confidence assessments of predictive output, such as through scenario analyses, were scarce in the reviewed planning tools, which is worrying as these tools are often used to assist environmental decision making (Milner-Gulland and Shea 2017). Management of the Baltic Sea is extremely complex due to the multiple processes and pressures acting on the system and the competing interest and involvement of numerous stakeholders. Rapidly progressing global pressures, particularly warming of Baltic waters and the surrounding drainage area, are now jeopardizing initial management achievements. As such, for current and future Baltic-focused DSTs to be of greatest value in designing and identifying the most effective management strategies, it is imperative that uncertainty is adequately quantified, documented and communicated in a transparent manner. Here we provide some recommendations that follow from our results and experiences gained while reviewing the DSTs and discuss how they can facilitate future tool development:

  • Uphold good modelling practise It is imperative that any modelling exercise or development is accompanied by transparent and comprehensive tests and analyses of input and output data to ensure the highest possible utility and value of the tool (Harwood and Stokes 2003; Schmolke et al. 2010). While we acknowledge that DSTs vary in complexity and that quantitative analyses of uncertainty and confidence throughout the entire tool development process are not always straightforward, a basic assessment of uncertainty should always be performed, irrespective of data availability or model complexity. In the reviewed DST documentation, we often found that computational capacity was considered a limiting factor to quantify uncertainty or confidence in the model output. While this may be complicated for some DSTs that employ data heavy simulation models or tools that consider multiple pressures operating across large spatiotemporal scales, most contemporary desktop computers contain sufficient memory to perform simple sensitivity analyses. Low sample size of empirical data was also frequently raised as a constraining factor for thorough uncertainty analyses in the reviewed DSTs. Yet statistical methods to tackle this problem are readily available (McNeish 2016), and uncertainty assessment through qualitative expert judgement provides an independent methodology to assess confidence in scares empirical data as well as DST output. Ultimately, the choice of method to assess the different sources of uncertainty in the DST development process varies on a case-by-case basis and depends largely on the available data and decision problem at hand. We advocate employing a combination of quantitative and qualitative approaches for a thorough and robust uncertainty assessment.

  • Communicate uncertainty transparently Lack of openly communicating and clearly documenting uncertainty about data and knowledge is a well-known and general issue in environmental science that may be driven by fear of the audience’s response (van der Bles et al. 2019). Consequently, it is possible that sources of uncertainty in DST input and output are quantified and incorporated more routinely during the DST development phase than quantified in our review, but that documentation was not easily retrieved. To overcome this challenge, we highlight a need for adopting a generic and standardized protocol for quantifying and documenting uncertainty in environmental DSTs, which can also bridge the difficulty of communicating uncertainty among stakeholders. Such standardized documentation formats already exist for other types of environmental models (Grimm et al. 2010; Schmolke et al. 2010; Zurell et al. 2020). For example, the TRACE documentation structure as proposed by Schmolke et al. (2010) provides a clear overview of all essential phases in the model development process including problem formulation, model testing and analyses, and model application, the latter of which includes details about uncertainty analysis. Adopting a similar documentation format for DSTs will benefit the overall transparency, facilitate communication of uncertainty as well as the utility of DSTs in environmental management. Doing so will also improve the feasibility of multi-DST approaches that are progressively being used to manage multi-pressure systems as linking several DSTs into one framework will be a more transparent process if uncertainty assessment of each individual DST is documented and communicated in a standardized manner.

  • Adopt a bottom-up approach to determine the desired level of confidence Deciding a priori on a confidence threshold to be attained (e.g. the probability of X happening with approach Y should be > 0.9) before DST output can be considered in the decision-making process is, in theory, a good idea as it forces confidence to be quantified and reported. Because confidence thresholds should be based on prevailing knowledge and evidence as well as consensus in the field, a possible approach would be to employ a bottom-up process that allows decision makers, scientists, managers, and other stakeholders involved to jointly determine the level of confidence required through a self-organizing process on a case-by-case basis. Although such a bottom-up process is likely to be time demanding and an iterative task, especially if multiple stakeholders with divergent interest and expertise are involved, a case-by-case approach would be beneficial as many environmental pressures lead to site-specific problems. For example, for assessment tools, different levels of data resolution might be required to achieve a sufficient level of confidence in state classification depending on type of environment and pressure addressed. Importantly, DSTs often turn out to be incomplete or lack sufficient confidence, and this requires revising the formulation of the DST, the underlying conceptual model, or even the original problem formulation (Jakeman et al. 2006). During this process, all stakeholders involved should get the chance to discuss and review DST objectives, outputs and acceptance criteria repeatedly (Harwood and Stokes 2003).

In summary, we advocate that quantifying, documenting, and communicating known sources of uncertainty in DSTs is absolutely crucial for successful environmental management. We urge environmental managers and decision makers to take into account the uncertainty when interpreting DST estimates, and be alert when information on uncertainty is missing. DST development, testing, and validation is an iterative process by nature and embracing the existence of uncertainty, instead of shying away from it, can only improve the utility and value of DSTs as it can highlight knowledge gaps and therefore inform further research. Only by continuously aiming to close such knowledge gaps will DSTs confidently assist scientists, environmental managers, and policy makers to evaluate the current and future status of ecosystems.

Electronic supplementary material

Below is the link to the electronic supplementary material.

13280_2020_1385_MOESM1_ESM.pdf (563.1KB, pdf)

Electronic supplementary material 1 (PDF 564 kb)

Acknowledgements

This review is a contribution from the BONUS DESTONY project. BONUS DESTONY has received funding from BONUS (Art. 185), funded jointly by the EU and the Swedish Research Council FORMAS. We wish to thank all participants of DESTONY, all DST developers and end users for all the hard work and knowledge produced when compiling and reviewing the DST list. We also wish to thank three anonymous reviewers for their constructive feedback on a previous manuscript draft.

Biographies

Floris M. van Beest

is a senior researcher at Aarhus University. His research interests include statistical and simulation-based modelling of spatial data and assessments of animal responses to human pressure and environmental change.

Henrik Nygård

is a senior research scientist at the Finnish Environmental Institute. His research interests include monitoring and assessment of the marine environment, with a focus on benthic habitats.

Vivi Fleming

is a researcher at the Finnish Environment Institute. Her research interests include eutrophication, indicator development and long-term changes in the Baltic Sea.

Jacob Carstensen

is a Professor at Aarhus University. His research interests include statistical modelling of monitoring data, indicator development and assessment of ecosystem responses to human pressures.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Floris M. van Beest, Email: flbe@bios.au.dk

Henrik Nygård, Email: henrik.nygard@ymparisto.fi.

Vivi Fleming, Email: Vivi.Fleming-Lehtinen@environment.fi.

Jacob Carstensen, Email: jac@bios.au.dk.

References

  1. Bennett ND, Croke BFW, Guariso G, Guillaume JHA, Hamilton SH, Jakeman AJ, Marsili-Libelli S, Newham LTH, et al. Characterising performance of environmental models. Environmental Modelling & Software. 2013;40:1–20. doi: 10.1016/J.ENVSOFT.2012.09.011. [DOI] [Google Scholar]
  2. Bonsdorff E, Andersson A, Elmgren R. Baltic Sea ecosystem-based management under climate change: Integrating social and ecological perspectives. Ambio. 2015;44:333–334. doi: 10.1007/s13280-015-0669-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Borgonovo E. Sensitivity analysis in decision making. Wiley Encyclopedia of Operations Research and Management Science. 2013 doi: 10.1002/9780470400531.eorms1076. [DOI] [Google Scholar]
  4. Carstensen J, Lindegarth M. Confidence in ecological indicators: A framework for quantifying uncertainty components from monitoring data. Ecological Indicators. 2016;67:306–317. doi: 10.1016/J.ECOLIND.2016.03.002. [DOI] [Google Scholar]
  5. Chatfield C. Model uncertainty, data mining and statistical inference. Journal of the Royal Statistical Society Series A. 1995;158:419. doi: 10.2307/2983440. [DOI] [Google Scholar]
  6. Conley DJ. Save the Baltic Sea. Nature. 2012;486:463–464. doi: 10.1038/486463a. [DOI] [PubMed] [Google Scholar]
  7. Davies AJ, Hope MJ. Bayesian inference-based environmental decision support systems for oil spill response strategy selection. Marine Pollution Bulletin. 2015;96:87–102. doi: 10.1016/j.marpolbul.2015.05.041. [DOI] [PubMed] [Google Scholar]
  8. Eero M, Hjelm J, Behrens J, Buchmann K, Cardinale M, Casini M, Gasyukov P, Holmgren N, et al. Eastern Baltic cod in distress: Biological changes and challenges for stock assessment. ICES Journal of Marine Science. 2015;72:2180–2186. doi: 10.1093/icesjms/fsv109. [DOI] [Google Scholar]
  9. Grimm V, Berger U, DeAngelis DL, Polhill JG, Giske J, Railsback SF. The ODD protocol: A review and first update. Ecological Modelling. 2010;221:2760–2768. doi: 10.1016/j.ecolmodel.2010.08.019. [DOI] [Google Scholar]
  10. Halpern BS, Frazier M, Potapenko J, Casey KS, Koenig K, Longo C, Lowndes JS, Rockwood RC, et al. Spatial and temporal changes in cumulative human impacts on the world’s ocean. Nature Communications. 2015;6:7615. doi: 10.1038/ncomms8615. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Harwood J, Stokes K. Coping with uncertainty in ecological advice: Lessons from fisheries. Trends in Ecology & Evolution. 2003;18:617–622. doi: 10.1016/J.TREE.2003.08.001. [DOI] [Google Scholar]
  12. HELCOM . HELCOM Baltic Sea action plan. Poland: Krakow; 2007. [Google Scholar]
  13. Heuvelink GBM, Brown JD, van Loon EE. A probabilistic framework for representing and simulating uncertain environmental variables. International Journal of Geographical Information Science. 2007;21:497–513. doi: 10.1080/13658810601063951. [DOI] [Google Scholar]
  14. ICES Cod (Gadus morhua) in subdivisions 24-32, eastern Baltic stock (eastern Baltic Sea) Report of the ICES Advisory Committee. 2019;27:24–32. doi: 10.17895/ices.advice.4747. [DOI] [Google Scholar]
  15. Jakeman AJ, Letcher RA, Norton JP. Ten iterative steps in development and evaluation of environmental models. Environmental Modelling & Software. 2006;21:602–614. doi: 10.1016/J.ENVSOFT.2006.01.004. [DOI] [Google Scholar]
  16. Khosravi F, Jha-Thakur U. Managing uncertainties through scenario analysis in strategic environmental assessment. Journal of Environmental Planning and Management. 2019;62:979–1000. doi: 10.1080/09640568.2018.1456913. [DOI] [Google Scholar]
  17. Laurila-Pant M, Mäntyniemi S, Venesjärvi R, Lehikoinen A. Incorporating stakeholders’ values into environmental decision support: A Bayesian Belief Network approach. Science of the Total Environment. 2019 doi: 10.1016/j.scitotenv.2019.134026. [DOI] [PubMed] [Google Scholar]
  18. McNeish D. On using Bayesian methods to address small sample problems. Structural Equation Modeling: A Multidisciplinary Journal. 2016;23:750–773. doi: 10.1080/10705511.2016.1186549. [DOI] [Google Scholar]
  19. Milner-Gulland EJ, Shea K. Embracing uncertainty in applied ecology. Edited by Andre Punt. Journal of Applied Ecology. 2017;54:2063–2068. doi: 10.1111/1365-2664.12887. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Morgan MG. Use (and abuse) of expert elicitation in support of decision making for public policy. Proceedings of the National academy of Sciences of the United States of America. 2014 doi: 10.1073/pnas.1319946111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Nygård H, van Beest FM, Bergqvist L, Carstensen J, Gustafsson BG, Hasler B, Schumacher J, Schernewski G, et al. Decision support tools used in the Baltic Sea area: Performance and end-user preferences. Environmental Management. 2020 doi: 10.1007/s00267-020-01356-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Refsgaard JC, van der Sluijs JP, Højberg AL, Vanrolleghem PA. Uncertainty in the environmental modelling process: A framework and guidance. Environmental Modelling & Software. 2007;22:1543–1556. doi: 10.1016/J.ENVSOFT.2007.02.004. [DOI] [Google Scholar]
  23. Regan, H. M., M. Colyvan, and M. A. Burgman. 2002. A taxonomy and treatment of uncertainty for ecology and conservation biology. Ecological Applications 12: 618–628. 10.1890/1051-0761(2002)012%5b0618:atatou%5d2.0.co;2.
  24. Reusch TBH, Dierking J, Andersson HC, Bonsdorff E, Carstensen J, Casini M, Czajkowski M, Hasler B, et al. The Baltic Sea as a time machine for the future coastal ocean. Science Advances. 2018;4:8195. doi: 10.1126/sciadv.aar8195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Schmolke A, Thorbek P, DeAngelis DL, Grimm V. Ecological models supporting environmental decision making: A strategy for the future. Trends in Ecology & Evolution. 2010;25:479–486. doi: 10.1016/j.tree.2010.05.001. [DOI] [PubMed] [Google Scholar]
  26. Steffen W, Richardson K, Rockstrom J, Cornell SE, Fetzer I, Bennett EM, Biggs R, Carpenter SR, et al. Planetary boundaries: Guiding human development on a changing planet. Science. 2015;347:1259855. doi: 10.1126/science.1259855. [DOI] [PubMed] [Google Scholar]
  27. Sullivan T. Evaluating environmental decision support tools. New York: Upton; 2002. [Google Scholar]
  28. Tapinos E. Perceived environmental uncertainty in scenario planning. Futures. 2012;44:338–345. doi: 10.1016/j.futures.2011.11.002. [DOI] [Google Scholar]
  29. Uusitalo L, Lehikoinen A, Helle I, Myrberg K. An overview of methods to evaluate uncertainty of deterministic models in decision support. Environmental Modelling & Software. 2015;63:24–31. doi: 10.1016/J.ENVSOFT.2014.09.017. [DOI] [Google Scholar]
  30. van der Bles AM, van der Linden S, Freeman ALJ, Mitchell J, Galvao AB, Zaval L, Spiegelhalter DJ. Communicating uncertainty about facts, numbers and science. Royal Society Open Science. 2019;6:181870. doi: 10.1098/rsos.181870. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. van der Vaart E, Prangle D, Sibly RM. Taking error into account when fitting models using Approximate Bayesian Computation. Ecological Applications. 2018;28:267–274. doi: 10.1002/eap.1656. [DOI] [PubMed] [Google Scholar]
  32. Walker WE, Harremoës P, Rotmans J, van der Sluijs JP, van Asselt MBA, Janssen P, Krayer von Krauss MP. Defining uncertainty: A conceptual basis for uncertainty management in model-based decision support. Integrated Assessment. 2003;4:5–17. doi: 10.1076/iaij.4.1.5.16466. [DOI] [Google Scholar]
  33. Zurell D, Franklin J, König C, Bouchet PJ, Dormann CF, Elith J, Fandos G, Feng X, et al. A standard protocol for reporting species distribution models. Ecography. 2020 doi: 10.1111/ecog.04960. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

13280_2020_1385_MOESM1_ESM.pdf (563.1KB, pdf)

Electronic supplementary material 1 (PDF 564 kb)


Articles from Ambio are provided here courtesy of Springer

RESOURCES