Skip to main content
. 2021 Mar 25;2021(3):CD013717. doi: 10.1002/14651858.CD013717.pub2
Aspect Source Questions Application in this review Examples
Model structure Philips 2006 1. Are the structural assumptions transparent and justified? 1. Assess whether all structural model assumptions are explicitly stated and whether the authors substantiate these assumptions either through theoretical reasoning or through prior knowledge from the literature.
  • Description of model type and defining equations

  • Comprehensible explanation of model variables and equations

  • Description of features of the disease captured by the model, e.g. a randomly distributed incubation time

  • Explanations of model structure implications by text or graphical representations visualising the simulation pathway, e.g. a scheme of the context being modelled

  • Description of model limitations and simplifying assumptions.


 
2. Are the structural assumptions reasonable given the overall objective, perspective and scope of the model? 2. Consider whether the structural assumptions are consistent with what is known about the phenomenon of interest in the literature. In case of disagreement, assess to what extent these discrepancies undermine the overall validity of results and conclusions.
Input data Caro 2014 3. Are the input parameters transparent and justified?
  3. Assess whether the values of all input parameters are explicitly stated and whether the authors substantiate these values either through theoretical reasoning or through prior knowledge from the literature.
  • Epidemiological characteristics known from other studies

  • Inputs to data calibration algorithms

  • Table with input parameters and probability distributions used for probabilistic modelling

  • Explanation and discussion of choice of parameter values with appropriate citations


 
4. Are the input parameters reasonable? 4. Consider whether the input parameter values are consistent with what is known about the phenomenon of interest in the literature. In case of disagreement, assess to what extent these discrepancies undermine the overall validity of results and conclusions.
Validation (external) Caro 2014 5. Has the external validation process been described?
 
5. Assess whether there was a formal process of comparing the predictions of the model with 1) the data source that was used to build the model (dependent validation), 2) a data source that was not used to build the model, e.g. an independent country (independent validation) or 3) future values that did not intervene in model building (predictive validation).
  • Calibration of SEIR model to case data (dependent validation)

  • Prediction of a subset of observed data points based on training data set and comparison with validation data set (dependent validation)

  • Prediction of data points of country/region that was not part of the model fitting and calibration process and comparison with observed data (independent validation)

  • Prediction of future values that were not used in model building (predictive validation)

6. Has the model been shown to be externally valid? 6. Consider the extent to which model predictions agree with the data sources that were selected for the external validation process.
Validation (internal) Caro 2014 7. Has the internal validation process been described?
 
7. Assess whether there was a formal process of verifying the extent to which the mathematical calculations are consistent with the model’s specifications, e.g. in the form of a simulation study in which the mathematical calculations are applied to data that were simulated according to the model with known parameter values.
  • Application of the model on simulated data to establish that analyses work as intended

  • Code review process conducted by authors or by an independent source to ensure correct implementation of mathematical structure

  • Independent replication of model

8. Has the model been shown to be internally valid? 8. Consider the extent to which the results of the internal validation process indicate that the mathematical calculations are consistent with the model’s specifications.
Uncertainty Caro 2014 9. Was there an adequate assessment of the effects of uncertainty? 9. Consider whether the robustness of results to alternative input parameter values or model assumptions was assessed either by reporting the results of specific sensitivity analyses or through an app in which readers can themselves explore the effects of varying these model assumptions and input parameter values.
  • Structural and parameter sensitivity analyses

  • Inherent stochasticity due to simulation nature of model

  • Reporting of an app in which effects of input changes can be tracked

  • Propagation of present uncertainties to outcomes

  • Was the model probabilistic, i.e. were parameter values fixed or sampled from a distribution?

  • Is uncertainty transparently reported, described and justified?

Transparency Caro 2014 10. Was technical documentation, in sufficient detail to allow (potentially) for replication, made available openly or under agreements that protect intellectual property? 10. Assess whether the description of the analyses (including model structure, input parameters, data sources and methods) is sufficiently detailed to allow for the replication of results. In particular, consider whether the code that was used to obtain the results is freely available and well documented.
  • Description of model which is qualitatively extensive enough to allow for scrutiny of other researchers (e.g. supplementary material)

  • Do authors encourage replication by clarifying a procedure to obtain code?

  • Do the authors only refer to other, similar models for justification and detailed methodological description or do they provide their own documentation?