Repository logo
 

Innovation and forward-thinking are needed to improve traditional synthesis methods: a response to Pescott & Stewart

Accepted version
Peer-reviewed

No Thumbnail Available

Type

Article

Change log

Authors

Amano, Tatsuya 
Martin, Philip 
Shackelford, Gorm 
Simmons, Benno 

Abstract

  1. In Christie et al. (2019), we used simulations to quantitatively compare the bias of commonly used study designs in ecology and conservation. Based on these simulations, we proposed ‘accuracy weights’ as a potential way to account for study design validity in meta-analytic weighting methods. Pescott & Stewart (2021) raised concerns that these weights may not be generalisable and still lead to biased meta-estimates. Here we respond to their concerns and demonstrate why developing alternative weighting methods is key to the future of evidence synthesis.
  2. We acknowledge that our simple simulation unfairly penalised Randomised Controlled Trial (RCT) relative to Before-After Control-Impact (BACI) designs as we assumed that the parallel trends assumption held for BACI designs. We point to an empirical follow-up study in which we more fairly quantify differences in biases between different study designs. However, we stand by our main findings that Before-After (BA), Control-Impact (CI), and After designs are quantifiably more biased than BACI and RCT designs. We also emphasise that our 'accuracy weighting’ method was preliminary and welcome future research to incorporate more dimensions of study quality.
  3. We further show that over a decade of advances in quality effect modelling, which Pescott & Stewart (2021) omit, highlights the importance of research such as ours in better understanding how to quantitatively integrate data on study quality directly into meta-analyses. We further argue that the traditional methods advocated for by Pescott & Stewart (2021) (e.g., manual risk-of-bias assessments and inverse-variance weighting) are subjective, wasteful, and potentially biased themselves. They also lack scalability for use in large syntheses that keep up-to-date with the rapidly growing scientific literature.
  4. Synthesis and applications. We suggest, contrary to Pescott & Stewart’s narrative, that moving towards alternative weighting methods is key to future-proofing evidence synthesis through greater automation, flexibility, and updating to respond to decision-makers needs – particularly in crisis disciplines in conservation science where problematic biases and variability exist in study designs, contexts, and metrics used. Whilst we must be cautious to avoid misinforming decision-makers, this should not stop us investigating alternative weighting methods that integrate study quality data directly into meta-analyses. To reliably and pragmatically inform decision-makers with science, we need efficient, scalable, readily automated, and feasible methods to appraise and weight studies to produce large-scale living syntheses of the future.

Description

Keywords

automation, bias adjustment, critical appraisal, dynamic meta-analyses, evidence synthesis, living reviews, quality effects modelling, risk of bias

Journal Title

Journal of Applied Ecology

Conference Name

Journal ISSN

0021-8901
1365-2664

Volume Title

Publisher

Wiley
Sponsorship
NERC (1945942)
Natural Environment Research Council (1945942)
NERC (NE/L002507/1)
Australian Research Council Future Fellowship Royal Commission for the Exhibition of 1851 Research Fellowship Grantham Foundation for the Protection of the Environment Natural Environment Research Council The David and Claudia Harding Foundation Kenneth Miller Trust Arcadia Fund
Relationships
Is source of: