11 Session 11: Quantitative synthesis workflow reporting
11.1 Learning outcomes
- Develop a checklist for reading systematic reviews and meta-analyses.
- Document a quantitative synthesis process.
- Consolidate understanding of key synthesis elements from the literature.
11.2 Context
Evidence implementation and reuse is a non-trivial process in most disciplines. It is crucial that experts are able to use, reuse, and implement synthesis findings. This process of critical appraisal furthers a novel, big-picture view of scientific findings associated with single studies. This can take the form of a perspective that routinely weights relative evidence by purpose, reason, and other findings to improve decision making by stakeholders, the public, and other scientists. Application of this process to published peer-review evidence can be limited by transparency, level of reporting, missing data, meta-data articulation, and limited moderator reporting. Hence, ten simple rules for evidence reuse were proposed were broadly to highlight these and other challenges in synthesis science (Lortie and Owen 2020). Critical reading of meta-analyses is also a powerful skill for all experts (Lortie et al. 2013). This thinking and evaluation process has also been developed into a more prescriptive set of ten questions to apply to any published meta-analysis or systematic review (Nakagawa et al. 2017). These questions strengthen the reuse of current meta-analyses and provide a checklist for reporting in future systematic reviews and meta-analyses.
11.2.1 Checklist reporting
List from (Nakagawa et al. 2017):
- Is the search systematic and transparently documented?
- What question and what effect size?
- Is non-independence taken into account?
- Which meta-analytic model?
- Is the level of consistency among studies reported?
- Are the causes of variation among studies investigated?
- Are effects interpreted in terms of biological importance?
- Has publication bias been considered?
- Are results really robust and unbiased?
- Is the current state (and lack) of knowledge summarized?
11.2.2 Purpose
Good reporting is supported by good thinking. Purpose prevails. A primer for systematic reviews and meta-analysis in the sports science clearly articulates a clear and direct purpose delineation process for syntheses (Impellizzeri and Bizzini 2021).
11.2.3 Goals checklist
List from (Impellizzeri and Bizzini 2021):
- Identifying treatments that are not effective.
- Summarizing the likely magnitude of benefits of effective treatments.
- Identifying unanticipated risks of apparently effective treatments.
- Identifying gaps of knowledge.
- Auditing the quality of existing trials.
Collectively, these how-to papers suggest that it would be ideal if synthesis reporting exceeded the norms and standards associated with primary research reporting to enable next-level synthesis and reproducibility. Nonetheless, it is easy to get lost in the technical details. Consequently, purpose, audience, and reuse should be kept at the forefront of reviewing, reporting, and doing scientific syntheses. These ideas can be mobilized for knowledge mining even in the early steps of evidence retrieval and reviewing for inclusion in a synthesis project.
11.3 Challenge
- Apply the checklist to a systematic review and meta-analysis in your discipline published a number of years ago and again to a more recent synthesis study.
- Check whether the derived data were also published for each synthesis paper.
- Check main text of each and supplements and note whether a Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist or flow diagram with provided.
- Do a primary publication research query using Google Scholar and/or The Web Of Science for your specific process. Document the relative frequency of reporting by search terms (complete template provided below).
- Review a small subset of the papers (screen only) and test out the PRISMAstatement R package for this scoping review.
11.4 Products
- A checklist for review and reporting of these specific syntheses.
- A pilot literature dataset of the literature for a search(es) for a synthesis.
- A PRISMA statement flow chart for use in retrospective pilot reporting.
11.5 Resources
11.6 Reflection questions
- Was there evidence for a change in better synthesis reporting practices in your discipline?
- Were the derived data reported in syntheses or would it be possible to update/repeat a published synthesis in your discipline?
- Does the flow chart to reviewing and reusing research align with your cognitive modality and critical thinking approach to evidence?