A flexible online tool for the management of complex evidence-synthesis projects




Poster session 3 Friday: Evidence Tools / Evidence synthesis - creation, publication and updating in the digital age


Friday 15 September 2017 - 12:30 to 14:00


All authors in correct order:

Pérez Rada D1, Zhang Y2, Etxeandia-Ikobaltzeta I2, Wiercioch W2, Canepa A1, Nieuwlaat R2, Couban R3, Schünemann H2, Rada G4
1 Epistemonikos Foundation, Chile
2 Department of Health Research Methods, Evidence, and Impact, McMaster University, Canada
3 McMaster University, Canada
4 Epistemonikos foundation; Centro Evidencia UC, Pontificia Universidad Católica de Chile, Chile
Presenting author and contact person

Presenting author:

Daniel Pérez Rada

Contact person:

Abstract text
Background:One of the possible solutions to address timeliness in the production of systematic reviews is parallelisation of tasks. However, in the context of large reviews, short times, or both, the capacity of parallelisation is limited. Even if more people can be recruited, it becomes difficult to manage the team centrally, to train new members and to keep quality control.
In the context of the development of a systematic review on patient values and preferences for the venous thromboembolism guidelines of the American Society of Hematology we tested Collaboratron ™, a new tool created to facilitate these tasks.

Objectives: To describe the experience of using an online tool aimed to coordinate the work of multiple reviewers with different levels of expertise for a large systematic review.

Methods: The systematic review addressed several questions, which were to be screened simultaneously.
Both teams designed workflows, and iterations, using agile methods, frequent prototypes, and quick testing these were transferred to the tool by the technology team.
It was also agreed that multiple screeners with different levels of expertise, so a calibration exercise was needed, and also some way of pairing experienced and less experienced reviewers.

Results: The designed workflow consisted on four questions: is relevant? Is it about VTE? Is it about values and preferences? Is it about acceptability or implementation? (See Figure 1).
Any record could be allocated to one of five mutually exclusive folders, and all of the questions could be answered yes/no/unclear.
All the screeners completed a calibration sample consisting of 50 records, followed by feedback from the central team.
Twenty-two people reviewed in duplicate 10,193 records, divided in two asymmetric groups (10/13 people) of experienced/less experienced screeners. The total time of screening was 19 days. There were 807 records in discrepancy which were resolved by two arbiters.

Conclusions: A flexible online tool and a close collaboration between the technological team and the guideline development team allowed to fulfill the needs of a large and complex project in a short timeframe.