The Embase project 2: crowdsourcing citation screening

ID: 

O 10.1

Session: 

Oral session 10: Filtering the information overload for better decisions

Date: 

Wednesday 7 October 2015 - 11:00 to 12:30

Location: 

All authors in correct order:

Noel-Storr A1, Dooley G2, Glanville J3, Foxlee R4
1 Cochrane Dementia and Cognitive Improvement Group, Oxford University, United Kingdom
2 Metaxis Ltd, United Kingdom
3 York Health Economics Consortium, United Kingdom
4 Cochrane Editorial Unit, United Kingdom
Presenting author and contact person

Presenting author:

Anna Noel-Storr

Contact person:

Abstract text
Background: The Embase project has been managed since April 2013 by a consortium made up of Metaxis Ltd, the Cochrane Dementia and Cognitive Improvement Group and the York Health Economics Consortium. It uses a novel crowdsourcing method to screen citations.
Objectives: To evaluate the effectiveness of using crowdsourcing to identify unique reports of randomised trials in Embase and to submit those reports to Cochrane’s Central Register of Controlled Trials (CENTRAL).
Methods: We recruited a crowd to screen the search results identified from the monthly sensitive searches run in Embase (via Ovid SP). Using a bespoke online citation screening tool the crowd classify citations as 'RCT/CCT', 'Reject', or 'Unsure'. Main outcome measures are performance of the crowd in terms of collective classification accuracy, quantity screened, timeliness, and screener recruitment, retention and engagement.
Results: To date (March 2015) over 950 people have signed-up to take part in the project and over 120,000 citations have been collectively screened by the crowd. The results for four independent validation studies (two completed; two ongoing) to assess crowd accuracy will be presented. The two completed validations show crowd sensitivity of 99.8 and 99.9, and specificity of 99.8 and 99.7. The reference standard used in both cases was determined by expert screeners.
Conclusions: This new approach to screening has brought significant efficiencies to trial identification. Crowdsourcing has proved to be both feasible in terms of recruitment and robust methodologically. It has meant that more trials have been identified more quickly, making CENTRAL a more valued repository of trial reports.