What's the accuracy of crowdsourcing the screening of search results? Help Cochrane find out!

Person at home on laptop

Cochrane Crowd is a citizen science platform  where a global community of volunteers help to classify the research needed to support informed decision-making about healthcare. Cochrane Crowd volunteers review descriptions of research studies to identify and classify clinical trials.

 A new task has just gone live on Cochrane Crowd. It is a citation screening task that we are doing in partnership with The Healthcare Improvement Studies Institute (THIS Institute).

It forms a part of a methodological study that aims to assess the accuracy of crowdsourcing the screening of search results. Unlike some of the previous studies we’ve done, this one is a little bit different. Instead of asking you to assess a record for possible relevance, we want you to assess it for irrelevance! Our hypothesis is that a crowd can still make a big difference in weeding out the obviously irrelevant records, and that by framing the task in this way, we will reduce the chances of possibly relevant records being rejected.

New Year, New Task

Are you up for joining this task? If so, head to crowd.cochrane.org and log in. On your tasks page you should see a task called: Training for healthcare professionals in electronic fetal monitoring using cardiotocograph.



We are going to run this study as a randomised study. When you click on the training module, you will be randomised to one of three tasks. Each of the three tasks will look exactly the same. The difference between the three tasks is the agreement algorithm in the background. This algorithm provides a ‘final’ classification on a record based on a certain number and order of individual classifications made by contributors. We are testing three different agreement algorithms as part of this methodological study.

There is of course a training module. It should only take around 10-15 minutes to complete. Once done you will be able to screen some ‘real’ records. Do as many as you like. If you manage to do 250 or more, you will get named acknowledgement in any write-ups of this methods study and be able to download a certificate.


As always, this kind of work would not be possible without the help of this fantastic community. If you are able to take part, then thank you very much indeed from the teams at THIS Institute and Cochrane Crowd.

If you have any questions, please don’t hesitate to get in touch with me: anna.noel-storr@rdm.ox.ac.uk

With best wishes to all and happy citation screening!

Anna and Sarah

visit Cochrane Crowd

Friday, January 14, 2022