Nous avons présenté un excellent travail montrant 'efficacité du peer review dans nos comptes rendus du congrès de Chicago (septembre 2013). Nous attendions la publication, et elle a été mise en ligne par TheBmj le 1 juillet 2014 avec le titre ci-dessous. Félicitons Isabelle Boutron pour sa collaboration à ce travail. L'article contient un peu plus de données que la présentation orale à Chicago, normal ! La méthode est originale, car il s'agissait de comparer des manuscrits avant et après peer-review.
Impact of peer review on reports of randomised trials published in open peer review journals: retrospective before and after study
What is already known on this topic
-
Despite the widespread use of peer review little is known about its impact on the quality of reporting of published research articles
-
Inadequacies in the methodology and reporting of research is widely recognised
-
Substantial uncertainty exists about the peer review process as a mechanism to improve reporting of the scientific literature
What this study adds
-
Peer reviewers often fail to detect important deficiencies in the reporting of the methods and results of randomised trials
-
Peer reviewers requested relatively few changes for reporting of trial methods and results
-
Most requests had a positive impact on reporting but in some instances the requested changes could have a negative impact
Et ci-dessous les résultats du résumé :
Of the 93 trial reports, 38% (n=35) did not describe the method of random sequence generation, 54% (n=50) concealment of allocation sequence, 50% (n=46) whether the study was blinded, 34% (n=32) the sample size calculation, 35% (n=33) specification of primary and secondary outcomes, 55% (n=51) results for the primary outcome, and 90% (n=84) details of the trial protocol. The number of changes between manuscript versions was relatively small; most involved adding new information or altering existing information. Most changes requested by peer reviewers had a positive impact on the reporting of the final manuscript—for example, adding or clarifying randomisation and blinding (n=27), sample size (n=15), primary and secondary outcomes (n=16), results for primary or secondary outcomes (n=14), and toning down conclusions to reflect the results (n=27). Some changes requested by peer reviewers, however, had a negative impact, such as adding additional unplanned analyses (n=15).