Background. Preventable diagnostic errors are a large burden on healthcare. Cognitive reasoning tools, that is, tools that aim to improve clinical reasoning, are commonly suggested interventions. However, quantitative estimates of tool effectiveness have been aggregated over both workplace-oriented and educational-oriented tools, leaving the impact of workplace-oriented cognitive reasoning tools alone unclear. This systematic review and meta-analysis aims to estimate the effect of cognitive reasoning tools on improving diagnostic performance among medical professionals and students, and to identify factors associated with larger improvements. Methods. Controlled experimental studies that assessed whether cognitive reasoning tools improved the diagnostic accuracy of individual medical students or professionals in a workplace setting were included. Embase.com, Medline ALL via Ovid, Web of Science Core Collection, Cochrane Central Register of Controlled Trials and Google Scholar were searched from inception to 15 October 2021, supplemented with handsearching. Meta-analysis was performed using a random- effects model. Results. The literature search resulted in 4546 articles of which 29 studies with data from 2732 participants were included for meta-analysis. The pooled estimate showed considerable heterogeneity (I2=70%). This was reduced to I2=38% by removing three studies that offered training with the tool before the intervention effect was measured. After removing these studies, the pooled estimate indicated that cognitive reasoning tools led to a small improvement in diagnostic accuracy (Hedges’g=0.20, 95% CI 0.10 to 0.29, p<0.001). There were no significant subgroup differences. Conclusion. Cognitive reasoning tools resulted in small but clinically important improvements in diagnostic accuracy in medical students and professionals, although no factors could be distinguished that resulted in larger improvements. Cognitive reasoning tools could be routinely implemented to improve diagnosis in practice, but going forward, more large-scale studies and evaluations of these tools in practice are needed to determine how these tools can be effectively implemented.
Commentaire du Dr Marius Laurent (PAQS)
- Le sujet de l’article est l’évaluation des interventions visant à parfaire la performance diagnostique en améliorant les facultés cognitives du praticien (check-lists, outils d’aide informatisés, formation au raisonnement dirigé, etc.), et non le contenu de ses connaissances. Le mérite de cette revue systématique et de la méta-analyse qui y est jointe est d’exclure les interventions cognitives qui cherchent à apprendre à des étudiants ou à des diagnosticiens débutants les bonnes pratiques du diagnostic, et de n’évaluer que les outils qui peuvent servir au chevet du patient dans la pratique clinique de tous les jours, contrairement à la revue de Lambe par exemple [1]. Le résultat n’est pas spectaculaire, mais est discrètement et significativement positif. L’étude ne permet pas d’identifier des populations pour lesquelles ces mesures seraient plus efficaces (en fonction de l’expérience clinique par exemple) ni quels types d’intervention conviennent le mieux. Le résultat est à la fois rassurant et décevant. Rassurant dans la mesure où il réconcilie les tenants des interventions cognitives avec ceux défendant l’importance du thésaurus de connaissances cliniques et de son exploitation, décevant comme l’est l’amplitude de l’effet, son hétérogénéité et la qualité de bien des études.
Staal J, Hooftman J, Gunput STG, et al. Effect on diagnostic accuracy of cognitive reasoning tools for the workplace setting : Systematic review and meta-analysis. BMJ Qual Saf 2022. Doi : 10.1136/bmjqs-2022-014865.
Note :
1- Lambe KA, O'Reilly G, Kelly BD, et al. Dual-process cognitive interventions to enhance diagnostic reasoning: A systematic review. BMJ Qual Saf 2016;25(10):808-820.