TY - JOUR
T1 - Utility of an Automated Radiology-Pathology Feedback Tool
AU - Doshi, Ankur M.
AU - Huang, Chenchan
AU - Melamud, Kira
AU - Shanbhogue, Krishna
AU - Slywotsky, Chrystia
AU - Taffel, Myles
AU - Moore, William
AU - Recht, Michael
AU - Kim, Danny
N1 - Publisher Copyright:
© 2019 American College of Radiology
PY - 2019/9
Y1 - 2019/9
N2 - Purpose: To determine the utility of an automated radiology-pathology feedback tool. Methods: We previously developed a tool that automatically provides radiologists with pathology results related to imaging examinations they interpreted. The tool also allows radiologists to mark the results as concordant or discordant. Five abdominal radiologists prospectively scored their own discordant results related to their previously interpreted abdominal ultrasound, CT, and MR interpretations between August 2017 and June 2018. Radiologists recorded whether they would have followed up on the case if there was no automated alert, reason for the discordance, whether the result required further action, prompted imaging rereview, influenced future interpretations, enhanced teaching files, or inspired a research idea. Results: There were 234 total discordances (range 30-66 per radiologist), and 70.5% (165 of 234) of discordances would not have been manually followed up in the absence of the automated tool. Reasons for discordances included missed findings (10.7%; 25 of 234), misinterpreted findings (29.1%; 68 of 234), possible biopsy sampling error (13.3%; 31 of 234), and limitations of imaging techniques (32.1%; 75/234). In addition, 4.7% (11 of 234) required further radiologist action, including report addenda or discussion with referrer or pathologist, and 93.2% (218 of 234) prompted radiologists to rereview the images. Radiologists reported that they learned from 88% (206 of 234) of discordances, 38.6% (90 of 233) of discordances probably or definitely influenced future interpretations, 55.6% (130 of 234) of discordances prompted the radiologist to add the case to his or her teaching files, and 13.7% (32 of 233) inspired a research idea. Conclusion: Automated pathology feedback provides a valuable opportunity for radiologists across experience levels to learn, increase their skill, and improve patient care.
AB - Purpose: To determine the utility of an automated radiology-pathology feedback tool. Methods: We previously developed a tool that automatically provides radiologists with pathology results related to imaging examinations they interpreted. The tool also allows radiologists to mark the results as concordant or discordant. Five abdominal radiologists prospectively scored their own discordant results related to their previously interpreted abdominal ultrasound, CT, and MR interpretations between August 2017 and June 2018. Radiologists recorded whether they would have followed up on the case if there was no automated alert, reason for the discordance, whether the result required further action, prompted imaging rereview, influenced future interpretations, enhanced teaching files, or inspired a research idea. Results: There were 234 total discordances (range 30-66 per radiologist), and 70.5% (165 of 234) of discordances would not have been manually followed up in the absence of the automated tool. Reasons for discordances included missed findings (10.7%; 25 of 234), misinterpreted findings (29.1%; 68 of 234), possible biopsy sampling error (13.3%; 31 of 234), and limitations of imaging techniques (32.1%; 75/234). In addition, 4.7% (11 of 234) required further radiologist action, including report addenda or discussion with referrer or pathologist, and 93.2% (218 of 234) prompted radiologists to rereview the images. Radiologists reported that they learned from 88% (206 of 234) of discordances, 38.6% (90 of 233) of discordances probably or definitely influenced future interpretations, 55.6% (130 of 234) of discordances prompted the radiologist to add the case to his or her teaching files, and 13.7% (32 of 233) inspired a research idea. Conclusion: Automated pathology feedback provides a valuable opportunity for radiologists across experience levels to learn, increase their skill, and improve patient care.
KW - Quality
KW - peer learning tools
KW - radiology-pathology correlation
KW - radiology-pathology discordance
UR - http://www.scopus.com/inward/record.url?scp=85065014770&partnerID=8YFLogxK
U2 - 10.1016/j.jacr.2019.03.001
DO - 10.1016/j.jacr.2019.03.001
M3 - Article
C2 - 31072775
AN - SCOPUS:85065014770
SN - 1558-349X
VL - 16
SP - 1211
EP - 1217
JO - Journal of the American College of Radiology
JF - Journal of the American College of Radiology
IS - 9
ER -