TY - JOUR
T1 - Quantitating and assessing interoperability between electronic health records
AU - Bernstam, Elmer V.
AU - Warner, Jeremy L.
AU - Krauss, John C.
AU - Ambinder, Edward
AU - Rubinstein, Wendy S.
AU - Komatsoulis, George
AU - Miller, Robert S.
AU - Chen, James L.
N1 - Publisher Copyright:
© 2021 The Author(s) 2022. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: [email protected].
PY - 2022/5/1
Y1 - 2022/5/1
N2 - Objectives: Electronic health records (EHRs) contain a large quantity of machine-readable data. However, institutions choose different EHR vendors, and the same product may be implemented differently at different sites. Our goal was to quantify the interoperability of real-world EHR implementations with respect to clinically relevant structured data. Materials and Methods: We analyzed de-identified and aggregated data from 68 oncology sites that implemented 1 of 5 EHR vendor products. Using 6 medications and 6 laboratory tests for which well-accepted standards exist, we calculated inter- and intra-EHR vendor interoperability scores. Results: The mean intra-EHR vendor interoperability score was 0.68 as compared to a mean of 0.22 for inter-system interoperability, when weighted by number of systems of each type, and 0.57 and 0.20 when not weighting by number of systems of each type. Discussion: In contrast to data elements required for successful billing, clinically relevant data elements are rarely standardized, even though applicable standards exist. We chose a representative sample of laboratory tests and medications for oncology practices, but our set of data elements should be seen as an example, rather than a definitive list. Conclusions: We defined and demonstrated a quantitative measure of interoperability between site EHR systems and within/between implemented vendor systems. Two sites that share the same vendor are, on average, more interoperable. However, even for implementation of the same EHR product, interoperability is not guaranteed. Our results can inform institutional EHR selection, analysis, and optimization for interoperability.
AB - Objectives: Electronic health records (EHRs) contain a large quantity of machine-readable data. However, institutions choose different EHR vendors, and the same product may be implemented differently at different sites. Our goal was to quantify the interoperability of real-world EHR implementations with respect to clinically relevant structured data. Materials and Methods: We analyzed de-identified and aggregated data from 68 oncology sites that implemented 1 of 5 EHR vendor products. Using 6 medications and 6 laboratory tests for which well-accepted standards exist, we calculated inter- and intra-EHR vendor interoperability scores. Results: The mean intra-EHR vendor interoperability score was 0.68 as compared to a mean of 0.22 for inter-system interoperability, when weighted by number of systems of each type, and 0.57 and 0.20 when not weighting by number of systems of each type. Discussion: In contrast to data elements required for successful billing, clinically relevant data elements are rarely standardized, even though applicable standards exist. We chose a representative sample of laboratory tests and medications for oncology practices, but our set of data elements should be seen as an example, rather than a definitive list. Conclusions: We defined and demonstrated a quantitative measure of interoperability between site EHR systems and within/between implemented vendor systems. Two sites that share the same vendor are, on average, more interoperable. However, even for implementation of the same EHR product, interoperability is not guaranteed. Our results can inform institutional EHR selection, analysis, and optimization for interoperability.
KW - common data elements
KW - data aggregation
KW - data management
KW - data warehousing
KW - electronic health records
KW - information storage and retrieval
UR - http://www.scopus.com/inward/record.url?scp=85128489072&partnerID=8YFLogxK
U2 - 10.1093/jamia/ocab289
DO - 10.1093/jamia/ocab289
M3 - Article
C2 - 35015861
AN - SCOPUS:85128489072
SN - 1067-5027
VL - 29
SP - 753
EP - 760
JO - Journal of the American Medical Informatics Association : JAMIA
JF - Journal of the American Medical Informatics Association : JAMIA
IS - 5
ER -