Is computer interpretation of “Normal ECG” reliable? (original) (raw)

Semi-automated ECG processing: A simple method to improve efficiency in ECG laboratories

Journal of Electrocardiology, 1975

A method to expedite processing of electrocardiograms (ECGs) is described. The hardware configuration utilizes conventional equipment, and the ECG data is stored in magnetic data cards. The electrocardiographer's interpretation is made using a specially developed code of 253 diagnostic statements of 2 to 9 words each. A minicomputer converts the code into full alphanumeric description and 2 characters into English statements. The diagnostic print-out appears in the same page as the reproduction of the original ECG data. This system has significantly reduced the ECG processing time, freed manpower to increase availability of technicians and decreased the size of permanent files.

three-lead) ECG computer programs Comparative evaluation of the IBM (12-lead) and Royal Infirmary (orthogonal

2010

A comparison of two computer programs for ECG interpretation was undertaken. Twelve-lead ECGs from 300 patients with various clinical abnormalities were interpreted at the National Institutes of Health using version 1 of the IBM program and corresponding orthogonal three-lead ECGs were analyzed by the Glasgow Royal Infirmary (GRI) program. Interpretations were compared with respect to the clinical documentation, wherever possible, and with each other directly in the case of diagnostic statements for which non-ECG documentation was not available. The two programs had a similar performance in determining abnormalities such as myocardial infarction and ventricular hypertrophy. However, with respect to conduction defects and ST-T-wave statements, certain discrepancies between the two program performances were revealed. There were 222 disagreements between various diagnostic statements. GRI was judged correct in 119 of these disagreements and IBM in 70. In these 189 cases the disagreement could most often be accounted for by different criteria and/or algorithms in the two programs or by the use of different ECG lead sets. The remaining 33 disagreements had to be classified as inconclusive.

Debatable issues in automated ECG Reporting

Journal of Electrocardiology

Four areas of topical interest in automated ECG reporting were debated at the 2017 International Society of Computerized Electrocardiology (ISCE) conference.  These included the value of automated ECG interpretation, real time ambulatory ECG monitoring, ECG imaging and single channel ECG rhythm interpretation  Advantages and disadvantages were elaborated in each of the four topics.

Assessment of the performance of electrocardiographic computer programs with the use of a reference data base

Circulation, 1985

To allow an exchange of measurements and criteria between different electrocardiographic (ECG) computer programs, an international cooperative project has been initiated aimed at standardization of computer-derived ECG measurements. To this end an ECG reference library of 250 ECGs with selected abnormalities was established and a comprehensive reviewing scheme was devised for the visual determination of the onsets and offsets of P, QRS, and T waves. This task was performed by a group of cardiologists on highly amplified, selected complexes from the library of ECGs. With use of a modified Delphi approach, individual outlying point estimates were eliminated in four successive rounds. In this way final referee estimates were obtained that proved to be highly reproducible and precise. This reference data base was used to study measurement results obtained with nine vectorcardiographic and 10 standard 12-lead ECG analysis programs. The medians of program determinations of P, QRS, and T w...

Diagnostic interpretation of electrocardiograms in population-based research: Computer program research physicians, or cardiologists?

Journal of Clinical Epidemiology, 1997

We assessed the performance of diagnostic electrocardiogram (ECG) interpretation by the computer program MEANS and by research physicians, compared to cardiologists, in a physician-based study. To establish a strategy for ECG interpretation in health surveys, we also studied the diagnostic capacity of three scenarios: use of the computer program alone (A), computer program and cardiologist (B), and computer program, research physician, and cardiologist (C). A stratified random sample of 381 ECGs was drawn from ECGs collected in the Rotterdam Study (n = 3057), w IC were interpreted both by a trained research physician h' h using a form for structured clinical evaluation and by MEANS. All ECG s were interpreted independently by two cardiologists; if they disagreed (n = 175) the ECG was judged by a third cardiologist. Five ECG diagnoses were considered: anterior and inferior myocardial infarction (MI), left and right bundle branch block (LBBB and RBBB), and left ventricular hypertrophy (LVH). Overall, sensitivities and specificities of MEANS and the research physicians were high. The sensitivity of MEANS ranged from 73.8% to 92.9% and of the research physician ranged from 71.8% to 96.9%. The specificity of MEANS ranged from 97.5% to 99.8% and of the research physician from 96.3% to 99.6%. To diagnose LVH, LBBB, and RBBB, use of the computer program alone gives satisfactory results. Preferably, all positive findings of anterior and inferior MI by the program should be verified by a cardiologist. We conclude that diagnostic ECG interpretation by computer can be very helpful in population-based research, being at least as good as ECG interpretation by a trained research physician, but much more efficient and therefore less expensive.

Automated ECG Interpretation—A Brief History from High Expectations to Deepest Networks

Hearts

This article traces the development of automated electrocardiography from its beginnings in Washington, DC around 1960 through to its current widespread application worldwide. Changes in the methodology of recording ECGs in analogue form using sizeable equipment through to digital recording, even in wearables, are included. Methods of analysis are considered from single lead to three leads to twelve leads. Some of the influential figures are mentioned while work undertaken locally is used to outline the progress of the technique mirrored in other centres. Applications of artificial intelligence are also considered so that the reader can find out how the field has been constantly evolving over the past 50 years.