Diagnostic errors are estimated to account for 40,000–80,000 deaths annually in U.S. hospitals alone [1
]. These figures only partially account for patients whose ambulatory misdiagnoses lead to death, and they do not include nonlethal disability, which may be just as common as death [2
]. Tort claims for negligent diagnostic errors result in billions of dollars in payouts annually [2
]. Nearly 75% of all medical malpractice claims against radiologists are related to diagnostic errors [3
]. Every radiologist worries about missing a diagnosis or erring too heavily on the side of caution and giving a false-positive reading [4
Definition, Prevalence, and Impact of Diagnostic Errors
Diagnostic error has been defined as a diagnosis that is missed, wrong, or delayed as detected by some subsequent definitive test or finding [5
]. Here we use the terms “diagnostic error” and “misdiagnosis” interchangeably and do not distinguish between them. In radiology, the most common problems leading to medical malpractice lawsuits are due to failure to diagnose [6
]. This means oversight of abnormalities or misinterpretation of radiologic images [3
Errors in diagnostic radiology have long been recognized, beginning with the pioneering revelation of Garland [8
] in 1949. Multiple studies have identified suboptimal radiology processes as contributors to the overwhelming number of medical errors and escalating economic costs, which are estimated at more than $38 billion annually [9
]. Overall, approximately 30% of abnormal radiographic studies are missed. Approximately 4% of radiologic interpretations rendered by radiologists in daily practice contain errors [11
]. Quekel et al. [12
] found that 19% of lung cancers presenting as a nodule with a median diameter of 16 mm on chest radiographs were missed, and even higher rates between 25% and 90% have been reported in the literature [13
Mammography has been the standard of care for the detection of breast carcinoma. However, a misdiagnosis of breast cancer occurs in 4–30% of screening mammography studies according to multiple randomized controlled trials [16
]. Given that 38,294,403 mammography studies were performed annually in the United States as of 2013, it is evident why radiologic misdiagnosis is an important public health issue [18
]. Moreover, screening mammography also results in overdiagnosis in 1–54% of cases, which represents the false-positive findings that would not have become symptomatic during a woman's lifetime if no screening had taken place [19
With radiologic diagnostic testing, as in laboratory medicine [20
], diagnostic errors may result from failures related to test ordering before a radiologist is ever involved or in the ordering clinician's use of the results after the radiologist's work is complete. Diagnostic errors attributed to radiologists have been grouped as related to failures in detection, interpretation, communication of results, or suggesting an appropriate follow-up test [6
Despite the high prevalence and serious consequences of diagnostic errors, until recently, they have received relatively little attention. For example, a text search of the 1999 Institute of Medicine (IOM) report To Err is Human
], which focused on the importance of medical error, found the term “diagnostic errors” mentioned only twice compared with 70 times for “medication errors” [21
The Cause of Error in Radiology: System-Related Causal Factors and Cognitive-Perceptual Causal Factors
Diagnostic error in internal medicine is commonly multifactorial in origin, typically secondary to a mix of cognitive and system factors [22
]. In radiology, cognitive errors (e.g., a missed lung nodule when interpreting a chest radiograph) are usually linked to problems of visual perception (scanning, recognition, interpretation). System errors (e.g., failure to communicate the presence of a nodule to the ordering physician) are usually linked to problems with the health system or context of care delivery. As with general medical diagnosis, errors often result from a combination or interaction between the two (e.g., night-staffed preliminary reports by resident radiologists that are altered in a final report but not fully communicated to caregivers) [23
]. As described later in this article, certain system factors (e.g., lighting conditions, shift length, pace of reading required) have a profound effect on the likelihood of cognitive diagnostic errors in radiology.
Cognitive Errors in Radiology
The dual-process theory of reasoning has emerged as the dominant theoretic model for cognitive processing during human decision making in real-world settings [24
]. This model proposes two general classes of cognitive operations and suggests causal explanations of where and how diagnostic errors occur in clinical reasoning [25
]. Early in diagnosis, radiologists must assess the features of an imaging finding for pattern recognition. If the condition is recognized, so-called “type 1” (automatic) processes will rapidly and effortlessly make the diagnosis and nothing further may be required. If it is not, then linear, analytical, deliberate, and effortful “type 2” processes are engaged instead. Dynamic oscillation may occur between the two systems throughout the decision-making process. Certain types of errors are prone to occur when type 1 processes are used because mental shortcuts (heuristics) used in this type of cognitive processing are particularly susceptible to human biases [25
], which will be further described later. Errors occurring during type 2 processes are believed to be less frequent in everyday practice but may be no less consequential [25
]. These cognitive processes are also impacted by internal (e.g., fatigue, stress) and external (e.g., lighting) factors.
Graber et al. [22
] showed that cognitive factors contribute to the diagnostic error in 74% of cases. Cognitive errors include faulty perception, failed heuristics, and biases. We rely on these shortcuts in reasoning to minimize delay, cost, and anxiety in our clinical decision making. Over the past three decades, the cognitive evolution in psychology literature has given rise to extensive literature on cognitive bias in decision making. Cognitive bias is best defined as a replicable pattern in perceptual distortion, inaccurate judgment, and illogical interpretation [26
]. Cognitive biases are the result of psychologic distortions in the human mind, which persistently lead to the same pattern of poor judgment, often triggered by a particular situation. Some authors suggest that metacognition (thinking about thinking) may enable us to avoid being trapped by these cognitive biases using deliberate type 2 cognitive forcing strategies [27
]. Rather than eliminating these cognitive shortcuts that serve us well most of the time, we might be better served by recognizing the potential diagnostic dangers that arise from specific shortcuts and overriding them when appropriate.
Dozens of cognitive biases have been described [29
]. Some of these biases likely play only a small role in radiology diagnostic error (e.g., certain emotional biases associated with direct patient interaction) [30
]. On the basis of a review of recent literature, we identified five cognitive biases particularly likely to lead to diagnostic errors in radiology (anchoring, framing, search satisfication, premature closure, and multiple alternative bias) and potential metacognitive strategies to reduce them [27
Anchoring is relying on an initial impression and failing to adjust this impression in light of subsequent information [32
]. For example, in a patient with multiple sclerosis who develops a new enhancing brain lesion seen on MRI, the most appropriate diagnosis might be another demyelinating plaque. A repeat image a week later showing additional enhancing lesions might be dismissed as more demyelinating plaques during a multiple sclerosis exacerbation without a closer inspection that might identify features suggestive of CNS lymphoma. Anchoring is particularly dangerous when combined with confirmation bias, when the radiologist seeks confirming evidence to support the hypothesis rather than contradictory evidence to refute it [29
Corrective strategy: Avoid early guesses; seek to disprove the initial diagnosis, rather than just confirm it (or seek disconfirming information rather than confirmatory information); when findings are worsening, reconsider the diagnosis or get a second opinion.
Framing Bias or Effect
Framing is being strongly influenced by subtle ways in which the problem is worded or framed [32
]. For example, a radiologist detects multiple foci of abnormal activity in bilateral ribs on a bone scan in a frail elderly patient. If a truncated clinical indication states, “history of weight loss, chest pain,” this finding might be erroneously interpreted as strongly suggesting metastatic lesions. If the full indication, “history of weight loss, chest pain after recent fall down stairs” were provided, a diagnosis of multiple rib fractures would be made. Because radiologists rely on abridged clinical details, framing may be a major contributor to diagnostic error in radiology.
Corrective strategy: Masked read before reviewing clinical indication; seek more clinical information from treating physicians when image interpretation is tightly coupled with clinical context or abnormal findings are likely to alter management.
Availability is the tendency to consider diagnoses more likely if they readily come to mind. For example, if a radiologist missed lung cancer on a chest radiograph, he or she is more likely to overcall suspected lung nodules on subsequent chest radiographs despite the low likelihood. Radiologists should be aware of the tendency to overestimate the frequency of previously missed, unusual, or otherwise memorable cases.
Corrective strategy: Obtain and use objective information to estimate the true base rate of a diagnosis; benchmark diagnostic performance against peers (e.g., for screening mammography, radiologists should enroll in the American College of Radiology National Mammography Data Registry to compare their recall rate, cancer detection rate, and positive predictive value with the established local, regional, and national benchmarks) [33
Search Satisficing (Satisfaction of Search)
Search satisficing is the tendency to stop a search for abnormality once one diagnosis that is evaluated as likely is found. For example, when a brain mass is identified on CT in a patient with headache, a radiologist might miss ethmoid or sphenoid sinus consolidation (especially if the radiologist does not know that the brain tumor diagnosis is old or that the patient has a fever).
Corrective strategy: Use a checklist or algorithmic approach to insure a systematic search, particularly for “do-not-miss” diagnoses [35
]; always commence a secondary search after the first search has been completed; be mindful of known combinations (e.g., multiple foreign bodies, multiple fractures or contusions, or infarction or vascular occlusion).
Premature closure is the tendency to accept a diagnosis before full verification. For example, in a patient with myasthenia gravis and a homogeneous mediastinal mass seen on chest CT, a diagnosis of thymoma might be made, even though thymic hyperplasia, lymphoma, and germ cell tumors remain on the differential diagnosis. A general limitation of imaging diagnoses is that pathologic diagnoses are inferred and not confirmed until tissue pathology is obtained.
Corrective strategy: Always generate a differential diagnosis (use checklists for common lesion differentials); never convert a working diagnosis to a final diagnosis before full (pathologic) verification.
System-Related Error in Radiology
In internal medicine, system-related factors contribute to diagnostic error in 65% of cases [22
]. The vast majority of system-related error in these cases relates to problems with policies and procedures, inefficient processes, teamwork, communication, and technical and equipment failures [22
]. Factors such as equipment failures and the methods of communicating dangerous radiographic findings to treating clinicians influence the likelihood of radiographic diagnostic error from a patient perspective. However, our focus is primarily on those factors that affect the likelihood of diagnostic error by the radiology provider. System issues, such as lighting conditions, shift length and timing, task repetitiveness, pace of reading images, and environmental distractions, all may impact the psychophysical process of visual diagnosis. Many of these issues ultimately exert their effects through visual and mental fatigue for radiologists [36
Fatigue is a subcategory of system-related error in radiology because health care providers are constantly required to deliver quality patient care while under the stress of disrupted circadian rhythms. Although many other system issues coexist and contribute to misdiagnosis, we choose fatigue as the primary example for this discussion because it is a well-studied field. As medical reimbursement continues to trend downward, radiologists attempt to compensate by undertaking additional responsibilities and increasing organizational productivity. The increased workload and rising quality expectations, poor communication, cognitive biases, and imperfect information systems serve as major sources of fatigue, often leading to diagnostic errors [37
]. Despite continuously evolving technology refinement and development, the current medical imaging system has developed as a one-size-fits-all model with relative inflexibility, which can impede workflow and productivity as well as cause end-user fatigue [36
]. As imaging volume and complexity continue to grow over time, the impact of visual fatigue on diagnostic accuracy is becoming increasingly important [38
Krupinski et al. [39
] studied the direct impact of fatigue using fractures on skeletal radiographs as the detection task. There is a significant reduction in diagnostic accuracy after the day of work (p
< 0.05) with associated increasing myopia. As expected, subjective ratings of physical discomfort, eye strain, and lack of motivation also increase by the end of the workday. Interestingly, residents suffered greater effects of fatigue on all measures compared with attending radiologists [39
]. The effects of visual fatigue seen with static radiographs also seem to apply to cross-sectional imaging examinations, which are displayed dynamically [40
]. After a work shift, radiologists have increased variability in their ocular convergence capabilities, indicating increased oculomotor strain and visual fatigue. Detection accuracy for pulmonary nodules was reduced on dynamically displayed CT images in the resident group only, with no significant effect on the attending physicians. Interestingly, this difference between the residents and attending physicians was also previously shown in the fracture detection study. Accommodative relaxation (shifting the focal point from near to far or vice versa) is effective in reducing visual fatigue. In fact, a radiologist can even become more resistant to visual fatigue by undergoing automated accommodative training [36
Decision (Mental) Fatigue
Radiologists also experience decision fatigue as a consequence of continuous and prolonged decision making [43
]. Decision fatigue is thought to increase later in the day or after the work shift when cognitive processes respond to mental strain by taking short cuts, leading to poor judgment and diagnostic errors [37
]. Those working prolonged shifts, off hours and with high-volume or high-complexity tasks are at the greatest risk [43
]. In particular, one of the most vulnerable populations is radiology residents who provide preliminary interpretations independently during off hours [43
Diagnostic errors are underrecognized and underappreciated in radiology practice because of the inability to obtain reliable national estimates of the impact, difficulty in evaluating effectiveness of potential interventions, and poor response to systemwide solutions. Most clinical work is executed through type 1 processes to minimize cost, anxiety, and delay; however, type 1 processes are also vulnerable to errors. Instead of trying to completely eliminate cognitive shortcuts that serve us well most of the time, becoming aware of common biases and using meta-cognitive strategies to mitigate their effects have the potential to create sustainable improvement in diagnostic errors.
For diagnostic errors to receive the resources and attention they deserve in the field of patient safety, multiple approaches are required. First, we need the methodology to accurately measure diagnostic errors, thereby evaluating the effectiveness of potential interventions. Second, we need to encourage research in the basic science of diagnostic errors to better understand why we make mistakes and how we can prevent them. Third, we need to maximize and refine available information technology tools, such as computer-aided detection. Finally, training programs should include diagnostic reasoning; board certification organizations also need to emphasize key elements of diagnostic accuracy in the licensing process. In summary, health information technology, improved education, and increasing acknowledgment of diagnostic errors hold promise in error reduction.