February 2019, VOLUME 212
NUMBER 2

Recommend & Share

February 2019, Volume 212, Number 2

Health Care Policy and Quality

Original Research

Effect of Clinical Decision Support–Generated Report Cards Versus Real-Time Alerts on Primary Care Provider Guideline Adherence for Low Back Pain Outpatient Lumbar Spine MRI Orders

+ Affiliations:
1Department of Radiology, Hospital of the University of Pennsylvania, 3400 Spruce St, Silverstein 1, Philadelphia, PA 19104.

2Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, MA.

3Department of Radiology, Brigham nd Women's Hospital, Harvard Medical School, Boston, MA.

4Department of Emergency Medicine, Columbia University Vagelos College of Physicians and Surgeons, New York, NY.

5Department of Emergency Medicine, Massachusetts General Hospital, Boston, MA.

6Department of Radiology, Stanford Hospital and Clinic, Stanford University Medical Center, Stanford, CA.

Citation: American Journal of Roentgenology. 2019;212: 386-394. 10.2214/AJR.18.19780

ABSTRACT
Next section

OBJECTIVE. The purpose of this study is to determine whether the type of feedback on evidence-based guideline adherence influences adult primary care provider (PCP) lumbar spine (LS) MRI orders for low back pain (LBP).

MATERIALS AND METHODS. Four types of guideline adherence feedback were tested on eight tertiary health care system outpatient PCP practices: no feedback during baseline (March 1, 2012–October 4, 2012), randomization by practice to either clinical decision support (CDS)–generated report cards comparing providers to peers only or real-time CDS alerts at order entry during intervention 1 (February 6, 2013–December 31, 2013), and both feedback types for all practices during intervention 2 (January 14, 2014–June 20, 2014, and September 4, 2014–January 21, 2015). International Classification of Disease codes identified LBP visits (excluding Medicare fee-for-service). The primary outcome of the likelihood of LS MRI order being made on the day of or 1–30 days after the outpatient LBP visit was adjusted by feedback type (none, report cards only, real-time alerts only, or both); patient age, sex, race, and insurance status; and provider sex and experience.

RESULTS. Half of PCPs (54/108) remained for all three periods, conducting 9394 of 107,938 (8.7%) outpatient LBP visits. The proportion of LBP visits increased over the course of the study (p = 0.0001). In multilevel hierarchic regression, report cards resulted in a lower likelihood of LS MRI orders made the day of and 1–30 days after the visit versus baseline: 38% (p = 0.009) and 37% (p = 0.006) for report cards alone, and 27% (p = 0.020) and 27% (p = 0.016) with alerts, respectively. Real-time alerts alone did not affect MRI orders made the day of (p = 0.585) or 1–30 days after (p = 0.650) the visit. No patient or provider variables were associated with LS MRI orders being generated on the day of or 1–30 days after the LBP visit.

CONCLUSION. CDS-generated evidence-based report cards can substantially reduce outpatient PCP LS MRI orders on the day of and 1–30 days after the LBP visit. Real-time CDS alerts do not.

Keywords: clinical decision support, evidence-based guideline adherence feedback, low back pain, lumbar spine MRI

Low back pain (LBP) is a common cause of outpatient physician visits in the United States; more than half of such visits are with primary care providers (PCPs) [1]. Total household medical expenditures among patients with self-reported spine problems increased by 65% between 1997 and 2005, with no corresponding improvement in health status [2]. Advanced imaging with lumbar spine (LS) MRI for patients with nonspecific LBP (i.e., no signs and symptoms of serious underlying disease, radiculopathy, saddle anesthesia, or fever) is inconsistent with high-value care [3]. Randomized controlled studies have found no improvement in outcomes between patients with nonspecific LBP who undergo early imaging versus those who receive conservative treatment without imaging [3]. However, substantial variation persists in diagnostic testing for LBP by provider, specialty, and geography [4, 5].

In 2007, the American College of Physicians (ACP) and American Pain Society (APS) issued an evidence-based guideline to optimize LS MRI use for LBP [3]. This guideline is supported by the American Association of Family Physicians and the American Board of Internal Medicine's Choosing Wisely initiative to improve quality of care through reducing unnecessary testing, treatment, and procedures [6]. It also meets seven of eight Institute of Medicine standards for trustworthiness [7]. Imaging recommendations in the ACP-APS guideline are concordant with the American College of Radiology's appropriateness criteria for LBP [8] and also include a range of pharmacologic and nonpharmacologic therapies for LBP.

Imaging clinical decision support (CDS) combines guideline evidence with pertinent clinical history to provide brief, automated, and actionable feedback that can optimize provider decisions and management [9]. Imaging CDS can be embedded into computerized physician order entry (CPOE) system to provide multiple feedback forms, including real-time feedback at order entry and data from which report cards comparing individual provider guideline adherence to all providers can be generated. Combined, these last two forms of feedback, which are based on the ACP-APS guideline, reduced potentially inappropriate LS MRI among outpatient PCP visits for LBP at a single health care system [10]. Our objective was to determine whether the specific type of guideline adherence feedback influences the likelihood of LS MRI orders placed the day of and up to 30 days after an adult outpatient PCP visit for LBP at a different health system.

Materials and Methods
Previous sectionNext section
Study Design Overview

The requirement for informed consent was waived for this HIPAA-compliant cohort study, which was approved by the institutional review board of the University of Pennsylvania. We tested four types of PCP feedback on adherence to the ACP-APS guideline [3] for LS MRI orders made the day of an adult outpatient LBP visit using clinical data entered into the CDS at the time of order entry: no feedback, periodic provider report cards, real-time CDS alerts at the time of CPOE, and both report cards and real-time CDS alerts. Eight outpatient PCP practices were randomized into two groups (group 1 and group 2), and the study was conducted over three periods (Table 1).

TABLE 1: Trial Design, by Study Period, Dates, Provider Group, and Type of Clinical Decision Support (CDS)–Generated Evidence-Based Feedback

In all three periods, at the time of LS MRI order, PCPs were directed to a CDS embedded in the CPOE system and were prompted to enter the relevant patient history. On the basis of these data, orders were categorized as guideline adherent, non-adherent, uncertain, or not covered by guidelines. During the baseline period (March 1, 2012–October 4, 2012), CDS categorized the orders silently with no feedback. During intervention period 1 (February 6, 2013–December 31, 2013), group 1 PCPs received only periodic report cards on guideline adherence and group 2 received only real-time CDS alerts on adherence at CPOE. During intervention period 2 (January 14, 2014–June 20, 2014 and September 4, 2014–January 21, 2015), PCPs in both groups received both periodic report cards and real-time CDS alerts.

Setting and Participants

The study was performed in a tertiary academic health system with eight PCP practices conducting approximately 142,000 annual adult outpatient visits (Table S1 can be viewed in the AJR electronic supplement to this article, available at www.ajronline.org). Three PCP practices are nonteaching (without residents or fellows), four are affiliated with the main academic hospital, three are affiliated with a community hospital, and one is part of a multispecialty stand-alone office.

We included all providers authorized to finalize LS MRI orders, including physicians and nurse practitioners. Medicare fee-for-service visits were excluded because these patients participated in an overlapping study [11]. We identified outpatient visits for LBP from among all outpatient visits using International Classification of Diseases Ninth Revision codes [10]. Because our electronic health record (EHR) does not rank order diagnosis codes, we included visits with any International Classification of Diseases Ninth Revision code corresponding to LBP. Only two PCPs practiced in more than one location; these providers were assigned to the group where they spent most of their clinical time. To ensure that data reflected the full effect of feedback, we included only providers who remained in the study during all three periods in the final analysis.

Evidence Base for Feedback

The ACP-APS guideline was the evidence base for both real-time CDS alerts and periodic report cards. We selected this guideline because of its endorsement by the professional societies of the PCPs in the study [6] and its compliance with almost all Institute of Medicine standards of guideline trustworthiness.

Interventions

The CDS algorithm determined whether a study was adherent, nonadherent, uncertain, or not covered by the guideline according to the data entered at the time of LS MRI CPOE (Fig. 1). We included finalized orders and the earliest order, in cases of multiple orders on the same day. Because of systemwide EHR updates, the CDS was nonoperational from October 6, 2012, through February 5, 2013, and from June 21, 2014, through September 3, 2014. Report cards were sent every 4–6 months to group 1 providers during intervention period 1 and to all providers during intervention period 2 (Fig. 2).

figure
View larger version (123K)

Fig. 1 —Screen shot of patient history entry within clinical decision support system.

figure
View larger version (206K)

Fig. 2A —Sample of provider report cards.

A, Images show sample of provider report card letter (A) and sample of provider report card graphic (B). L = lumbar, LS = lumbar spine, IRB = institutional review board, NIBIB = National Institute for Biomedical Imaging and Bioengineering, MID = Medicare Imaging Demonstration, NPI = National Provider Identifier.

figure
View larger version (51K)

Fig. 2B —Sample of provider report cards.

B, Images show sample of provider report card letter (A) and sample of provider report card graphic (B). L = lumbar, LS = lumbar spine, IRB = institutional review board, NIBIB = National Institute for Biomedical Imaging and Bioengineering, MID = Medicare Imaging Demonstration, NPI = National Provider Identifier.

Report cards compared the number and proportion of LS MRI orders that were adherent, nonadherent, uncertain, or not covered for that individual provider to aggregate all-provider data. To provide clinical context, each provider's report card included the proportion of adherent and nonadherent LS MRI orders relative to all their LS MRI orders and the proportion of LBP visits with finalized LS MRI orders among all their visits. Real-time CDS alerts, including a hyperlink to the guidelines, was given with every finalized LS MRI order to group 2 providers during intervention period 1 and to all providers during intervention period 2 (Fig. 3). Feedback was not a hard stop; providers could finalize LS MRI orders categorized as nonadherent.

figure
View larger version (83K)

Fig. 3 —Screen shot of real-time clinical decision support system alert determination based on American College of Physicians and American Pain Society guidelines. DSS = decision support system.

Outcomes

The primary outcome was the likelihood of a finalized LS MRI order placed by the PCP the day of an outpatient LBP visit, adjusted by feedback type (none, report card only, real-time CDS alerts only, or both report card and CDS alert); patient age, sex, race, and insurance status; and provider sex and experience. To account for potential increases in delayed LS MRI orders due to the interventions, we also evaluated the likelihood of a finalized LS MRI order placed by any provider within our health system 1–30 days after the PCP visit, adjusted for the same factors. Although we captured data on guideline adherence, this was not the objective of our study.

Covariates

Patient sex, age (> 60 or ≤ 60 years), race (white, black, or other), and health insurance information were obtained from the EHR. Medicare patients without primary fee-for-service coverage and Medicaid patients were collapsed into one category: government. The commercial category included private insurance, and uninsured patients were included in the other category. Provider sex and years of experience were obtained from our health system Department of Medical Affairs.

Statistical Analysis

Assuming that 4% of the 4000 monthly out-patient visits across all practices would be for LBP (i.e., 160 LBP visits/month), a minimum of 8 months was required for each study interval to provide 80% power to detect a difference in LS MRI orders between 4% and 6.5% [10]. Differences in experience between included and excluded providers were analyzed using a t test, assuming unequal sample variance, and differences in the proportion of included and excluded male providers were analyzed using Pearson chi-square analysis. Univariate analysis with Pearson chi-square analysis was also used to test for differences in the proportion of outpatient LBP visits with finalized LS MRI orders made the same day and those made 1–30 days after the visit by feedback type and patient age, sex, race, and payor. A three-level hierarchic regression was performed to model the likelihood that an outpatient LBP visit was associated with a finalized LS MRI order made the day of or 1–30 days after the visit, adjusted for feedback type; patient age, sex, race, and insurance; and provider sex and experience. A random intercept was used for providers and for hospital affiliation, and doctors were nested within practices. Specialty was highly correlated with hospital affiliation (i.e., family practice was more likely to be practicing in the community than internal medicine) as was teaching status (i.e., practices affiliated with the main hospital were more likely to be teaching than nonteaching); therefore, specialty and teaching status were not included in the model. All tests were two-sided with significance level at α = 0.05. Practice type characteristics (e.g., teaching or nonteaching) were not included in the regression because of the sample size. All analyses were performed using JMP Pro (version 12.0, SAS Institute) and R (version 12.0, R Foundation for Statistical Computing).

Results
Previous sectionNext section
Providers and Visits

Fifty-four of 108 PCPs (50.0%), 28 providers in group 1 and 26 providers in group 2, remained in the study for all three periods. Provider details by study period are in Table S1. The mean (± SD) number of years in practice was higher for included than excluded providers (18.7 ± 9.2 vs 13.1 ± 11.6 years; p = 0.007). There was no significant difference in the proportion of included and excluded male providers (27/54 [50.0%] vs 18/54 [33.3%]; p = 0.08).

These 54 providers conducted 107,938 out-patient visits for 39,284 unique patients; 9394 visits (8.7%) by 5398 patients were for LBP. The proportion of LBP visits among all visits varied by period but increased during the study, from 8.2% (2513/30,580) during baseline to 9.1% (3760/41,523) during intervention 1 and 8.7% (3121/35,835) during intervention 2 (p = 0.0001 overall; p = 0.024 baseline vs intervention 2). Demographic data for patients with LBP visits were similar to those with non-LBP visits (Table S2 can be viewed in the AJR electronic supplement to this article, available at www.ajronline.org). A higher proportion of LBP visits were observed among black patients (5147/9394; 54.7%), compared with white patients (3490/9394; 37.1%) or those of other ethnicities (757/9394; 8.1%). The proportion of LBP visits among all PCP patient visits varied by provider from 3.1% to 16.8% throughout the study (Fig. 4).

figure
View larger version (45K)

Fig. 4 —Distribution of low back pain (LBP) visits among all primary care provider visits throughout trial.

Lumbar Spine MRI Ordering

Univariate analysis revealed a significantly lower proportion of LS MRI orders placed on the day of the LBP visit by the type of feedback on evidence-based guideline adherence (p = 0.006) (Table 2). A lower proportion of orders was observed after receipt of CDS-generated periodic report cards only (2.9%) or both report cards and real-time CDS alerts at order entry (3.6%), compared with baseline (4.7%) when no feedback was received. No differences in LS MRI orders were found between baseline and real-time CDS alerts only (4.7%). A similar trend was found in the proportion of LS MRI orders placed 1–30 days after the LBP visit by feedback type (p = 0.003): a lower proportion of LS MRI orders was placed by providers who received CDS-generated periodic report cards only (3.4%) or both report cards and real-time CDS alerts (4.1%), compared with baseline (5.4%), with no difference between real-time CDS alerts only (5.3%) and baseline. There was also a significantly lower proportion of LS MRI orders placed between black patients and white or other race patients both on the day of the LBP visit (p = < 0.0001) and 1–30 days after the visit (p = < 0.0001). LS MRI orders were lower among women 1–30 days after the visit (p = 0.033) and among patients with government insurance relative to commercial or other insurance both on the day of (p = 0.009) and 1–30 days after (p = 0.008) the LBP visit.

TABLE 2: Proportion of Lumbar Spine MRI Orders Made the Day of and 1–30 Days After Low Back Pain (LBP) Visit Among All LBP Visits, by Feedback Type and Patient Demographics

On multilevel hierarchic regression analysis, holding all factors equal, there was a 38.0% lower likelihood that an LS MRI would be ordered on the day of the LBP visit by providers who received CDS-generated periodic report cards only relative to no feedback during baseline (p = 0.009), and a 27% lower likelihood after receiving both CDS-generated periodic report cards and real-time CDS alerts (p = 0.020) (Table 3). Real-time CDS alerts alone were not associated with any change in day-of LS MRI orders relative to baseline (p = 0.585). No patient or provider variables were associated with LS MRI orders on the day of the LBP visit.

TABLE 3: Multivariate Analysis of Likelihood of Lumbar Spine MRI Order Made on Day of Low Back Pain Visit, by Feedback Type and Patient and Provider Demographics

The same findings persisted up to 30 days after the LBP visit (Table 4). Holding all factors equal, there was a 37.0% lower likelihood that an LS MRI order would be placed 1–30 days after an LBP visit by providers who received CDS-generated periodic report cards only relative to no feedback during baseline (p = 0.006) and a 27.0% lower likelihood after receiving both CDS-generated periodic report cards and real-time CDS alerts (p = 0.016). There was no associated change in orders made 1–30 days after the visit with real-time CDS alerts only, relative to baseline (p = 0.650). No patient or provider variables were associated with LS MRI orders made on the day of the LBP visit.

TABLE 4: Multivariate Analysis of Likelihood of MRI Order 1–30 Days After Low Back Pain Visit, by Feedback Type and Patient and Provider Demographics
Discussion
Previous sectionNext section

High-value cost-conscious care must decrease or eliminate care that provides no benefit and may even be harmful [12]; this includes advanced imaging with LS MRI for patients with nonspecific LBP [13]. Research on physician education suggests that learning is promoted by three factors: specific knowledge transmission (e.g., just-in-time scientific evidence), reflective practice (e.g., provider feedback), and a supportive environment [14]. We found that, despite an increase in the proportion of adult outpatient PCP LBP visits over nearly 3 years, periodic provider report cards on evidence-based guideline adherence using data entered into a CDS system decreased the likelihood of LS MRI orders placed on the day of and 1–30 days after the LBP visit compared with no feedback; this was true when report cards were used either in isolation or combined with real-time CDS alerts at the time of CPOE. In contrast, isolated real-time CDS alerts on guideline adherence did not affect LS MRI orders.

The proportion of LS MRI orders in our study placed on the day of the LBP visit before and after guideline adherence feedback is concordant with that reported by Ip et al. [10], who tested a multifaceted intervention using the same ACP-APS guideline; this supports the generalizability of our results. In contrast to Ip et al., our study was designed to elucidate the effect of four discrete forms of guideline adherence feedback to inform future initiatives to reduce unnecessary LS MRI use. Our finding that the type of feedback affected adult PCP LS MRI orders for LBP may reflect differences in the content and frequency of CDS-generated report cards as opposed to real-time CDS alerts. Report card effectiveness is enhanced when normative data are provided highlighting desirable practice, such as adherence to guidelines, as in our study [1517]. Our report cards also included tailored feedback for each provider on the proportion of LS MRI orders among all their LBP visits, and the proportion of adherent and nonadherent orders relative to all LS MRI orders. Thus, the report cards contained both normative and clinical data on guideline adherence within the context of all LBP visits, providing a more sustained educational effect than real-time CDS alerts focused on adherence of isolated LS MRI orders. Report cards were delivered every 4–6 months. Alert fatigue from pop-up boxes at the time of each order entry may have led PCPs to ignore imaging order feedback, as has been shown for nonimaging CDS feedback [18].

CDS workflow also likely contributed to our findings. Frustration switching between the CDS software, where feedback was given, and the EHR, where orders were placed, may have hampered PCPs wishing to change orders on the basis of feedback. A lack of perceptible real-time consequences for ignoring recommendations presented in the CDS alerts, such as peer-to-peer consultation [10] or a hard stop [19], could also have contributed to the observed lack of effect [20]. Our results suggest that real-time CDS alerts do not promote high-value cost-conscious health care, and turning off real-time alerts does not influence PCP guideline adherence and may help reduce alert fatigue. Conversely, detailed clinical data captured during CDS interactions can be used to create high-quality evidence-based report cards, including data on all relevant clinic visits, to improve PCP guideline adherence and reduce unnecessary high-cost imaging.

Our results are discordant with those of the Medicare Imaging Demonstration study [11], which found no significant utilization reduction for 12 advanced imaging studies, including LS MRI for LBP, after evidence-based provider feedback. In contrast, we observed a lower likelihood of LS MRI orders among providers who received CDS-generated periodic report cards with or without real-time CDS alerts. Differences in results likely reflect our choice of evidence and the content of our report cards. Medicare Imaging Demonstration feedback consolidated appropriateness scores for each provider across nearly 80 practice guidelines; these guidelines mainly originated from imaging societies. Our feedback was derived from a single guideline endorsed by the professional societies for providers in the intervention and meeting seven of eight standards for trustworthiness, a key attribute of the clinical validity of a practice guideline [7]. As already described, our report cards also included tailored feedback for each provider on the proportion of LS MRI orders relative to all LBP visits and the proportion of adherent orders relative to all LS MRI orders; these data were not included in the Medicare Imaging Demonstration report cards. Thus, the report cards in our study provided discrete easily understood and individualized feedback on adherence to evidence based on trustworthy guidelines to help providers identify potential opportunities for practice improvement.

Our study had several limitations. Our health care system participated in the Medicare Imaging Demonstration study [11], which overlapped with this study's baseline period. It is unlikely that this confounded our results given the absence of significant utilization reductions in advanced imaging in that study. Lack of a concurrent control during intervention 2, when national initiatives such as Choosing Wisely were underway, limits our ability to assess for secular trends. However, there are no reports of LS MRI utilization reduction nationally during this period. Clinical data entered to CPOE by PCPs may not have always been correct, although this has not been shown for other clinical indications such as pulmonary embolus [21]. Despite differences in mean years in practice between included and excluded providers, our results are similar to those reported at a different institution, supporting the generalizability of our findings. Finally, there were interruptions in the study when CDS was unavailable because of EHR upgrades; this reflects true life challenges to implementing CDS and is unlikely to affect our conclusions.

In conclusion, our results suggest that health information technology tools can promote high-value cost-conscious health care through high-quality provider report cards presenting evidence-based diagnostic imaging guideline adherence to PCPs based on data entered into a CDS system within the context of relevant clinic visits. Our data further suggest that turning off real-time CDS alerts does not influence PCP guideline adherence. The effect of report cards was likely optimized by report card content, frequency, and the trustworthiness of the evidence source. The effect of real-time CDS alerts may have been diminished by the singular focus on order appropriateness, separation of CDS feedback and order entry, lack of real-time perceptible consequences for ignoring recommendations presented in the CDS alerts, and alert fatigue.

Supported by grant UC4-EB012952-01 from the National Institute for Biomedical Imaging and Bioengineering.

Acknowledgments
Previous sectionNext section

We thank Laura E. Peterson for her help with editing and formatting this manuscript and Gina Redfern, Marie Hegarty, and Keith Maston for their assistance in retrieving and formatting the data.

References
Previous sectionNext section
1. Hart LG, Deyo RA, Cherkin DC. Physician office visits for low back pain: frequency, clinical evaluation, and treatment patterns from a U.S. national survey. Spine 1995; 20:11–19 [Google Scholar]
2. Martin BI, Deyo RA, Mirza SK, et al. Expenditures and health status among adults with back and neck problems. JAMA 2008; 299:656–664 [Google Scholar]
3. Chou R, Qaseem A, Snow V, et al. Diagnosis and treatment of low back pain: a joint clinical practice guideline from the American College of Physicians and the American Pain Society. Ann Intern Med 2007; 147:478–491 [Google Scholar]
4. Cherkin DC, Deyo RA, Wheeler K, Ciol MA. Physician variation in diagnostic testing for low back pain: who you see is what you get. Arthritis Rheum 1994; 37:15–22 [Google Scholar]
5. Ip IK, Raja AS, Seltzer SE, Gawande AA, Joynt KE, Khorasani R. Use of public data to target variation in providers' use of CT and MR imaging among Medicare beneficiaries. Radiology 2015; 275:718–724 [Google Scholar]
6. American College of Physicians. Five things physicians and patients should question. American College of Physicians website. www.choosingwisely.org/societies/american-college-of-physicians/. Published 2012. Accessed January 22, 2017 [Google Scholar]
7. Ransohoff DF, Pignone M, Sox HC. How to decide whether a clinical practice guideline is trustworthy. JAMA 2013; 309:139–140 [Google Scholar]
8. Patel ND, Broderick DF, Burns J, et al. American College of Radiology appropriateness criteria: low back pain. American College of Radiology website. acsearch.acr.org/docs/69483/Narrative/. Published 1996. Updated 2015. Accessed April 4, 2018 [Google Scholar]
9. Zafar HM, Mills AM, Khorasani R, Langlotz CP. Clinical decision support for imaging in the era of the Patient Protection and Affordable Care Act. J Am Coll Radiol 2012; 9:907.e5–918.e5 [Google Scholar]
10. Ip IK, Gershanik EF, Schneider LI, et al. Impact of IT-enabled intervention on MRI use for back pain. Am J Med 2014; 127:512.e1–518.e1 [Google Scholar]
11. Timbie JW, Hussey PS, Burgett LF, et al. Medicare Imaging Demonstration final evaluation: report to Congress. RAND Corporation website. www.rand.org/pubs/research_reports/RR706.html. Published 2014. Accessed February 21, 2017 [Google Scholar]
12. Owens DK, Qaseem A, Chou R, Shekelle P; Clinical Guidelines Committee of the American College of Physicians. High-value, cost-conscious health care: concepts for clinicians to evaluate the benefits, harms, and costs of medical interventions. Ann Intern Med 2011; 154:174–180 [Google Scholar]
13. Chou R, Qaseem A, Owens DK, Shekelle P; Clinical Guidelines Committee of the American College of Physicians. Diagnostic imaging for low back pain: advice for high-value health care from the American College of Physicians. Ann Intern Med 2011; 154:181–189 [Google Scholar]
14. Stammen LA, Stalmeijer RE, Paternotte E, et al. Training physicians to provide high-value, cost-conscious care: a systematic review. JAMA 2015; 314:2384–2400 [Google Scholar]
15. Liao JM, Fleisher LA, Navathe AS. Increasing the value of social comparisons of physician performance using norms. JAMA 2016; 316:1151–1152 [Google Scholar]
16. Raja AS, Ip IK, Dunne RM, Schuur JD, Mills AM, Khorasani R. Effects of performance feedback reports on adherence to evidence-based guidelines in use of CT for evaluation of pulmonary embolism in the emergency department: a randomized trial. AJR 2015; 205:936–940 [Abstract] [Google Scholar]
17. Navathe AS, Emanuel EJ. Physician peer comparisons as a nonfinancial strategy to improve the value of care. JAMA 2016; 316:1759–1760 [Google Scholar]
18. Ancker JS, Edwards A, Nosal S, et al. Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical decision support system. BMC Med Inform Decis Mak 2017; 17:36 [Google Scholar]
19. Blackmore CC, Mecklenburg RS, Kaplan GS. Effectiveness of clinical decision support in controlling inappropriate imaging. J Am Coll Radiol 2011; 8:19–25 [Google Scholar]
20. Khorasani R, Hentel K, Darer J, et al. Ten commandments for effective clinical decision support for imaging: enabling evidence-based practice to improve quality and reduce waste. AJR 2014; 203:945–951 [Abstract] [Google Scholar]
21. Gupta A, Raja AS, Khorasani R. Examining clinical decision support integrity: is clinician self-reported data entry accurate? J Am Med Inform Assoc 2014; 21:23–26 [Google Scholar]
FOR YOUR INFORMATION

A data supplement for this article can be viewed in the online version of the article at: www.ajronline.org.

Address correspondence to H. M. Zafar ().

R. Khorasani was a consultant for the Medicalis Corporation when the study was performed. Medicalis did not fund this study, view the results, or the manuscript. C. P. Langlotz was a founder and shareholder of Montage Healthcare when the study was performed and is currently a shareholder and advisor to whiterabbit.ai, Nines.ai, and Galileo Medical and serves on a Nuance Communications physician advisory board without compensation and the board of directors of the Radiological Society of North America.

Recommended Articles

Effect of Clinical Decision Support–Generated Report Cards Versus Real-Time Alerts on Primary Care Provider Guideline Adherence for Low Back Pain Outpatient Lumbar Spine MRI Orders

Full Access, , , , , , , ,
American Journal of Roentgenology. 2019;212:395-401. 10.2214/AJR.18.19757
Abstract | Full Text | PDF (1164 KB) | PDF Plus (980 KB) 
Full Access, , , , , , , ,
American Journal of Roentgenology. 2017;208:351-357. 10.2214/AJR.16.16373
Abstract | Full Text | PDF (715 KB) | PDF Plus (792 KB) 
Full Access, , , , , , , , , , , , , , ,
American Journal of Roentgenology. 2019;212:382-385. 10.2214/AJR.18.20060
Abstract | Full Text | PDF (571 KB) | PDF Plus (586 KB) 
Full Access, , ,
American Journal of Roentgenology. 2019;213:1015-1020. 10.2214/AJR.19.21511
Abstract | Full Text | PDF (704 KB) | PDF Plus (754 KB) 
Full Access, , , , ,
American Journal of Roentgenology. 2019;212:402-410. 10.2214/AJR.18.19933
Abstract | Full Text | PDF (938 KB) | PDF Plus (846 KB) | Supplemental Material 
Full Access, , , , , ,
American Journal of Roentgenology. 2019;212:142-145. 10.2214/AJR.18.19966
Abstract | Full Text | PDF (674 KB) | PDF Plus (682 KB)