Applying an Organizational Framework for Health Information Technology to Alerts

Colene M. Byrne, PhD, Eric C. Pan, MD, MSc, [...], and Helga E. Rippen, MD, PhD, MPH, FACPM

Additional article information

Abstract

We are far from understanding how best to design, implement, and use health information technology (IT). A comprehensive framework, developed by Rippen et al1 to capture and organize knowledge on the implementation, use, and optimization of health IT, may guide and inform more effective health IT deployment. This study applied Rippen’s framework to a focused type of health IT – alerts – through clinical decision support (CDS), an area with a substantial evidence base around many facets of implementation, including the technology, use, and outcomes. We report results from applying this framework for capturing, organizing and standardizing knowledge and related measures around alerts. It is clear there are gaps in information shared and that measures across studies vary significantly. Insights identified using the framework highlight areas for further study and development, directed toward a shared conceptualization and representation of knowledge, and ultimately, a more comprehensive and deeper understanding of health IT.

Introduction

Knowledge capture in any field is improved by measurement based on standard concepts, terminologies, and taxonomies. This applies to health IT, which is complex and poorly understood.24 Kaushal conducted a systematic review of health IT and concluded that uniform standards for the reporting of research on health IT implementation are a high priority.5 The authors of this study strongly concur, and in this spirit, have leveraged the Organizational Framework developed by Rippen et al1, to continue moving the goal of uniform standards for reporting health IT research.

A framework can serve as a basis for organizing and reporting research on implementation of health IT, and reduce heterogeneity in reporting by ensuring that all important ‘facets’ (a term the framework developers used to capture important aspects of health IT) and their key components (e.g., characteristics and dimensions) are identified using common terminology, concepts and measures. The authors applied Rippen’s organizational framework to studies relating to a specific type of CDS, alerts, to assess what was being described within the literature relating to facets, components and measures. The purpose of this analysis is to pilot the use of the organizational framework by: (1) apply the framework to support the specific health IT application, alerts; (2) identify gaps in the literature around a specific component; and (3) explore the measures used to describe the facets relevant to the application.

Background

The organizational framework developed by Rippen et al. is based on existing health IT and related theories and models including technology diffusion, change management, and sociotechnical theories.1 The framework recognizes that health IT actions such as alerts are “sociotechnical” interactions between the IT and the user—including the organization’s existing social and technical systems, such as their workflows, culture, and social interactions.68 By cross-walking elements of current theories and models, Rippen identified five major, interrelated facets of an organizational framework that provide a structure to organize and capture information on the implementation and use of health IT. These are: (1) Technology—elements relevant to the specific health IT itself; (2) Use—elements related to the actual use of the technology; (3) Environment—elements related to the context in which the technology is used; (4) Outcomes—elements capturing the end results of the technology in use in its environment; and (5) Temporality—time and the developmental trajectory of other elements such as implementation and clinical disease processes.

Alerts are CDS tools that notify health professionals and patients with general and person-specific information, intelligently filtered and organized, at appropriate times of decision-making, to enhance health and health care. Drug-allergy checking and alerting is one of the simplest yet most important CDS tools used in electronic order entry systems.9 Alerts supporting the delivery of preventive services are another CDS application that enables action.10 As one of the most recognizable and common types of CDS, it is important to more fully understand this health IT application and assess the state of knowledge for all of its facets.

Methods

The authors selected a convenience sample of 17 published studies on alerts from peer reviewed journals in order to apply the Rippen organizational framework to the alert study description and findings.2,3,5,922 These 17 papers include original research and review papers (systematic review and meta-analyses) for the most common types of alerts: (1) basic medication alerts such as drug-drug interaction (DDI), drug-allergy interaction, and drug duplication alerts; (2) basic medication order guidance such as advanced drug alerts, drug-laboratory, drug-condition, drug-disease, drug-age and appropriate prescribing alerts; (3) drug formulary and dosing alerts; (4) condition-specific dosing guidelines (e.g., renal function); and (5) preventive services alerts or reminders, in both inpatient and outpatient settings. All of these studies were based on health IT applications such as order entry systems that used active alerts.

Each paper was reviewed by one member of a team of four health IT researchers, and important study descriptions and findings were mapped to the appropriate framework facets and associated characteristics from the original Rippen paper (for example, type of outcome or feature of technology). Themes for measures were identified, and examples of measures used in the study were identified. One paper was initially selected by the authors for internal discussion to ensure agreement on definitions and application of the framework. Facets, characteristics, themes, and examples were identified, captured, and conceptualized, and standardized terminology was assigned as needed.

Results

The results of the review, summarized by facet in Tables 15, lists the facet categories, related characteristics, themes, and the measures used to describe them in the literature. The technology facet, highlighted in Table 1, has six categories associated with it: cost, data and interoperability, functionality, non-functional requirements, product and user-based design. For alerts, data is a critical component as it has a direct effect on the validity of the alerts to the clinician, and hence the outcome of the patient. Not surprisingly, there are many types and sources for the data such as knowledge database derived data, patient specific data, system generated data (e.g., alert, reminder, order set) and clinician entered data (e.g., the order, override). The attributes of the data such as currency and validity are especially important but not often discussed. Also, data from other systems are also important but often not available (e.g., laboratory or diagnosis data). In general, details around the data itself were often not included in these papers. Not mentioned in any of the studies were the non-functional requirements (shaded in grey).

Table 1.
Technology facet categories and measures for alerts.
Table 5:
Temporality facet categories and measures relating to alerts

Table 2 presents the use facet organized by the associated categories of user characteristics and attitudes, knowledge, ownership/buy-in, and usability and workflow. The category user attitudes in the Organizational Framework1 was expanded to include user characteristics and attitudes to clearly identify the concept of user, which was described in each study and is an important concept when addressing use. The organizational framework category of knowledge was not a characteristic mentioned in any of the studies reviewed (shaded in grey).

Table 2:
Use facet categories and measures relating to alerts.

The environmental factors have been shown to be important to successful implementation of health IT systems.23 In the papers reviewed, attributes relating to this facet were rarely identified and discussed with the exception of setting, as is highlighted in Table 3. The papers were also scant on measures related to leadership (shaded in gray).

Table 3:
Environment facet categories and measures relating to alerts.

Table 4 summarizes the information around outcomes and organized by the categories adoption, business/financial, clinical, and methodology. There were a large number of themes and measures around adoption and use. This variation makes it more difficult to compare studies. Clinical outcomes were usually not associated with adoption and user characteristic measures such as user acceptance and level of alert use. Many adoption studies focused on clinician acceptance and compliance with alerts as a function of number, type of alerts, interface, and ability to control alerts, and the most comprehensive studies provided information around the technology and use facet categories to aid in interpreting outcomes.

Table 4:
Outcome facet categories and measures relating to alerts.

Table 5 presents the temporality facet and associated subcategories of time, implementation cycle, and outcome lifecycle as extracted from the seventeen studies reviewed using Rippen’s Organizational framework for health information technology.1 The measure around study duration or time was at the month/year or year level. However, this information was often missing in the papers reviewed.

In the temporal facet, measures used include objective time (e.g. dates and durations), subjective time (e.g. enhancement period), mapping of subjective time to objective time (e.g. the enhancement period started on <date>) and change of measures from other facets over time. Not surprisingly, there’s good agreement in the use of standard time units in describing objective time, with the only variation being in the precision or granularity of the time units. Subjective time is not used significantly in the sample studies. This may be a reflection of the relative “youth” of the CDS subject domain, yet to develop a common accepted terminology for implementation phases. The immaturity of the CDS domain is also reflected in the lack of outcome lifecycle measures, largely due to the scarcity of longitudinal studies that would associate and define outcome over time.

Discussion

The Rippen framework facilitated identifying and categorizing research findings and context about alerts. Although all of the facets were touched upon, technology, use and outcomes were most often addressed. The framework facets not generally addressed in the studies were environmental and temporality. When reviewing each facet it is clear that the components addressed across facets varied as did the measures used to describe or quantify them.

This analysis accomplished its threefold purpose:

  1. Application of the organizational framework to alerts. Rippen’s organizational framework provided a robust framework to capture the concepts, themes, and measures presented in the papers reviewed. No key characteristics and measures were identified that were not addressed by the framework. For clarity, the category ‘user attitude’ in the framework was changed to ‘user characteristics and attitude’ for this paper.
  2. Identify gaps in the literature around alerts. Health IT implementation is complex in that it is difficult to adequately “control” for context under which an application is implemented. Researchers often focus on the details of the technology that they are studying. Not surprisingly, the most detail in the studies was around the technology facet, specifically the application itself. However, even in this facet there were gaps. Non-functional requirements were not addressed despite the fact that they can have a significant effect on the outcome. For example, a response time of 300 milliseconds vs. 15 seconds for the time between when an order is entered and when the alert is seen is important as to whether and how the alert is used. Failure to provide a robust description of the context and technology is a significant barrier to understanding the field. Furthermore, user and environment facets and characteristics such as user acceptance, training, clinical leadership, and presence of quality improvement initiatives in settings also hindered an in-depth understanding of alert adoption and related outcomes.
  3. Explore the granularities and consistencies of measure definitions for alerts. One of the most compelling findings was the lack of clear measures across the studies. This includes not having measures that account for alert “burden.” For example, a physician with 90% of his population prescribed drug A compared to a physician who will prescribe drug A less than 1% of the time will have a different experience when “nuisance alerts” are fired for drug A.

Focusing on a specific health IT application – CDS alerts, and including more than one type of application of alerts (medication related and prevention) provided additional insights. The concept of user acceptance of the alert recommendation in conjunction with workflow (easy action) is highlighted in the comparison of a medication-related versus a preventive service order recommendation and likelihood of use.

Without a common framework to provide context, meaning and measures to describe them, our ability to build on different studies will be limited. If the same technology is implemented in similar settings, how will it be known why one fails and another succeeds? Broader adoption of the framework would encourage more standard, comparable, and comprehensive collection and reporting of important implementation information.

Case studies and more focused literature reviews applying the framework to specific health IT implementation projects will help to more clearly identify the benefits and limitations of the framework, in order to more effectively apply the framework to real life health IT projects and to studies of the implementations.

Conclusion

The framework provides a strong starting point for identifying the important facets and characteristics associated with implementation of alerts. The available studies did not provide detailed information on many important implementation facets and characteristics. As health IT applications become more widespread, future studies, guided by a framework such as the Rippen one, can and should examine more facets in greater depth. Besides the technical facets, which are generally well-reported, other facets and characteristics of the environment, setting, culture, intended users, etc. can be better identified and investigated. Only by examining the individual facets and characteristics in detail will it be possible to fully understand which of them are critical success factors (or critical impediments) and how to optimize those success factors (or avoid impediments).

Article information

AMIA Annu Symp Proc. 2012; 2012: 67–76.
Published online 2012 Nov 3.
PMCID: PMC3540480
PMID: 23304274
Colene M. Byrne, PhD,1 Eric C. Pan, MD, MSc,1 Cynthia Russell, RN, MSN,1 Scott Finley, MD, MPH,1 and Helga E. Rippen, MD, PhD, MPH, FACPM1
1Westat, Rockville, MD
This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose
Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

REFERENCES

1. Rippen HE, Pan EC, Russell C, Byrne CM, Swift EK. Organizational framework for health information technology. Int.J.Med.Inform. 2012 [PubMed] [Google Scholar]
2. Kaushal R, Shojania KG, Bates DW. Effects of computerized physician order entry and clinical decision support systems on medication safety: a systematic review. Arch.Intern.Med. 2003;163:1409–1416. [PubMed] [Google Scholar]
3. Kawamoto K, Lobach DF. Clinical decision support provided within physician order entry systems: a systematic review of features effective for changing clinician behavior. AMIA.Annu.Symp.Proc. 2003:361–365. [PMC free article] [PubMed] [Google Scholar]
4. Berner ES. Clinical decision support systems. State of the Art. 2009 Jun;09-0069-EF [Google Scholar]
5. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765. [PMC free article] [PubMed] [Google Scholar]
6. Ash JS, Sittig DF, Wright A, McMullen C, Shapiro M, Bunce A, et al. Clinical decision support in small community practice settings: a case study. J.Am.Med.Inform.Assoc. 2011;18:879–882. [PMC free article] [PubMed] [Google Scholar]
7. Goldstein MK, Coleman RW, Tu SW, Shankar RD, O’Connor MJ, Musen MA, et al. Translating research into practice: organizational issues in implementing automated decision support for hypertension in three medical centers. J.Am.Med.Inform.Assoc. 2004;11:368–376. [PMC free article] [PubMed] [Google Scholar]
8. Harrison MI, Koppel R, Bar-Lev S. Unintended consequences of information technologies in health care--an interactive sociotechnical analysis. J.Am.Med.Inform.Assoc. 2007;14:542–549. [PMC free article] [PubMed] [Google Scholar]
9. Schedlbauer A, Prasad V, Mulvaney C, Phansalkar S, Stanton W, Bates DW, et al. What evidence supports the use of computerized alerts and prompts to improve clinicians’ prescribing behavior? J.Am.Med.Inform.Assoc. 2009;16:531–538. [PMC free article] [PubMed] [Google Scholar]
10. Balas EA, Weingarten S, Garb CT, Blumenthal D, Boren SA, Brown GD. Improving preventive care by prompting physicians. Arch.Intern.Med. 2000;160:301–308. [PubMed] [Google Scholar]
11. Abookire SA, Teich JM, Sandige H, Paterno MD, Martin MT, Kuperman GJ, et al. Improving allergy alerting in a computerized physician order entry system. Proc.AMIA.Symp. 2000:2–6. [PMC free article] [PubMed] [Google Scholar]
12. Dexter PR, Perkins S, Overhage JM, Maharry K, Kohler RB, McDonald CJ. A computerized reminder system to increase the use of preventive care for hospitalized patients. N.Engl.J.Med. 2001;345:965–970. [PubMed] [Google Scholar]
13. Glassman PA, Simon B, Belperio P, Lanto A. Improving recognition of drug interactions: benefits and barriers to using automated drug alerts. Med Care. 2002;40:1161–71. [PubMed] [Google Scholar]
14. Goldberg HI, Neighbor WE, Cheadle AD, Ramsey SD, Diehr P, Gore E. A controlled time-series trial of clinical reminders: using computerized firm systems to make quality improvement research a routine part of mainstream practice. Health Serv.Res. 2000;34:1519–1534. [PMC free article] [PubMed] [Google Scholar]
15. Ko Y, Ararca J, Malone DC, Dare DC, Geraets D, Houranieh A, et al. Practitioners’ views on computerized drug-drug interaction alerts in the VA system. Journal of the American Medical Informatics Association. 2007;14:56–64. [PMC free article] [PubMed] [Google Scholar]
16. Kuperman GJ, Bobb A, Payne TH, Avery AJ, Gandhi TK, Burns G, et al. Medication-related clinical decision support in computerized provider order entry systems: a review. J.Am.Med.Inform.Assoc. 2007;14:29–40. [PMC free article] [PubMed] [Google Scholar]
17. Paterno MD, Maviglia SM, Gorman PN, Seger DL, Yoshida E, Seger AC, et al. Tiering drug-drug interaction alerts by severity increases compliance rates. J.Am.Med.Inform.Assoc. 2009;16:40–46. [PMC free article] [PubMed] [Google Scholar]
18. Persell SD, Kaiser D, Dolan NC, Andrews B, Levi S, Khandekar J, et al. Changes in Performance After Implementation of a Multifaceted Electronic-Health-Record-Based Quality Improvement System. Med. Care. 2010 [PubMed] [Google Scholar]
19. Roberts LL, Ward MM, Brokel JM, Wakefield DS, Crandall DK, Conlon P. Impact of health information technology on detection of potential adverse drug events at the ordering stage. Am.J.Health.Syst.Pharm. 2010;67:1838–1846. [PubMed] [Google Scholar]
20. Shah NR, Seger AC, Seger DL, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving acceptance of computerized prescribing alerts in ambulatory care. Journal of the American Medical Informatics Association. 2006;13:5–11. [PMC free article] [PubMed] [Google Scholar]
21. Shea S, DuMouchel W, Bahamonde L. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting. J.Am.Med.Inform.Assoc. 1996;3:399–409. [PMC free article] [PubMed] [Google Scholar]
22. van der Sijs H, Aarts J, van Gelder T, Berg M, Vulto A. Turning Off Frequently Overridden Drug Alerts: Limited Opportunities for Doing It Safely. Journal of the American Medical Informatics Association. 2008;15:439–448. [PMC free article] [PubMed] [Google Scholar]
23. Lorenzi NM, Novak LL, Weiss JB, Gadd CS, Unertl KM. Crossing the implementation chasm: a proposal for bold action. J.Am.Med.Inform.Assoc. 2008;15:290–296. [PMC free article] [PubMed] [Google Scholar]

Table 1.

Technology facet categories and measures for alerts.

CategoryCharacteristicsExample ThemesMeasures Used
CostCost relating to and specific to the technologyDevelopment cost (when built in-house); implementation cost; hardware and software costs, operations and maintenance costsDollars2;
Dollars/year2;
Dollars/patient/year2
Data and interoperabilityAttributes related to the knowledge databasesKnowledge databases, type, fields, and source nameType: Medication database, Source: pharmacy database listing stock; national medication database22
Type: Allergy tables; medication ingredients11,16
Type: Contraindications Database, Source: internally developed11,16
Type: Drug-Disease database11,16
Attributes around validation, integrity, quality, currencyQuality Comprehensiveness Currency: Knowledge database Update frequency Data standard Magnitude of changeVague; unhelpful16
Incomplete labs (e.g., pregnancy test results)22
Updates monthly22
ICD 10 code16
Changes in alerts secondary to changes in drug dictionary over time11
Attributes related to patient specific data including that which is entered by the user (prescription)Clinician entered data: medication orders, allergy data, diagnosis data Registration data: sex, age Laboratory data: test/procedure result; drug level Pharmacy systemData type
Availability of data22
FunctionalityDescribes the functionality and design purpose of the technical applicationApplication type: Computerized Physician Order System (CPOE) Clinical Decision Support System (CDSS) Knowledge databaseApplication type/definition3,11
Alert levelHigh, Medium, Low22
Seriousness Index, F,E,D,C,B,A22
Level 1, Level 2, Level 320
Alert target/reachAll/None (hospital wide)22
Physician office
Alert recipientClinician
Patient10
Alert triggerDrug-drug interaction
Overdose
Duplicate orders
Drug allergy
Drug contraindications (lab, disease, pregnancy)
Reminders (corollary orders)
Formulary2,16,22
Age, sex, diagnosis specific Preventive Services14
Trigger eventPer order11
Per patient14
Rule thresholdTrue allergy
Possible allergy
Medication intolerance11
Alert deliveryPrinted
Electronic
Phone
Letter
Postcard10,14
Alert action requirementsEliminate contraindication; override reason; read and confirm; none 20
Stop current order, override, respond, cancel new order22
Approval only (yes/no)
Approval3
Reminder21
Physician order entryOrder set
Free text prescription22
Required data entry fields (CPOE)Drug name, dosage form, strength, drug dose, frequency, start date, start time11,20,22
Alert trendingAvailability, # alerts/month11
Non-functional requirementsIndicates how well the system performs
ProductDescribes the specific technology product (i.e., hardware, software)Product name
Version number
Name and version2,11,20,22
VendorName of Company
LocationCity, State, Country2,11,20,22
User-based designIncludes user interface design but also the workflow that the health IT was designed to supportAlert responseIntrusive vs non-intrusive11,20,22
PromptUnambiguous action
Promotes an action3
Yes/No10
Patient specific or generalized10
Timing of promptYearly
1 month in advance
Night before clinic visit
Before visit at reception desk
During routine intake procedures10
Communication content feature: JustificationReasoning
Research evidence
Citation of authority
Institution-specific guidelines3,16
PrioritizationClinically significant warnings differentiated from other warnings16
User-based designLocal user involvement – yes/no 3

Table 2:

Use facet categories and measures relating to alerts.

CategoryCharacteristicsExample ThemesMeasures Used
User characteristics and attitudesCover a wide range of concepts such as user characteristics, satisfaction, perceived usefulness and usability, and user acceptanceEnd users characteristicsPrescribers, pharmacist, age, specialty, percent faculty10,14,19,21
Studies evaluated questions such as: Value of DDI information context is perceived to be helpfulLikert scale of agreement (1–5) Percent positive in the Likert scale of statement agreement.15
DDI alerts improved their ability to prescribe safelyLikert scale of agreement (1–5) Percent positive in the Likert scale of statement agreement.13
Satisfaction with CPOELikert scale of agreement related to satisfaction15
Provider belief about medicationLikert scale of agreement16
Clinicians reaction to DDI alertsRelevance of DDI alerts as helpful (surveyed at the time of warning)16
Usability & workflowCovers usability and actual workflow of the userHow the alert system is integrated into the prescribers workflow and at what point influence the success of CPOE alerts.Interventions assessed as succeeded when the CDSS was provide to clinicians automatically5,22
Other reported barriers (e.g. poor visual presentation)Poor signal to noise ratio13
Alert text readabilityNot lengthy, clear and concise to be helpful with links to supporting evidence22
Automatic provision of decision support as part of the clinician workflow and no need for additional clinician data entry
Ownership/Buy-inCaptures the amount of user involvement and participation in the implementation processAdvise having clinician-users review the alert text prior to introducing them into practiceDegree of involvement; ability to turning alerts on or off22
KnowledgeIncludes concepts around adult learning, training, capability

Table 3:

Environment facet categories and measures relating to alerts.

CategoryCharacteristicsExample ThemesMeasures Used
Business driversThis includes governmental policies and regulations that influence the organization and business factors, e.g., competitionPatientsNumber of patients
Cultural/organizationalCaptures teamwork climate, values, cultureCultural views regarding data retention
Physician acceptance of guidance
Hesitance to remove allergy data from an EMR even if dubious16
Physician acceptance10
LeadershipSenior leaders and champions fit into this category
ResourcesThis includes not only the resources available to support the implementation of the health IT, but also the IT infrastructure that can enable itAvailabilityNumber of terminals in given location14
SettingWhich environment the health IT is being usedFacility type; affiliationInpatient (# of beds); outpatient (# physicians, staff)11,14,20,22
University affiliated, public clinic10
Name and location of facilityName; city, state, country11,14,20,22
Staffing mixPhysicians (specialties), residents, physician assistants, nurses, medical assistants, receptionists14,20
SupportMany aspects of implementation management fit under this category, including trainingCustomizationAbility to support local customization16

Table 4:

Outcome facet categories and measures relating to alerts.

CategoryCharacteristicsExample ThemesMeasures Used*
AdoptionThe number of alert users, the rate of use, depth of their useCompliance with alerts (acceptance of alert recommendation)
Override of alerts (non acceptance of alert recommendation)
Number of unique alerts and total alerts fired
Alert fatigue
Turning on/off alerts
Documentation of exceptions to alerts which impacts number of patients eligible for performance measure
Number of alerts accepted over total alerts fired22
Number of patients eligible for performance measures with alert functional allowing documentation of exceptions to alert rule18
Acceptance of alerts and overrides20,22
Clinician decision to turn off/on alert based on information available with alerts22
Number of unique alerts, and overridden22
Drug prescribing according to alert rulesDrug prescribing according to alert rules Ordering corollary medicationsProportion of all physicians with perfect or near perfect (90%–99%) for 7 drug prescribing guidelines in alerts18
25% improvement in ordering of corollary medications by faculty and residents2
Attributes related to alert type, severity, and tiringBasic or Advanced Alerts: Classification system adapted to derive three main categories of alerts and prompts: 1) basic alerts consisting of drug allergy, drug-drug interaction and duplication warnings, and basic medication order guidance; 2) advanced alerts including drug-lab, drug-condition, formulary alerts; and 3) complex alert systems with basic and advanced alerts.Alert acceptance by level and severity of alert17
Differences in alert acceptance by tiering17
Number of DDI alerts fired, accepted, rejected by type of alert10,21,22
Assessment of alert severity by cliniciansClinician drug severity versus that from drug databasesAgreement between clinicians and drug database on alert severity22
Ability to tailor alertsTailor-ability according to specific userNumber of frequently overridden alerts turned off, suppressed alerts22
Business/FinancialHealthcare utilization and costsReduced utilization and associated costs Reduced costs of care Formulary complianceLength of stay3
Laboratory test orders3
Cost per hospitalization2
Medication costs9
ClinicalPrescribing behaviorAdverse drug events
Prescribing errors
Duplicate orders
Other adverse outcomes related to prescribing
Prescribing errors22
-related renal impairment9
Number and type of serious and life-threatening adverse drug events9
Number of duplicate orders16,20,22
Falls in elderly9
Hospital length of stay9
Quality of care according to guidelinesAcceptance of CDS alerts for chronic and preventive care according to guidelinesQuality measure performance related to CDS alerts, for selected chronic care and preventive quality measures (e.g., lipid lowering drugs; ACE Inhibitor or ARB; ACE Inhibitor or ARB in LVSD; beta blocker in LVSD; anticoagulant for atrial fib; LDL-C control is persons with diabetes; aspirin for primary prevention - diabetes; neuropathy mgmt. diabetes3,12,14,18
No change in mammography screening 14
MethodologyStudy DesignStudy type (e.g., pre-post, observational field study, case–control, prospective, retrospective, time series)
  • Quantitative
  • Qualitative (e.g., focus groups, interviews)

    Sample Size, Power Analyses

Number of patients, alerts fired, alerts accepted, alert overrides17,22
Study ParticipantsHealth IT with alert users Type of userNumber of active users of the CPOE system, Specialty types - internal medicine, cardiology (specialists), registered hospital pharmacists. Internal med & cardiology : 4 specialists, 2 residents; 4 registered pharmacists and 2 residents in pharmacy; surgery - 2 specialists, 4 final year residents) - 576 assessments22
Clinicians, nurses, pharmacies
LimitationsAbility to control for alert intervention variable Study populationAn intervention was multifaceted, and study unable to determine which components were most responsible for the observed results18
Results only generalizable to experienced EHR users18
*most clinical outcome measures compared outcomes before and after the implementation of alerts

Table 5:

Temporality facet categories and measures relating to alerts

CategoryCharacteristicsExample ThemesMeasures Used
TimeIndependent variable; external anchor; construct for understanding durationStart of study;
End of study;
Start of data collection;
End of data collection;
Electronic availability of data;
Duration of study
Date as Month, Day, Year14,20
Date as Month, Year22
Date as Year14
Duration as months22
Implementation cycleCharacterizes a step in a lifecycle of health IT implementation; not dependent on time; supports comparison across different implementations independent of timeEnhancementsPeriod of enhancement to CDS11
Outcome lifecycleCharacterizes the lifecycle of when an intervention can be expected to generate a given outcome; provides the ability to account for findings when a study has not provided for sufficient time for evaluationFrequency of alerts; Clinician reaction to alertsChange in frequency of alerts over time11
Change in clinician compliance to alerts over time11