Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
J Am Dent Assoc. Author manuscript; available in PMC 2009 Dec 1.
Published in final edited form as:
J Am Dent Assoc. 2008 Dec; 139(12): 1632–1642.
PMCID: PMC2614265

A usability evaluation of four commercial dental computer-based patient record systems

Thankam P. Thyvalikakath, BDS, MS, assistant professor
Center for Dental Informatics, School of Dental Medicine, University of Pittsburgh, 334 Salk Hall, 3501 Terrace St., Pittsburgh, Pa. 15261, e-mail “ude.ttip .latned@1tpt”.
Valerie Monaco, PhD, MHCI, assistant professor
Department of Biomedical Informatics, University of Pittsburgh.
Hima Bindu Thambuganipalle, BDS, dental student
College of Dentistry, New York University, New York City.
Titus Schleyer, DMD, PhD, associate professor and the director



The usability of dental computer-based patient record (CPR) systems has not been studied, despite early evidence that poor usability is a problem for dental CPR system users at multiple levels.


The authors conducted formal usability tests of four dental CPR systems by using a purposive sample of four groups of five novice users. The authors measured task outcomes (correctly completed, incorrectly completed and incomplete) in each CPR system while the participants performed nine clinical documentation tasks, as well as the number of usability problems identified in each CPR system and their potential relationship to task outcomes. The authors reviewed the software application design aspects responsible for these usability problems.


The range for correctly completed tasks was 16 to 64 percent, for incorrectly completed tasks 18 to 38 percent and for incomplete tasks 9 to 47 percent. The authors identified 286 usability problems. The main types were three unsuccessful attempts, negative affect and task incorrectly completed. They also identified six problematic interface and interaction designs that led to usability problems.


The four dental CPR systems studied have significant usability problems for novice users, resulting in a steep learning curve and potentially reduced system adoption.

Clinical Implications

The significant number of data entry errors raises concerns about the quality of documentation in clinical practice.

Keywords: Usability, computer-based patient records, clinical information systems, evaluation, user interface

The adoption of computer-based patient records (CPRs) by practitioners has become a key issue in re-engineering the health care system.1 In both medicine and dentistry, practitioners in solo and small-group practices have been slow to adopt CPRs. Approximately 17 to 25 percent of physicians use a CPR system.2 In dentistry, 25 percent of all general practitioners in the United States used a computer in at least one of their operatories in 2005.3 As in medicine,4 there is some debate in dentistry as to what exactly constitutes a CPR.3 However, as of 2005, only 1.8 percent of general dental offices maintained patient records almost completely on the computer, suggesting a low adoption rate of CPRs.

Researchers have suggested that usability problems are one factor in retarding system adoption.5,6 Usability problems in CPR systems can result in negative consequences in the way practitioners make clinical decisions,7 how efficiently practitioners interact with information about patients8 and how care providers collaborate.9 Usability problems also can cause new types of errors that are less prevalent or absent when traditional approaches are used.5,10-13 A steep learning curve3 and loss of time and productivity14,15 are other problems caused by usability issues.

In a usability test, the typical variables of interest are ease of use, efficiency and user satisfaction with the system. In the last decade, several studies reported the importance of incorporating usability methods during the development and evaluation of clinical information systems.6,12,16-18 The results of these studies highlighted improvements in efficiency, effectiveness, user acceptance and reduction in errors when using these systems. While usability (defined by the International Organization for Standardization as “the extent to which a product can be used by specified users to achieve goals with effectiveness, efficiency and satisfaction in a specified context of use”19) has been studied extensively in medical systems, we found no reports of empirical usability studies of dental CPR systems when we conducted an exhaustive search of the MEDLINE-indexed literature. Three studies recently conducted by the Center for Dental Informatics at the University of Pittsburgh School of Dental Medicine (SDM),3,20,21 however, have yielded evidence that usability problems may be an important factor in retarding the adoption of CPR systems by dental practitioners.

In a usability test, the typical variables of interest are ease of use, efficiency and user satisfaction with the system.

In a telephone survey of 102 general dental practitioners in the United States, investigators found that usability and functionality were the two main aspects that participants disliked most about the CPR system they were using.3 Charting (that is, entering clinical information on a graphical chart representing the patient’s teeth and gingivae) was third. Participants also listed insufficient usability and a steep learning curve as drawbacks to using CPR systems and considered “better input methods” and “better user interface design” as main areas of improvement.

The results of a second study, a heuristic evaluation of the four market-leading dental software packages,20 revealed 229 heuristic violations in the four CPR systems, primarily in the categories of “consistency and standards” (that is, ensuring that identical data and functions can be perceived as such), “match between system and the real world” (that is, representing data and functions on the computer in words, concepts and representations familiar to users) and “error prevention” (that is, reducing opportunities for users to make errors). Consistency and standards and match between system and the real world are two heuristics that have a significant influence on a novice user’s ability to understand a system, lending credibility to the survey participants’ assertion that dental CPR systems are hard to learn to use. The 41 violations related to the error prevention heuristic suggested that use of the systems may result in frequent errors.

The results of the third study21 showed that some CPR systems organize clinical information in ways that do not optimally support clinical decision making, which could result in an unnecessary degree of cognitive friction. Taken together, these findings strongly suggest that usability problems negatively affect the adoption and use of CPRs in dentistry.

Increasingly, researchers have been advocating the use of qualitative studies to determine the frequency and nature of usability problems.22-24 Therefore, we conducted a study to determine the intuitiveness of clinical charting functions in a representative sample of dental CPRs for novice users, to study task outcomes and usability problems and their potential relationships with one another and to find out which interface design aspects were the most problematic in light of the observed usability problems.


We conducted usability assessments25,26 of the charting interfaces of working demonstration versions of four CPR systems (Dentrix [DX], Version [Dentrix, America Fork, Utah]; Eagle-Soft [ES], Version 10.0 [Patterson Dental, St. Paul, Minn.]; SoftDent [SD], Version 10.0.2 [Kodak Dental Systems, Atlanta]; and Practice-Works [PW], Version 5.0.2 [Kodak Dental Systems]) with four groups of participants consisting of five novice users in each group.

Four full-time dental faculty members, eight practicing dentists and eight senior dental students from the University of Pittsburgh SDM and the Pittsburgh area participated in the study. We defined dentists who were teaching more than three days a week at SDM as full-time dental faculty members and dentists who were teaching three days a week or less at SDM and practicing at least 25 hours a week as practicing dentists. Eleven of the 20 participants were men, and all but one of the practicing dentists were affiliated with the University of Pittsburgh. The mean age of the participants was 37.8 years. Eighteen participants reported having used a computer for four or more years. Three-fourths of the participants indicated that they had learned how to use computers on their own, while the remainder said they had completed courses offered either in college or dental school.

We required participants to have experience using the Microsoft Windows operating system (Microsoft, Redmond, Wash.) but not a dental CPR system. We chose five participants per group because this number is considered adequate for usability testing as evidenced in the usability engineering literature.27 The purposive sample of novice users for each CPR system was made up of one full-time dental faculty member, two practicing dentists and two senior dental students.

Figures Figures11 and and22 show a sample hard-tissue chart and a soft-tissue chart in ES, respectively. Each participant used only one software package and worked through nine clinical documentation tasks (Table 1) using a think-aloud protocol.22,24,27,28 Participants recorded general pathological findings in Tasks 1 and 2, existing restorations in Tasks 3 and 4, planned procedures in Tasks 5 and 6 and periodontal findings in Tasks 7 and 8. In Task 9, participants were required to delete the restorative finding they had recorded in Task 3.

Figure 1
Sample hard-tissue chart in the EagleSoft software application (Patterson Dental, St. Paul, Minn.), which is used to document missing teeth, caries and other pathology, as well as completed and planned procedures. Image of EagleSoft reproduced with permission ...
Figure 2
Sample soft-tissue and periodontal chart in the EagleSoft software application (Patterson Dental, St. Paul, Minn.), which is used to document pocket depth, bleeding, bone loss and other periodontal findings. Image of EagleSoft reproduced with permission ...
Clinical documentation tasks.

A facilitator (T.P.T.) guided the experiment and recorded the participants’ think-aloud statements and the computer screen throughout using Camtasia Studio, Version 2.1.0 (TechSmith, Okemos, Mich.), in addition to taking written notes. At the beginning of each experiment, the facilitator asked participants to complete a short background questionnaire designed to assess their computer knowledge, which had been validated in a previous study.29 For each task, the facilitator handed a 3 × 5-inch card to each participant and asked him or her to read the text aloud. The facilitator then asked the participant to complete the task on the computer while thinking aloud.

After each session, two experts (T.P.T. [expert 1] and a usability expert [expert 2]) coded usability problems using a coding manual (Table 2). We based the coding scheme of usability problems on the approach used in other usability studies.28,30 For each task, the experts coded the task outcome and the types of usability problems that occurred. We calculated Cronbach α intra-class correlation coefficient and the κ statistic across all participants, periods with one-minute intervals and software applications using the statistical software Stata, Version 10.0 (StataCorp, College Station, Texas).

Coding manual for task outcome and type of usability problem.

After independently coding each session, the experts reviewed disagreements and reached consensus. We performed Wilcoxon rank-sum tests, Spearman correlation coefficient tests and Kruskal-Wallis tests to determine the correlation between usability problems and task outcomes for each task, as well as any differences in the computer knowledge score between the four groups of participants.

After the experiments concluded, we calculated the distribution of task outcomes and the total number of usability problems for each software application, and analyzed the frequency of usability problems by task, software application and type. To identify design aspects that might cause usability problems across software applications, we recorded all usability problems for each task and software application by using screen-shots of the user interface (Figure 3, page 1637). We then compared findings across software applications and abstracted interface and interaction designs that presented usability problems for most users.

Figure 3
Screenshot showing a user’s actions while performing Task 1 (record tooth no. 28 as missing) while using the Dentrix software application (Dentrix, American Fork, Utah). The correct sequence consists of the following left mouse clicks: “tooth ...

We provided the software vendors with the manuscript of this article before we submitted it to The Journal of the American Dental Association for review and comment. The University of Pittsburgh Institutional Review Board classified this study as expedited under Title 45 CFR Part 46.


Participants and interrater reliability

As a group, participants scored an average of 1.8 (3.0 being the highest) on the computer knowledge scale (standard deviation = 0.4). κ values for the classifications “user gave up” and “task incorrectly completed” were excellent (0.96 and 0.86, respectively), while they were good for the classifications “three unsuccessful attempts” (0.76), “negative affect” (0.71) and “design suggestion” (0.71). The Cronbach α intraclass correlation coefficient between the two raters was 0.98 for the classification “user gave up,” 0.92 for “task incorrectly completed,” 0.94 for “three unsuccessful attempts,” 0.85 for “negative affect” and 0.75 for “design suggestion.”

Task outcomes and usability problems

Table 3 shows the cumulative outcomes for the nine usability tasks in each of the four software applications after the two experts reached consensus. The percentage of correctly completed tasks ranged from 16 to 64 percent. The percentage of incorrectly completed tasks followed an inverse distribution, from 18 to 38 percent. Incomplete tasks made up the remaining percentage (9 to 47 percent). There were no statistically significant differences for task outcomes among the software programs except for Tasks 6 and 7 (P < .05). The frequency of observed usability problems correlated positively with the frequency of task failures for all tasks except two, Tasks 1 and 7 (P < .05).

Cumulative outcomes of the nine usability tasks in each of the four software applications tested.*

We classified the largest number of usability problems for all packages as three unsuccessful attempts (Table 4, page 1638). This type of problem occurred almost exactly twice as often (146 times [51 percent]) as the next most frequent problem, negative affect (65 times [23 percent]). The task with the largest number of usability problems (61 [21 percent]) was Task 7. In most software applications, this task generated the largest number of problems compared with all other tasks. Task 9 resulted in the second highest number of usability problems (39 [14 percent]), and Task 1 closely followed (38 [13 percent]). The number of usability problems in the remaining tasks ranged from 16 (6 percent) to 30 (10 percent) of all observed problems.

Distribution of the five types of usability problem, by software application tested and frequency.

Table 4 also shows the number of usability problems in each problem category by software application. For instance, it shows that Task 1 (record tooth no. 28 as missing) in a DX CPR system had the highest occurrence of three unsuccessful attempts (13). Similarly, the highest number of expressions of negative affect (10) occurred in an ES CPR system when trying to accomplish Task 7 (record pocket depths of 2 mm for teeth nos. 1 through 8).

Common design problems affecting usability

In the following paragraphs, we discuss the findings shown in Table 4 in the context of the interaction design in each software application. We address the role that identical or similar interaction designs across software applications played in causing usability problems. Where appropriate, we contrast problematic designs with those that promoted usability.

Counterintuitive sequence of steps in recording findings

To record findings or procedures, most participants tried to identify the anatomical region (for example, the tooth) first and then recorded the actual finding. Button and colleagues31 made a similar observation in 1999. For instance, in our study for Task 1, 11 users left clicked on tooth no. 28 first and then tried to find a way to mark it as missing. This was the correct approach in all programs except SD, which worked the opposite way (that is, the user had to select the missing tooth icon and then left click on the tooth). This design caused several users to fail the task in SD. In DX, the CPR system requires the user to perform an additional step to indicate whether the selected task was an existing condition or a planned procedure. The system, however, provided no guidance to help users take this action. This led to four of five users’ failing to record tooth no. 28 as missing (Figure 3).

Inadequate user guidance due to semantically related labels, objects and functions

Many users were confused or led astray by labels, objects and functions with similar design, meaning or intent. For instance, despite identical clinical presentation, there is a difference between a tooth that is missing and one that has been extracted. Seven users who were not able to enter the condition “missing tooth” looked for a way to record it as having been extracted (which was incorrect). In another instance while using PW, three users who could not find a way to “delete a restoration” looked for a way to delete it by left clicking on the “clear selected teeth” and on the “undo last entry” buttons (Figure 4). Confusing graphical design was evident in SD, which displays both permanent and primary teeth in the chart by default, a highly unlikely scenario. As a consequence, two users recorded findings on a primary tooth (incorrect) instead of the permanent tooth (correct). Reason32 classified this type of slip as a description error.

Figure 4
Screenshot showing a user’s actions while performing Task 9 (delete the existing entry for a mesio-occlusal amalgam on tooth no. 14) while using the PracticeWorks software application (Kodak Dental Systems, Atlanta). The correct sequence consists ...

Poorly organized controls for entering findings and treatment

Dental charting interfaces must accommodate the entry of a large universe of data points, such as findings and treatment procedures. Poor approaches to organizing the corresponding data entry controls led to a significant number of task failures in our study. The ES and DX software applications intermixed controls for entering conditions and procedures on the same toolbar or palette, which confused users. SD, on the other hand, separated the controls for entering conditions between a toolbar and a palette. As a result, several users could not locate the correct button or menu item for completing a task.

Insufficient match between the user’s and the software application’s task model

The results from users’ performance of Task 6 showed that there was a poor match between the users and the software application’s task model in three of four programs (Figure 5). Recording a bridge consists of a number of subtasks, such as specifying abutments and pontics; abutment types, such as crowns or onlays; and the materials to be used, such as metal or ceramics. More than one-half of the users began this task by selecting the teeth and spaces across which the planned bridge was to extend, and then they looked for a button or function for charting a bridge. Only PW supported that approach. In the other software applications, users had to select each tooth or space separately and specify the necessary parameters, which resulted in a number of usability problems.

Figure 5
Screenshot showing a user’s actions while performing Task 6 (record a proposed porcelain-fused-to-high-noble bridge for the missing tooth no. 21, with pontic on tooth no. 21 and abutments on teeth nos. 20 and 22) while using the EagleSoft software ...

Separation of clinically related information

In a previous study,21 we found evidence that clinical information content that typically appears on one paper form often is spread out over two or more screens in CPR systems. We suspected that these design choices could result in usability problems. In our current study, we found evidence of the relationship between separation of information and a usability problem in Task 7. In all software applications, users had to enter periodontal pocket depths on the soft-tissue screen (Figure 6), which required users to move away from the hard-tissue screen they were using. Users had significant problems navigating from one screen to the other in all software applications except PW, which used clearly labeled tabs adjacent to each other. Other investigators have found that unnecessary separation of information can reduce the quality of clinical decisions.5,8

Figure 6
Screenshot showing a user’s actions during the performance of Task 7 (record pocket depths of 2 millimeters for teeth nos. 1 through 8) while using SoftDent software application (Kodak Dental Systems, Atlanta). The correct sequence consists of ...

Failure to leverage existing user knowledge and customary design affordances

While all software applications were Windows applications, several did not take advantage of users’ familiarity with common Windows functions. For instance, one participant attempted to complete Task 1 by selecting tooth no. 28 and pressing the delete key. A closely related alternative strategy was to right click on the tooth and scan the resulting context-sensitive menu (if one appeared) for a delete command. None of the software applications allowed the use of the delete key or of a context-sensitive menu to mark a tooth as missing. Also while performing Task 1, one user attempted to type “28 missing” into a text box that typically lists planned and completed procedures in DX. This attempt failed because the text box was read-only but did not appear as such.

The six issues discussed above represent only the most serious problems for which we had evidence in the information and interaction design across the four CPR systems. If we had performed a more detailed analysis, we might have identified additional issues but with less support from the experimental data. Given that this study is the first systematic evaluation of usability problems in dental software, we felt that a more granular analysis was not meaningful at this time.


The results of our study yielded strong evidence for significant usability problems in dental CPR systems that were suggested by our earlier studies.3,20,21 These results support our earlier finding that insufficient usability and a steep learning curve are a major problem for general dentists.3 In studies of physicians, investigators found that usability problems and the resultant loss of time and productivity are significant barriers to adopting a CPR system.14,15,33 One conclusion from our study is that the usability of dental CPRs must be improved to increase the adoption of CPR systems in dental practices.

A second significant finding was the high frequency of task failures. Study participants failed to complete 28 percent of the tasks, and they made errors in completing 30 percent of the tasks. While it is difficult to infer error rates in daily practice from a laboratory study, these findings strongly suggest that there is a need to examine the incidence of documentation errors in practices that use CPR systems. Similar to the results of other studies,12,34-36 the strong correlation between the frequency of usability problems and the frequency of task failures in our study suggests that usability problems can lead to errors that affect task outcomes.

Few of the tasks in our study can be considered hard to perform. Most of them consisted of entering a single finding, diagnosis or planned procedure for a single anatomical location. What seemed to make performing the tasks a challenge for most users was that the user interface provided the capability to enter hundreds, if not thousands, of data with a few mouse clicks or keystrokes. The visual and functional complexity of the software applications seemed to overwhelm most users and appeared to be responsible for a large number of task failures.

That the usability problem of three unsuccessful attempts made up 51 percent of all problems shows that study participants tried to complete the tasks. This finding also is supported by the fact that users rarely gave up (giving up accounted for only 5 percent of usability problems). At the same time, the usability problem of negative affect made up 23 percent of all problems. Thus, it seems that a motivated group of study participants floundered owing to the complexity and challenges of the interaction design.

Several lessons for the future design and redesign of dental CPR systems emerged from our analysis of design features responsible for usability problems. Task flow and models in dental CPR systems should be aligned more closely with common practice; for instance, users should be able to identify their general approach to treatment (implant-supported bridge or removable prosthesis) and the system should help fill in the details for that approach. Data entry and retrieval controls available on screen should correspond with the tasks to be completed, and unrelated or extraneous controls should not be shown. Information that belongs to a specific task context should be shown together or be easily accessible; for instance, hard-tissue and periodontal findings often must be reviewed together to make a clinical decision and should not be separated unnecessarily. Data entry and functional controls should be organized and labeled clearly, given the multitude of findings, procedures, notes and other information that a user can enter at any one time. Dental CPR systems should leverage common platform functionality, such as Windows conventions, more than they do.

Our study had several limitations. First, users’ performances in some tasks may have been influenced by learning effects to some degree. Several tasks such as recording conditions, findings and planned treatment partially reused the same or similar user interface controls. Reusing controls in subsequent tasks (regardless of whether they were used with or without success) influenced future task outcomes. However, the limited number of participants in our study did not allow us to conduct appropriate statistical tests to measure the presence and strength of learning effects.

A second limitation of our study was that none of the four systems we tested was designed to be used without customers’ first undergoing in-depth training. In our view, a complex application domain is not an excuse for poor system design. Simple tasks should be easy to execute in any system.

The third limitation was that vendors told us that they had upgraded their software applications since we performed our study, and, as a result, some of the problems we found had been remedied. While the vendors of DX, SD and PW did not elaborate on the changes made, the vendor of ES told us that the white space at the bottom of the list of conditions, which led to many usability problems among the participants, now indicates that “Impacted Mesial” is the last item and the white space no longer is a problem in Version 14.0 (J. Garrett, software development and quality assurance manager, Patterson Dental, written communication, Aug. 27, 2007).


While this study measured only usability without training in a laboratory setting, its results suggest that there is significant room for improving CPR systems in clinical dentistry. In our view, the comparative assessment of the usability of four commercial CPR systems that derived generalizable findings about usability problems and problematic designs is a significant contribution of our study. The results of our study are significant because they provide, for the first time, quantitative evidence of usability problems in dental CPR systems and their effect on users’ ability to complete clinical documentation tasks without error. The results are relevant for dental information technology companies, systems developers, dental informatics researchers, the dental practitioner community at large, and other stake-holders in the adoption and improvement of dental CPRs. ■


The research in this manuscript was supported, in part, by National Library of Medicine award 5T15LM07059-17 and by grant 1 KL2 RR024154-02 from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH), and NIH Roadmap for Medical Research. Its contents are solely the responsibility of the authors and do not necessarily represent the official view of NCRR or NIH.

The authors thank the test participants for donating their time. They also thank Mrs. Leslie Middleton, who agreed to be the second usability expert; Dr. James Bost, who assisted with statistical analysis; Dr. Heiko Spallek, who helped with screenshots; Ms. Jeannie Y. Irwin, Dr. Miguel Humberto Torres-Urquidy and Dr. Pedro Hernandez, who assisted them in conducting the study; Mr. Michael Dziabiak for his assistance in data analysis and final formatting of the manuscript: and the computer-based record system vendors who reviewed this manuscript and provided feedback, specifically Danette Johnson (Dentrix, Dentrix, America Fork, Utah), Bonnie Pugh (Kodak Dental Systems, Atlanta) and Jim Garrett (Patterson Dental, St. Paul, Minn.)


Computer-based patient records
National Center for Research Resources
National Institutes of Health
School of Dental Medicine


Address reprint requests to Dr. Thyvalikakath.

Disclosures. Drs. Thyvalikakath and Schleyer have received consulting income from Patterson Dental (St. Paul, Minn.) for a separate usability evaluation of EagleSoft, Version 14.0, in 2008.


1. Thompson TG, Brailer DJ. The Decade of Health Information Technology: Delivering Consumer-Centric and Information-Rich Health Care. U.S. Department of Health and Human Services, Office of the National Coordinator for Health Information Technology; Washington: 2004.
2. Jha AK, Ferris TG, Donelan K, et al. How common are electronic health records in the United States? A summary of the evidence. Health Aff (Millwood) 2006;25(6):w496–507. [PubMed]
3. Schleyer TK, Thyvalikakath TP, Spallek H, Torres-Urquidy MH, Hernandez P, Yuhaniak J. Clinical computing in general dentistry. J Am Med Inform Assoc. 2006;13(3):344–352. [PMC free article] [PubMed]
4. Brailer DJ, Terasawa EL. California HealthCare Foundation. California HealthCare Foundation; Oakland, Calif.: 2003. Use and Adoption of Computer-Based Patient Records.
5. Ash J, Berg M, Coiera E. Some unintended consequences of information technology in heath care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004;11(2):104–112. [PMC free article] [PubMed]
6. Rose AF, Schnipper JL, Park ER, Poon EG, Li Q, Middleton B. Using qualitative studies to improve the usability of an EMR. J Biomed Inform. 2005;38(1):51–60. [PubMed]
7. Elting LS, Martin CG, Cantor SB, Rubenstein EB. Influence of data display formats on physician investigators’ decisions to stop clinical trials: prospective trial with repeated measures. BMJ. 1999;318(7197):1527–1531. [PMC free article] [PubMed]
8. Nygren E, Wyatt JC, Wright P. Helping clinicians to find data and avoid delays. Lancet. 1998;352(9138):1462–1466. [PubMed]
9. Pratt W, Reddy MC, McDonald DW, Tarczy-Hornoch P, Gennari JH. Incorporating ideas from computer-supported cooperative work. J Biomed Inform. 2004;37(2):128–137. [PubMed]
10. Coiera E, Westbrook JC, Wyatt JC. The safety and quality of decision support systems. IMIA Yearbook of Medical Informatics. 2006;45(suppl 1):S20–S25.
11. Graham MJ, Kubose TK, Jordan D, Zhang J, Johnson TR, Patel VL. Heuristic evaluation of infusion pumps: implications for patient safety in Intensive Care Units. Int J Med Inform. 2004;73(1112):771–779. [PubMed]
12. Kushniruk AW, Triola MM, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: the relationship between usability problems and prescription errors when using a handheld application. Int J Med Inform. 2005;74(78):519–526. [PubMed]
13. Obradovich JH, Woods DD. Users as designers: how people cope with poor HCI design in computer-based medical devices. Hum Factors. 1996;38(4):574–592. [PubMed]
14. Miller RH, Sim I. Physicians’ use of electronic medical records: barriers and solutions. Health Aff (Millwood) 2004;23(2):116–126. [PubMed]
15. Simon SR, Kaushal R, Cleary PD, et al. Correlates of electronic health record adoption in office practices: a statewide survey. J Am Med Inform Assoc. 2007;14(1):110–117. [PMC free article] [PubMed]
16. Johnson CM, Johnson T, Zhang J. Increasing productivity and reducing errors through usability analysis: a case study and recommendations. Proc AMIA Symp. 2000:394–398. [PMC free article] [PubMed]
17. Rodriguez NJ, Murillo V, Borges JA, Ortiz J, Sands DZ. A usability study of physicians interaction with a paper-based patient record system and a graphical-based electronic patient record system. Proc AMIA Symp. 2002:667–671. [PMC free article] [PubMed]
18. Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003;10(6):523–530. [PMC free article] [PubMed]
19. International Organization for Standardization Ergonomic Requirements for Office Work With Visual Display Terminals (VDTs), Part 11: Guidance on Usability 1998. International Organization for Standardization; Geneva: Technical Committee Report ISO 9241-11.
20. Thyvalikakath TP, Schleyer TK, Monaco V. Heuristic evaluation of clinical functions in four practice management systems: a pilot study. JADA. 2007;138(2):209–218. [PubMed]
21. Schleyer T, Spallek H, Hernandez P. A qualitative investigation of the content of dental paper-based and computer-based patient record formats. J Am Med Inform Assoc. 2007;14(4):515–526. [PMC free article] [PubMed]
22. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform. 2004;37(1):56–76. [PubMed]
23. Kushniruk AW, Patel VL. Cognitive approaches to the evaluation of healthcare information systems. In: Anderson JG, Aydin CE, editors. Evaluating the Organizational Impact of Healthcare Information Systems. 2nd Springer; New York: 2005. pp. 144–173.
24. Johnson CM, Johnson TR, Zhang J. A user-centered framework for redesigning health care interfaces. J Biomed Inform. 2005;38(1):75–87. [PubMed]
25. Schleyer TK. Assessing software usability through heuristic evaluation. JADA. 2007;138(2):211–212. [PubMed]
26. Nielsen J. Usability Engineering. Academic Press; Boston: 1993. pp. 195–200.
27. Nielsen J. Adelson B, Dumais S, Olson J, Landauer T, Mackay W, editors. Enhancing the explanatory power of usability heuristics Human Factors in Computing Systems: “Celebrating Interdependence”—Boston, Massachusetts, USA, April 24-28, 1994. 1994. 152–158.158Association for Computing Machinery; New York City
28. John BE, Mashyna MM. Evaluating a multimedia authoring tool. J Am Soc Inform Sci. 1997;48(11):1004–1022.
29. Schleyer TK, Torres-Urquidy H, Straja S. Validation of an instrument to measure dental students’ use of, knowledge about, and attitudes towards computers. J Dent Educ. 2001;65(9):883–891. [PubMed]
30. Jacobsen NE, Hertzum M, John BE. The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments; Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting; Chicago. Oct. 5-9, 1998; Santa Monica, Calif.: HFES; 1998. pp. 1336–1340.
31. Button PS, Doyle K, Karitis JW, Selhorst C. Automating clinical documentation in dentistry: case study of a clinical integration model. J Healthc Inf Manag. 1999;13(3):31–40. [PubMed]
32. Reason JT. Human Error. Cambridge University Press; New York City: 1990.
33. Fitzpatrick J, Koh JS. If you build it (right), they will come: the physician-friendly CPOE—not everything works as planned right out of the box, a Mississippi hospital customizes its electronic order entry system for maximum use by physicians. Health Manag Technol. 2005;26(1):52–53. [PubMed]
34. Horsky J, Kaufman DR, Patel VL. The cognitive complexity of a provider order entry interface. AMIA Annu Symp Proc. 2003:294–298. [PMC free article] [PubMed]
35. Bates DW, Cohen M, Leape LL, Overhage JM, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc. 2001;8(4):299–308. [PMC free article] [PubMed]
36. Zhang J, Patel VL, Johnson TR, Shortliffe EH. A cognitive taxonomy of medical errors. J Biomed Inform. 2004;37(3):193–204. [PubMed]
PubReader format: click here to try


Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...


Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...