Send to

Choose Destination
Appl Clin Inform. 2012;3(2):164-174.

Assessing Electronic Note Quality Using the Physician Documentation Quality Instrument (PDQI-9).

Author information

Department of Biomedical Informatics, Columbia University.



To refine the Physician Documentation Quality Instrument (PDQI) and test the validity and reliability of the 9-item version (PDQI-9).


Three sets each of admission notes, progress notes and discharge summaries were evaluated by two groups of physicians using the PDQI-9 and an overall general assessment: one gold standard group consisting of program or assistant program directors (n=7), and the other of attending physicians or chief residents (n=24). The main measures were criterion-related validity (correlation coefficients between Total PDQI-9 scores and 1-item General Impression scores for each note), discriminant validity (comparison of PDQI-9 scores on notes rated as best and worst using 1-item General Impression score), internal consistency reliability (Cronbach's alpha), and inter-rater reliability (intraclass correlation coefficient (ICC)).


The results were criterion-related validity (r = -.678 to .856), discriminant validity (best versus worst note, t = 9.3, p = .003), internal consistency reliability (Cronbach's alphas = .87-.94), and inter-rater reliability (ICC = .83, CI = .72-.91).


The results support the criterion-related and discriminant validity, internal consistency reliability, and inter-rater reliability of the PDQI-9 for rating the quality of electronic physician notes. Tools for assessing note redundancy are required to complement use of PDQI-9. Trials of the PDQI-9 at other institutions, of different size, using different EHRs, and incorporating additional physician specialties and notes of other healthcare providers are needed to confirm its generalizability.


Supplemental Content

Full text links

Icon for PubMed Central
Loading ...
Support Center