Logo of procamiaLink to Publisher's site
AMIA Annu Symp Proc. 2005; 2005: 918.
PMCID: PMC1560686

Comparison of Accuracy Captured by Different Controlled Languages in Oral Pathology Diagnoses

Jung-Wei Chen, DDS, MS,a,b Catherine Flaitz, DDS, MS,a and Todd Johnson, PhDb


Fifty-five oral pathology diagnoses were selected for comparison among 5 coding systems. The results of accuracy in capturing oral diagnoses are: AFIP (96.4%), followed by Read 99 (85.5%), SNOMED 98 (74.5%), ICD-9 (43.6%), and CDT-3 (14.5%). These results show that the currently used coding systems, ICD-9 and CDT-3, were inadequate, whereas the AFIP coding system captured the majority of oral diagnoses. In conclusion, the most commonly used medical and dental coding systems lack terms for the diagnosis of oral and dental conditions.


Although there are many different controlled medical languages, very few are applicable to dentistry.1 The most-used dental controlled language CDT-3 emphasizes dental procedures only. It does not include terms for disease diagnosis nor for anatomy. Dental diagnoses are usually coded using the ICD-9 code for reimbursement purposes, but ICD-9 fails to capture many important aspects of dentistry. The Armed Forces Institute of Pathology (AFIP) code is a special coding system for dental diagnosis that is more comprehensive, but is used in only a few institutions. The purpose of this research is to compare the accuracy of major controlled medical and dental languages for capturing oral pathology diagnoses, and to call attention to the shortage of dental diagnosis controlled languages.

Materials and Methods

55 oral pathology patient biopsy diagnosis paper reports (dated 1999 to 2000) were randomly selected. The five coding systems (SNOMED 98, ICD-9, CDT-3, Read 99, and AFIP) were then used to code the diagnoses found in the reports. The oral pathology diagnosis was entered into the Unified Medical Language System (UMLS) Metathesaurus. After the diagnosis was entered into the UMLS Metathesaurus the three code systems (SNOMED 98, ICD-9 and Read 99) were chosen and the result was recorded. A manual search was performed for matching the oral pathology diagnosis to the CDT-3 and AFIP code due to lack of computerized searching system. If the diagnosis could only be coded using an NOS (Not Otherwise Specified) code, it was recorded as a coding failure. The results were analyzed to compare the accuracy of the five coding systems. Cochran Q and McNemar tests were used to look for significant differences in the ability to capture the diagnosis among coding systems. Data analysis was performed with SPSS 10.0.


The accuracy for capturing oral pathology diagnoses results (Table 1) showed that AFIP demonstrated the highest accuracy for capturing oral and dental diagnoses (96.4%), followed by Read 99 (85.5%), SNOMED(74.5%),ICD-9(43.6%)and CDT-3 (14.5%).

Table 1
The result of match accuracy

There is statistical significance in match accuracy between the 5 coding systems. (Cochran’s Q test, p<0.001) Each 2 of the coding system were analyzed to find the significant difference in match accuracy by using McNemar’s Test. (Table 2)

Table 2
The results of McNemar Test (*p<0.05)

The results show that the accuracy of AFIP significantly differs from SNOMED 98, ICD-9, and CDT-3. In addition, CDT-3 significantly differs from all the other coding system.


The most commonly used medical and dental coding systems, ICD-9 and CDT-3, do not include the necessary terminology for the diagnosis of many oral and dental conditions. However, the AFIP coding system captured the majority of conditions relevant to oral and maxillofacial pathology and could provide the basis for developing a controlled vocabulary in this specialty. There is an important need for including the oral pathology diagnosis code in current controlled terminologies for diagnostic and research purposes..


1. Cimino JJ. Review paper: coding systems in health care. Methods of Info in Med. 1996;35(4–5):273–84. [PubMed]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association
PubReader format: click here to try


Save items

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...


  • MedGen
    Related information in MedGen
  • PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...