Send to

Choose Destination
Ann Emerg Med. 1999 Aug;34(2):155-9.

Reliability of the Canadian emergency department triage and acuity scale: interrater agreement.

Author information

Division of Emergency Medicine, Dalhousie University, Halifax, Nova Scotia.



To determine the rate of interobserver reliability of the Canadian Emergency Department Triage and Acuity Scale (CTAS).


Ten physicians and 10 nurses were randomly selected to review and assign a triage level on 50 ED case summaries containing presenting complaint, mode of arrival, vital signs, and a verbatim triage note. The rate of agreement within and between groups of raters was determined using kappa statistics. One-way, 2-way analysis of variance (ANOVA) and combined ANOVA were used to quantify reliability coefficients for intraclass and interclass correlations.


The overall chance-corrected agreement kappa for all observers was.80 (95% confidence interval [CI] .79 to .81), and the probability of agreement between 2 random observers on a random case was.539. For nurses alone, kappa=.84 (95% CI .83 to .85, P = .598), and for doctors alone, kappa= .83 (95% CI .81 to .85, P = .566). The 1-way, 2-way ANOVA and combined ANOVA showed that the reliability coefficients (84%) for both nurses and physicians were similar to the kappa values. A combined ANOVA showed there was a. 2-point difference with physicians assigning a higher triage level.


The high rate of interobserver agreement has important implications for case mix comparisons and suggests that this scale is understood and interpreted in a similar fashion by nurses and physicians.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center