Format

Send to

Choose Destination
See comment in PubMed Commons below
JAMA. 2013 May 8;309(18):1903-11. doi: 10.1001/jama.2013.4598.

Effect of long-detection interval vs standard-detection interval for implantable cardioverter-defibrillators on antitachycardia pacing and shock delivery: the ADVANCE III randomized clinical trial.

Author information

1
Electrophysiology and Pacing Unit, Humanitas Clinical and Research Center, Via Manzoni 56, 20089 Rozzano (MI), Italy. maurizio.gasparini@humanitas.it

Erratum in

  • JAMA. 2013 Jun 26;309(24):2552.

Abstract

IMPORTANCE:

Using more intervals to detect ventricular tachyarrhythmias has been associated with reducing unnecessary implantable cardioverter-defibrillator (ICD) therapies.

OBJECTIVE:

To determine whether using 30 of 40 intervals to detect ventricular arrhythmias (VT) (long detection) during spontaneous fast VT episodes reduces antitachycardia pacing (ATP) and shock delivery more than 18 of 24 intervals (standard detection).

DESIGN, SETTING, AND PARTICIPANTS:

Randomized, single-blind, parallel-group trial that enrolled 1902 primary and secondary prevention patients (mean [SD] age, 65 [11] years; 84% men; 75% primary prevention ICD) with ischemic and nonischemic etiology undergoing first ICD implant at 1 of 94 international centers (March 2008-December 2010).

INTERVENTIONS:

Patients were randomized 1:1 to programming with long- (n = 948) or standard-detection (n = 954) intervals.

MAIN OUTCOMES AND MEASURES:

Total number of ATPs and shocks delivered for all episodes (primary outcomes) and inappropriate shocks, mortality, and syncopal rate (secondary outcomes).

RESULTS:

During a median follow-up of 12 months (interquartile range, 11-13), long-detection group had 346 delivered therapies (42 therapies per 100 person-years, 95% CI, 38-47) vs 557 in the standard-detection group (67 therapies per 100 person-years [95% CI, 62-73]; incident rate ratio [IRR], 0.63 [95% CI, 0.51-0.78]; P < .001). The long- vs the standard-detection group experienced 23 ATPs per 100 person-years (95% CI, 20-27) vs 37 ATPs per 100 person-years (95% CI, 33-41; IRR, 0.58 [95% CI, 0.47-0.72]; P < .001); 19 shocks per 100 person-years (95% CI, 16-22) vs 30 shocks per 100 person-years (95% CI, 26-34; IRR, 0.77 [95% CI, 0.59-1.01]; P = .06), with a significant difference in the probability of therapy occurrence (P < .001); and a reduction in first occurrence of inappropriate shock (5.1 per 100 patient-years [95% CI, 3.7-6.9] vs 11.6 [95% CI, 9.4-14.1]; IRR, 0.55 [95% CI, 0.36-0.85]; P = .008). Mortality (5.5 [95% CI, 4.0-7.2] vs 6.3 [95% CI, 4.8-8.2] per 100 patient-years; HR, 0.87; P = .50) and arrhythmic syncope rates (3.1 [95% CI, 2.6-4.6] vs 1.9 [95% CI, 1.1-3.1] per 100 patient-years; IRR, 1.60 [95% CI, 0.76-3.41]; P = .22) did not differ significantly between groups.

CONCLUSIONS AND RELEVANCE:

Among patients receiving an ICD, the use of a long- vs standard-detection interval resulted in a lower rate of ATP and shocks, and inappropriate shocks. This programming strategy may be an appropriate alternative.

TRIAL REGISTRATION:

clinicaltrials.gov Identifier: NCT00617175.

PMID:
23652522
DOI:
10.1001/jama.2013.4598
[Indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Silverchair Information Systems
    Loading ...
    Support Center