User login
Once-Weekly Antibiotic Might Be Effective for Treatment of Acute Bacterial Skin Infections
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of =37.6°C at 48–72 hours. Secondary endpoints included a decrease in lesion area of =20% at 48–72 hours and clinical success at end of therapy (determined by clinical and historical features). Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups. Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48–72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Once-Weekly Antibiotic Might Be Effective for Treatment of Acute Bacterial Skin Infections
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Clinical question: Is once-weekly intravenous dalbavancin as effective as conventional therapy for the treatment of acute bacterial skin infections?
Background: Acute bacterial skin infections are common and often require hospitalization for intravenous antibiotic administration. Treatment covering gram-positive bacteria usually is indicated. Dalbavancin is effective against gram-positives, including MRSA. Its long half-life makes it an attractive alternative to other commonly used antibiotics, which require more frequent dosing.
Study design: Phase 3, double-blinded RCT.
Setting: Multiple international centers.
Synopsis: Researchers randomized 1,312 patients with acute bacterial skin and skin-structure infections with signs of systemic infection requiring intravenous antibiotics to receive dalbavancin on days one and eight, with placebo on other days, or several doses of vancomycin with an option to switch to oral linezolid. The primary endpoint was cessation of spread of erythema and temperature of ≤37.6°C at 48-72 hours. Secondary endpoints included a decrease in lesion area of ≥20% at 48-72 hours and clinical success at end of therapy (determined by clinical and historical features).
Results of the primary endpoint were similar with dalbavancin and vancomycin-linezolid groups (79.7% and 79.8%, respectively) and were within 10 percentage points of noninferiority. The secondary endpoints were similar between both groups.
Limitations of the study were the early primary endpoint, lack of noninferiority analysis of the secondary endpoints, and cost-effective analysis.
Bottom line: Once-weekly dalbavancin appears to be similarly efficacious to intravenous vancomycin in the treatment of acute bacterial skin infections in terms of outcomes within 48-72 hours of therapy and might provide an alternative to continued inpatient hospitalization for intravenous antibiotics in stable patients.
Citation: Boucher HW, Wilcox M, Talbot GH, Puttagunta S, Das AF, Dunne MW. Once-weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370(23):2169-2179.
Continuous Positive Airway Pressure Outperforms Noctural Oxygen for Blood Pressure Reduction
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Clinical question: What is the effect of continuous positive airway pressure (CPAP) or supplemental oxygen on ambulatory blood pressures and markers of cardiovascular risk when combined with sleep hygiene education in patients with obstructive sleep apnea (OSA) and coronary artery disease or cardiac risk factors?
Background: OSA is considered a risk factor for the development of hypertension. One meta-analysis showed reduction of mean arterial pressure (MAP) with CPAP therapy, but randomized controlled data on blood pressure reduction with treatment of OSA is lacking.
Study design: Randomized, parallel-group trial.
Setting: Four outpatient cardiology practices.
Synopsis: Patients ages 45-75 with OSA were randomized to receive nocturnal CPAP and healthy lifestyle and sleep education (HLSE), nocturnal oxygen therapy and HSLE, or HSLE alone. The primary outcome was 24-hour MAP. Secondary outcomes included fasting blood glucose, lipid panel, insulin level, erythrocyte sedimentation rate, C-reactive protein (CRP), and N-terminal pro-brain naturetic peptide.
Participants had high rates of diabetes, hypertension, and coronary artery disease. At 12 weeks, the CPAP arm experienced greater reductions in 24-hour MAP compared to both the nocturnal oxygen and HSLE arms (-2.8 mmHg [P=0.02] and -2.4 mmHg [P=0.04], respectively). No significant decrease in MAP was identified in the nocturnal oxygen arm when compared to the HSLE arm. The only significant difference in secondary outcomes was a decrease in CRP in the CPAP arm when compared to the HSLE arm, the clinical significance of which is unclear.
Bottom line: CPAP therapy with sleep hygiene education appears superior to nocturnal oxygen therapy with sleep hygiene education and sleep hygiene education alone in decreasing 24-hour MAP in patients with OSA and coronary artery disease or cardiac risk factors.
Citation: Gottlieb DJ, Punjabi NM, Mehra R, et al. CPAP versus oxygen in obstructive sleep apnea. N Engl J Med. 2014;370(24):2276-2285.
Lactate Clearance Portends Better Outcomes after Cardiac Arrest
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Clinical question: Is greater lactate clearance following resuscitation from cardiac arrest associated with lower mortality and better neurologic outcomes?
Background: Recommendations from the International Liaison Committee on Resuscitation for monitoring serial lactate levels in post-resuscitation patients are based primarily on extrapolation from other conditions such as sepsis. Two single-retrospective analyses found effective lactate clearance was associated with decreased mortality. This association had not previously been validated in a multicenter, prospective study.
Study design: Multicenter, prospective, observational study.
Setting: Four urban, tertiary-care teaching hospitals.
Synopsis: Absolute lactate levels and the differences in the percent lactate change over 24 hours were compared in 100 patients who suffered out-of-hospital cardiac arrest. Ninety-seven percent received therapeutic hypothermia, and overall survival was 46%. Survivors and patients with a good neurologic outcome had lower lactate levels at zero hours (4.1 vs. 7.3), 12 hours (2.2 vs. 6.0), and 24 hours (1.6 vs. 4.4) compared with nonsurvivors and patients with bad neurologic outcomes.
The percent lactate decreased was greater in survivors and in those with good neurologic outcomes (odds ratio, 2.2; 95% confidence interval, 1.1–4.4).
Nonsurvivors or those with poor neurologic outcomes were less likely to have received bystander CPR, to have suffered a witnessed arrest, or to have had a shockable rhythm at presentation. Superior lactate clearance in survivors and those with good neurologic outcomes suggests a potential role in developing markers of effective resuscitation.
Bottom line: Lower lactate levels and more effective clearance of lactate in patients following cardiac arrest are associated with improved survival and good neurologic outcome.
Citation: Donnino MW, Andersen LW, Giberson T, et al. Initial lactate and lactate change in post-cardiac arrest: a multicenter validation study. Crit Care Med. 2014;42(8):1804-1811.
Time to Meds Matters for Patients with Cardiac Arrest Due to Nonshockable Rhythms
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Clinical question: Is earlier administration of epinephrine in patients with cardiac arrest due to nonshockable rhythms associated with increased return of spontaneous circulation, survival, and neurologically intact survival?
Background: About 200,000 hospitalized patients in the U.S. have a cardiac arrest, commonly due to nonshockable rhythms. Cardiopulmonary resuscitation has been the only efficacious intervention. There are no well-controlled trials of the use of epinephrine on survival and neurological outcomes.
Study design: Prospective cohort from a large multicenter registry of in-hospital cardiac arrests.
Setting: Data from 570 hospitals from 2000 to 2009.
Synopsis: Authors included 25,095 adults from 570 hospitals who had cardiac arrests in hospital with asystole or pulseless electrical activity as the initial rhythm. Time to first administration of epinephrine was recorded and then separated into quartiles, and odds ratios were evaluated using one to three minutes as the reference group. Outcomes of survival to hospital discharge (10%), return of spontaneous circulation (47%), and survival to hospital discharge with favorable neurologic status (7%) were assessed.
Survival to discharge decreased as the time to administration of the first dose of epinephrine increased. Of those patients receiving epinephrine in one minute, 12% survived. This dropped to 7% for those first receiving epinephrine after seven minutes. Return of spontaneous circulation and survival to discharge with favorable neurologic status showed a similar stepwise decrease with longer times to first administration of epinephrine.
Bottom line: Earlier administration of epinephrine to patients with cardiac arrest due to nonshockable rhythms is associated with improved survival to discharge, return of spontaneous circulation, and neurologically intact survival.
Citation: Donnino MW, Salciccioli JD, Howell MD, et al. Time to administration of epinephrine and outcome after in-hospital cardiac arrest with non-shockable rhythms: restrospective analysis of large in-hospital data registry. BMJ. 2014;348:g3028.
Frailty Indices Tool Predicts Post-Operative Complications, Mortality after Elective Surgery in Geriatric Patients
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Clinical question: Is there a more accurate way to predict adverse post-operative outcomes in geriatric patients undergoing elective surgery?
Background: More than half of all operations in the U.S. involve geriatric patients. Most tools hospitalists use to predict post-operative outcomes are focused on cardiovascular events and do not account for frailty. Common in geriatric patients, frailty is thought to influence post-operative outcomes.
Study design: Prospective cohort study.
Setting: A 1,000-bed academic hospital in Seoul, South Korea.
Synopsis: A cohort of 275 elderly patients (>64 years old) who were scheduled for elective intermediate or high-risk surgery underwent a pre-operative comprehensive geriatric assessment (CGA) that included measures of frailty. This cohort was then followed for mortality, major post-operative complications (pneumonia, urinary infection, pulmonary embolism, and unplanned transfer to intensive care), length of stay, and transfer to a nursing home. Post-operative complications, transfer to a nursing facility, and one-year mortality were associated with a derived scoring tool that included the Charlson Comorbidity Index, activities of daily living (ADL), instrumental activities of daily living (IADL), dementia, risk for delirium, mid-arm circumference, and a mini-nutritional assessment.
This tool was more accurate at predicting one-year mortality than the American Society of Anesthesiologists (ASA) classification.
Bottom line: This study establishes that measures of frailty predict post-operative outcomes in geriatric patients undergoing elective surgery; however, the authors’ tool depends on CGA, which is time-consuming, cumbersome, and depends on indices not familiar to many hospitalists.
Citation: Kim SW, Han HS, Jung HW, et al. Multidimensional frailty scores for the prediction of postoperative mortality risk. JAMA Surg. 2014;149(7):633-640.
Pre-Operative Use of Angiotension Converting Enzyme Inhibitors, Angiotension Receptor Blockers Examined in Elective Joint Replacement Surgery
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
Clinical question: Should angiotension-converting enzyme inhibitors or angiotension receptor blockers (ACEI/ARB) be held the morning of elective joint replacement?
Background: In patients taking ACEI/ARB, the decision regarding whether or not to give these medications on the day of surgery is controversial. UptoDate recommends holding ACEI/ARB the day of surgery; American College of Physicians Guidelines and SHM Consult Medicine recommend giving these drugs on the day of surgery.
Study design: Retrospective cohort (case control) study.
Setting: A large academic hospital in Pennsylvania.
Synopsis: Researchers studied adults undergoing elective spinal fusion, total knee replacement, or total hip replacement, and compared outcomes in 323 patients who were taking an ACEI/ARB (study group) to outcomes in the 579 patients who were not taking an ACEI/ARB (control group) before surgery. It was assumed—but not studied—that the ACEI/ARB was continued the morning of surgery in all patients in the study group, because that was the standard practice at this hospital.
Compared to the control group, the study group had more post-induction hypotension (12.2% vs. 6.7%) and more post-operative acute kidney injury (5.76% vs. 3.28%). Patients who developed acute kidney injury had longer length of stay (5.76 vs. 3.28 days) but no difference in two-year mortality.
Patients in the study group had higher baseline creatinine, were older, were more likely to be taking a diuretic, and were more likely to have diabetes, heart failure, and coronary artery disease. The authors used multiple logistic regression to adjust for these differences. Anesthesia and intra-operative fluid management were not standardized or compared.
Bottom line: ACEI/ARB administration on the morning of elective major orthopedic surgery is likely associated with a higher risk of intra-operative hypotension and acute kidney injury.
Citation: Nielson E, Hennrikus E, Lehman E, Mets B. Angiotensin axis blockade, hypotension, and acute kidney injury in elective major orthopedic surgery. J Hosp Med. 2014;9(5):283-288.
No Mortality Difference Associated with Pre-Operative Beta Blocker Use for Coronary Artery Bypass Grafting Without Recent Myocardial Infarction
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Clinical question: Is the use of beta blockers within 24 hours of coronary artery bypass grafting (CABG) surgery without recent myocardial infarction (MI) associated with decreased peri-operative mortality?
Background: Several retrospective observational studies suggest a reduction in peri-operative mortality with CABG surgery if beta blockers are administered prior to surgery. Although the use of beta blockers pre-operatively for CABG is now a quality measure, the use of pre-operative beta blockers is still controversial due to the results of more recent studies, with the observed benefit thought to be driven mainly by patients with recent MI.
Study design: Retrospective cohort analysis.
Setting: More than 1,100 U.S. hospitals.
Synopsis: The Society of Thoracic Surgeons’ National Adult Cardiac Surgery database identified 506,110 adult patients (without MI within 21 days) nonemergently undergoing CABG surgery. Beta blocker use was defined as receiving a beta blocker within 24 hours before surgery. Although most patients (86%) received beta blockers prior to surgery, there was no significant difference in operative mortality, permanent stroke, prolonged ventilation, and renal failure between patients receiving beta blockers and those who did not, although atrial fibrillation (Afib) was more common with pre-operative beta blocker use.
Bottom line: For patients undergoing nonemergent CABG surgery without recent MI, pre-operative beta blocker use is not associated with improved outcomes and is associated with slightly higher rates of Afib.
Citation: Brinkman W, Herbert MA, O’Brien S, et al. Preoperative beta-blocker use in coronary artery bypass grafting surgery: national database analysis. JAMA Intern Med. 2014;174(8):1320-1327.
Delirium Severity Scoring System CAM-S Correlates with Length of Stay, Mortality
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.
Clinical question: Does the CAM-S, a modified version of the Confusion Assessment Method (CAM), which measures delirium severity, correlate with clinical outcomes?
Background: In 1990, Dr. Sharon Inouye developed the CAM, which is a common, standard measure to identify the presence of delirium. Although other scoring systems exist to quantify delirium severity, Dr. Inouye proposes an extension of the CAM (CAM-S) to measure delirium severity.
Study design: Validation analysis.
Setting: Three academic medical centers in the U.S.
Synopsis: Two validation cohorts of patients 70 years or older without dementia and moderate-to-high-risk of developing delirium during hospitalization were studied. The first cohort comprised 300 patients scheduled for elective, noncardiac surgery; the second cohort was made up of 250 patients admitted to an inpatient medical service. The CAM-S uses the same items as the original CAM and rates each symptom 0 for absent, 1 for mild, or 2 for marked; acute onset of fluctuation receives 0 (absent) or 1 (present). Higher CAM-S scores appear to correlate with various outcome measures, including increased length of stay, new nursing home placement, and 90-day mortality.
Bottom line: Higher scores on the CAM-S, a scoring system based on the CAM and designed to measure delirium severity, are associated with worse in-hospital and post-discharge outcomes.
Citation: Inouye SK, Kosar CM, Tommet D, et al. The CAM-S: development and validation of a new scoring system for delirium severity in 2 cohorts. Ann Intern Med. 2014;160(8):526-533.
Thrombolytics in Pulmonary Embolism Associated with Lower Mortality, Increased Bleeding
Clinical question: What are the mortality benefits and bleeding risks associated with thrombolytic therapy, compared with other anticoagulants, in pulmonary embolism (PE)?
Background: Thrombolytics are not routinely administered for PE but can be considered in patients with hemodynamic instability with massive PE and those not responding to anticoagulation.
Study design: Meta-analysis.
Setting: Sixteen randomized clinical trials (RCTs) occurring in a variety of settings.
Synopsis: Trials involving 2,115 patients (thrombolytic therapy cohort 1,061; anticoagulation cohort 1,054) with PE were studied, with special attention given to those patients with intermediate risk PEs defined by subclinical cardiovascular compromise. Thrombolytics were compared with low molecular weight heparin, unfractionated heparin, vitamin K antagonists, and fondaparinux. The primary outcomes were all-cause mortality and major bleeding. Secondary outcomes included risk of recurrence of the PE and intracranial hemorrhage.
Thrombolytic therapy was associated with lower all-cause mortality and with higher risk of bleeding. There was a 9.24% rate of major bleeding in the thrombolytic therapy cohort and a 3.42% rate in the anticoagulation cohort. Intracranial hemorrhage was greater in the thrombolytic therapy cohort (1.46% vs. 0.19%). Patients with intermediate risk PE had greater major bleeding rate (7.74% vs. 2.25%) and lower mortality (1.39% vs. 2.92%) with thrombolytics compared to anticoagulation. A net clinical benefit calculation (mortality benefit accounting for intracranial hemorrhage risk) was performed and demonstrated a net clinical benefit of 0.81% (95% CI, 0.65%-1.01%) for those patients who received thrombolytics versus other anticoagulation.
Bottom line: This study suggested a mortality benefit of thrombolytics overall, including those patients with intermediate risk PE.
Citation: Chatterjee S, Chakraborty A, Weinberg I, et al. Thrombolysis for pulmonary embolism and risk of all-cause mortality, major bleeding, and intracranial hemorrhage: a meta-analysis. JAMA. 2014;311(23):2414-2421.
Clinical question: What are the mortality benefits and bleeding risks associated with thrombolytic therapy, compared with other anticoagulants, in pulmonary embolism (PE)?
Background: Thrombolytics are not routinely administered for PE but can be considered in patients with hemodynamic instability with massive PE and those not responding to anticoagulation.
Study design: Meta-analysis.
Setting: Sixteen randomized clinical trials (RCTs) occurring in a variety of settings.
Synopsis: Trials involving 2,115 patients (thrombolytic therapy cohort 1,061; anticoagulation cohort 1,054) with PE were studied, with special attention given to those patients with intermediate risk PEs defined by subclinical cardiovascular compromise. Thrombolytics were compared with low molecular weight heparin, unfractionated heparin, vitamin K antagonists, and fondaparinux. The primary outcomes were all-cause mortality and major bleeding. Secondary outcomes included risk of recurrence of the PE and intracranial hemorrhage.
Thrombolytic therapy was associated with lower all-cause mortality and with higher risk of bleeding. There was a 9.24% rate of major bleeding in the thrombolytic therapy cohort and a 3.42% rate in the anticoagulation cohort. Intracranial hemorrhage was greater in the thrombolytic therapy cohort (1.46% vs. 0.19%). Patients with intermediate risk PE had greater major bleeding rate (7.74% vs. 2.25%) and lower mortality (1.39% vs. 2.92%) with thrombolytics compared to anticoagulation. A net clinical benefit calculation (mortality benefit accounting for intracranial hemorrhage risk) was performed and demonstrated a net clinical benefit of 0.81% (95% CI, 0.65%-1.01%) for those patients who received thrombolytics versus other anticoagulation.
Bottom line: This study suggested a mortality benefit of thrombolytics overall, including those patients with intermediate risk PE.
Citation: Chatterjee S, Chakraborty A, Weinberg I, et al. Thrombolysis for pulmonary embolism and risk of all-cause mortality, major bleeding, and intracranial hemorrhage: a meta-analysis. JAMA. 2014;311(23):2414-2421.
Clinical question: What are the mortality benefits and bleeding risks associated with thrombolytic therapy, compared with other anticoagulants, in pulmonary embolism (PE)?
Background: Thrombolytics are not routinely administered for PE but can be considered in patients with hemodynamic instability with massive PE and those not responding to anticoagulation.
Study design: Meta-analysis.
Setting: Sixteen randomized clinical trials (RCTs) occurring in a variety of settings.
Synopsis: Trials involving 2,115 patients (thrombolytic therapy cohort 1,061; anticoagulation cohort 1,054) with PE were studied, with special attention given to those patients with intermediate risk PEs defined by subclinical cardiovascular compromise. Thrombolytics were compared with low molecular weight heparin, unfractionated heparin, vitamin K antagonists, and fondaparinux. The primary outcomes were all-cause mortality and major bleeding. Secondary outcomes included risk of recurrence of the PE and intracranial hemorrhage.
Thrombolytic therapy was associated with lower all-cause mortality and with higher risk of bleeding. There was a 9.24% rate of major bleeding in the thrombolytic therapy cohort and a 3.42% rate in the anticoagulation cohort. Intracranial hemorrhage was greater in the thrombolytic therapy cohort (1.46% vs. 0.19%). Patients with intermediate risk PE had greater major bleeding rate (7.74% vs. 2.25%) and lower mortality (1.39% vs. 2.92%) with thrombolytics compared to anticoagulation. A net clinical benefit calculation (mortality benefit accounting for intracranial hemorrhage risk) was performed and demonstrated a net clinical benefit of 0.81% (95% CI, 0.65%-1.01%) for those patients who received thrombolytics versus other anticoagulation.
Bottom line: This study suggested a mortality benefit of thrombolytics overall, including those patients with intermediate risk PE.
Citation: Chatterjee S, Chakraborty A, Weinberg I, et al. Thrombolysis for pulmonary embolism and risk of all-cause mortality, major bleeding, and intracranial hemorrhage: a meta-analysis. JAMA. 2014;311(23):2414-2421.