Bringing you the latest news, research and reviews, exclusive interviews, podcasts, quizzes, and more.

Theme
medstat_cr
Top Sections
Clinical Review
Expert Commentary
cr
Main menu
CR Main Menu
Explore menu
CR Explore Menu
Proclivity ID
18822001
Unpublish
Negative Keywords Excluded Elements
div[contains(@class, 'view-clinical-edge-must-reads')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
nav[contains(@class, 'nav-ce-stack nav-ce-stack__large-screen')]
header[@id='header']
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'main-prefix')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]
div[contains(@class, 'view-medstat-quiz-listing-panes')]
div[contains(@class, 'pane-article-sidebar-latest-news')]
Altmetric
Click for Credit Button Label
Take Test
DSM Affiliated
Display in offset block
Disqus Exclude
Best Practices
CE/CME
Education Center
Medical Education Library
Enable Disqus
Display Author and Disclosure Link
Publication Type
Clinical
Slot System
Featured Buckets
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz
Gating Strategy
First Page Free
Challenge Center
Disable Inline Native ads

Low-Fat Vegan Diet May Improve Cardiometabolic Health in T1D

Article Type
Changed
Tue, 04/16/2024 - 11:42

 

TOPLINE:

A low-fat vegan diet — high in fiber and carbohydrates and moderate in protein — reduces insulin requirement, increases insulin sensitivity, and improves glycemic control in individuals with type 1 diabetes (T1D) compared with a conventional portion-controlled diet.

METHODOLOGY:

  • The effects of a low-fat vegan diet (without carbohydrate or portion restriction) were compared with those of a conventional portion-controlled, carbohydrate-controlled diet in 58 patients with T1D (age, ≥ 18 years) who had been receiving stable insulin treatment for the past 3 months.
  • Participants were randomly assigned to receive either the vegan diet (n = 29), comprising vegetables, grains, legumes, and fruits, or the portion-controlled diet (n = 29), which reduced daily energy intake by 500-1000 kcal/d in participants with overweight while maintaining a stable carbohydrate intake.
  • The primary clinical outcomes were insulin requirement (total daily dose of insulin), insulin sensitivity, and glycemic control (A1c).
  • Other assessments included the blood, lipid profile, blood urea nitrogen, blood urea nitrogen-to-creatinine ratio, and body weight.

TAKEAWAY:

  • The study was completed by 18 participants in the vegan-diet group and 17 in the portion-controlled group.
  • In the vegan group, the total daily dose of insulin decreased by 12.1 units/d (P = .007) and insulin sensitivity increased by 6.6 g of carbohydrate per unit of insulin on average (P = .002), with no significant changes in the portion-controlled diet group.
  • Participants on the vegan diet had lower levels of total and low-density lipoprotein cholesterol and blood urea nitrogen and a lower blood urea nitrogen-to-creatinine ratio (P for all < .001), whereas both vegan and portion-controlled groups had lower A1c levels.
  • Body weight decreased by 5.2 kg (P < .001) in the vegan group; there were no significant changes in the portion-controlled group.
  • For every 1-kg weight loss, there was a 2.16-unit decrease in the insulin total daily dose and a 0.9-unit increase in insulin sensitivity.

IN PRACTICE:

“This study provides substantial support for a low-fat vegan diet that is high in fiber and carbohydrates, low in fat, and moderate in protein” and suggests the potential therapeutic use of this diet in type 1 diabetes management, the authors wrote.

SOURCE:

The study led by Hana Kahleova, MD, PhD, Physicians Committee for Responsible Medicine, Washington, was published in Clinical Diabetes.

LIMITATIONS:

Dietary intake was recorded on the basis of self-reported data. A higher attrition rate was observed due to meal and blood glucose monitoring. The findings may have limited generalizability as the study participants comprised those seeking help for T1D.

DISCLOSURES:

The study was supported by the Physicians Committee for Responsible Medicine and a grant from the Institute for Technology in Healthcare. Some authors reported receiving compensation, being cofounders of a coaching program, writing books, providing nutrition coaching, giving lectures, or receiving royalties and honoraria from various sources.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

A low-fat vegan diet — high in fiber and carbohydrates and moderate in protein — reduces insulin requirement, increases insulin sensitivity, and improves glycemic control in individuals with type 1 diabetes (T1D) compared with a conventional portion-controlled diet.

METHODOLOGY:

  • The effects of a low-fat vegan diet (without carbohydrate or portion restriction) were compared with those of a conventional portion-controlled, carbohydrate-controlled diet in 58 patients with T1D (age, ≥ 18 years) who had been receiving stable insulin treatment for the past 3 months.
  • Participants were randomly assigned to receive either the vegan diet (n = 29), comprising vegetables, grains, legumes, and fruits, or the portion-controlled diet (n = 29), which reduced daily energy intake by 500-1000 kcal/d in participants with overweight while maintaining a stable carbohydrate intake.
  • The primary clinical outcomes were insulin requirement (total daily dose of insulin), insulin sensitivity, and glycemic control (A1c).
  • Other assessments included the blood, lipid profile, blood urea nitrogen, blood urea nitrogen-to-creatinine ratio, and body weight.

TAKEAWAY:

  • The study was completed by 18 participants in the vegan-diet group and 17 in the portion-controlled group.
  • In the vegan group, the total daily dose of insulin decreased by 12.1 units/d (P = .007) and insulin sensitivity increased by 6.6 g of carbohydrate per unit of insulin on average (P = .002), with no significant changes in the portion-controlled diet group.
  • Participants on the vegan diet had lower levels of total and low-density lipoprotein cholesterol and blood urea nitrogen and a lower blood urea nitrogen-to-creatinine ratio (P for all < .001), whereas both vegan and portion-controlled groups had lower A1c levels.
  • Body weight decreased by 5.2 kg (P < .001) in the vegan group; there were no significant changes in the portion-controlled group.
  • For every 1-kg weight loss, there was a 2.16-unit decrease in the insulin total daily dose and a 0.9-unit increase in insulin sensitivity.

IN PRACTICE:

“This study provides substantial support for a low-fat vegan diet that is high in fiber and carbohydrates, low in fat, and moderate in protein” and suggests the potential therapeutic use of this diet in type 1 diabetes management, the authors wrote.

SOURCE:

The study led by Hana Kahleova, MD, PhD, Physicians Committee for Responsible Medicine, Washington, was published in Clinical Diabetes.

LIMITATIONS:

Dietary intake was recorded on the basis of self-reported data. A higher attrition rate was observed due to meal and blood glucose monitoring. The findings may have limited generalizability as the study participants comprised those seeking help for T1D.

DISCLOSURES:

The study was supported by the Physicians Committee for Responsible Medicine and a grant from the Institute for Technology in Healthcare. Some authors reported receiving compensation, being cofounders of a coaching program, writing books, providing nutrition coaching, giving lectures, or receiving royalties and honoraria from various sources.

A version of this article appeared on Medscape.com.

 

TOPLINE:

A low-fat vegan diet — high in fiber and carbohydrates and moderate in protein — reduces insulin requirement, increases insulin sensitivity, and improves glycemic control in individuals with type 1 diabetes (T1D) compared with a conventional portion-controlled diet.

METHODOLOGY:

  • The effects of a low-fat vegan diet (without carbohydrate or portion restriction) were compared with those of a conventional portion-controlled, carbohydrate-controlled diet in 58 patients with T1D (age, ≥ 18 years) who had been receiving stable insulin treatment for the past 3 months.
  • Participants were randomly assigned to receive either the vegan diet (n = 29), comprising vegetables, grains, legumes, and fruits, or the portion-controlled diet (n = 29), which reduced daily energy intake by 500-1000 kcal/d in participants with overweight while maintaining a stable carbohydrate intake.
  • The primary clinical outcomes were insulin requirement (total daily dose of insulin), insulin sensitivity, and glycemic control (A1c).
  • Other assessments included the blood, lipid profile, blood urea nitrogen, blood urea nitrogen-to-creatinine ratio, and body weight.

TAKEAWAY:

  • The study was completed by 18 participants in the vegan-diet group and 17 in the portion-controlled group.
  • In the vegan group, the total daily dose of insulin decreased by 12.1 units/d (P = .007) and insulin sensitivity increased by 6.6 g of carbohydrate per unit of insulin on average (P = .002), with no significant changes in the portion-controlled diet group.
  • Participants on the vegan diet had lower levels of total and low-density lipoprotein cholesterol and blood urea nitrogen and a lower blood urea nitrogen-to-creatinine ratio (P for all < .001), whereas both vegan and portion-controlled groups had lower A1c levels.
  • Body weight decreased by 5.2 kg (P < .001) in the vegan group; there were no significant changes in the portion-controlled group.
  • For every 1-kg weight loss, there was a 2.16-unit decrease in the insulin total daily dose and a 0.9-unit increase in insulin sensitivity.

IN PRACTICE:

“This study provides substantial support for a low-fat vegan diet that is high in fiber and carbohydrates, low in fat, and moderate in protein” and suggests the potential therapeutic use of this diet in type 1 diabetes management, the authors wrote.

SOURCE:

The study led by Hana Kahleova, MD, PhD, Physicians Committee for Responsible Medicine, Washington, was published in Clinical Diabetes.

LIMITATIONS:

Dietary intake was recorded on the basis of self-reported data. A higher attrition rate was observed due to meal and blood glucose monitoring. The findings may have limited generalizability as the study participants comprised those seeking help for T1D.

DISCLOSURES:

The study was supported by the Physicians Committee for Responsible Medicine and a grant from the Institute for Technology in Healthcare. Some authors reported receiving compensation, being cofounders of a coaching program, writing books, providing nutrition coaching, giving lectures, or receiving royalties and honoraria from various sources.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Mandibular Device Comparable to CPAP to Reduce BP in Hypertension, OSA

Article Type
Changed
Tue, 04/16/2024 - 07:41

Use of a mandibular advancement device (MAD) proved non-inferior to guideline-recommended continuous positive airway pressure (CPAP) to reduce blood pressure in patients with hypertension and obstructive sleep apnea (OSA), in a randomized trial.

The investigator-initiated CRESCENT trial showed that at 6 months, the MAD group had a reduction of 2.5 mm Hg in 24-hour mean arterial blood pressure vs no change in the CPAP group, for a nonsignificant between-group difference of 1.6 mm Hg. 

“These findings suggest that MAD could be considered an alternative to CPAP for optimizing blood pressure control in OSA patients with hypertension and high cardiovascular risk,” the researchers conclude. 

“Looking at the totality of evidence available in the literature, it is still reasonable to say that CPAP is the first-line treatment until we have more data on the MAD,” said Ronald Lee Chi-Hang, MD, professor of medicine at Yong Loo Lin School of Medicine, National University of Singapore, who presented the results.

“However, for patients who truly cannot tolerate or accept using a CPAP, we should be more open-minded in looking for an alternative therapy such as a MAD, which based on our study, numerically had a better blood pressure reduction in patients compared with a CPAP,” said Dr. Chi-Hang, who is also a senior consultant in the Department of Cardiology at Singapore’s National University Heart Centre. 

The results were presented April 6 at the American College of Cardiology Scientific Sessions 2024 and published online simultaneously in the Journal of the American College of Cardiology
 

Oral Appliance

OSA is increasingly recognized as “an underdiagnosed and modifiable cause of hypertension,” the researchers note in their report. “Patients with OSA develop recurrent collapse of the upper airway during sleep, resulting in hypoxemia, sympathetic hyperactivity, and BP surges.” 

Current guidelines recommend screening and treatment of OSA in patients with hypertension, and CPAP is considered first-line therapy, they note. 

“Despite being effective, unfortunately, many patients decline to use a CPAP or find it challenging to stick to the therapy,” Dr. Chi-Hang said, particularly those without daytime sleepiness. 

MADs are oral appliances that work by advancing the mandible about 5 to 10 mm during sleep, he said. They provide an alternative to OSA patients and have been shown to improve daytime sleepiness and quality of life, “and in general, is better accepted and tolerated than CPAP.” 

However, early studies are small, with short follow up, included patients with and without hypertension, and didn’t specify BP reduction as the primary outcome. 

The CRESCENT trial was an investigator-initiated, randomized, non-inferiority trial that aimed to compare the relative effectiveness of MAD vs CPAP in reducing 24-hour ambulatory blood pressure in patients with moderate-to-severe OSA, hypertension and high cardiovascular risk. The prespecified margin for non-inferiority was 1.5 mm Hg. 

A total of 321 participants were recruited at three public hospitals for polysomnography. All were older than age 40 years, had hypertension, and were at increased cardiovascular risk. Of these, 220 with moderate-to-severe OSA, defined as an apnea–hypopnea index (AHI) of ≥ 15 events/hour, were randomly assigned to either MAD or CPAP treatment. 

The primary outcome was the difference between the 24-hour mean arterial BP at baseline and 6 months. The median age was 61 years, most patients (85.5%) were male, and all were Chinese. All had essential hypertension and were on one or more antihypertensive medications. Hypertension was relatively well controlled at baseline.

At 6 months, 24-hour mean arterial BP decreased by 2.5 mm Hg in the MAD group (= .003) compared to no change from baseline in the CPAP group (P = .374). 

The between-group difference was -1.6 mm Hg (95% CI, -3.51 to 0.24, non-inferiority P < .001). 

There was a larger between-group reduction in all secondary ambulatory BP parameters in the MAD versus the CPAP group, with the most pronounced effects seen in the asleep BP parameters. 

Both the MAD and CPAP significantly improved daytime sleepiness, with no between-group differences (P =.384). There were no between-group differences in cardiovascular biomarkers. 

During the presentation, panel discussant Julie B. Damp, MD, associate professor of medicine at Vanderbilt Health in Nashville, Tennessee, called CRESCENT “a really interesting study, and I think it has a lot of information to add [regarding] what we know about this comparison in the literature, because this is a big study and it also followed these patients for longer than we’ve seen in some of the previous studies.”

Dr. Damp asked, however, about how these results might be extrapolated to other populations, since the vast majority of participants were male. 

Dr. Chi-Hang pointed out that most OSA studies include mostly male patients, but noted that particularly in Asian culture, female patients may be more conservative in seeking treatment for problems with snoring, poor quality of sleep, or extensive daytime sleepiness. “Therefore, lots of times, even in clinical practice, we see that over 80 or 90% of patients are male patients,” he said. 

Dr. Damp followed up by asking about the differential effectiveness of CPAP vs MAD. “Just in thinking about these two therapies, there is some evidence that the mandibular devices are potentially less effective on some of the sleep apnea-specific measures, so how much of this do you think is an issue of a better vs a not better treatment as opposed to an issue truly of compliance and what patients are able to tolerate?”

Dr. Chi-Hang agreed that in terms of reducing the AHI, CPAP is more effective than MAD. “In fact, in our data, the residual AHI was 10 for the MAD group and 2 for the CPAP group. Clearly, CPAP is more effective,” he said. “But the problem we are facing in this area is the value of AHI as an index is being questioned.” 

AHI considers only the number of events, without taking into account the duration or the depth of the apnea, he said. “AHI is simply not an ideal index to document the disease severity,” or the impact on cardiovascular outcomes. 
 

 

 

A Tailored Approach

In an editorial accompanying the JACC publication, Michele Emdin, MD, PhD, Francesco Gentile, MD, and Alberto Giannoni, MD, PhD, all from the Health Science Interdisciplinary Center, Scuola Superiore Sant’ Anna, and Fondazione Toscana Gabriele Monasterio, in Pisa, Italy, commend the researchers for designing and conducting “such a pragmatic and informative trial, which confirms and extends previous findings.” 

They also discuss the compliance vs effectiveness issue, pointing out that although CPAP appeared to be more effective in reducing apnea burden, there was higher adherence to MAD — with 57% using the device 6 or more hours per night, vs 23% for CPAP — which might have offset the greater reduction in apnea burden and resulted in the reduction in blood pressure seen in the trial. 

“Addressing poor adherence to OSA treatments seems therefore necessary, particularly in the case of less symptomatic patients, who often have a lower perception of the related risks,” they write. 

“Currently, a tailored approach seems reasonable, based on updated evidence, considering: a) the differential effects of CPAP or MAD on OSA, blood pressure; b) the treatment feasibility; c) the individual baseline demographic and clinical characteristics, including the presence of resistant hypertension; and d) compliance with the therapeutic tool and patient’s preferences,” the editorialists conclude. 

The study was funded by the Singapore Ministry of Health. The authors and editorialists report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Use of a mandibular advancement device (MAD) proved non-inferior to guideline-recommended continuous positive airway pressure (CPAP) to reduce blood pressure in patients with hypertension and obstructive sleep apnea (OSA), in a randomized trial.

The investigator-initiated CRESCENT trial showed that at 6 months, the MAD group had a reduction of 2.5 mm Hg in 24-hour mean arterial blood pressure vs no change in the CPAP group, for a nonsignificant between-group difference of 1.6 mm Hg. 

“These findings suggest that MAD could be considered an alternative to CPAP for optimizing blood pressure control in OSA patients with hypertension and high cardiovascular risk,” the researchers conclude. 

“Looking at the totality of evidence available in the literature, it is still reasonable to say that CPAP is the first-line treatment until we have more data on the MAD,” said Ronald Lee Chi-Hang, MD, professor of medicine at Yong Loo Lin School of Medicine, National University of Singapore, who presented the results.

“However, for patients who truly cannot tolerate or accept using a CPAP, we should be more open-minded in looking for an alternative therapy such as a MAD, which based on our study, numerically had a better blood pressure reduction in patients compared with a CPAP,” said Dr. Chi-Hang, who is also a senior consultant in the Department of Cardiology at Singapore’s National University Heart Centre. 

The results were presented April 6 at the American College of Cardiology Scientific Sessions 2024 and published online simultaneously in the Journal of the American College of Cardiology
 

Oral Appliance

OSA is increasingly recognized as “an underdiagnosed and modifiable cause of hypertension,” the researchers note in their report. “Patients with OSA develop recurrent collapse of the upper airway during sleep, resulting in hypoxemia, sympathetic hyperactivity, and BP surges.” 

Current guidelines recommend screening and treatment of OSA in patients with hypertension, and CPAP is considered first-line therapy, they note. 

“Despite being effective, unfortunately, many patients decline to use a CPAP or find it challenging to stick to the therapy,” Dr. Chi-Hang said, particularly those without daytime sleepiness. 

MADs are oral appliances that work by advancing the mandible about 5 to 10 mm during sleep, he said. They provide an alternative to OSA patients and have been shown to improve daytime sleepiness and quality of life, “and in general, is better accepted and tolerated than CPAP.” 

However, early studies are small, with short follow up, included patients with and without hypertension, and didn’t specify BP reduction as the primary outcome. 

The CRESCENT trial was an investigator-initiated, randomized, non-inferiority trial that aimed to compare the relative effectiveness of MAD vs CPAP in reducing 24-hour ambulatory blood pressure in patients with moderate-to-severe OSA, hypertension and high cardiovascular risk. The prespecified margin for non-inferiority was 1.5 mm Hg. 

A total of 321 participants were recruited at three public hospitals for polysomnography. All were older than age 40 years, had hypertension, and were at increased cardiovascular risk. Of these, 220 with moderate-to-severe OSA, defined as an apnea–hypopnea index (AHI) of ≥ 15 events/hour, were randomly assigned to either MAD or CPAP treatment. 

The primary outcome was the difference between the 24-hour mean arterial BP at baseline and 6 months. The median age was 61 years, most patients (85.5%) were male, and all were Chinese. All had essential hypertension and were on one or more antihypertensive medications. Hypertension was relatively well controlled at baseline.

At 6 months, 24-hour mean arterial BP decreased by 2.5 mm Hg in the MAD group (= .003) compared to no change from baseline in the CPAP group (P = .374). 

The between-group difference was -1.6 mm Hg (95% CI, -3.51 to 0.24, non-inferiority P < .001). 

There was a larger between-group reduction in all secondary ambulatory BP parameters in the MAD versus the CPAP group, with the most pronounced effects seen in the asleep BP parameters. 

Both the MAD and CPAP significantly improved daytime sleepiness, with no between-group differences (P =.384). There were no between-group differences in cardiovascular biomarkers. 

During the presentation, panel discussant Julie B. Damp, MD, associate professor of medicine at Vanderbilt Health in Nashville, Tennessee, called CRESCENT “a really interesting study, and I think it has a lot of information to add [regarding] what we know about this comparison in the literature, because this is a big study and it also followed these patients for longer than we’ve seen in some of the previous studies.”

Dr. Damp asked, however, about how these results might be extrapolated to other populations, since the vast majority of participants were male. 

Dr. Chi-Hang pointed out that most OSA studies include mostly male patients, but noted that particularly in Asian culture, female patients may be more conservative in seeking treatment for problems with snoring, poor quality of sleep, or extensive daytime sleepiness. “Therefore, lots of times, even in clinical practice, we see that over 80 or 90% of patients are male patients,” he said. 

Dr. Damp followed up by asking about the differential effectiveness of CPAP vs MAD. “Just in thinking about these two therapies, there is some evidence that the mandibular devices are potentially less effective on some of the sleep apnea-specific measures, so how much of this do you think is an issue of a better vs a not better treatment as opposed to an issue truly of compliance and what patients are able to tolerate?”

Dr. Chi-Hang agreed that in terms of reducing the AHI, CPAP is more effective than MAD. “In fact, in our data, the residual AHI was 10 for the MAD group and 2 for the CPAP group. Clearly, CPAP is more effective,” he said. “But the problem we are facing in this area is the value of AHI as an index is being questioned.” 

AHI considers only the number of events, without taking into account the duration or the depth of the apnea, he said. “AHI is simply not an ideal index to document the disease severity,” or the impact on cardiovascular outcomes. 
 

 

 

A Tailored Approach

In an editorial accompanying the JACC publication, Michele Emdin, MD, PhD, Francesco Gentile, MD, and Alberto Giannoni, MD, PhD, all from the Health Science Interdisciplinary Center, Scuola Superiore Sant’ Anna, and Fondazione Toscana Gabriele Monasterio, in Pisa, Italy, commend the researchers for designing and conducting “such a pragmatic and informative trial, which confirms and extends previous findings.” 

They also discuss the compliance vs effectiveness issue, pointing out that although CPAP appeared to be more effective in reducing apnea burden, there was higher adherence to MAD — with 57% using the device 6 or more hours per night, vs 23% for CPAP — which might have offset the greater reduction in apnea burden and resulted in the reduction in blood pressure seen in the trial. 

“Addressing poor adherence to OSA treatments seems therefore necessary, particularly in the case of less symptomatic patients, who often have a lower perception of the related risks,” they write. 

“Currently, a tailored approach seems reasonable, based on updated evidence, considering: a) the differential effects of CPAP or MAD on OSA, blood pressure; b) the treatment feasibility; c) the individual baseline demographic and clinical characteristics, including the presence of resistant hypertension; and d) compliance with the therapeutic tool and patient’s preferences,” the editorialists conclude. 

The study was funded by the Singapore Ministry of Health. The authors and editorialists report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Use of a mandibular advancement device (MAD) proved non-inferior to guideline-recommended continuous positive airway pressure (CPAP) to reduce blood pressure in patients with hypertension and obstructive sleep apnea (OSA), in a randomized trial.

The investigator-initiated CRESCENT trial showed that at 6 months, the MAD group had a reduction of 2.5 mm Hg in 24-hour mean arterial blood pressure vs no change in the CPAP group, for a nonsignificant between-group difference of 1.6 mm Hg. 

“These findings suggest that MAD could be considered an alternative to CPAP for optimizing blood pressure control in OSA patients with hypertension and high cardiovascular risk,” the researchers conclude. 

“Looking at the totality of evidence available in the literature, it is still reasonable to say that CPAP is the first-line treatment until we have more data on the MAD,” said Ronald Lee Chi-Hang, MD, professor of medicine at Yong Loo Lin School of Medicine, National University of Singapore, who presented the results.

“However, for patients who truly cannot tolerate or accept using a CPAP, we should be more open-minded in looking for an alternative therapy such as a MAD, which based on our study, numerically had a better blood pressure reduction in patients compared with a CPAP,” said Dr. Chi-Hang, who is also a senior consultant in the Department of Cardiology at Singapore’s National University Heart Centre. 

The results were presented April 6 at the American College of Cardiology Scientific Sessions 2024 and published online simultaneously in the Journal of the American College of Cardiology
 

Oral Appliance

OSA is increasingly recognized as “an underdiagnosed and modifiable cause of hypertension,” the researchers note in their report. “Patients with OSA develop recurrent collapse of the upper airway during sleep, resulting in hypoxemia, sympathetic hyperactivity, and BP surges.” 

Current guidelines recommend screening and treatment of OSA in patients with hypertension, and CPAP is considered first-line therapy, they note. 

“Despite being effective, unfortunately, many patients decline to use a CPAP or find it challenging to stick to the therapy,” Dr. Chi-Hang said, particularly those without daytime sleepiness. 

MADs are oral appliances that work by advancing the mandible about 5 to 10 mm during sleep, he said. They provide an alternative to OSA patients and have been shown to improve daytime sleepiness and quality of life, “and in general, is better accepted and tolerated than CPAP.” 

However, early studies are small, with short follow up, included patients with and without hypertension, and didn’t specify BP reduction as the primary outcome. 

The CRESCENT trial was an investigator-initiated, randomized, non-inferiority trial that aimed to compare the relative effectiveness of MAD vs CPAP in reducing 24-hour ambulatory blood pressure in patients with moderate-to-severe OSA, hypertension and high cardiovascular risk. The prespecified margin for non-inferiority was 1.5 mm Hg. 

A total of 321 participants were recruited at three public hospitals for polysomnography. All were older than age 40 years, had hypertension, and were at increased cardiovascular risk. Of these, 220 with moderate-to-severe OSA, defined as an apnea–hypopnea index (AHI) of ≥ 15 events/hour, were randomly assigned to either MAD or CPAP treatment. 

The primary outcome was the difference between the 24-hour mean arterial BP at baseline and 6 months. The median age was 61 years, most patients (85.5%) were male, and all were Chinese. All had essential hypertension and were on one or more antihypertensive medications. Hypertension was relatively well controlled at baseline.

At 6 months, 24-hour mean arterial BP decreased by 2.5 mm Hg in the MAD group (= .003) compared to no change from baseline in the CPAP group (P = .374). 

The between-group difference was -1.6 mm Hg (95% CI, -3.51 to 0.24, non-inferiority P < .001). 

There was a larger between-group reduction in all secondary ambulatory BP parameters in the MAD versus the CPAP group, with the most pronounced effects seen in the asleep BP parameters. 

Both the MAD and CPAP significantly improved daytime sleepiness, with no between-group differences (P =.384). There were no between-group differences in cardiovascular biomarkers. 

During the presentation, panel discussant Julie B. Damp, MD, associate professor of medicine at Vanderbilt Health in Nashville, Tennessee, called CRESCENT “a really interesting study, and I think it has a lot of information to add [regarding] what we know about this comparison in the literature, because this is a big study and it also followed these patients for longer than we’ve seen in some of the previous studies.”

Dr. Damp asked, however, about how these results might be extrapolated to other populations, since the vast majority of participants were male. 

Dr. Chi-Hang pointed out that most OSA studies include mostly male patients, but noted that particularly in Asian culture, female patients may be more conservative in seeking treatment for problems with snoring, poor quality of sleep, or extensive daytime sleepiness. “Therefore, lots of times, even in clinical practice, we see that over 80 or 90% of patients are male patients,” he said. 

Dr. Damp followed up by asking about the differential effectiveness of CPAP vs MAD. “Just in thinking about these two therapies, there is some evidence that the mandibular devices are potentially less effective on some of the sleep apnea-specific measures, so how much of this do you think is an issue of a better vs a not better treatment as opposed to an issue truly of compliance and what patients are able to tolerate?”

Dr. Chi-Hang agreed that in terms of reducing the AHI, CPAP is more effective than MAD. “In fact, in our data, the residual AHI was 10 for the MAD group and 2 for the CPAP group. Clearly, CPAP is more effective,” he said. “But the problem we are facing in this area is the value of AHI as an index is being questioned.” 

AHI considers only the number of events, without taking into account the duration or the depth of the apnea, he said. “AHI is simply not an ideal index to document the disease severity,” or the impact on cardiovascular outcomes. 
 

 

 

A Tailored Approach

In an editorial accompanying the JACC publication, Michele Emdin, MD, PhD, Francesco Gentile, MD, and Alberto Giannoni, MD, PhD, all from the Health Science Interdisciplinary Center, Scuola Superiore Sant’ Anna, and Fondazione Toscana Gabriele Monasterio, in Pisa, Italy, commend the researchers for designing and conducting “such a pragmatic and informative trial, which confirms and extends previous findings.” 

They also discuss the compliance vs effectiveness issue, pointing out that although CPAP appeared to be more effective in reducing apnea burden, there was higher adherence to MAD — with 57% using the device 6 or more hours per night, vs 23% for CPAP — which might have offset the greater reduction in apnea burden and resulted in the reduction in blood pressure seen in the trial. 

“Addressing poor adherence to OSA treatments seems therefore necessary, particularly in the case of less symptomatic patients, who often have a lower perception of the related risks,” they write. 

“Currently, a tailored approach seems reasonable, based on updated evidence, considering: a) the differential effects of CPAP or MAD on OSA, blood pressure; b) the treatment feasibility; c) the individual baseline demographic and clinical characteristics, including the presence of resistant hypertension; and d) compliance with the therapeutic tool and patient’s preferences,” the editorialists conclude. 

The study was funded by the Singapore Ministry of Health. The authors and editorialists report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Salt Substitutes May Cut All-Cause And Cardiovascular Mortality

Article Type
Changed
Fri, 04/19/2024 - 11:17

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Publications
Topics
Sections

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Are E-Cigarettes Bad for the Heart?

Article Type
Changed
Tue, 04/16/2024 - 11:52

E-cigarettes entered the market as consumer products without comprehensive toxicological testing,based on the assessment that they were 95% less harmful than traditional cigarettes. Further, consumer dvertising suggests that e-cigarettes are a good alternative to conventional combustible cigarettes and can serve as a gateway to quitting smoking.

However, hen considering damage to the endothelium and toxicity, e-cigarettes have a negative impact like that of conventional cigarettes. Moreover, switching to e-cigarettes often leads to dual use, said Stefan Andreas, MD, director of the Lungenfachklinik in Immenhausen, Germany, at the Congress of the German Respiratory Society and Intensive Care Medicine. 
 

Subclinical Atherosclerosis

Because e-cigarettes have emerged relatively recently, long-term studies on their cardiac consequences are not yet available. Dr. Andreas explained that the impact on endothelial function is relevant for risk assessment. Endothelial function is a biomarker for early, subclinical atherosclerosis. “If endothelial function is impaired, the risk for heart attack and stroke is significantly increased 5-10 years later,” said Dr. Andreas.

The results of a crossover study showed reduced vascular elasticity after consuming both tobacco cigarettes and e-cigarettes. The study included 20 smokers, and endothelial function was measured using flow-mediated vasodilation.

Significant effects on the vessels were also found in a study of 31 participants who had never smoked. The study participants inhaled a nicotine-free aerosol from e-cigarettes. Before and after, parameters of endothelial function were examined using a 3.0-T MRI. After aerosol inhalation, the resistance index was 2.3% higher (P < .05), and flow-mediated vascular dilation was reduced by 34% (P < .001).

A recent review involving 372 participants from China showed that e-cigarettes lead to an increase in pulse wave velocity, with a difference of 3.08 (P < .001). “Pulse wave velocity is also a marker of endothelial function: The stiffer the vessels, the higher the pulse wave velocity,” said Dr. Andreas. The authors of the review concluded that “e-cigarettes should not be promoted as a healthier alternative to tobacco smoking.”
 

No Harmless Alternative

A recent review compared the effects of tobacco smoking and e-cigarettes. The results showed that vaping e-cigarettes causes oxidative stress, inflammation, endothelial dysfunction, and related cardiovascular consequences. The authors attributed the findings to overlapping toxic compounds in vapor and tobacco smoke and similar pathomechanical features of vaping and smoking. Although the toxic mixture in smoke is more complex, both e-cigarettes and tobacco cigarettes “impaired endothelial function to a similar extent,” they wrote. The authors attributed this finding to oxidative stress as the central mechanism.

“There is increasing evidence that e-cigarettes are not a harmless alternative to tobacco cigarettes,” wrote Thomas Münzel, MD, professor of cardiology at the University of Mainz and his team in their 2020 review, which examined studies in humans and animals. They provided an overview of the effects of tobacco/hookah smoking and e-cigarette vaping on endothelial function. They also pointed to emerging adverse effects on the proteome, transcriptome, epigenome, microbiome, and circadian clock.

Finally, a toxicological review of e-cigarettes also found alarmingly high levels of carcinogens and toxins that could have long-term effects on other organs, including the development of neurological symptoms, lung cancer, cardiovascular diseases, and cavities.

Dr. Andreas observed that even small amounts, such as those obtained through secondhand smoking, can be harmful. In 2007, Dr. Andreas and his colleagues showed that even low exposure to tobacco smoke can lead to a significant increase in cardiovascular events.
 

 

 

Conflicts of Interest 

Dr. Andreas recommended closely examining the studies that suggest that e-cigarettes are less risky. “It is noticeable that there is a significant difference depending on whether publications were supported by the tobacco industry or not,” he emphasized.

Danish scientists found that a conflict of interest (COI) has a strong influence on study results. “In studies without a COI, e-cigarettes are found to cause damage 95% of the time. In contrast, when there is a strong conflict of interest, the result is often ‘no harm,’” said Dr. Andreas.

This effect is quite relevant for the discussion of e-cigarettes. “If scientists make a critical statement in a position paper, there will always be someone who says, ‘No, it’s different, there are these and those publications.’ The true nature of interest-driven publications on e-cigarettes is not always easy to discern,” said Dr. Andreas.
 

No Gateway to Quitting 

E-cigarettes are used in clinical studies for tobacco cessation. The results of a randomized study showed that significantly more smokers who were switched to e-cigarettes quit smoking, compared with controls. But there was no significant difference in complete smoking cessation between groups. Moreover, 45% of smokers who switched to e-cigarettes became dual users, compared with 11% of controls.

“Translating these results means that for one person who quits smoking by using e-cigarettes, they gain five people who use both traditional cigarettes and e-cigarettes,” explained Dr. Andreas.

In their recent review, Münzel and colleagues pointed out that the assessment that e-cigarettes could help with quitting might be wrong. Rather, it seems that “e-cigarettes have the opposite effect.” They also note that the age of initiation for e-cigarettes is generally lower than for tobacco cigarettes: Consumption often starts at age 13 or 14 years. And the consumption of e-cigarettes among children and adolescents increased by 7% from 2016 to 2023.

A meta-analysis published at the end of February also shows that e-cigarettes are about as dangerous as tobacco cigarettes. They are more dangerous than not smoking, and dual use is more dangerous than tobacco cigarettes alone. “There is a need to reassess the assumption that e-cigarette use provides substantial harm reduction across all cigarette-caused diseases, particularly accounting for dual use,” wrote the authors.

“One must always consider that e-cigarettes have only been available for a relatively short time. We can only see the cumulative toxicity in 10, 20 years when we have patients who have smoked e-cigarettes only for 20 years,” said Dr. Andreas. Ultimately, however, e-cigarettes promote dual use and, consequently, additive toxicity.
 

Nicotine Replacement Therapies 

Quitting smoking reduces the risk of cardiovascular events and premature death by 40%, even among patients with cardiovascular disease, according to a Cochrane meta-analysis. Smoking cessation reduces the risk for cardiovascular death by 39%, the risk for major adverse cardiovascular events by 43%, the risk for heart attack by 36%, the risk for stroke by 30%, and overall mortality by 40%.

Quitting smoking is the most effective measure for risk reduction, as a meta-analysis of 20 studies in patients with coronary heart disease found. Smoking cessation was associated with a 36% risk reduction compared with 29% risk reduction for statin therapy, 23% risk reduction with beta-blockers and ACE inhibitors and 15% risk reduction with aspirin.

Dr. Andreas emphasized that nicotine replacement therapies are well-researched and safe even in cardiovascular disease, as shown by a US study that included patients who had sustained a heart attack. A group of the participants was treated with nicotine patches for 10 weeks, while the other group received a placebo. After 14 weeks, 21% of the nicotine patch group achieved abstinence vs 9% of the placebo group (P = .001). Transdermal nicotine application does not lead to a significant increase in cardiovascular events in high-risk patients.

The German “Nonsmoker Heroes” app has proven to be an effective means of behavioral therapeutic coaching. A recent study of it included 17 study centers with 661 participants. About 21% of the subjects had chronic obstructive pulmonary disease, 19% had asthma. Smoking onset occurred at age 16 years. The subjects were highly dependent: > 72% had at least moderate dependence, > 58% had high to very high dependence, and the population had an average of 3.6 quit attempts. The odds ratio for self-reported abstinence was 2.2 after 6 months. “The app is not only effective, but also can be prescribed on an extrabudgetary basis,” said Dr. Andreas.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

E-cigarettes entered the market as consumer products without comprehensive toxicological testing,based on the assessment that they were 95% less harmful than traditional cigarettes. Further, consumer dvertising suggests that e-cigarettes are a good alternative to conventional combustible cigarettes and can serve as a gateway to quitting smoking.

However, hen considering damage to the endothelium and toxicity, e-cigarettes have a negative impact like that of conventional cigarettes. Moreover, switching to e-cigarettes often leads to dual use, said Stefan Andreas, MD, director of the Lungenfachklinik in Immenhausen, Germany, at the Congress of the German Respiratory Society and Intensive Care Medicine. 
 

Subclinical Atherosclerosis

Because e-cigarettes have emerged relatively recently, long-term studies on their cardiac consequences are not yet available. Dr. Andreas explained that the impact on endothelial function is relevant for risk assessment. Endothelial function is a biomarker for early, subclinical atherosclerosis. “If endothelial function is impaired, the risk for heart attack and stroke is significantly increased 5-10 years later,” said Dr. Andreas.

The results of a crossover study showed reduced vascular elasticity after consuming both tobacco cigarettes and e-cigarettes. The study included 20 smokers, and endothelial function was measured using flow-mediated vasodilation.

Significant effects on the vessels were also found in a study of 31 participants who had never smoked. The study participants inhaled a nicotine-free aerosol from e-cigarettes. Before and after, parameters of endothelial function were examined using a 3.0-T MRI. After aerosol inhalation, the resistance index was 2.3% higher (P < .05), and flow-mediated vascular dilation was reduced by 34% (P < .001).

A recent review involving 372 participants from China showed that e-cigarettes lead to an increase in pulse wave velocity, with a difference of 3.08 (P < .001). “Pulse wave velocity is also a marker of endothelial function: The stiffer the vessels, the higher the pulse wave velocity,” said Dr. Andreas. The authors of the review concluded that “e-cigarettes should not be promoted as a healthier alternative to tobacco smoking.”
 

No Harmless Alternative

A recent review compared the effects of tobacco smoking and e-cigarettes. The results showed that vaping e-cigarettes causes oxidative stress, inflammation, endothelial dysfunction, and related cardiovascular consequences. The authors attributed the findings to overlapping toxic compounds in vapor and tobacco smoke and similar pathomechanical features of vaping and smoking. Although the toxic mixture in smoke is more complex, both e-cigarettes and tobacco cigarettes “impaired endothelial function to a similar extent,” they wrote. The authors attributed this finding to oxidative stress as the central mechanism.

“There is increasing evidence that e-cigarettes are not a harmless alternative to tobacco cigarettes,” wrote Thomas Münzel, MD, professor of cardiology at the University of Mainz and his team in their 2020 review, which examined studies in humans and animals. They provided an overview of the effects of tobacco/hookah smoking and e-cigarette vaping on endothelial function. They also pointed to emerging adverse effects on the proteome, transcriptome, epigenome, microbiome, and circadian clock.

Finally, a toxicological review of e-cigarettes also found alarmingly high levels of carcinogens and toxins that could have long-term effects on other organs, including the development of neurological symptoms, lung cancer, cardiovascular diseases, and cavities.

Dr. Andreas observed that even small amounts, such as those obtained through secondhand smoking, can be harmful. In 2007, Dr. Andreas and his colleagues showed that even low exposure to tobacco smoke can lead to a significant increase in cardiovascular events.
 

 

 

Conflicts of Interest 

Dr. Andreas recommended closely examining the studies that suggest that e-cigarettes are less risky. “It is noticeable that there is a significant difference depending on whether publications were supported by the tobacco industry or not,” he emphasized.

Danish scientists found that a conflict of interest (COI) has a strong influence on study results. “In studies without a COI, e-cigarettes are found to cause damage 95% of the time. In contrast, when there is a strong conflict of interest, the result is often ‘no harm,’” said Dr. Andreas.

This effect is quite relevant for the discussion of e-cigarettes. “If scientists make a critical statement in a position paper, there will always be someone who says, ‘No, it’s different, there are these and those publications.’ The true nature of interest-driven publications on e-cigarettes is not always easy to discern,” said Dr. Andreas.
 

No Gateway to Quitting 

E-cigarettes are used in clinical studies for tobacco cessation. The results of a randomized study showed that significantly more smokers who were switched to e-cigarettes quit smoking, compared with controls. But there was no significant difference in complete smoking cessation between groups. Moreover, 45% of smokers who switched to e-cigarettes became dual users, compared with 11% of controls.

“Translating these results means that for one person who quits smoking by using e-cigarettes, they gain five people who use both traditional cigarettes and e-cigarettes,” explained Dr. Andreas.

In their recent review, Münzel and colleagues pointed out that the assessment that e-cigarettes could help with quitting might be wrong. Rather, it seems that “e-cigarettes have the opposite effect.” They also note that the age of initiation for e-cigarettes is generally lower than for tobacco cigarettes: Consumption often starts at age 13 or 14 years. And the consumption of e-cigarettes among children and adolescents increased by 7% from 2016 to 2023.

A meta-analysis published at the end of February also shows that e-cigarettes are about as dangerous as tobacco cigarettes. They are more dangerous than not smoking, and dual use is more dangerous than tobacco cigarettes alone. “There is a need to reassess the assumption that e-cigarette use provides substantial harm reduction across all cigarette-caused diseases, particularly accounting for dual use,” wrote the authors.

“One must always consider that e-cigarettes have only been available for a relatively short time. We can only see the cumulative toxicity in 10, 20 years when we have patients who have smoked e-cigarettes only for 20 years,” said Dr. Andreas. Ultimately, however, e-cigarettes promote dual use and, consequently, additive toxicity.
 

Nicotine Replacement Therapies 

Quitting smoking reduces the risk of cardiovascular events and premature death by 40%, even among patients with cardiovascular disease, according to a Cochrane meta-analysis. Smoking cessation reduces the risk for cardiovascular death by 39%, the risk for major adverse cardiovascular events by 43%, the risk for heart attack by 36%, the risk for stroke by 30%, and overall mortality by 40%.

Quitting smoking is the most effective measure for risk reduction, as a meta-analysis of 20 studies in patients with coronary heart disease found. Smoking cessation was associated with a 36% risk reduction compared with 29% risk reduction for statin therapy, 23% risk reduction with beta-blockers and ACE inhibitors and 15% risk reduction with aspirin.

Dr. Andreas emphasized that nicotine replacement therapies are well-researched and safe even in cardiovascular disease, as shown by a US study that included patients who had sustained a heart attack. A group of the participants was treated with nicotine patches for 10 weeks, while the other group received a placebo. After 14 weeks, 21% of the nicotine patch group achieved abstinence vs 9% of the placebo group (P = .001). Transdermal nicotine application does not lead to a significant increase in cardiovascular events in high-risk patients.

The German “Nonsmoker Heroes” app has proven to be an effective means of behavioral therapeutic coaching. A recent study of it included 17 study centers with 661 participants. About 21% of the subjects had chronic obstructive pulmonary disease, 19% had asthma. Smoking onset occurred at age 16 years. The subjects were highly dependent: > 72% had at least moderate dependence, > 58% had high to very high dependence, and the population had an average of 3.6 quit attempts. The odds ratio for self-reported abstinence was 2.2 after 6 months. “The app is not only effective, but also can be prescribed on an extrabudgetary basis,” said Dr. Andreas.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

E-cigarettes entered the market as consumer products without comprehensive toxicological testing,based on the assessment that they were 95% less harmful than traditional cigarettes. Further, consumer dvertising suggests that e-cigarettes are a good alternative to conventional combustible cigarettes and can serve as a gateway to quitting smoking.

However, hen considering damage to the endothelium and toxicity, e-cigarettes have a negative impact like that of conventional cigarettes. Moreover, switching to e-cigarettes often leads to dual use, said Stefan Andreas, MD, director of the Lungenfachklinik in Immenhausen, Germany, at the Congress of the German Respiratory Society and Intensive Care Medicine. 
 

Subclinical Atherosclerosis

Because e-cigarettes have emerged relatively recently, long-term studies on their cardiac consequences are not yet available. Dr. Andreas explained that the impact on endothelial function is relevant for risk assessment. Endothelial function is a biomarker for early, subclinical atherosclerosis. “If endothelial function is impaired, the risk for heart attack and stroke is significantly increased 5-10 years later,” said Dr. Andreas.

The results of a crossover study showed reduced vascular elasticity after consuming both tobacco cigarettes and e-cigarettes. The study included 20 smokers, and endothelial function was measured using flow-mediated vasodilation.

Significant effects on the vessels were also found in a study of 31 participants who had never smoked. The study participants inhaled a nicotine-free aerosol from e-cigarettes. Before and after, parameters of endothelial function were examined using a 3.0-T MRI. After aerosol inhalation, the resistance index was 2.3% higher (P < .05), and flow-mediated vascular dilation was reduced by 34% (P < .001).

A recent review involving 372 participants from China showed that e-cigarettes lead to an increase in pulse wave velocity, with a difference of 3.08 (P < .001). “Pulse wave velocity is also a marker of endothelial function: The stiffer the vessels, the higher the pulse wave velocity,” said Dr. Andreas. The authors of the review concluded that “e-cigarettes should not be promoted as a healthier alternative to tobacco smoking.”
 

No Harmless Alternative

A recent review compared the effects of tobacco smoking and e-cigarettes. The results showed that vaping e-cigarettes causes oxidative stress, inflammation, endothelial dysfunction, and related cardiovascular consequences. The authors attributed the findings to overlapping toxic compounds in vapor and tobacco smoke and similar pathomechanical features of vaping and smoking. Although the toxic mixture in smoke is more complex, both e-cigarettes and tobacco cigarettes “impaired endothelial function to a similar extent,” they wrote. The authors attributed this finding to oxidative stress as the central mechanism.

“There is increasing evidence that e-cigarettes are not a harmless alternative to tobacco cigarettes,” wrote Thomas Münzel, MD, professor of cardiology at the University of Mainz and his team in their 2020 review, which examined studies in humans and animals. They provided an overview of the effects of tobacco/hookah smoking and e-cigarette vaping on endothelial function. They also pointed to emerging adverse effects on the proteome, transcriptome, epigenome, microbiome, and circadian clock.

Finally, a toxicological review of e-cigarettes also found alarmingly high levels of carcinogens and toxins that could have long-term effects on other organs, including the development of neurological symptoms, lung cancer, cardiovascular diseases, and cavities.

Dr. Andreas observed that even small amounts, such as those obtained through secondhand smoking, can be harmful. In 2007, Dr. Andreas and his colleagues showed that even low exposure to tobacco smoke can lead to a significant increase in cardiovascular events.
 

 

 

Conflicts of Interest 

Dr. Andreas recommended closely examining the studies that suggest that e-cigarettes are less risky. “It is noticeable that there is a significant difference depending on whether publications were supported by the tobacco industry or not,” he emphasized.

Danish scientists found that a conflict of interest (COI) has a strong influence on study results. “In studies without a COI, e-cigarettes are found to cause damage 95% of the time. In contrast, when there is a strong conflict of interest, the result is often ‘no harm,’” said Dr. Andreas.

This effect is quite relevant for the discussion of e-cigarettes. “If scientists make a critical statement in a position paper, there will always be someone who says, ‘No, it’s different, there are these and those publications.’ The true nature of interest-driven publications on e-cigarettes is not always easy to discern,” said Dr. Andreas.
 

No Gateway to Quitting 

E-cigarettes are used in clinical studies for tobacco cessation. The results of a randomized study showed that significantly more smokers who were switched to e-cigarettes quit smoking, compared with controls. But there was no significant difference in complete smoking cessation between groups. Moreover, 45% of smokers who switched to e-cigarettes became dual users, compared with 11% of controls.

“Translating these results means that for one person who quits smoking by using e-cigarettes, they gain five people who use both traditional cigarettes and e-cigarettes,” explained Dr. Andreas.

In their recent review, Münzel and colleagues pointed out that the assessment that e-cigarettes could help with quitting might be wrong. Rather, it seems that “e-cigarettes have the opposite effect.” They also note that the age of initiation for e-cigarettes is generally lower than for tobacco cigarettes: Consumption often starts at age 13 or 14 years. And the consumption of e-cigarettes among children and adolescents increased by 7% from 2016 to 2023.

A meta-analysis published at the end of February also shows that e-cigarettes are about as dangerous as tobacco cigarettes. They are more dangerous than not smoking, and dual use is more dangerous than tobacco cigarettes alone. “There is a need to reassess the assumption that e-cigarette use provides substantial harm reduction across all cigarette-caused diseases, particularly accounting for dual use,” wrote the authors.

“One must always consider that e-cigarettes have only been available for a relatively short time. We can only see the cumulative toxicity in 10, 20 years when we have patients who have smoked e-cigarettes only for 20 years,” said Dr. Andreas. Ultimately, however, e-cigarettes promote dual use and, consequently, additive toxicity.
 

Nicotine Replacement Therapies 

Quitting smoking reduces the risk of cardiovascular events and premature death by 40%, even among patients with cardiovascular disease, according to a Cochrane meta-analysis. Smoking cessation reduces the risk for cardiovascular death by 39%, the risk for major adverse cardiovascular events by 43%, the risk for heart attack by 36%, the risk for stroke by 30%, and overall mortality by 40%.

Quitting smoking is the most effective measure for risk reduction, as a meta-analysis of 20 studies in patients with coronary heart disease found. Smoking cessation was associated with a 36% risk reduction compared with 29% risk reduction for statin therapy, 23% risk reduction with beta-blockers and ACE inhibitors and 15% risk reduction with aspirin.

Dr. Andreas emphasized that nicotine replacement therapies are well-researched and safe even in cardiovascular disease, as shown by a US study that included patients who had sustained a heart attack. A group of the participants was treated with nicotine patches for 10 weeks, while the other group received a placebo. After 14 weeks, 21% of the nicotine patch group achieved abstinence vs 9% of the placebo group (P = .001). Transdermal nicotine application does not lead to a significant increase in cardiovascular events in high-risk patients.

The German “Nonsmoker Heroes” app has proven to be an effective means of behavioral therapeutic coaching. A recent study of it included 17 study centers with 661 participants. About 21% of the subjects had chronic obstructive pulmonary disease, 19% had asthma. Smoking onset occurred at age 16 years. The subjects were highly dependent: > 72% had at least moderate dependence, > 58% had high to very high dependence, and the population had an average of 3.6 quit attempts. The odds ratio for self-reported abstinence was 2.2 after 6 months. “The app is not only effective, but also can be prescribed on an extrabudgetary basis,” said Dr. Andreas.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Endoscopic Sleeve Gastroplasty More Cost-Effective Long Term Than Semaglutide for Treating Obesity

Article Type
Changed
Tue, 04/16/2024 - 11:53

 

TOPLINE:

Endoscopic sleeve gastroplasty (ESG) is more cost-effective, and achieves and sustains greater weight loss, than semaglutide over a 5-year period in patients with class II obesity.

METHODOLOGY:

  • Researchers used a Markov cohort model to assess the cost-effectiveness of semaglutide vs ESG over 5 years in people with class II obesity (body mass index [BMI], 35-39.9), with the model costs based on the US healthcare system.
  • A 45-year-old patient with a BMI of 37 was included as the base case in this study.
  • The model simulated hypothetical patients with class II obesity who received ESG, semaglutide, or no treatment (reference group with zero treatment costs).
  • The model derived clinical data for the first year from two randomized clinical trials, STEP 1 (semaglutide) and MERIT (ESG); for the following years, data were derived from published studies and publicly available data sources.
  • Study outcomes were total costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratio (ICER).

TAKEAWAY:

  • ESG led to better weight loss outcomes (BMI, 31.7 vs 33.0) and added 0.06 more QALYs relative to semaglutide in the modelled patients over the 5-year time horizon; about 20% of the patients receiving semaglutide dropped out owing to medication intolerance or other reasons.
  • The semaglutide treatment was $33,583 more expensive than the ESG treatment over the 5-year period.
  • ESG became more cost-effective than semaglutide at 2 years and remained so over a 5-year time horizon, with an ICER of — $595,532 per QALY for the base case.
  • The annual price of semaglutide would need to be reduced from $13,618 to $3591 to achieve nondominance compared with ESG.

IN PRACTICE:

“The strategic choice of cost saving yet effective treatment such as ESG compared with semaglutide for specific patient groups could help alleviate the potential budget strain expected from the use of semaglutide,” the authors wrote.

SOURCE:

Muhammad Haseeb, MD, MSc, Division of Gastroenterology, Hepatology and Endoscopy, Brigham and Women’s Hospital, Boston, led this study, which was published online on April 12, 2024, in JAMA Network Open.

LIMITATIONS:

The study did not look at benefits associated with improvements in comorbidities from either treatment strategy, and the model used did not account for any microlevel follow-up costs such as routine clinic visits. The authors acknowledged that semaglutide’s prices may fall in the future when more anti-obesity drugs get approved.

DISCLOSURES:

This study was supported in part by the National Institutes of Health. Some authors declared receiving personal fees, royalty payments, and/or grants and having other ties with several sources.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Endoscopic sleeve gastroplasty (ESG) is more cost-effective, and achieves and sustains greater weight loss, than semaglutide over a 5-year period in patients with class II obesity.

METHODOLOGY:

  • Researchers used a Markov cohort model to assess the cost-effectiveness of semaglutide vs ESG over 5 years in people with class II obesity (body mass index [BMI], 35-39.9), with the model costs based on the US healthcare system.
  • A 45-year-old patient with a BMI of 37 was included as the base case in this study.
  • The model simulated hypothetical patients with class II obesity who received ESG, semaglutide, or no treatment (reference group with zero treatment costs).
  • The model derived clinical data for the first year from two randomized clinical trials, STEP 1 (semaglutide) and MERIT (ESG); for the following years, data were derived from published studies and publicly available data sources.
  • Study outcomes were total costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratio (ICER).

TAKEAWAY:

  • ESG led to better weight loss outcomes (BMI, 31.7 vs 33.0) and added 0.06 more QALYs relative to semaglutide in the modelled patients over the 5-year time horizon; about 20% of the patients receiving semaglutide dropped out owing to medication intolerance or other reasons.
  • The semaglutide treatment was $33,583 more expensive than the ESG treatment over the 5-year period.
  • ESG became more cost-effective than semaglutide at 2 years and remained so over a 5-year time horizon, with an ICER of — $595,532 per QALY for the base case.
  • The annual price of semaglutide would need to be reduced from $13,618 to $3591 to achieve nondominance compared with ESG.

IN PRACTICE:

“The strategic choice of cost saving yet effective treatment such as ESG compared with semaglutide for specific patient groups could help alleviate the potential budget strain expected from the use of semaglutide,” the authors wrote.

SOURCE:

Muhammad Haseeb, MD, MSc, Division of Gastroenterology, Hepatology and Endoscopy, Brigham and Women’s Hospital, Boston, led this study, which was published online on April 12, 2024, in JAMA Network Open.

LIMITATIONS:

The study did not look at benefits associated with improvements in comorbidities from either treatment strategy, and the model used did not account for any microlevel follow-up costs such as routine clinic visits. The authors acknowledged that semaglutide’s prices may fall in the future when more anti-obesity drugs get approved.

DISCLOSURES:

This study was supported in part by the National Institutes of Health. Some authors declared receiving personal fees, royalty payments, and/or grants and having other ties with several sources.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Endoscopic sleeve gastroplasty (ESG) is more cost-effective, and achieves and sustains greater weight loss, than semaglutide over a 5-year period in patients with class II obesity.

METHODOLOGY:

  • Researchers used a Markov cohort model to assess the cost-effectiveness of semaglutide vs ESG over 5 years in people with class II obesity (body mass index [BMI], 35-39.9), with the model costs based on the US healthcare system.
  • A 45-year-old patient with a BMI of 37 was included as the base case in this study.
  • The model simulated hypothetical patients with class II obesity who received ESG, semaglutide, or no treatment (reference group with zero treatment costs).
  • The model derived clinical data for the first year from two randomized clinical trials, STEP 1 (semaglutide) and MERIT (ESG); for the following years, data were derived from published studies and publicly available data sources.
  • Study outcomes were total costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratio (ICER).

TAKEAWAY:

  • ESG led to better weight loss outcomes (BMI, 31.7 vs 33.0) and added 0.06 more QALYs relative to semaglutide in the modelled patients over the 5-year time horizon; about 20% of the patients receiving semaglutide dropped out owing to medication intolerance or other reasons.
  • The semaglutide treatment was $33,583 more expensive than the ESG treatment over the 5-year period.
  • ESG became more cost-effective than semaglutide at 2 years and remained so over a 5-year time horizon, with an ICER of — $595,532 per QALY for the base case.
  • The annual price of semaglutide would need to be reduced from $13,618 to $3591 to achieve nondominance compared with ESG.

IN PRACTICE:

“The strategic choice of cost saving yet effective treatment such as ESG compared with semaglutide for specific patient groups could help alleviate the potential budget strain expected from the use of semaglutide,” the authors wrote.

SOURCE:

Muhammad Haseeb, MD, MSc, Division of Gastroenterology, Hepatology and Endoscopy, Brigham and Women’s Hospital, Boston, led this study, which was published online on April 12, 2024, in JAMA Network Open.

LIMITATIONS:

The study did not look at benefits associated with improvements in comorbidities from either treatment strategy, and the model used did not account for any microlevel follow-up costs such as routine clinic visits. The authors acknowledged that semaglutide’s prices may fall in the future when more anti-obesity drugs get approved.

DISCLOSURES:

This study was supported in part by the National Institutes of Health. Some authors declared receiving personal fees, royalty payments, and/or grants and having other ties with several sources.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

GLP-1 Receptor Agonists Don’t Raise Thyroid Cancer Risk

Article Type
Changed
Mon, 04/15/2024 - 09:24

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Speedy Eating and Late-Night Meals May Take a Toll on Health

Article Type
Changed
Fri, 04/19/2024 - 11:19

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Consider Skin Cancer, Infection Risks in Solid Organ Transplant Recipients

Article Type
Changed
Fri, 04/12/2024 - 12:52

SAN DIEGO — The number of solid organ transplant survivors is on the rise, a dermatologist told colleagues, and they face unique challenges from higher risks for skin cancer and skin infections because of their suppressed immune systems.

“There are over 450,000 people with a solid organ transplant living in the United States. If you do the math, that works out to about 40 organ transplant recipients for every dermatologist, so there’s a lot of them out there for us to take care of,” Sean Christensen, MD, PhD, associate professor of dermatology, Yale University, New Haven, Connecticut, said at the annual meeting of the American Academy of Dermatology (AAD). “If we expand that umbrella to include all types of immunosuppression, that’s over 4 million adults in the US.”

Dr. Christensen encouraged dermatologists to be aware of the varying risks for immunosuppressive drugs and best screening practices for these patients, and to take advantage of a validated skin cancer risk assessment tool for transplant patients.

During his presentation, he highlighted five classes of immunosuppressive drugs and their associated skin cancer risks:

  • Calcineurin inhibitors (tacrolimus or cyclosporine), which cause severe immune suppression and pose a severe skin cancer risk. They may also cause gingival hyperplasia and sebaceous hyperplasia.
  • Antimetabolites (mycophenolate mofetil or azathioprine), which cause moderate to severe immune suppression and pose a severe skin cancer risk.
  • Mammalian target of rapamycin inhibitors (sirolimus or everolimus), which cause severe immune suppression and pose a moderate skin cancer risk. They also impair wound healing.
  • Corticosteroids (prednisone), which cause mild to severe immune suppression and pose a minimal skin cancer risk.
  • A decoy receptor protein (belatacept), which causes severe immune suppression and poses a mild skin cancer risk.

“Most of our solid-organ transplant recipients will be on both a calcineurin inhibitor and an antimetabolite,” Dr. Christensen said. “In addition to the skin cancer risk associated with immunosuppression, there is an additive risk” that is a direct effect of these medications on the skin. “That means our transplant recipients have a severely and disproportionate increase in skin cancer,” he noted.

Up to half of solid-organ transplant recipients will develop skin cancer, Dr. Christensen said. These patients have a sixfold to 10-fold increased risk for basal cell carcinoma (BCC), a 35- to 65-fold increased risk for squamous cell carcinoma (SCC), a twofold to sevenfold increased risk for melanoma, and a 16- to 100-fold increased risk for Merkel cell carcinoma.

Transplant recipients with SCC, he said, have a twofold to threefold higher risk for metastasis (4%-8% nodal metastasis) and twofold to fivefold higher risk for death (2%-7% mortality) from SCC.

As for other kinds of immunosuppression, HIV positivity, treatment with 6-mercaptopurine or azathioprine (for inflammatory bowel disease and rheumatoid arthritis), and antitumor necrosis factor agents (for psoriasis, inflammatory bowel disease, and rheumatoid arthritis) have been linked in studies to a higher risk for nonmelanoma skin cancer.

Dr. Christensen also highlighted graft-versus-host disease (GVHD). “It does look like there is a disproportionate and increased risk of SCC of the oropharynx and of the skin in patients who have chronic GVHD. This is probably due to a combination of both the immunosuppressive medications that are required but also from chronic and ongoing inflammation in the skin.”



Chronic GVHD has been linked to a 5.3-fold increase in the risk for SCC and a twofold increase in the risk for BCC, he added.

Moreover, new medications for treating GVHD have been linked to an increased risk for SCC, including a 3.2-fold increased risk for SCC associated with ruxolitinib, a Janus kinase (JAK) 1 and JAK2 inhibitor, in a study of patients with polycythemia vera and myelofibrosis; and a case report of SCC in a patient treated with belumosudil, a rho-associated coiled-coil-containing protein kinase-2 kinase inhibitor, for chronic GVHD. Risk for SCC appears to increase based on duration of use with voriconazole, an antifungal, which, he said, is a potent photosynthesizer.

Dr. Christensen also noted the higher risk for infections in immunocompromised patients and added that these patients can develop inflammatory disease despite immunosuppression:

Staphylococcus, Streptococcus, and Dermatophytes are the most common skin pathogens in these patients. There’s a significantly increased risk for reactivation of herpes simplex, varicella-zoster viruses, and cytomegalovirus. Opportunistic and disseminated fungal infections, such as mycobacteria, Candida, histoplasma, cryptococcus, aspergillus, and mucormycosis, can also appear.

More than 80% of transplant recipients develop molluscum and verruca vulgaris/human papillomavirus infection. They may also develop noninfectious inflammatory dermatoses.

 

 

Risk Calculator

What can dermatologists do to help transplant patients? Dr. Christensen highlighted the Skin and UV Neoplasia Transplant Risk Assessment Calculator, which predicts skin cancer risk based on points given for race, gender, skin cancer history, age at transplant, and site of transplant.

The tool, validated in a 2023 study of transplant recipients in Europe, is available online and as an app. It makes recommendations to users about when patients should have initial skin screening exams. Those with the most risk — 45% at 5 years — should be screened within 6 months. “We can use [the tool] to triage these cases when we first meet them and get them plugged into the appropriate care,” Dr. Christensen said.

He recommended seeing high-risk patients at least annually. Patients with a prior SCC and a heavy burden of actinic keratosis should be followed more frequently, he said.

In regard to SCC, he highlighted a 2024 study of solid organ transplant recipients that found the risk for a second SCC after a first SCC was 74%, the risk for a third SCC after a second SCC was 83%, and the risk for another SCC after five SCCs was 92%.

Dr. Christensen disclosed relationships with Canfield Scientific Inc. (consulting), Inhibitor Therapeutics (advisory board), and Sol-Gel Technologies Ltd. (grants/research funding).

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

SAN DIEGO — The number of solid organ transplant survivors is on the rise, a dermatologist told colleagues, and they face unique challenges from higher risks for skin cancer and skin infections because of their suppressed immune systems.

“There are over 450,000 people with a solid organ transplant living in the United States. If you do the math, that works out to about 40 organ transplant recipients for every dermatologist, so there’s a lot of them out there for us to take care of,” Sean Christensen, MD, PhD, associate professor of dermatology, Yale University, New Haven, Connecticut, said at the annual meeting of the American Academy of Dermatology (AAD). “If we expand that umbrella to include all types of immunosuppression, that’s over 4 million adults in the US.”

Dr. Christensen encouraged dermatologists to be aware of the varying risks for immunosuppressive drugs and best screening practices for these patients, and to take advantage of a validated skin cancer risk assessment tool for transplant patients.

During his presentation, he highlighted five classes of immunosuppressive drugs and their associated skin cancer risks:

  • Calcineurin inhibitors (tacrolimus or cyclosporine), which cause severe immune suppression and pose a severe skin cancer risk. They may also cause gingival hyperplasia and sebaceous hyperplasia.
  • Antimetabolites (mycophenolate mofetil or azathioprine), which cause moderate to severe immune suppression and pose a severe skin cancer risk.
  • Mammalian target of rapamycin inhibitors (sirolimus or everolimus), which cause severe immune suppression and pose a moderate skin cancer risk. They also impair wound healing.
  • Corticosteroids (prednisone), which cause mild to severe immune suppression and pose a minimal skin cancer risk.
  • A decoy receptor protein (belatacept), which causes severe immune suppression and poses a mild skin cancer risk.

“Most of our solid-organ transplant recipients will be on both a calcineurin inhibitor and an antimetabolite,” Dr. Christensen said. “In addition to the skin cancer risk associated with immunosuppression, there is an additive risk” that is a direct effect of these medications on the skin. “That means our transplant recipients have a severely and disproportionate increase in skin cancer,” he noted.

Up to half of solid-organ transplant recipients will develop skin cancer, Dr. Christensen said. These patients have a sixfold to 10-fold increased risk for basal cell carcinoma (BCC), a 35- to 65-fold increased risk for squamous cell carcinoma (SCC), a twofold to sevenfold increased risk for melanoma, and a 16- to 100-fold increased risk for Merkel cell carcinoma.

Transplant recipients with SCC, he said, have a twofold to threefold higher risk for metastasis (4%-8% nodal metastasis) and twofold to fivefold higher risk for death (2%-7% mortality) from SCC.

As for other kinds of immunosuppression, HIV positivity, treatment with 6-mercaptopurine or azathioprine (for inflammatory bowel disease and rheumatoid arthritis), and antitumor necrosis factor agents (for psoriasis, inflammatory bowel disease, and rheumatoid arthritis) have been linked in studies to a higher risk for nonmelanoma skin cancer.

Dr. Christensen also highlighted graft-versus-host disease (GVHD). “It does look like there is a disproportionate and increased risk of SCC of the oropharynx and of the skin in patients who have chronic GVHD. This is probably due to a combination of both the immunosuppressive medications that are required but also from chronic and ongoing inflammation in the skin.”



Chronic GVHD has been linked to a 5.3-fold increase in the risk for SCC and a twofold increase in the risk for BCC, he added.

Moreover, new medications for treating GVHD have been linked to an increased risk for SCC, including a 3.2-fold increased risk for SCC associated with ruxolitinib, a Janus kinase (JAK) 1 and JAK2 inhibitor, in a study of patients with polycythemia vera and myelofibrosis; and a case report of SCC in a patient treated with belumosudil, a rho-associated coiled-coil-containing protein kinase-2 kinase inhibitor, for chronic GVHD. Risk for SCC appears to increase based on duration of use with voriconazole, an antifungal, which, he said, is a potent photosynthesizer.

Dr. Christensen also noted the higher risk for infections in immunocompromised patients and added that these patients can develop inflammatory disease despite immunosuppression:

Staphylococcus, Streptococcus, and Dermatophytes are the most common skin pathogens in these patients. There’s a significantly increased risk for reactivation of herpes simplex, varicella-zoster viruses, and cytomegalovirus. Opportunistic and disseminated fungal infections, such as mycobacteria, Candida, histoplasma, cryptococcus, aspergillus, and mucormycosis, can also appear.

More than 80% of transplant recipients develop molluscum and verruca vulgaris/human papillomavirus infection. They may also develop noninfectious inflammatory dermatoses.

 

 

Risk Calculator

What can dermatologists do to help transplant patients? Dr. Christensen highlighted the Skin and UV Neoplasia Transplant Risk Assessment Calculator, which predicts skin cancer risk based on points given for race, gender, skin cancer history, age at transplant, and site of transplant.

The tool, validated in a 2023 study of transplant recipients in Europe, is available online and as an app. It makes recommendations to users about when patients should have initial skin screening exams. Those with the most risk — 45% at 5 years — should be screened within 6 months. “We can use [the tool] to triage these cases when we first meet them and get them plugged into the appropriate care,” Dr. Christensen said.

He recommended seeing high-risk patients at least annually. Patients with a prior SCC and a heavy burden of actinic keratosis should be followed more frequently, he said.

In regard to SCC, he highlighted a 2024 study of solid organ transplant recipients that found the risk for a second SCC after a first SCC was 74%, the risk for a third SCC after a second SCC was 83%, and the risk for another SCC after five SCCs was 92%.

Dr. Christensen disclosed relationships with Canfield Scientific Inc. (consulting), Inhibitor Therapeutics (advisory board), and Sol-Gel Technologies Ltd. (grants/research funding).

A version of this article first appeared on Medscape.com.

SAN DIEGO — The number of solid organ transplant survivors is on the rise, a dermatologist told colleagues, and they face unique challenges from higher risks for skin cancer and skin infections because of their suppressed immune systems.

“There are over 450,000 people with a solid organ transplant living in the United States. If you do the math, that works out to about 40 organ transplant recipients for every dermatologist, so there’s a lot of them out there for us to take care of,” Sean Christensen, MD, PhD, associate professor of dermatology, Yale University, New Haven, Connecticut, said at the annual meeting of the American Academy of Dermatology (AAD). “If we expand that umbrella to include all types of immunosuppression, that’s over 4 million adults in the US.”

Dr. Christensen encouraged dermatologists to be aware of the varying risks for immunosuppressive drugs and best screening practices for these patients, and to take advantage of a validated skin cancer risk assessment tool for transplant patients.

During his presentation, he highlighted five classes of immunosuppressive drugs and their associated skin cancer risks:

  • Calcineurin inhibitors (tacrolimus or cyclosporine), which cause severe immune suppression and pose a severe skin cancer risk. They may also cause gingival hyperplasia and sebaceous hyperplasia.
  • Antimetabolites (mycophenolate mofetil or azathioprine), which cause moderate to severe immune suppression and pose a severe skin cancer risk.
  • Mammalian target of rapamycin inhibitors (sirolimus or everolimus), which cause severe immune suppression and pose a moderate skin cancer risk. They also impair wound healing.
  • Corticosteroids (prednisone), which cause mild to severe immune suppression and pose a minimal skin cancer risk.
  • A decoy receptor protein (belatacept), which causes severe immune suppression and poses a mild skin cancer risk.

“Most of our solid-organ transplant recipients will be on both a calcineurin inhibitor and an antimetabolite,” Dr. Christensen said. “In addition to the skin cancer risk associated with immunosuppression, there is an additive risk” that is a direct effect of these medications on the skin. “That means our transplant recipients have a severely and disproportionate increase in skin cancer,” he noted.

Up to half of solid-organ transplant recipients will develop skin cancer, Dr. Christensen said. These patients have a sixfold to 10-fold increased risk for basal cell carcinoma (BCC), a 35- to 65-fold increased risk for squamous cell carcinoma (SCC), a twofold to sevenfold increased risk for melanoma, and a 16- to 100-fold increased risk for Merkel cell carcinoma.

Transplant recipients with SCC, he said, have a twofold to threefold higher risk for metastasis (4%-8% nodal metastasis) and twofold to fivefold higher risk for death (2%-7% mortality) from SCC.

As for other kinds of immunosuppression, HIV positivity, treatment with 6-mercaptopurine or azathioprine (for inflammatory bowel disease and rheumatoid arthritis), and antitumor necrosis factor agents (for psoriasis, inflammatory bowel disease, and rheumatoid arthritis) have been linked in studies to a higher risk for nonmelanoma skin cancer.

Dr. Christensen also highlighted graft-versus-host disease (GVHD). “It does look like there is a disproportionate and increased risk of SCC of the oropharynx and of the skin in patients who have chronic GVHD. This is probably due to a combination of both the immunosuppressive medications that are required but also from chronic and ongoing inflammation in the skin.”



Chronic GVHD has been linked to a 5.3-fold increase in the risk for SCC and a twofold increase in the risk for BCC, he added.

Moreover, new medications for treating GVHD have been linked to an increased risk for SCC, including a 3.2-fold increased risk for SCC associated with ruxolitinib, a Janus kinase (JAK) 1 and JAK2 inhibitor, in a study of patients with polycythemia vera and myelofibrosis; and a case report of SCC in a patient treated with belumosudil, a rho-associated coiled-coil-containing protein kinase-2 kinase inhibitor, for chronic GVHD. Risk for SCC appears to increase based on duration of use with voriconazole, an antifungal, which, he said, is a potent photosynthesizer.

Dr. Christensen also noted the higher risk for infections in immunocompromised patients and added that these patients can develop inflammatory disease despite immunosuppression:

Staphylococcus, Streptococcus, and Dermatophytes are the most common skin pathogens in these patients. There’s a significantly increased risk for reactivation of herpes simplex, varicella-zoster viruses, and cytomegalovirus. Opportunistic and disseminated fungal infections, such as mycobacteria, Candida, histoplasma, cryptococcus, aspergillus, and mucormycosis, can also appear.

More than 80% of transplant recipients develop molluscum and verruca vulgaris/human papillomavirus infection. They may also develop noninfectious inflammatory dermatoses.

 

 

Risk Calculator

What can dermatologists do to help transplant patients? Dr. Christensen highlighted the Skin and UV Neoplasia Transplant Risk Assessment Calculator, which predicts skin cancer risk based on points given for race, gender, skin cancer history, age at transplant, and site of transplant.

The tool, validated in a 2023 study of transplant recipients in Europe, is available online and as an app. It makes recommendations to users about when patients should have initial skin screening exams. Those with the most risk — 45% at 5 years — should be screened within 6 months. “We can use [the tool] to triage these cases when we first meet them and get them plugged into the appropriate care,” Dr. Christensen said.

He recommended seeing high-risk patients at least annually. Patients with a prior SCC and a heavy burden of actinic keratosis should be followed more frequently, he said.

In regard to SCC, he highlighted a 2024 study of solid organ transplant recipients that found the risk for a second SCC after a first SCC was 74%, the risk for a third SCC after a second SCC was 83%, and the risk for another SCC after five SCCs was 92%.

Dr. Christensen disclosed relationships with Canfield Scientific Inc. (consulting), Inhibitor Therapeutics (advisory board), and Sol-Gel Technologies Ltd. (grants/research funding).

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAD 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Metabolite in Red Meat Increases Kidney Disease Risk

Article Type
Changed
Fri, 04/19/2024 - 11:21

 

TOPLINE:

Trimethylamine N-oxide (TMAO) is a gut microbiota-derived metabolite generated by metabolism of dietary L-carnitine, primarily from red meat, and choline, from a variety of animal source foods. TMAO has been shown to cause kidney injury and tubulointerstitial fibrosis in experimental models.

In this study, TMAO was independently associated with higher risks for incident chronic kidney disease (CKD) and faster kidney function decline in humans.

The findings suggest that TMAO may be a novel risk factor and intervention target for CKD prevention and treatment.

METHODOLOGY:

  • Study population was 10,564 participants from two community-based, prospective cohorts without baseline CKD (estimated glomerular filtration rate [eGFR] ≥ 60 mL/min/1.73 m2).
  • Incident CKD was defined as eGFR decline ≥ 30% from baseline, resulting in eGFR < 60 mL/min/1.73 m2.

TAKEAWAY:

  • During a median 9.4 years, 979 incident CKD events occurred.
  • Correlation between baseline TMAO and total meat intake was small but statistically significant (P = .08).
  • After adjustments for sociodemographic, lifestyle, diet, and cardiovascular risk factors, higher plasma TMAO was associated with more than doubled CKD incidence (hazard ratio, 2.24 for top vs bottom quintile).
  • Higher TMAO levels were also associated with greater annual eGFR decline (top vs bottom quintile eGFR change = −0.43 mL/min/1.73 m2 per year.
  • Compared with other major CKD risk factors, the association for the top vs bottom TMAO quintile (−0.43 mL/min/1.73 m2 per year) was similar to that seen per 10 years of older age (−0.43) and presence of diabetes (−0.51), and larger than that seen comparing Black vs non-Black race (−0.28) and per 10 mm Hg systolic blood pressure (−0.16).

IN PRACTICE:

“TMAO levels are highly modifiable by both lifestyle-like diet and pharmacologic interventions. Besides using novel drugs to lower TMAO in patients, using dietary interventions to lower TMAO in the general population could be a cost-efficient and low-risk preventive strategy for chronic kidney disease development. ... These findings support future studies to investigate whether lifestyle and pharmacologic interventions to lower TMAO may prevent CKD development and progression.”

SOURCE:

The study was conducted by Meng Wang, PhD, of Tufts University, Boston, and colleagues and published online in the Journal of the American Society of Nephrology.

LIMITATIONS:

Observational design, can’t exclude residual confounding.

Inter-assay variability.

Use of International Classification of Diseases codes for hospitalization-based CKD, subject to reporting errors.

DISCLOSURES:

The study was supported by grants from the National Institutes of Health and an American Heart Association Postdoctoral Fellowship. Dr. Wang had no disclosures but several coauthors have patents on various diagnostics and/or industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Trimethylamine N-oxide (TMAO) is a gut microbiota-derived metabolite generated by metabolism of dietary L-carnitine, primarily from red meat, and choline, from a variety of animal source foods. TMAO has been shown to cause kidney injury and tubulointerstitial fibrosis in experimental models.

In this study, TMAO was independently associated with higher risks for incident chronic kidney disease (CKD) and faster kidney function decline in humans.

The findings suggest that TMAO may be a novel risk factor and intervention target for CKD prevention and treatment.

METHODOLOGY:

  • Study population was 10,564 participants from two community-based, prospective cohorts without baseline CKD (estimated glomerular filtration rate [eGFR] ≥ 60 mL/min/1.73 m2).
  • Incident CKD was defined as eGFR decline ≥ 30% from baseline, resulting in eGFR < 60 mL/min/1.73 m2.

TAKEAWAY:

  • During a median 9.4 years, 979 incident CKD events occurred.
  • Correlation between baseline TMAO and total meat intake was small but statistically significant (P = .08).
  • After adjustments for sociodemographic, lifestyle, diet, and cardiovascular risk factors, higher plasma TMAO was associated with more than doubled CKD incidence (hazard ratio, 2.24 for top vs bottom quintile).
  • Higher TMAO levels were also associated with greater annual eGFR decline (top vs bottom quintile eGFR change = −0.43 mL/min/1.73 m2 per year.
  • Compared with other major CKD risk factors, the association for the top vs bottom TMAO quintile (−0.43 mL/min/1.73 m2 per year) was similar to that seen per 10 years of older age (−0.43) and presence of diabetes (−0.51), and larger than that seen comparing Black vs non-Black race (−0.28) and per 10 mm Hg systolic blood pressure (−0.16).

IN PRACTICE:

“TMAO levels are highly modifiable by both lifestyle-like diet and pharmacologic interventions. Besides using novel drugs to lower TMAO in patients, using dietary interventions to lower TMAO in the general population could be a cost-efficient and low-risk preventive strategy for chronic kidney disease development. ... These findings support future studies to investigate whether lifestyle and pharmacologic interventions to lower TMAO may prevent CKD development and progression.”

SOURCE:

The study was conducted by Meng Wang, PhD, of Tufts University, Boston, and colleagues and published online in the Journal of the American Society of Nephrology.

LIMITATIONS:

Observational design, can’t exclude residual confounding.

Inter-assay variability.

Use of International Classification of Diseases codes for hospitalization-based CKD, subject to reporting errors.

DISCLOSURES:

The study was supported by grants from the National Institutes of Health and an American Heart Association Postdoctoral Fellowship. Dr. Wang had no disclosures but several coauthors have patents on various diagnostics and/or industry disclosures.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Trimethylamine N-oxide (TMAO) is a gut microbiota-derived metabolite generated by metabolism of dietary L-carnitine, primarily from red meat, and choline, from a variety of animal source foods. TMAO has been shown to cause kidney injury and tubulointerstitial fibrosis in experimental models.

In this study, TMAO was independently associated with higher risks for incident chronic kidney disease (CKD) and faster kidney function decline in humans.

The findings suggest that TMAO may be a novel risk factor and intervention target for CKD prevention and treatment.

METHODOLOGY:

  • Study population was 10,564 participants from two community-based, prospective cohorts without baseline CKD (estimated glomerular filtration rate [eGFR] ≥ 60 mL/min/1.73 m2).
  • Incident CKD was defined as eGFR decline ≥ 30% from baseline, resulting in eGFR < 60 mL/min/1.73 m2.

TAKEAWAY:

  • During a median 9.4 years, 979 incident CKD events occurred.
  • Correlation between baseline TMAO and total meat intake was small but statistically significant (P = .08).
  • After adjustments for sociodemographic, lifestyle, diet, and cardiovascular risk factors, higher plasma TMAO was associated with more than doubled CKD incidence (hazard ratio, 2.24 for top vs bottom quintile).
  • Higher TMAO levels were also associated with greater annual eGFR decline (top vs bottom quintile eGFR change = −0.43 mL/min/1.73 m2 per year.
  • Compared with other major CKD risk factors, the association for the top vs bottom TMAO quintile (−0.43 mL/min/1.73 m2 per year) was similar to that seen per 10 years of older age (−0.43) and presence of diabetes (−0.51), and larger than that seen comparing Black vs non-Black race (−0.28) and per 10 mm Hg systolic blood pressure (−0.16).

IN PRACTICE:

“TMAO levels are highly modifiable by both lifestyle-like diet and pharmacologic interventions. Besides using novel drugs to lower TMAO in patients, using dietary interventions to lower TMAO in the general population could be a cost-efficient and low-risk preventive strategy for chronic kidney disease development. ... These findings support future studies to investigate whether lifestyle and pharmacologic interventions to lower TMAO may prevent CKD development and progression.”

SOURCE:

The study was conducted by Meng Wang, PhD, of Tufts University, Boston, and colleagues and published online in the Journal of the American Society of Nephrology.

LIMITATIONS:

Observational design, can’t exclude residual confounding.

Inter-assay variability.

Use of International Classification of Diseases codes for hospitalization-based CKD, subject to reporting errors.

DISCLOSURES:

The study was supported by grants from the National Institutes of Health and an American Heart Association Postdoctoral Fellowship. Dr. Wang had no disclosures but several coauthors have patents on various diagnostics and/or industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Identifies Several Factors That Influence Longterm Antibiotic Prescribing for Acne

Article Type
Changed
Fri, 04/12/2024 - 07:25

Dermatologists are well aware of guidelines limiting long-term antibiotic use for acne to 3-4 months, but a perceived lack of supporting data, along with in-office realities unaddressed by guidelines, hinder clinicians’ ability and willingness to follow them, according to the authors of a recently published study.

“This study explored why dermatologists still prescribe a good number of long-term antibiotics for people with acne,” the study’s senior author Howa Yeung, MD, MSc, assistant professor of dermatology at Emory University, Atlanta, said in an interview. “And we found a lot of reasons.” The study was published online in JAMA Dermatology.

Dr. Howa Yeung, assistant professor of dermatology at Emory University, Atlanta.
Dr. Yeung
Dr. Howa Yeung

Using online surveys and semi-structured video interviews of 30 dermatologists, infectious disease physicians with expertise in antimicrobial stewardship, dermatology residents, and nonphysician clinicians, the investigators assessed respondents’ knowledge and attitudes regarding long-term antibiotics in acne. Salient themes impacting long-term antibiotic prescriptions included the following:

  • A perceived dearth of evidence to justify changes in practice.
  • Difficulties with iPLEDGE, the Risk Evaluation and Mitigation Strategy (REMS) for managing the teratogenic risks associated with isotretinoin, and with discussing oral contraceptives.
  • “Navigating” discussions with about tapering-off of antibiotics.
  • Challenging patient demands.
  • A lack of effective tools for monitoring progress in antibiotic stewardship.

“It’s surprising there are so many barriers that make it difficult for dermatologists to stick with the guidelines even if they want to,” said Dr. Yeung, a coauthor of the recently released updated American Academy of Dermatology (AAD) acne management guidelines.

A dermatologist who wants to stop systemic antibiotics within 3 months may not know how to do so, he explained, or high demand for appointments may prevent timely follow-ups.

A major reason why dermatologists struggle to limit long-term antibiotic use is that there are very few substitutes that are perceived to work as well, said David J. Margolis, MD, PhD, who was not involved with the study and was asked to comment on the results. He is professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia.

David J. Margolis, MD, PhD, professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia
Dr. Margolis
Dr. David J. Margolis

“Part of the reason antibiotics are being used to treat acne is that they’re effective, and effective for severe disease,” he said. The alternatives, which are mostly topicals, said Dr. Margolis, do not work as well for moderate to severe disease or, with isotretinoin, involve time-consuming hurdles. Dr. Margolis said that he often hears such concerns from individual dermatologists. “But it’s helpful to see these in a well-organized, well-reported qualitative study.”

Infectious disease specialists surveyed considered limiting long-term antibiotic use as extremely important, while several dermatologists “argued that other specialties ‘underestimate the impact acne has on people’s lives,’ ” the authors wrote. Other respondents prioritized making the right choice for the patient at hand.

Although guidelines were never meant to be black and white, Dr. Yeung said, it is crucial to target the goal of tapering off after about 3-4 months — a cutoff with which guidelines from groups including the AAD, the Japanese Dermatological Association in guidelines from 2016, and 2017, respectively, and others concur.

He added, “Some folks believe that if the oral antibiotic is working, why stop? We need to develop evidence to show that reducing oral antibiotic use is important to our patients, not just to a theoretical problem of antibiotic resistance in society.” For example, in a study published in The Lancet in 2004, patients who used strictly topical regimens achieved efficacy similar to that of those who used only oral antibiotics.



In addition, some clinicians worried that limiting antibiotics could reduce patient satisfaction, spurring switches to other providers. However, he and the other authors of the JAMA Dermatology study noted that in a survey of patients with acne published in the Journal of Clinical and Aesthetic Dermatology in 2019, 76.9% said they would be “very or extremely likely” to use effective antibiotic-free treatments if offered.

Because most respondents were highly aware of the importance of antibiotic stewardship, Dr. Yeung said, additional passive education is not necessarily the answer. “It will take a concerted effort by our national societies to come up with resources and solutions for individual dermatologists to overcome some of these larger barriers.” Such solutions could range from training in communication and shared decision-making to implementing systems that provide individualized feedback to support antibiotic stewardship.

Many ongoing studies are examining antibiotic stewardship, Dr. Margolis said in the interview. However, he added, dermatologists’ idea of long-term use is 3 months, versus 1 month or less in other specialties. “Moreover, dermatology patients tend to be much healthier individuals and are rarely hospitalized, so there may be some issues comparing the ongoing studies to individuals with acne.” Future research will need to account for such differences, he said.

The study was funded by an American Acne & Rosacea Society Clinical Research Award. Dr. Yeung is associate editor of JAMA Dermatology. Dr. Margolis has received a National Institutes of Health grant to study doxycycline versus spironolactone in acne.

Publications
Topics
Sections

Dermatologists are well aware of guidelines limiting long-term antibiotic use for acne to 3-4 months, but a perceived lack of supporting data, along with in-office realities unaddressed by guidelines, hinder clinicians’ ability and willingness to follow them, according to the authors of a recently published study.

“This study explored why dermatologists still prescribe a good number of long-term antibiotics for people with acne,” the study’s senior author Howa Yeung, MD, MSc, assistant professor of dermatology at Emory University, Atlanta, said in an interview. “And we found a lot of reasons.” The study was published online in JAMA Dermatology.

Dr. Howa Yeung, assistant professor of dermatology at Emory University, Atlanta.
Dr. Yeung
Dr. Howa Yeung

Using online surveys and semi-structured video interviews of 30 dermatologists, infectious disease physicians with expertise in antimicrobial stewardship, dermatology residents, and nonphysician clinicians, the investigators assessed respondents’ knowledge and attitudes regarding long-term antibiotics in acne. Salient themes impacting long-term antibiotic prescriptions included the following:

  • A perceived dearth of evidence to justify changes in practice.
  • Difficulties with iPLEDGE, the Risk Evaluation and Mitigation Strategy (REMS) for managing the teratogenic risks associated with isotretinoin, and with discussing oral contraceptives.
  • “Navigating” discussions with about tapering-off of antibiotics.
  • Challenging patient demands.
  • A lack of effective tools for monitoring progress in antibiotic stewardship.

“It’s surprising there are so many barriers that make it difficult for dermatologists to stick with the guidelines even if they want to,” said Dr. Yeung, a coauthor of the recently released updated American Academy of Dermatology (AAD) acne management guidelines.

A dermatologist who wants to stop systemic antibiotics within 3 months may not know how to do so, he explained, or high demand for appointments may prevent timely follow-ups.

A major reason why dermatologists struggle to limit long-term antibiotic use is that there are very few substitutes that are perceived to work as well, said David J. Margolis, MD, PhD, who was not involved with the study and was asked to comment on the results. He is professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia.

David J. Margolis, MD, PhD, professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia
Dr. Margolis
Dr. David J. Margolis

“Part of the reason antibiotics are being used to treat acne is that they’re effective, and effective for severe disease,” he said. The alternatives, which are mostly topicals, said Dr. Margolis, do not work as well for moderate to severe disease or, with isotretinoin, involve time-consuming hurdles. Dr. Margolis said that he often hears such concerns from individual dermatologists. “But it’s helpful to see these in a well-organized, well-reported qualitative study.”

Infectious disease specialists surveyed considered limiting long-term antibiotic use as extremely important, while several dermatologists “argued that other specialties ‘underestimate the impact acne has on people’s lives,’ ” the authors wrote. Other respondents prioritized making the right choice for the patient at hand.

Although guidelines were never meant to be black and white, Dr. Yeung said, it is crucial to target the goal of tapering off after about 3-4 months — a cutoff with which guidelines from groups including the AAD, the Japanese Dermatological Association in guidelines from 2016, and 2017, respectively, and others concur.

He added, “Some folks believe that if the oral antibiotic is working, why stop? We need to develop evidence to show that reducing oral antibiotic use is important to our patients, not just to a theoretical problem of antibiotic resistance in society.” For example, in a study published in The Lancet in 2004, patients who used strictly topical regimens achieved efficacy similar to that of those who used only oral antibiotics.



In addition, some clinicians worried that limiting antibiotics could reduce patient satisfaction, spurring switches to other providers. However, he and the other authors of the JAMA Dermatology study noted that in a survey of patients with acne published in the Journal of Clinical and Aesthetic Dermatology in 2019, 76.9% said they would be “very or extremely likely” to use effective antibiotic-free treatments if offered.

Because most respondents were highly aware of the importance of antibiotic stewardship, Dr. Yeung said, additional passive education is not necessarily the answer. “It will take a concerted effort by our national societies to come up with resources and solutions for individual dermatologists to overcome some of these larger barriers.” Such solutions could range from training in communication and shared decision-making to implementing systems that provide individualized feedback to support antibiotic stewardship.

Many ongoing studies are examining antibiotic stewardship, Dr. Margolis said in the interview. However, he added, dermatologists’ idea of long-term use is 3 months, versus 1 month or less in other specialties. “Moreover, dermatology patients tend to be much healthier individuals and are rarely hospitalized, so there may be some issues comparing the ongoing studies to individuals with acne.” Future research will need to account for such differences, he said.

The study was funded by an American Acne & Rosacea Society Clinical Research Award. Dr. Yeung is associate editor of JAMA Dermatology. Dr. Margolis has received a National Institutes of Health grant to study doxycycline versus spironolactone in acne.

Dermatologists are well aware of guidelines limiting long-term antibiotic use for acne to 3-4 months, but a perceived lack of supporting data, along with in-office realities unaddressed by guidelines, hinder clinicians’ ability and willingness to follow them, according to the authors of a recently published study.

“This study explored why dermatologists still prescribe a good number of long-term antibiotics for people with acne,” the study’s senior author Howa Yeung, MD, MSc, assistant professor of dermatology at Emory University, Atlanta, said in an interview. “And we found a lot of reasons.” The study was published online in JAMA Dermatology.

Dr. Howa Yeung, assistant professor of dermatology at Emory University, Atlanta.
Dr. Yeung
Dr. Howa Yeung

Using online surveys and semi-structured video interviews of 30 dermatologists, infectious disease physicians with expertise in antimicrobial stewardship, dermatology residents, and nonphysician clinicians, the investigators assessed respondents’ knowledge and attitudes regarding long-term antibiotics in acne. Salient themes impacting long-term antibiotic prescriptions included the following:

  • A perceived dearth of evidence to justify changes in practice.
  • Difficulties with iPLEDGE, the Risk Evaluation and Mitigation Strategy (REMS) for managing the teratogenic risks associated with isotretinoin, and with discussing oral contraceptives.
  • “Navigating” discussions with about tapering-off of antibiotics.
  • Challenging patient demands.
  • A lack of effective tools for monitoring progress in antibiotic stewardship.

“It’s surprising there are so many barriers that make it difficult for dermatologists to stick with the guidelines even if they want to,” said Dr. Yeung, a coauthor of the recently released updated American Academy of Dermatology (AAD) acne management guidelines.

A dermatologist who wants to stop systemic antibiotics within 3 months may not know how to do so, he explained, or high demand for appointments may prevent timely follow-ups.

A major reason why dermatologists struggle to limit long-term antibiotic use is that there are very few substitutes that are perceived to work as well, said David J. Margolis, MD, PhD, who was not involved with the study and was asked to comment on the results. He is professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia.

David J. Margolis, MD, PhD, professor of epidemiology and dermatology at the University of Pennsylvania, Philadelphia
Dr. Margolis
Dr. David J. Margolis

“Part of the reason antibiotics are being used to treat acne is that they’re effective, and effective for severe disease,” he said. The alternatives, which are mostly topicals, said Dr. Margolis, do not work as well for moderate to severe disease or, with isotretinoin, involve time-consuming hurdles. Dr. Margolis said that he often hears such concerns from individual dermatologists. “But it’s helpful to see these in a well-organized, well-reported qualitative study.”

Infectious disease specialists surveyed considered limiting long-term antibiotic use as extremely important, while several dermatologists “argued that other specialties ‘underestimate the impact acne has on people’s lives,’ ” the authors wrote. Other respondents prioritized making the right choice for the patient at hand.

Although guidelines were never meant to be black and white, Dr. Yeung said, it is crucial to target the goal of tapering off after about 3-4 months — a cutoff with which guidelines from groups including the AAD, the Japanese Dermatological Association in guidelines from 2016, and 2017, respectively, and others concur.

He added, “Some folks believe that if the oral antibiotic is working, why stop? We need to develop evidence to show that reducing oral antibiotic use is important to our patients, not just to a theoretical problem of antibiotic resistance in society.” For example, in a study published in The Lancet in 2004, patients who used strictly topical regimens achieved efficacy similar to that of those who used only oral antibiotics.



In addition, some clinicians worried that limiting antibiotics could reduce patient satisfaction, spurring switches to other providers. However, he and the other authors of the JAMA Dermatology study noted that in a survey of patients with acne published in the Journal of Clinical and Aesthetic Dermatology in 2019, 76.9% said they would be “very or extremely likely” to use effective antibiotic-free treatments if offered.

Because most respondents were highly aware of the importance of antibiotic stewardship, Dr. Yeung said, additional passive education is not necessarily the answer. “It will take a concerted effort by our national societies to come up with resources and solutions for individual dermatologists to overcome some of these larger barriers.” Such solutions could range from training in communication and shared decision-making to implementing systems that provide individualized feedback to support antibiotic stewardship.

Many ongoing studies are examining antibiotic stewardship, Dr. Margolis said in the interview. However, he added, dermatologists’ idea of long-term use is 3 months, versus 1 month or less in other specialties. “Moreover, dermatology patients tend to be much healthier individuals and are rarely hospitalized, so there may be some issues comparing the ongoing studies to individuals with acne.” Future research will need to account for such differences, he said.

The study was funded by an American Acne & Rosacea Society Clinical Research Award. Dr. Yeung is associate editor of JAMA Dermatology. Dr. Margolis has received a National Institutes of Health grant to study doxycycline versus spironolactone in acne.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA DERMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article