Bringing you the latest news, research and reviews, exclusive interviews, podcasts, quizzes, and more.

Theme
medstat_cr
Top Sections
Clinical Review
Expert Commentary
cr
Main menu
CR Main Menu
Explore menu
CR Explore Menu
Proclivity ID
18822001
Unpublish
Negative Keywords Excluded Elements
div[contains(@class, 'view-clinical-edge-must-reads')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
nav[contains(@class, 'nav-ce-stack nav-ce-stack__large-screen')]
header[@id='header']
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'main-prefix')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]
div[contains(@class, 'view-medstat-quiz-listing-panes')]
div[contains(@class, 'pane-article-sidebar-latest-news')]
Altmetric
Click for Credit Button Label
Take Test
DSM Affiliated
Display in offset block
Disqus Exclude
Best Practices
CE/CME
Education Center
Medical Education Library
Enable Disqus
Display Author and Disclosure Link
Publication Type
Clinical
Slot System
Featured Buckets
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz
Gating Strategy
First Page Free
Challenge Center
Disable Inline Native ads

Nontraditional Risk Factors Play an Outsized Role in Young Adult Stroke Risk

Article Type
Changed
Mon, 04/08/2024 - 09:00

Nontraditional risk factors such as migraine and autoimmune diseases have a significantly greater effect on stroke risk in young adults than traditional risk factors such as hypertension, high cholesterol, and tobacco use, new research showed.

The findings may offer insight into the increased incidence of stroke in adults under age 45, which has more than doubled in the past 20 years in high-income countries, while incidence in those over 45 has decreased.

Investigators believe the findings are important because most conventional prevention efforts focus on traditional risk factors.

“The younger they are at the time of stroke, the more likely their stroke is due to a nontraditional risk factor,” lead author Michelle Leppert, MD, an assistant professor of neurology at the University of Colorado School of Medicine, Aurora, Colorado, said in a news release.

The findings were published online in Circulation: Cardiovascular Quality and Outcomes.
 

Traditional Versus Nontraditional

The researchers retrospectively analyzed 2618 stroke cases (52% female; 73% ischemic stroke) that resulted in an inpatient admission and 7827 controls, all aged 18-55 years. Data came from the Colorado All Payer Claims Database between January 2012 and April 2019. Controls were matched by age, sex, and insurance type.

Traditional risk factors were defined as being a well-established risk factor for stroke that is routinely noted during stroke prevention screenings in older adults, including hypertension, diabetes, hyperlipidemia, sleep apnea, cardiovascular disease, alcohol, substance use disorder, and obesity.

Nontraditional risk factors were defined as those that are rarely cited as a cause of stroke in older adults, including migraines, malignancy, HIV, hepatitis, thrombophilia, autoimmune disease, vasculitis, sickle cell disease, heart valve disease, renal failure, and hormonal risk factors in women, such as oral contraceptives, pregnancy, or puerperium.

Overall, traditional risk factors were more common in stroke cases, with nontraditional factors playing a smaller role. However, among adults aged 18-34 years, more strokes were associated with nontraditional than traditional risk factors in men (31% vs 25%, respectively) and in women (43% vs 33%, respectively).

Migraine, the most common nontraditional risk factor for stroke in this younger age group, was found in 20% of men (odds ratio [OR], 3.9) and 35% of women (OR, 3.3).

Other notable nontraditional risk factors included heart valve disease in both men and women (OR, 3.1 and OR, 4.2, respectively); renal failure in men (OR, 8.9); and autoimmune diseases in women (OR, 8.8).
 

An Underestimate?

The contribution of nontraditional risk factors declined with age. After the age of 44, they were no longer significant. Hypertension was the most important traditional risk factor and increased in contribution with age.

“There have been many studies demonstrating the association between migraines and strokes, but to our knowledge, this study may be the first to demonstrate just how much stroke risk may be attributable to migraines,” Dr. Leppert said.

Overall, women had significantly more risk factors for stroke than men. Among controls, 52% and 34% of women had at least one traditional and nontraditional risk factors, respectively, compared with 48% and 22% in men.

The total contribution of nontraditional risk factors was likely an underestimate because some such factors, including the autoimmune disorder antiphospholipid syndrome and patent foramen ovale, “lacked reliable administrative algorithms” and could not be assessed in this study, the researchers noted.

Further research on how nontraditional risk factors affect strokes could lead to better prevention.

“We need to better understand the underlying mechanisms of these nontraditional risk factors to develop targeted interventions,” Dr. Leppert said.

The study was funded by the National Institutes of Health/National Center for Advancing Translational Sciences Colorado Clinical and Translational Science Award. Dr. Leppert reports receiving an American Heart Association Career Development Grant. Other disclosures are included in the original article.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Nontraditional risk factors such as migraine and autoimmune diseases have a significantly greater effect on stroke risk in young adults than traditional risk factors such as hypertension, high cholesterol, and tobacco use, new research showed.

The findings may offer insight into the increased incidence of stroke in adults under age 45, which has more than doubled in the past 20 years in high-income countries, while incidence in those over 45 has decreased.

Investigators believe the findings are important because most conventional prevention efforts focus on traditional risk factors.

“The younger they are at the time of stroke, the more likely their stroke is due to a nontraditional risk factor,” lead author Michelle Leppert, MD, an assistant professor of neurology at the University of Colorado School of Medicine, Aurora, Colorado, said in a news release.

The findings were published online in Circulation: Cardiovascular Quality and Outcomes.
 

Traditional Versus Nontraditional

The researchers retrospectively analyzed 2618 stroke cases (52% female; 73% ischemic stroke) that resulted in an inpatient admission and 7827 controls, all aged 18-55 years. Data came from the Colorado All Payer Claims Database between January 2012 and April 2019. Controls were matched by age, sex, and insurance type.

Traditional risk factors were defined as being a well-established risk factor for stroke that is routinely noted during stroke prevention screenings in older adults, including hypertension, diabetes, hyperlipidemia, sleep apnea, cardiovascular disease, alcohol, substance use disorder, and obesity.

Nontraditional risk factors were defined as those that are rarely cited as a cause of stroke in older adults, including migraines, malignancy, HIV, hepatitis, thrombophilia, autoimmune disease, vasculitis, sickle cell disease, heart valve disease, renal failure, and hormonal risk factors in women, such as oral contraceptives, pregnancy, or puerperium.

Overall, traditional risk factors were more common in stroke cases, with nontraditional factors playing a smaller role. However, among adults aged 18-34 years, more strokes were associated with nontraditional than traditional risk factors in men (31% vs 25%, respectively) and in women (43% vs 33%, respectively).

Migraine, the most common nontraditional risk factor for stroke in this younger age group, was found in 20% of men (odds ratio [OR], 3.9) and 35% of women (OR, 3.3).

Other notable nontraditional risk factors included heart valve disease in both men and women (OR, 3.1 and OR, 4.2, respectively); renal failure in men (OR, 8.9); and autoimmune diseases in women (OR, 8.8).
 

An Underestimate?

The contribution of nontraditional risk factors declined with age. After the age of 44, they were no longer significant. Hypertension was the most important traditional risk factor and increased in contribution with age.

“There have been many studies demonstrating the association between migraines and strokes, but to our knowledge, this study may be the first to demonstrate just how much stroke risk may be attributable to migraines,” Dr. Leppert said.

Overall, women had significantly more risk factors for stroke than men. Among controls, 52% and 34% of women had at least one traditional and nontraditional risk factors, respectively, compared with 48% and 22% in men.

The total contribution of nontraditional risk factors was likely an underestimate because some such factors, including the autoimmune disorder antiphospholipid syndrome and patent foramen ovale, “lacked reliable administrative algorithms” and could not be assessed in this study, the researchers noted.

Further research on how nontraditional risk factors affect strokes could lead to better prevention.

“We need to better understand the underlying mechanisms of these nontraditional risk factors to develop targeted interventions,” Dr. Leppert said.

The study was funded by the National Institutes of Health/National Center for Advancing Translational Sciences Colorado Clinical and Translational Science Award. Dr. Leppert reports receiving an American Heart Association Career Development Grant. Other disclosures are included in the original article.

A version of this article appeared on Medscape.com.

Nontraditional risk factors such as migraine and autoimmune diseases have a significantly greater effect on stroke risk in young adults than traditional risk factors such as hypertension, high cholesterol, and tobacco use, new research showed.

The findings may offer insight into the increased incidence of stroke in adults under age 45, which has more than doubled in the past 20 years in high-income countries, while incidence in those over 45 has decreased.

Investigators believe the findings are important because most conventional prevention efforts focus on traditional risk factors.

“The younger they are at the time of stroke, the more likely their stroke is due to a nontraditional risk factor,” lead author Michelle Leppert, MD, an assistant professor of neurology at the University of Colorado School of Medicine, Aurora, Colorado, said in a news release.

The findings were published online in Circulation: Cardiovascular Quality and Outcomes.
 

Traditional Versus Nontraditional

The researchers retrospectively analyzed 2618 stroke cases (52% female; 73% ischemic stroke) that resulted in an inpatient admission and 7827 controls, all aged 18-55 years. Data came from the Colorado All Payer Claims Database between January 2012 and April 2019. Controls were matched by age, sex, and insurance type.

Traditional risk factors were defined as being a well-established risk factor for stroke that is routinely noted during stroke prevention screenings in older adults, including hypertension, diabetes, hyperlipidemia, sleep apnea, cardiovascular disease, alcohol, substance use disorder, and obesity.

Nontraditional risk factors were defined as those that are rarely cited as a cause of stroke in older adults, including migraines, malignancy, HIV, hepatitis, thrombophilia, autoimmune disease, vasculitis, sickle cell disease, heart valve disease, renal failure, and hormonal risk factors in women, such as oral contraceptives, pregnancy, or puerperium.

Overall, traditional risk factors were more common in stroke cases, with nontraditional factors playing a smaller role. However, among adults aged 18-34 years, more strokes were associated with nontraditional than traditional risk factors in men (31% vs 25%, respectively) and in women (43% vs 33%, respectively).

Migraine, the most common nontraditional risk factor for stroke in this younger age group, was found in 20% of men (odds ratio [OR], 3.9) and 35% of women (OR, 3.3).

Other notable nontraditional risk factors included heart valve disease in both men and women (OR, 3.1 and OR, 4.2, respectively); renal failure in men (OR, 8.9); and autoimmune diseases in women (OR, 8.8).
 

An Underestimate?

The contribution of nontraditional risk factors declined with age. After the age of 44, they were no longer significant. Hypertension was the most important traditional risk factor and increased in contribution with age.

“There have been many studies demonstrating the association between migraines and strokes, but to our knowledge, this study may be the first to demonstrate just how much stroke risk may be attributable to migraines,” Dr. Leppert said.

Overall, women had significantly more risk factors for stroke than men. Among controls, 52% and 34% of women had at least one traditional and nontraditional risk factors, respectively, compared with 48% and 22% in men.

The total contribution of nontraditional risk factors was likely an underestimate because some such factors, including the autoimmune disorder antiphospholipid syndrome and patent foramen ovale, “lacked reliable administrative algorithms” and could not be assessed in this study, the researchers noted.

Further research on how nontraditional risk factors affect strokes could lead to better prevention.

“We need to better understand the underlying mechanisms of these nontraditional risk factors to develop targeted interventions,” Dr. Leppert said.

The study was funded by the National Institutes of Health/National Center for Advancing Translational Sciences Colorado Clinical and Translational Science Award. Dr. Leppert reports receiving an American Heart Association Career Development Grant. Other disclosures are included in the original article.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CIRCULATION: CARDIOVASCULAR QUALITY AND OUTCOMES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Shows Nirmatrelvir–Ritonavir No More Effective Than Placebo for COVID-19 Symptom Relief

Article Type
Changed
Thu, 04/11/2024 - 15:58

Paxlovid does not significantly alleviate symptoms of COVID-19 compared with placebo among nonhospitalized adults, a new study published April 3 in The New England Journal of Medicine found. 

The results suggest that the drug, a combination of nirmatrelvir and ritonavir, may not be particularly helpful for patients who are not at high risk for severe COVID-19. However, although the rate of hospitalization and death from any cause was low overall, the group that received Paxlovid had a reduced rate compared with people in the placebo group, according to the researchers. 

“Clearly, the benefit observed among unvaccinated high-risk persons does not extend to those at lower risk for severe COVID-19,” Rajesh T. Gandhi, MD, and Martin Hirsch, MD, of Massachusetts General Hospital in Boston, wrote in an editorial accompanying the journal article. “This result supports guidelines that recommend nirmatrelvir–ritonavir only for persons who are at high risk for disease progression.”

The time from onset to relief of COVID-19 symptoms — including cough, shortness of breath, body aches, and chills — did not differ significantly between the two study groups, the researchers reported. The median time to sustained alleviation of symptoms was 12 days for the Paxlovid group compared with 13 days in the placebo group (P = .60).

However, the phase 2/3 trial found a 57.6% relative reduction in the risk for hospitalizations or death among people who took Paxlovid and were vaccinated but were at high risk for poor outcomes, according to Jennifer Hammond, PhD, head of antiviral development for Pfizer, which makes the drug, and the corresponding author on the study.

Paxlovid has “an increasing body of evidence supporting the strong clinical value of the treatment in preventing hospitalization and death among eligible patients across age groups, vaccination status, and predominant variants,” Dr. Hammond said. 

She and her colleagues analyzed data from 1250 adults with symptomatic COVID-19. Participants were fully vaccinated and had a high risk for progression to severe disease or were never vaccinated or had not been in the previous year and had no risk factors for progression to severe disease.

More than half of participants were women, 78.5% were White and 41.4% identified as Hispanic or Latinx. Almost three quarters underwent randomization within 3 days of the start of symptoms, and a little over half had previously received a COVID-19 vaccination. Almost half had one risk factor for severe illness, the most common of these being hypertension (12.3%). 

In a subgroup analysis of high-risk participants, hospitalization or death occurred in 0.9% of patients in the Paxlovid group and 2.2% in the placebo group (95% CI, -3.3 to 0.7). 

The study’s limitations include that the statistical analysis of COVID-19–related hospitalizations or death from any cause was only descriptive, “because the results for the primary efficacy end point were not significant,” the authors wrote. 

Participants who were vaccinated and at high risk were also enrolled regardless of when they had last had a vaccine dose. Furthermore, Paxlovid has a telltale taste, which may have affected the blinding. Finally, the trial was started when the B.1.617.2 (Delta) variant was predominant.

Dr. Gandhi and Dr. Hirsch pointed out that only 5% of participants in the trial were older than 65 years and that other than risk factors such as obesity and smoking, just 2% of people had heart or lung disease. 

“As with many medical interventions, there is likely to be a gradient of benefit for nirmatrelvir–ritonavir, with the patients at highest risk for progression most likely to derive the greatest benefit,” Dr. Gandhi and Dr. Hirsch wrote in the editorial. “Thus, it appears reasonable to recommend nirmatrelvir–ritonavir primarily for the treatment of COVID-19 in older patients (particularly those ≥ 65 years of age), those who are immunocompromised, and those who have conditions that substantially increase the risk of severe COVID-19, regardless of previous vaccination or infection status.”

The study was supported by Pfizer. 

A version of this article appeared on Medscape.com .

Publications
Topics
Sections

Paxlovid does not significantly alleviate symptoms of COVID-19 compared with placebo among nonhospitalized adults, a new study published April 3 in The New England Journal of Medicine found. 

The results suggest that the drug, a combination of nirmatrelvir and ritonavir, may not be particularly helpful for patients who are not at high risk for severe COVID-19. However, although the rate of hospitalization and death from any cause was low overall, the group that received Paxlovid had a reduced rate compared with people in the placebo group, according to the researchers. 

“Clearly, the benefit observed among unvaccinated high-risk persons does not extend to those at lower risk for severe COVID-19,” Rajesh T. Gandhi, MD, and Martin Hirsch, MD, of Massachusetts General Hospital in Boston, wrote in an editorial accompanying the journal article. “This result supports guidelines that recommend nirmatrelvir–ritonavir only for persons who are at high risk for disease progression.”

The time from onset to relief of COVID-19 symptoms — including cough, shortness of breath, body aches, and chills — did not differ significantly between the two study groups, the researchers reported. The median time to sustained alleviation of symptoms was 12 days for the Paxlovid group compared with 13 days in the placebo group (P = .60).

However, the phase 2/3 trial found a 57.6% relative reduction in the risk for hospitalizations or death among people who took Paxlovid and were vaccinated but were at high risk for poor outcomes, according to Jennifer Hammond, PhD, head of antiviral development for Pfizer, which makes the drug, and the corresponding author on the study.

Paxlovid has “an increasing body of evidence supporting the strong clinical value of the treatment in preventing hospitalization and death among eligible patients across age groups, vaccination status, and predominant variants,” Dr. Hammond said. 

She and her colleagues analyzed data from 1250 adults with symptomatic COVID-19. Participants were fully vaccinated and had a high risk for progression to severe disease or were never vaccinated or had not been in the previous year and had no risk factors for progression to severe disease.

More than half of participants were women, 78.5% were White and 41.4% identified as Hispanic or Latinx. Almost three quarters underwent randomization within 3 days of the start of symptoms, and a little over half had previously received a COVID-19 vaccination. Almost half had one risk factor for severe illness, the most common of these being hypertension (12.3%). 

In a subgroup analysis of high-risk participants, hospitalization or death occurred in 0.9% of patients in the Paxlovid group and 2.2% in the placebo group (95% CI, -3.3 to 0.7). 

The study’s limitations include that the statistical analysis of COVID-19–related hospitalizations or death from any cause was only descriptive, “because the results for the primary efficacy end point were not significant,” the authors wrote. 

Participants who were vaccinated and at high risk were also enrolled regardless of when they had last had a vaccine dose. Furthermore, Paxlovid has a telltale taste, which may have affected the blinding. Finally, the trial was started when the B.1.617.2 (Delta) variant was predominant.

Dr. Gandhi and Dr. Hirsch pointed out that only 5% of participants in the trial were older than 65 years and that other than risk factors such as obesity and smoking, just 2% of people had heart or lung disease. 

“As with many medical interventions, there is likely to be a gradient of benefit for nirmatrelvir–ritonavir, with the patients at highest risk for progression most likely to derive the greatest benefit,” Dr. Gandhi and Dr. Hirsch wrote in the editorial. “Thus, it appears reasonable to recommend nirmatrelvir–ritonavir primarily for the treatment of COVID-19 in older patients (particularly those ≥ 65 years of age), those who are immunocompromised, and those who have conditions that substantially increase the risk of severe COVID-19, regardless of previous vaccination or infection status.”

The study was supported by Pfizer. 

A version of this article appeared on Medscape.com .

Paxlovid does not significantly alleviate symptoms of COVID-19 compared with placebo among nonhospitalized adults, a new study published April 3 in The New England Journal of Medicine found. 

The results suggest that the drug, a combination of nirmatrelvir and ritonavir, may not be particularly helpful for patients who are not at high risk for severe COVID-19. However, although the rate of hospitalization and death from any cause was low overall, the group that received Paxlovid had a reduced rate compared with people in the placebo group, according to the researchers. 

“Clearly, the benefit observed among unvaccinated high-risk persons does not extend to those at lower risk for severe COVID-19,” Rajesh T. Gandhi, MD, and Martin Hirsch, MD, of Massachusetts General Hospital in Boston, wrote in an editorial accompanying the journal article. “This result supports guidelines that recommend nirmatrelvir–ritonavir only for persons who are at high risk for disease progression.”

The time from onset to relief of COVID-19 symptoms — including cough, shortness of breath, body aches, and chills — did not differ significantly between the two study groups, the researchers reported. The median time to sustained alleviation of symptoms was 12 days for the Paxlovid group compared with 13 days in the placebo group (P = .60).

However, the phase 2/3 trial found a 57.6% relative reduction in the risk for hospitalizations or death among people who took Paxlovid and were vaccinated but were at high risk for poor outcomes, according to Jennifer Hammond, PhD, head of antiviral development for Pfizer, which makes the drug, and the corresponding author on the study.

Paxlovid has “an increasing body of evidence supporting the strong clinical value of the treatment in preventing hospitalization and death among eligible patients across age groups, vaccination status, and predominant variants,” Dr. Hammond said. 

She and her colleagues analyzed data from 1250 adults with symptomatic COVID-19. Participants were fully vaccinated and had a high risk for progression to severe disease or were never vaccinated or had not been in the previous year and had no risk factors for progression to severe disease.

More than half of participants were women, 78.5% were White and 41.4% identified as Hispanic or Latinx. Almost three quarters underwent randomization within 3 days of the start of symptoms, and a little over half had previously received a COVID-19 vaccination. Almost half had one risk factor for severe illness, the most common of these being hypertension (12.3%). 

In a subgroup analysis of high-risk participants, hospitalization or death occurred in 0.9% of patients in the Paxlovid group and 2.2% in the placebo group (95% CI, -3.3 to 0.7). 

The study’s limitations include that the statistical analysis of COVID-19–related hospitalizations or death from any cause was only descriptive, “because the results for the primary efficacy end point were not significant,” the authors wrote. 

Participants who were vaccinated and at high risk were also enrolled regardless of when they had last had a vaccine dose. Furthermore, Paxlovid has a telltale taste, which may have affected the blinding. Finally, the trial was started when the B.1.617.2 (Delta) variant was predominant.

Dr. Gandhi and Dr. Hirsch pointed out that only 5% of participants in the trial were older than 65 years and that other than risk factors such as obesity and smoking, just 2% of people had heart or lung disease. 

“As with many medical interventions, there is likely to be a gradient of benefit for nirmatrelvir–ritonavir, with the patients at highest risk for progression most likely to derive the greatest benefit,” Dr. Gandhi and Dr. Hirsch wrote in the editorial. “Thus, it appears reasonable to recommend nirmatrelvir–ritonavir primarily for the treatment of COVID-19 in older patients (particularly those ≥ 65 years of age), those who are immunocompromised, and those who have conditions that substantially increase the risk of severe COVID-19, regardless of previous vaccination or infection status.”

The study was supported by Pfizer. 

A version of this article appeared on Medscape.com .

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Highlights Some Semaglutide-Associated Skin Effects

Article Type
Changed
Thu, 04/04/2024 - 15:11

 

TOPLINE:

A review of 22 articles found a higher incidence of “altered skin sensations” and alopecia in individuals receiving oral semaglutide than in those receiving placebo.

METHODOLOGY:

  • The Food and Drug Administration’s  has not received reports of semaglutide-related safety events, and few studies have characterized skin findings associated with oral or subcutaneous semaglutide, a glucagon-like peptide 1 agonist used to treat obesity and type 2 diabetes.
  • In this scoping review, researchers included 22 articles (15 clinical trials, six case reports, and one retrospective cohort study), published through January 2024, of patients receiving either semaglutide or a placebo or comparator, which included reports of semaglutide-associated adverse dermatologic events in 255 participants.

TAKEAWAY:

  • Patients who received 50 mg oral semaglutide weekly reported a higher incidence of altered skin sensations, such as dysesthesia (1.8% vs 0%), hyperesthesia (1.2% vs 0%), skin pain (2.4% vs 0%), paresthesia (2.7% vs 0%), and sensitive skin (2.7% vs 0%), than those receiving placebo or comparator.
  • Reports of alopecia (6.9% vs 0.3%) were higher in patients who received 50 mg oral semaglutide weekly than in those on placebo, but only 0.2% of patients on 2.4 mg of subcutaneous semaglutide reported alopecia vs 0.5% of those on placebo.
  • Unspecified dermatologic reactions (4.1% vs 1.5%) were reported in more patients on subcutaneous semaglutide than those on a placebo or comparator. Several case reports described isolated cases of severe skin-related adverse effects, such as bullous pemphigoid, eosinophilic fasciitis, and leukocytoclastic vasculitis.
  • On the contrary, injection site reactions (3.5% vs 6.7%) were less common in patients on subcutaneous semaglutide compared with in those on a placebo or comparator.

IN PRACTICE:

“Variations in dosage and administration routes could influence the types and severity of skin findings, underscoring the need for additional research,” the authors wrote.

SOURCE:

Megan M. Tran, BS, from the Warren Alpert Medical School, Brown University, Providence, Rhode Island, led this study, which was published online in the Journal of the American Academy of Dermatology.

LIMITATIONS:

This study could not adjust for confounding factors and could not establish a direct causal association between semaglutide and the adverse reactions reported.

DISCLOSURES:

This study did not report any funding sources. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

A review of 22 articles found a higher incidence of “altered skin sensations” and alopecia in individuals receiving oral semaglutide than in those receiving placebo.

METHODOLOGY:

  • The Food and Drug Administration’s  has not received reports of semaglutide-related safety events, and few studies have characterized skin findings associated with oral or subcutaneous semaglutide, a glucagon-like peptide 1 agonist used to treat obesity and type 2 diabetes.
  • In this scoping review, researchers included 22 articles (15 clinical trials, six case reports, and one retrospective cohort study), published through January 2024, of patients receiving either semaglutide or a placebo or comparator, which included reports of semaglutide-associated adverse dermatologic events in 255 participants.

TAKEAWAY:

  • Patients who received 50 mg oral semaglutide weekly reported a higher incidence of altered skin sensations, such as dysesthesia (1.8% vs 0%), hyperesthesia (1.2% vs 0%), skin pain (2.4% vs 0%), paresthesia (2.7% vs 0%), and sensitive skin (2.7% vs 0%), than those receiving placebo or comparator.
  • Reports of alopecia (6.9% vs 0.3%) were higher in patients who received 50 mg oral semaglutide weekly than in those on placebo, but only 0.2% of patients on 2.4 mg of subcutaneous semaglutide reported alopecia vs 0.5% of those on placebo.
  • Unspecified dermatologic reactions (4.1% vs 1.5%) were reported in more patients on subcutaneous semaglutide than those on a placebo or comparator. Several case reports described isolated cases of severe skin-related adverse effects, such as bullous pemphigoid, eosinophilic fasciitis, and leukocytoclastic vasculitis.
  • On the contrary, injection site reactions (3.5% vs 6.7%) were less common in patients on subcutaneous semaglutide compared with in those on a placebo or comparator.

IN PRACTICE:

“Variations in dosage and administration routes could influence the types and severity of skin findings, underscoring the need for additional research,” the authors wrote.

SOURCE:

Megan M. Tran, BS, from the Warren Alpert Medical School, Brown University, Providence, Rhode Island, led this study, which was published online in the Journal of the American Academy of Dermatology.

LIMITATIONS:

This study could not adjust for confounding factors and could not establish a direct causal association between semaglutide and the adverse reactions reported.

DISCLOSURES:

This study did not report any funding sources. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE:

A review of 22 articles found a higher incidence of “altered skin sensations” and alopecia in individuals receiving oral semaglutide than in those receiving placebo.

METHODOLOGY:

  • The Food and Drug Administration’s  has not received reports of semaglutide-related safety events, and few studies have characterized skin findings associated with oral or subcutaneous semaglutide, a glucagon-like peptide 1 agonist used to treat obesity and type 2 diabetes.
  • In this scoping review, researchers included 22 articles (15 clinical trials, six case reports, and one retrospective cohort study), published through January 2024, of patients receiving either semaglutide or a placebo or comparator, which included reports of semaglutide-associated adverse dermatologic events in 255 participants.

TAKEAWAY:

  • Patients who received 50 mg oral semaglutide weekly reported a higher incidence of altered skin sensations, such as dysesthesia (1.8% vs 0%), hyperesthesia (1.2% vs 0%), skin pain (2.4% vs 0%), paresthesia (2.7% vs 0%), and sensitive skin (2.7% vs 0%), than those receiving placebo or comparator.
  • Reports of alopecia (6.9% vs 0.3%) were higher in patients who received 50 mg oral semaglutide weekly than in those on placebo, but only 0.2% of patients on 2.4 mg of subcutaneous semaglutide reported alopecia vs 0.5% of those on placebo.
  • Unspecified dermatologic reactions (4.1% vs 1.5%) were reported in more patients on subcutaneous semaglutide than those on a placebo or comparator. Several case reports described isolated cases of severe skin-related adverse effects, such as bullous pemphigoid, eosinophilic fasciitis, and leukocytoclastic vasculitis.
  • On the contrary, injection site reactions (3.5% vs 6.7%) were less common in patients on subcutaneous semaglutide compared with in those on a placebo or comparator.

IN PRACTICE:

“Variations in dosage and administration routes could influence the types and severity of skin findings, underscoring the need for additional research,” the authors wrote.

SOURCE:

Megan M. Tran, BS, from the Warren Alpert Medical School, Brown University, Providence, Rhode Island, led this study, which was published online in the Journal of the American Academy of Dermatology.

LIMITATIONS:

This study could not adjust for confounding factors and could not establish a direct causal association between semaglutide and the adverse reactions reported.

DISCLOSURES:

This study did not report any funding sources. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Genetic Testing of Some Patients With Early-Onset AF Advised

Article Type
Changed
Thu, 04/04/2024 - 15:12

 

Genetic testing may be considered in patients with early-onset atrial fibrillation (AF), particularly those with a positive family history and lack of conventional clinical risk factors, because specific genetic variants may underlie AF as well as “potentially more sinister cardiac conditions,” a new white paper from the Canadian Cardiovascular Society suggested.

“Given the resources and logistical challenges potentially imposed by genetic testing (that is, the majority of cardiology and arrhythmia clinics are not presently equipped to offer it), we have not recommended routine genetic testing for early-onset AF patients at this time,” lead author Jason D. Roberts, MD, associate professor of medicine at McMaster University in Hamilton, Ontario, Canada, told this news organization.

“We do, however, recommend that early-onset AF patients undergo clinical screening for potential coexistence of a ventricular arrhythmia or cardiomyopathy syndrome through careful history, including family history, and physical examination, along with standard clinical testing, including ECGechocardiogram, and Holter monitoring,” he said.

The white paper was published online in the Canadian Journal of Cardiology.

Routine Testing Unwarranted

The Canadian Cardiovascular Society reviewed AF research in 2022 and concluded that a guideline update was not yet warranted. One area meriting consideration but lacking sufficient evidence for a formal guideline was the clinical application of AF genetics.

Therefore, the society formed a writing group to assess the evidence linking genetic factors to AF, discuss an approach to using genetic testing for early-onset patients with AF, and consider the potential value of genetic testing in the foreseeable future.

The resulting white paper reviews familial and epidemiologic evidence for a genetic contribution to AF. As an example, the authors pointed to work from the Framingham Heart Study showing a statistically significant risk for AF among first-degree relatives of patients with AF. The overall odds ratio (OR) for AF among first-degree relatives was 1.85. But for first-degree relatives of patients with AF onset at younger than age 75 years, the OR increased to 3.23.

Other evidence included the identification of two rare genetic variants: KCNQ1 in a Chinese family and NPPA in a family with Northern European ancestry. In case-control studies, a single gene, titin (TTN), was linked to an increased burden of loss-of-function variants in patients with AF compared with controls. The variant was associated with a 2.2-fold increased risk for AF.

The two main classes of AF variants identified in candidate gene approaches were linked to ion channels and ventricular cardiomyopathy. For example, loss-of-function SCN5A variants are implicated in Brugada syndrome and cardiac conduction system disease, whereas gain-of-function variants cause long QT syndrome type 3 and multifocal ectopic Purkinje-related premature contractions. Each of these conditions was associated with an increased prevalence of AF.

Similarly, genes implicated in various other forms of ventricular channelopathies also have been implicated in AF, as have ion channels primarily expressed in the atria and not the ventricles, such as KCNA5 and GJA5.

Nevertheless, in most cases, AF is diagnosed in the context of older age and established cardiovascular risk factors, according to the authors. The contribution of genetic factors in this population is relatively low, highlighting the limited role for genetic testing when AF develops in the presence of multiple conventional clinical risk factors.

 

 

Cardiogenetic Expertise Required

“Although significant progress has been made, additional work is needed before [beginning] routine integration of clinical genetic testing for early-onset AF patients,” Dr. Roberts said. The ideal clinical genetic testing panel for AF is still unclear, and the inclusion of genes for which there is no strong evidence of involvement in AF “creates the potential for harm.”

Specifically, “a genetic variant could be incorrectly assigned as the cause of AF, which could create confusion for the patient and family members and lead to inappropriate clinical management,” said Dr. Roberts.

“Beyond cost, routine introduction of genetic testing for AF patients will require allocation of significant resources, given that interpretation of genetic testing results can be nuanced,” he noted. “This nuance is anticipated to be heightened in AF, given that many genetic variants have low-to-intermediate penetrance and can manifest with variable clinical phenotypes.”

“Traditionally, genetic testing has been performed and interpreted, and results communicated, by dedicated cardiogenetic clinics with specialized expertise,” he added. “Existing cardiogenetic clinics, however, are unlikely to be sufficient in number to accommodate the large volume of AF patients that may be eligible for testing.”

Careful Counseling

Jim W. Cheung, MD, chair of the American College of Cardiology Electrophysiology Council, told this news organization that the white paper is consistent with the latest European Heart Rhythm Association/Heart Rhythm Society/Asia Pacific Heart Rhythm Society/Latin American Heart Rhythm Society expert consensus statement published in 2022.

Overall, the approach suggested for genetic testing “is a sound one, but one that requires implementation by clinicians with access to cardiogenetic expertise,” said Cheung, who was not involved in the study. “Any patient undergoing genetic testing needs to be carefully counseled about the potential uncertainties associated with the actual test results and their implications on clinical management.”

Variants of uncertain significance that are detected with genetic testing “can be a source of stress for clinicians and patients,” he said. “Therefore, patient education prior to and after genetic testing is essential.”

Furthermore, he said, “in many patients with early-onset AF who harbor pathogenic variants, initial imaging studies may not detect any signs of cardiomyopathy. In these patients, regular follow-up to assess for development of cardiomyopathy in the future is necessary.”

The white paper was drafted without outside funding. Dr. Roberts and Dr. Cheung reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

Genetic testing may be considered in patients with early-onset atrial fibrillation (AF), particularly those with a positive family history and lack of conventional clinical risk factors, because specific genetic variants may underlie AF as well as “potentially more sinister cardiac conditions,” a new white paper from the Canadian Cardiovascular Society suggested.

“Given the resources and logistical challenges potentially imposed by genetic testing (that is, the majority of cardiology and arrhythmia clinics are not presently equipped to offer it), we have not recommended routine genetic testing for early-onset AF patients at this time,” lead author Jason D. Roberts, MD, associate professor of medicine at McMaster University in Hamilton, Ontario, Canada, told this news organization.

“We do, however, recommend that early-onset AF patients undergo clinical screening for potential coexistence of a ventricular arrhythmia or cardiomyopathy syndrome through careful history, including family history, and physical examination, along with standard clinical testing, including ECGechocardiogram, and Holter monitoring,” he said.

The white paper was published online in the Canadian Journal of Cardiology.

Routine Testing Unwarranted

The Canadian Cardiovascular Society reviewed AF research in 2022 and concluded that a guideline update was not yet warranted. One area meriting consideration but lacking sufficient evidence for a formal guideline was the clinical application of AF genetics.

Therefore, the society formed a writing group to assess the evidence linking genetic factors to AF, discuss an approach to using genetic testing for early-onset patients with AF, and consider the potential value of genetic testing in the foreseeable future.

The resulting white paper reviews familial and epidemiologic evidence for a genetic contribution to AF. As an example, the authors pointed to work from the Framingham Heart Study showing a statistically significant risk for AF among first-degree relatives of patients with AF. The overall odds ratio (OR) for AF among first-degree relatives was 1.85. But for first-degree relatives of patients with AF onset at younger than age 75 years, the OR increased to 3.23.

Other evidence included the identification of two rare genetic variants: KCNQ1 in a Chinese family and NPPA in a family with Northern European ancestry. In case-control studies, a single gene, titin (TTN), was linked to an increased burden of loss-of-function variants in patients with AF compared with controls. The variant was associated with a 2.2-fold increased risk for AF.

The two main classes of AF variants identified in candidate gene approaches were linked to ion channels and ventricular cardiomyopathy. For example, loss-of-function SCN5A variants are implicated in Brugada syndrome and cardiac conduction system disease, whereas gain-of-function variants cause long QT syndrome type 3 and multifocal ectopic Purkinje-related premature contractions. Each of these conditions was associated with an increased prevalence of AF.

Similarly, genes implicated in various other forms of ventricular channelopathies also have been implicated in AF, as have ion channels primarily expressed in the atria and not the ventricles, such as KCNA5 and GJA5.

Nevertheless, in most cases, AF is diagnosed in the context of older age and established cardiovascular risk factors, according to the authors. The contribution of genetic factors in this population is relatively low, highlighting the limited role for genetic testing when AF develops in the presence of multiple conventional clinical risk factors.

 

 

Cardiogenetic Expertise Required

“Although significant progress has been made, additional work is needed before [beginning] routine integration of clinical genetic testing for early-onset AF patients,” Dr. Roberts said. The ideal clinical genetic testing panel for AF is still unclear, and the inclusion of genes for which there is no strong evidence of involvement in AF “creates the potential for harm.”

Specifically, “a genetic variant could be incorrectly assigned as the cause of AF, which could create confusion for the patient and family members and lead to inappropriate clinical management,” said Dr. Roberts.

“Beyond cost, routine introduction of genetic testing for AF patients will require allocation of significant resources, given that interpretation of genetic testing results can be nuanced,” he noted. “This nuance is anticipated to be heightened in AF, given that many genetic variants have low-to-intermediate penetrance and can manifest with variable clinical phenotypes.”

“Traditionally, genetic testing has been performed and interpreted, and results communicated, by dedicated cardiogenetic clinics with specialized expertise,” he added. “Existing cardiogenetic clinics, however, are unlikely to be sufficient in number to accommodate the large volume of AF patients that may be eligible for testing.”

Careful Counseling

Jim W. Cheung, MD, chair of the American College of Cardiology Electrophysiology Council, told this news organization that the white paper is consistent with the latest European Heart Rhythm Association/Heart Rhythm Society/Asia Pacific Heart Rhythm Society/Latin American Heart Rhythm Society expert consensus statement published in 2022.

Overall, the approach suggested for genetic testing “is a sound one, but one that requires implementation by clinicians with access to cardiogenetic expertise,” said Cheung, who was not involved in the study. “Any patient undergoing genetic testing needs to be carefully counseled about the potential uncertainties associated with the actual test results and their implications on clinical management.”

Variants of uncertain significance that are detected with genetic testing “can be a source of stress for clinicians and patients,” he said. “Therefore, patient education prior to and after genetic testing is essential.”

Furthermore, he said, “in many patients with early-onset AF who harbor pathogenic variants, initial imaging studies may not detect any signs of cardiomyopathy. In these patients, regular follow-up to assess for development of cardiomyopathy in the future is necessary.”

The white paper was drafted without outside funding. Dr. Roberts and Dr. Cheung reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

Genetic testing may be considered in patients with early-onset atrial fibrillation (AF), particularly those with a positive family history and lack of conventional clinical risk factors, because specific genetic variants may underlie AF as well as “potentially more sinister cardiac conditions,” a new white paper from the Canadian Cardiovascular Society suggested.

“Given the resources and logistical challenges potentially imposed by genetic testing (that is, the majority of cardiology and arrhythmia clinics are not presently equipped to offer it), we have not recommended routine genetic testing for early-onset AF patients at this time,” lead author Jason D. Roberts, MD, associate professor of medicine at McMaster University in Hamilton, Ontario, Canada, told this news organization.

“We do, however, recommend that early-onset AF patients undergo clinical screening for potential coexistence of a ventricular arrhythmia or cardiomyopathy syndrome through careful history, including family history, and physical examination, along with standard clinical testing, including ECGechocardiogram, and Holter monitoring,” he said.

The white paper was published online in the Canadian Journal of Cardiology.

Routine Testing Unwarranted

The Canadian Cardiovascular Society reviewed AF research in 2022 and concluded that a guideline update was not yet warranted. One area meriting consideration but lacking sufficient evidence for a formal guideline was the clinical application of AF genetics.

Therefore, the society formed a writing group to assess the evidence linking genetic factors to AF, discuss an approach to using genetic testing for early-onset patients with AF, and consider the potential value of genetic testing in the foreseeable future.

The resulting white paper reviews familial and epidemiologic evidence for a genetic contribution to AF. As an example, the authors pointed to work from the Framingham Heart Study showing a statistically significant risk for AF among first-degree relatives of patients with AF. The overall odds ratio (OR) for AF among first-degree relatives was 1.85. But for first-degree relatives of patients with AF onset at younger than age 75 years, the OR increased to 3.23.

Other evidence included the identification of two rare genetic variants: KCNQ1 in a Chinese family and NPPA in a family with Northern European ancestry. In case-control studies, a single gene, titin (TTN), was linked to an increased burden of loss-of-function variants in patients with AF compared with controls. The variant was associated with a 2.2-fold increased risk for AF.

The two main classes of AF variants identified in candidate gene approaches were linked to ion channels and ventricular cardiomyopathy. For example, loss-of-function SCN5A variants are implicated in Brugada syndrome and cardiac conduction system disease, whereas gain-of-function variants cause long QT syndrome type 3 and multifocal ectopic Purkinje-related premature contractions. Each of these conditions was associated with an increased prevalence of AF.

Similarly, genes implicated in various other forms of ventricular channelopathies also have been implicated in AF, as have ion channels primarily expressed in the atria and not the ventricles, such as KCNA5 and GJA5.

Nevertheless, in most cases, AF is diagnosed in the context of older age and established cardiovascular risk factors, according to the authors. The contribution of genetic factors in this population is relatively low, highlighting the limited role for genetic testing when AF develops in the presence of multiple conventional clinical risk factors.

 

 

Cardiogenetic Expertise Required

“Although significant progress has been made, additional work is needed before [beginning] routine integration of clinical genetic testing for early-onset AF patients,” Dr. Roberts said. The ideal clinical genetic testing panel for AF is still unclear, and the inclusion of genes for which there is no strong evidence of involvement in AF “creates the potential for harm.”

Specifically, “a genetic variant could be incorrectly assigned as the cause of AF, which could create confusion for the patient and family members and lead to inappropriate clinical management,” said Dr. Roberts.

“Beyond cost, routine introduction of genetic testing for AF patients will require allocation of significant resources, given that interpretation of genetic testing results can be nuanced,” he noted. “This nuance is anticipated to be heightened in AF, given that many genetic variants have low-to-intermediate penetrance and can manifest with variable clinical phenotypes.”

“Traditionally, genetic testing has been performed and interpreted, and results communicated, by dedicated cardiogenetic clinics with specialized expertise,” he added. “Existing cardiogenetic clinics, however, are unlikely to be sufficient in number to accommodate the large volume of AF patients that may be eligible for testing.”

Careful Counseling

Jim W. Cheung, MD, chair of the American College of Cardiology Electrophysiology Council, told this news organization that the white paper is consistent with the latest European Heart Rhythm Association/Heart Rhythm Society/Asia Pacific Heart Rhythm Society/Latin American Heart Rhythm Society expert consensus statement published in 2022.

Overall, the approach suggested for genetic testing “is a sound one, but one that requires implementation by clinicians with access to cardiogenetic expertise,” said Cheung, who was not involved in the study. “Any patient undergoing genetic testing needs to be carefully counseled about the potential uncertainties associated with the actual test results and their implications on clinical management.”

Variants of uncertain significance that are detected with genetic testing “can be a source of stress for clinicians and patients,” he said. “Therefore, patient education prior to and after genetic testing is essential.”

Furthermore, he said, “in many patients with early-onset AF who harbor pathogenic variants, initial imaging studies may not detect any signs of cardiomyopathy. In these patients, regular follow-up to assess for development of cardiomyopathy in the future is necessary.”

The white paper was drafted without outside funding. Dr. Roberts and Dr. Cheung reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE CANADIAN JOURNAL OF CARDIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Tirzepatide Offers Better Glucose Control, Regardless of Baseline Levels

Article Type
Changed
Wed, 04/10/2024 - 08:20

 

TOPLINE:

Tirzepatide vs basal insulins led to greater improvements in A1c and postprandial glucose (PPG) levels in patients with type 2 diabetes (T2D), regardless of different baseline PPG or fasting serum glucose (FSG) levels.

METHODOLOGY:

  • Tirzepatide led to better glycemic control than insulin degludec and insulin glargine in the SURPASS-3 and SURPASS-4 trials, respectively, but the effect on FSG and PPG levels was not evaluated.
  • In this post hoc analysis, the researchers assessed changes in various glycemic parameters in 3314 patients with T2D who were randomly assigned to receive tirzepatide (5, 10, or 15 mg), insulin degludec, or insulin glargine.
  • Based on the median baseline glucose values, the patients were stratified into four subgroups: Low FSG/low PPG, low FSG/high PPG, high FSG/low PPG, and high FSG/high PPG.
  • The outcomes of interest were changes in FSG, PPG, A1c, and body weight from baseline to week 52.

TAKEAWAY:

  • Tirzepatide and basal insulins effectively lowered A1c, PPG levels, and FSG levels at 52 weeks across all patient subgroups (all P < .05).
  • All three doses of tirzepatide resulted in greater reductions in both A1c and PPG levels than in basal insulins (all P < .05).
  • In the high FSG/high PPG subgroup, a greater reduction in FSG levels was observed with tirzepatide 10- and 15-mg doses vs insulin glargine (both P < .05) and insulin degludec vs tirzepatide 5 mg (P < .001).
  • Furthermore, at week 52, tirzepatide led to body weight reduction (P < .05), but insulin treatment led to an increase in body weight (P < .05) in all subgroups.

IN PRACTICE:

“Treatment with tirzepatide was consistently associated with more reduced PPG levels compared with insulin treatment across subgroups, including in participants with lower baseline PPG levels, in turn leading to greater A1c reductions,” the authors wrote.

SOURCE:

This study was led by Francesco Giorgino, MD, PhD, of the Section of Internal Medicine, Endocrinology, Andrology, and Metabolic Diseases, University of Bari Aldo Moro, Bari, Italy, and was published online in Diabetes Care.

LIMITATIONS:

The limitations include post hoc nature of the study and the short treatment duration. The trials included only patients with diabetes and overweight or obesity, and therefore, the study findings may not be generalizable to other populations.

DISCLOSURES:

This study and the SURPASS trials were funded by Eli Lilly and Company. Four authors declared being employees and shareholders of Eli Lilly and Company. The other authors declared having several ties with various sources, including Eli Lilly and Company.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Tirzepatide vs basal insulins led to greater improvements in A1c and postprandial glucose (PPG) levels in patients with type 2 diabetes (T2D), regardless of different baseline PPG or fasting serum glucose (FSG) levels.

METHODOLOGY:

  • Tirzepatide led to better glycemic control than insulin degludec and insulin glargine in the SURPASS-3 and SURPASS-4 trials, respectively, but the effect on FSG and PPG levels was not evaluated.
  • In this post hoc analysis, the researchers assessed changes in various glycemic parameters in 3314 patients with T2D who were randomly assigned to receive tirzepatide (5, 10, or 15 mg), insulin degludec, or insulin glargine.
  • Based on the median baseline glucose values, the patients were stratified into four subgroups: Low FSG/low PPG, low FSG/high PPG, high FSG/low PPG, and high FSG/high PPG.
  • The outcomes of interest were changes in FSG, PPG, A1c, and body weight from baseline to week 52.

TAKEAWAY:

  • Tirzepatide and basal insulins effectively lowered A1c, PPG levels, and FSG levels at 52 weeks across all patient subgroups (all P < .05).
  • All three doses of tirzepatide resulted in greater reductions in both A1c and PPG levels than in basal insulins (all P < .05).
  • In the high FSG/high PPG subgroup, a greater reduction in FSG levels was observed with tirzepatide 10- and 15-mg doses vs insulin glargine (both P < .05) and insulin degludec vs tirzepatide 5 mg (P < .001).
  • Furthermore, at week 52, tirzepatide led to body weight reduction (P < .05), but insulin treatment led to an increase in body weight (P < .05) in all subgroups.

IN PRACTICE:

“Treatment with tirzepatide was consistently associated with more reduced PPG levels compared with insulin treatment across subgroups, including in participants with lower baseline PPG levels, in turn leading to greater A1c reductions,” the authors wrote.

SOURCE:

This study was led by Francesco Giorgino, MD, PhD, of the Section of Internal Medicine, Endocrinology, Andrology, and Metabolic Diseases, University of Bari Aldo Moro, Bari, Italy, and was published online in Diabetes Care.

LIMITATIONS:

The limitations include post hoc nature of the study and the short treatment duration. The trials included only patients with diabetes and overweight or obesity, and therefore, the study findings may not be generalizable to other populations.

DISCLOSURES:

This study and the SURPASS trials were funded by Eli Lilly and Company. Four authors declared being employees and shareholders of Eli Lilly and Company. The other authors declared having several ties with various sources, including Eli Lilly and Company.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Tirzepatide vs basal insulins led to greater improvements in A1c and postprandial glucose (PPG) levels in patients with type 2 diabetes (T2D), regardless of different baseline PPG or fasting serum glucose (FSG) levels.

METHODOLOGY:

  • Tirzepatide led to better glycemic control than insulin degludec and insulin glargine in the SURPASS-3 and SURPASS-4 trials, respectively, but the effect on FSG and PPG levels was not evaluated.
  • In this post hoc analysis, the researchers assessed changes in various glycemic parameters in 3314 patients with T2D who were randomly assigned to receive tirzepatide (5, 10, or 15 mg), insulin degludec, or insulin glargine.
  • Based on the median baseline glucose values, the patients were stratified into four subgroups: Low FSG/low PPG, low FSG/high PPG, high FSG/low PPG, and high FSG/high PPG.
  • The outcomes of interest were changes in FSG, PPG, A1c, and body weight from baseline to week 52.

TAKEAWAY:

  • Tirzepatide and basal insulins effectively lowered A1c, PPG levels, and FSG levels at 52 weeks across all patient subgroups (all P < .05).
  • All three doses of tirzepatide resulted in greater reductions in both A1c and PPG levels than in basal insulins (all P < .05).
  • In the high FSG/high PPG subgroup, a greater reduction in FSG levels was observed with tirzepatide 10- and 15-mg doses vs insulin glargine (both P < .05) and insulin degludec vs tirzepatide 5 mg (P < .001).
  • Furthermore, at week 52, tirzepatide led to body weight reduction (P < .05), but insulin treatment led to an increase in body weight (P < .05) in all subgroups.

IN PRACTICE:

“Treatment with tirzepatide was consistently associated with more reduced PPG levels compared with insulin treatment across subgroups, including in participants with lower baseline PPG levels, in turn leading to greater A1c reductions,” the authors wrote.

SOURCE:

This study was led by Francesco Giorgino, MD, PhD, of the Section of Internal Medicine, Endocrinology, Andrology, and Metabolic Diseases, University of Bari Aldo Moro, Bari, Italy, and was published online in Diabetes Care.

LIMITATIONS:

The limitations include post hoc nature of the study and the short treatment duration. The trials included only patients with diabetes and overweight or obesity, and therefore, the study findings may not be generalizable to other populations.

DISCLOSURES:

This study and the SURPASS trials were funded by Eli Lilly and Company. Four authors declared being employees and shareholders of Eli Lilly and Company. The other authors declared having several ties with various sources, including Eli Lilly and Company.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Do New Antiobesity Meds Still Require Lifestyle Management?

Article Type
Changed
Thu, 04/04/2024 - 11:58

Is lifestyle counseling needed with the more effective second-generation nutrient-stimulated, hormone-based medications like semaglutide and tirzepatide?

If so, how intensive does the counseling need to be, and what components should be emphasized?

These are the clinical practice questions at the top of mind for healthcare professionals and researchers who provide care to patients who have overweight and/or obesity.

This is what we know. Lifestyle management is considered foundational in the care of patients with obesity.

Because obesity is fundamentally a disease of energy dysregulation, counseling has traditionally focused on dietary caloric reduction, increased physical activity, and strategies to adapt new cognitive and lifestyle behaviors.

On the basis of trial results from the Diabetes Prevention Program and the Look AHEAD studies, provision of intensive behavioral therapy (IBT) is recommended for treatment of obesity by the Centers for Medicare & Medicaid Services and by the US Preventive Services Task Force (Moyer VAUS Preventive Services Task Force).

IBT is commonly defined as consisting of 12-26 comprehensive and multicomponent sessions over the course of a year.

Reaffirming the primacy of lifestyle management, all antiobesity medications are approved by the US Food and Drug Administration as an adjunct to a reduced-calorie diet and increased physical activity.

The beneficial effect of combining IBT with earlier-generation medications like naltrexone/bupropion or liraglutide demonstrated that more participants in the trials achieved ≥ 10% weight loss with IBT compared with those taking the medication without IBT: 38.4% vs 20% for naltrexone/bupropion and 46% vs 33% for liraglutide.

Although there aren’t trial data for other first-generation medications like phentermineorlistat, or phentermine/topiramate, it is assumed that patients taking these medications would also achieve greater weight loss when combined with IBT.

The obesity pharmacotherapy landscape was upended, however, with the approval of semaglutide (Wegovy), a glucagon-like peptide-1 (GLP-1) receptor agonist, in 2021; and tirzepatide (Zepbound), a GLP-1 and glucose-dependent insulinotropic polypeptide dual receptor agonist, in 2023.

These highly effective medications harness the effect of naturally occurring incretin hormones that reduce appetite through direct and indirect effects on the brain. Although the study designs differed between the STEP 1 and STEP 3 trials, the addition of IBT to semaglutide increased mean percent weight loss from 15% to 16% after 68 weeks of treatment (Wilding JPH et alWadden TA).

Comparable benefits from the STEP 3 and SURMOUNT-1 trials of adding IBT to tirzepatide at the maximal tolerated dose increased mean percent weight loss from 21% to 24% after 72 weeks (Wadden TAJastreboff AM). Though multicomponent IBT appears to provide greater weight loss when used with nutrient-stimulated hormone-based therapeutics, the additional benefit may be less when compared with first-generation medications.

So, how should we view the role and importance of lifestyle management when a patient is taking a second-generation medication? We need to shift the focus from prescribing a calorie-reduced diet to counseling for healthy eating patterns.

Because the second-generation drugs are more biologically effective in suppressing appetite (ie, reducing hunger, food noise, and cravings, and increasing satiation and satiety), it is easier for patients to reduce their food intake without a sense of deprivation. Furthermore, many patients express less desire to consume savory, sweet, and other enticing foods.

Patients should be encouraged to optimize the quality of their diet, prioritizing lean protein sources with meals and snacks; increasing fruits, vegetables, fiber, and complex carbohydrates; and keeping well hydrated. Because of the risk of developing micronutrient deficiencies while consuming a low-calorie diet — most notably calcium, iron, and vitamin D — patients may be advised to take a daily multivitamin supplement. Dietary counseling should be introduced when patients start pharmacotherapy, and if needed, referral to a registered dietitian nutritionist may be helpful in making these changes.

Additional counseling tips to mitigate the gastrointestinal side effects of these drugs that most commonly occur during the early dose-escalation phase include eating slowly; choosing smaller portion sizes; stopping eating when full; not skipping meals; and avoiding fatty, fried, and greasy foods. These dietary changes are particularly important over the first days after patients take the injection.

The increased weight loss achieved also raises concerns about the need to maintain lean body mass and the importance of physical activity and exercise counseling. All weight loss interventions, including dietary restriction, pharmacotherapy, or bariatric surgery, result in loss of fat mass and lean body mass.

The goal of lifestyle counseling is to minimize and preserve muscle mass (a component of lean body mass) which is needed for optimal health, mobility, daily function, and quality of life. Counseling should incorporate both aerobic and resistance training. Aerobic exercise (eg, brisk walking, jogging, dancing, elliptical machine, and cycling) improves cardiovascular fitness, metabolic health, and energy expenditure. Resistance (strength) training (eg, weightlifting, resistance bands, and circuit training) lessens the loss of muscle mass, enhances functional strength and mobility, and improves bone density (Gorgojo-Martinez JJ et alOppert JM et al).

Robust physical activity has also been shown to be a predictor of weight loss maintenance. A recently published randomized placebo-controlled trial demonstrated the benefit of supervised exercise in maintaining body weight and lean body mass after discontinuing 52 weeks of liraglutide treatment compared with no exercise.

Rather than minimizing the provision of lifestyle management, using highly effective second-generation therapeutics redirects the focus on how patients with obesity can strive to achieve a healthy and productive life.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Is lifestyle counseling needed with the more effective second-generation nutrient-stimulated, hormone-based medications like semaglutide and tirzepatide?

If so, how intensive does the counseling need to be, and what components should be emphasized?

These are the clinical practice questions at the top of mind for healthcare professionals and researchers who provide care to patients who have overweight and/or obesity.

This is what we know. Lifestyle management is considered foundational in the care of patients with obesity.

Because obesity is fundamentally a disease of energy dysregulation, counseling has traditionally focused on dietary caloric reduction, increased physical activity, and strategies to adapt new cognitive and lifestyle behaviors.

On the basis of trial results from the Diabetes Prevention Program and the Look AHEAD studies, provision of intensive behavioral therapy (IBT) is recommended for treatment of obesity by the Centers for Medicare & Medicaid Services and by the US Preventive Services Task Force (Moyer VAUS Preventive Services Task Force).

IBT is commonly defined as consisting of 12-26 comprehensive and multicomponent sessions over the course of a year.

Reaffirming the primacy of lifestyle management, all antiobesity medications are approved by the US Food and Drug Administration as an adjunct to a reduced-calorie diet and increased physical activity.

The beneficial effect of combining IBT with earlier-generation medications like naltrexone/bupropion or liraglutide demonstrated that more participants in the trials achieved ≥ 10% weight loss with IBT compared with those taking the medication without IBT: 38.4% vs 20% for naltrexone/bupropion and 46% vs 33% for liraglutide.

Although there aren’t trial data for other first-generation medications like phentermineorlistat, or phentermine/topiramate, it is assumed that patients taking these medications would also achieve greater weight loss when combined with IBT.

The obesity pharmacotherapy landscape was upended, however, with the approval of semaglutide (Wegovy), a glucagon-like peptide-1 (GLP-1) receptor agonist, in 2021; and tirzepatide (Zepbound), a GLP-1 and glucose-dependent insulinotropic polypeptide dual receptor agonist, in 2023.

These highly effective medications harness the effect of naturally occurring incretin hormones that reduce appetite through direct and indirect effects on the brain. Although the study designs differed between the STEP 1 and STEP 3 trials, the addition of IBT to semaglutide increased mean percent weight loss from 15% to 16% after 68 weeks of treatment (Wilding JPH et alWadden TA).

Comparable benefits from the STEP 3 and SURMOUNT-1 trials of adding IBT to tirzepatide at the maximal tolerated dose increased mean percent weight loss from 21% to 24% after 72 weeks (Wadden TAJastreboff AM). Though multicomponent IBT appears to provide greater weight loss when used with nutrient-stimulated hormone-based therapeutics, the additional benefit may be less when compared with first-generation medications.

So, how should we view the role and importance of lifestyle management when a patient is taking a second-generation medication? We need to shift the focus from prescribing a calorie-reduced diet to counseling for healthy eating patterns.

Because the second-generation drugs are more biologically effective in suppressing appetite (ie, reducing hunger, food noise, and cravings, and increasing satiation and satiety), it is easier for patients to reduce their food intake without a sense of deprivation. Furthermore, many patients express less desire to consume savory, sweet, and other enticing foods.

Patients should be encouraged to optimize the quality of their diet, prioritizing lean protein sources with meals and snacks; increasing fruits, vegetables, fiber, and complex carbohydrates; and keeping well hydrated. Because of the risk of developing micronutrient deficiencies while consuming a low-calorie diet — most notably calcium, iron, and vitamin D — patients may be advised to take a daily multivitamin supplement. Dietary counseling should be introduced when patients start pharmacotherapy, and if needed, referral to a registered dietitian nutritionist may be helpful in making these changes.

Additional counseling tips to mitigate the gastrointestinal side effects of these drugs that most commonly occur during the early dose-escalation phase include eating slowly; choosing smaller portion sizes; stopping eating when full; not skipping meals; and avoiding fatty, fried, and greasy foods. These dietary changes are particularly important over the first days after patients take the injection.

The increased weight loss achieved also raises concerns about the need to maintain lean body mass and the importance of physical activity and exercise counseling. All weight loss interventions, including dietary restriction, pharmacotherapy, or bariatric surgery, result in loss of fat mass and lean body mass.

The goal of lifestyle counseling is to minimize and preserve muscle mass (a component of lean body mass) which is needed for optimal health, mobility, daily function, and quality of life. Counseling should incorporate both aerobic and resistance training. Aerobic exercise (eg, brisk walking, jogging, dancing, elliptical machine, and cycling) improves cardiovascular fitness, metabolic health, and energy expenditure. Resistance (strength) training (eg, weightlifting, resistance bands, and circuit training) lessens the loss of muscle mass, enhances functional strength and mobility, and improves bone density (Gorgojo-Martinez JJ et alOppert JM et al).

Robust physical activity has also been shown to be a predictor of weight loss maintenance. A recently published randomized placebo-controlled trial demonstrated the benefit of supervised exercise in maintaining body weight and lean body mass after discontinuing 52 weeks of liraglutide treatment compared with no exercise.

Rather than minimizing the provision of lifestyle management, using highly effective second-generation therapeutics redirects the focus on how patients with obesity can strive to achieve a healthy and productive life.

A version of this article first appeared on Medscape.com.

Is lifestyle counseling needed with the more effective second-generation nutrient-stimulated, hormone-based medications like semaglutide and tirzepatide?

If so, how intensive does the counseling need to be, and what components should be emphasized?

These are the clinical practice questions at the top of mind for healthcare professionals and researchers who provide care to patients who have overweight and/or obesity.

This is what we know. Lifestyle management is considered foundational in the care of patients with obesity.

Because obesity is fundamentally a disease of energy dysregulation, counseling has traditionally focused on dietary caloric reduction, increased physical activity, and strategies to adapt new cognitive and lifestyle behaviors.

On the basis of trial results from the Diabetes Prevention Program and the Look AHEAD studies, provision of intensive behavioral therapy (IBT) is recommended for treatment of obesity by the Centers for Medicare & Medicaid Services and by the US Preventive Services Task Force (Moyer VAUS Preventive Services Task Force).

IBT is commonly defined as consisting of 12-26 comprehensive and multicomponent sessions over the course of a year.

Reaffirming the primacy of lifestyle management, all antiobesity medications are approved by the US Food and Drug Administration as an adjunct to a reduced-calorie diet and increased physical activity.

The beneficial effect of combining IBT with earlier-generation medications like naltrexone/bupropion or liraglutide demonstrated that more participants in the trials achieved ≥ 10% weight loss with IBT compared with those taking the medication without IBT: 38.4% vs 20% for naltrexone/bupropion and 46% vs 33% for liraglutide.

Although there aren’t trial data for other first-generation medications like phentermineorlistat, or phentermine/topiramate, it is assumed that patients taking these medications would also achieve greater weight loss when combined with IBT.

The obesity pharmacotherapy landscape was upended, however, with the approval of semaglutide (Wegovy), a glucagon-like peptide-1 (GLP-1) receptor agonist, in 2021; and tirzepatide (Zepbound), a GLP-1 and glucose-dependent insulinotropic polypeptide dual receptor agonist, in 2023.

These highly effective medications harness the effect of naturally occurring incretin hormones that reduce appetite through direct and indirect effects on the brain. Although the study designs differed between the STEP 1 and STEP 3 trials, the addition of IBT to semaglutide increased mean percent weight loss from 15% to 16% after 68 weeks of treatment (Wilding JPH et alWadden TA).

Comparable benefits from the STEP 3 and SURMOUNT-1 trials of adding IBT to tirzepatide at the maximal tolerated dose increased mean percent weight loss from 21% to 24% after 72 weeks (Wadden TAJastreboff AM). Though multicomponent IBT appears to provide greater weight loss when used with nutrient-stimulated hormone-based therapeutics, the additional benefit may be less when compared with first-generation medications.

So, how should we view the role and importance of lifestyle management when a patient is taking a second-generation medication? We need to shift the focus from prescribing a calorie-reduced diet to counseling for healthy eating patterns.

Because the second-generation drugs are more biologically effective in suppressing appetite (ie, reducing hunger, food noise, and cravings, and increasing satiation and satiety), it is easier for patients to reduce their food intake without a sense of deprivation. Furthermore, many patients express less desire to consume savory, sweet, and other enticing foods.

Patients should be encouraged to optimize the quality of their diet, prioritizing lean protein sources with meals and snacks; increasing fruits, vegetables, fiber, and complex carbohydrates; and keeping well hydrated. Because of the risk of developing micronutrient deficiencies while consuming a low-calorie diet — most notably calcium, iron, and vitamin D — patients may be advised to take a daily multivitamin supplement. Dietary counseling should be introduced when patients start pharmacotherapy, and if needed, referral to a registered dietitian nutritionist may be helpful in making these changes.

Additional counseling tips to mitigate the gastrointestinal side effects of these drugs that most commonly occur during the early dose-escalation phase include eating slowly; choosing smaller portion sizes; stopping eating when full; not skipping meals; and avoiding fatty, fried, and greasy foods. These dietary changes are particularly important over the first days after patients take the injection.

The increased weight loss achieved also raises concerns about the need to maintain lean body mass and the importance of physical activity and exercise counseling. All weight loss interventions, including dietary restriction, pharmacotherapy, or bariatric surgery, result in loss of fat mass and lean body mass.

The goal of lifestyle counseling is to minimize and preserve muscle mass (a component of lean body mass) which is needed for optimal health, mobility, daily function, and quality of life. Counseling should incorporate both aerobic and resistance training. Aerobic exercise (eg, brisk walking, jogging, dancing, elliptical machine, and cycling) improves cardiovascular fitness, metabolic health, and energy expenditure. Resistance (strength) training (eg, weightlifting, resistance bands, and circuit training) lessens the loss of muscle mass, enhances functional strength and mobility, and improves bone density (Gorgojo-Martinez JJ et alOppert JM et al).

Robust physical activity has also been shown to be a predictor of weight loss maintenance. A recently published randomized placebo-controlled trial demonstrated the benefit of supervised exercise in maintaining body weight and lean body mass after discontinuing 52 weeks of liraglutide treatment compared with no exercise.

Rather than minimizing the provision of lifestyle management, using highly effective second-generation therapeutics redirects the focus on how patients with obesity can strive to achieve a healthy and productive life.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

How Abdominal Fibrogenesis Affects Adolescents With Obesity

Article Type
Changed
Thu, 04/04/2024 - 11:56

 

TOPLINE:

Insulin resistance and obesity in adolescents may lead to increased abdominal fibrogenesis, impairing the capacity of the abdominal subcutaneous adipose tissue (SAT) to store lipids, which may cause fat accumulation in the visceral adipose tissue (VAT) depot and in other organs such as the liver.

METHODOLOGY:

  • Abdominal fibrogenesis, but not adipose tissue expandability, is known to increase in adults with obesity and reduce insulin sensitivity; however, little is known about fibrogenesis in adolescents with obesity.
  • In this study, researchers investigated if lipid dynamics, fibrogenesis, and abdominal and gluteal adipocyte turnover show dysregulation to a greater extent in insulin-resistant adolescents with obesity than in insulin-sensitive adolescents with obesity.
  • They recruited 14 individuals between 12 and 20 years with a body mass index over 30 from the Yale  Clinic, of whom seven participants were classified as insulin resistant.
  • Deuterated water methodologies were used to study the indices of adipocyte turnover, lipid dynamics, and fibrogenesis in abdominal and gluteal fat deposits.
  • A 3-hour oral glucose tolerance test and multisection MRI scan of the abdominal region were used to assess the indices of glucose metabolism, abdominal fat distribution patterns, and liver fat content.

TAKEAWAY:

  • The abdominal and gluteal SAT turnover rate of lipid components (triglyceride production and breakdown as well as de novo lipogenesis contribution) was similar in insulin-resistant and insulin-sensitive adolescents with obesity.
  • The insoluble collagen (type I, subunit alpha2) level was higher in the abdominal adipose tissue of insulin-resistant adolescents than in insulin-sensitive adolescents (difference in fractional synthesis rate, 0.611; P < .001), indicating increased abdominal fibrogenesis.
  • Abdominal insoluble collagen I alpha2 was associated with higher fasting plasma insulin levels (correlation [r], 0.579; P = .015), a higher visceral to total adipose tissue ratio (r, 0.643; P = .007), and a lower whole-body insulin sensitivity index (r, -0.540; P = .023).
  • There was no evidence of increased collagen production in the gluteal adipose tissue, and as a result, fibrogenesis was observed.

IN PRACTICE:

“The increased formation of insoluble collagen observed in insulin-resistant compared with insulin-sensitive individuals contributes to lipid spillover from SAT to VAT and, in turn, serves as a critically important mechanism involved in the complex sequelae of obesity-related metabolic and liver disease pathology,” the authors wrote.

SOURCE:

This study, led by Aaron L. Slusher, Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, was published online in Obesity.

LIMITATIONS:

The researchers did not measure hepatic collagen synthesis rates. The analysis was performed on a small study population. The authors were also unable to assess potential sex differences.

DISCLOSURES:

The study was funded by the Foundation for the National Institutes of Health and Clara Guthrie Patterson Trust Mentored Research Award. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Insulin resistance and obesity in adolescents may lead to increased abdominal fibrogenesis, impairing the capacity of the abdominal subcutaneous adipose tissue (SAT) to store lipids, which may cause fat accumulation in the visceral adipose tissue (VAT) depot and in other organs such as the liver.

METHODOLOGY:

  • Abdominal fibrogenesis, but not adipose tissue expandability, is known to increase in adults with obesity and reduce insulin sensitivity; however, little is known about fibrogenesis in adolescents with obesity.
  • In this study, researchers investigated if lipid dynamics, fibrogenesis, and abdominal and gluteal adipocyte turnover show dysregulation to a greater extent in insulin-resistant adolescents with obesity than in insulin-sensitive adolescents with obesity.
  • They recruited 14 individuals between 12 and 20 years with a body mass index over 30 from the Yale  Clinic, of whom seven participants were classified as insulin resistant.
  • Deuterated water methodologies were used to study the indices of adipocyte turnover, lipid dynamics, and fibrogenesis in abdominal and gluteal fat deposits.
  • A 3-hour oral glucose tolerance test and multisection MRI scan of the abdominal region were used to assess the indices of glucose metabolism, abdominal fat distribution patterns, and liver fat content.

TAKEAWAY:

  • The abdominal and gluteal SAT turnover rate of lipid components (triglyceride production and breakdown as well as de novo lipogenesis contribution) was similar in insulin-resistant and insulin-sensitive adolescents with obesity.
  • The insoluble collagen (type I, subunit alpha2) level was higher in the abdominal adipose tissue of insulin-resistant adolescents than in insulin-sensitive adolescents (difference in fractional synthesis rate, 0.611; P < .001), indicating increased abdominal fibrogenesis.
  • Abdominal insoluble collagen I alpha2 was associated with higher fasting plasma insulin levels (correlation [r], 0.579; P = .015), a higher visceral to total adipose tissue ratio (r, 0.643; P = .007), and a lower whole-body insulin sensitivity index (r, -0.540; P = .023).
  • There was no evidence of increased collagen production in the gluteal adipose tissue, and as a result, fibrogenesis was observed.

IN PRACTICE:

“The increased formation of insoluble collagen observed in insulin-resistant compared with insulin-sensitive individuals contributes to lipid spillover from SAT to VAT and, in turn, serves as a critically important mechanism involved in the complex sequelae of obesity-related metabolic and liver disease pathology,” the authors wrote.

SOURCE:

This study, led by Aaron L. Slusher, Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, was published online in Obesity.

LIMITATIONS:

The researchers did not measure hepatic collagen synthesis rates. The analysis was performed on a small study population. The authors were also unable to assess potential sex differences.

DISCLOSURES:

The study was funded by the Foundation for the National Institutes of Health and Clara Guthrie Patterson Trust Mentored Research Award. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Insulin resistance and obesity in adolescents may lead to increased abdominal fibrogenesis, impairing the capacity of the abdominal subcutaneous adipose tissue (SAT) to store lipids, which may cause fat accumulation in the visceral adipose tissue (VAT) depot and in other organs such as the liver.

METHODOLOGY:

  • Abdominal fibrogenesis, but not adipose tissue expandability, is known to increase in adults with obesity and reduce insulin sensitivity; however, little is known about fibrogenesis in adolescents with obesity.
  • In this study, researchers investigated if lipid dynamics, fibrogenesis, and abdominal and gluteal adipocyte turnover show dysregulation to a greater extent in insulin-resistant adolescents with obesity than in insulin-sensitive adolescents with obesity.
  • They recruited 14 individuals between 12 and 20 years with a body mass index over 30 from the Yale  Clinic, of whom seven participants were classified as insulin resistant.
  • Deuterated water methodologies were used to study the indices of adipocyte turnover, lipid dynamics, and fibrogenesis in abdominal and gluteal fat deposits.
  • A 3-hour oral glucose tolerance test and multisection MRI scan of the abdominal region were used to assess the indices of glucose metabolism, abdominal fat distribution patterns, and liver fat content.

TAKEAWAY:

  • The abdominal and gluteal SAT turnover rate of lipid components (triglyceride production and breakdown as well as de novo lipogenesis contribution) was similar in insulin-resistant and insulin-sensitive adolescents with obesity.
  • The insoluble collagen (type I, subunit alpha2) level was higher in the abdominal adipose tissue of insulin-resistant adolescents than in insulin-sensitive adolescents (difference in fractional synthesis rate, 0.611; P < .001), indicating increased abdominal fibrogenesis.
  • Abdominal insoluble collagen I alpha2 was associated with higher fasting plasma insulin levels (correlation [r], 0.579; P = .015), a higher visceral to total adipose tissue ratio (r, 0.643; P = .007), and a lower whole-body insulin sensitivity index (r, -0.540; P = .023).
  • There was no evidence of increased collagen production in the gluteal adipose tissue, and as a result, fibrogenesis was observed.

IN PRACTICE:

“The increased formation of insoluble collagen observed in insulin-resistant compared with insulin-sensitive individuals contributes to lipid spillover from SAT to VAT and, in turn, serves as a critically important mechanism involved in the complex sequelae of obesity-related metabolic and liver disease pathology,” the authors wrote.

SOURCE:

This study, led by Aaron L. Slusher, Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut, was published online in Obesity.

LIMITATIONS:

The researchers did not measure hepatic collagen synthesis rates. The analysis was performed on a small study population. The authors were also unable to assess potential sex differences.

DISCLOSURES:

The study was funded by the Foundation for the National Institutes of Health and Clara Guthrie Patterson Trust Mentored Research Award. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The ED Sailed Smoothly in the Early COVID-19 Days

Article Type
Changed
Thu, 04/04/2024 - 09:26

 

TOPLINE: 

There were few cases of SARS-CoV-2 infections among emergency department (ED) healthcare personnel and no substantial changes in the delivery of emergency medical care during the initial phase of the COVID-19 pandemic.

METHODOLOGY:

  • This multicenter prospective cohort study of US ED healthcare personnel called Project COVERED was conducted from May to December 2020 to evaluate the following outcomes:
  • The possibility of infected ED personnel reporting to work
  • The burden of COVID-19 symptoms on an ED personnel’s work status
  • The association between SARS-CoV-2 infection levels and ED staffing
  • Project COVERED enrolled 1673 ED healthcare personnel with 29,825 person weeks of observational data from 25 geographically diverse EDs.
  • The presence of any SARS-CoV-2 infection was determined using reverse transcription polymerase chain reaction or IgG antibody testing at baseline, week 2, week 4, and every four subsequent weeks through week 20.
  • Investigators also collected weekly data on ED staffing and the incidence of SARS-CoV-2 infections in healthcare facilities.

TAKEAWAY:

  • Despite the absence of widespread natural immunity or COVID-19 vaccine availability during the time of this study, only 4.5% of ED healthcare personnel tested positive for SARS-CoV-2 infections, with more than half (57.3%) not experiencing any symptoms.
  • Most personnel (83%) who experienced symptoms associated with COVID-19 reported working at least one shift in the ED and nearly all of them continued to work until they received laboratory confirmation of their infection.
  • The working time lost as a result of COVID-19 and related concerns was minimal, as 89 healthcare personnel reported 90 person weeks of missed work (0.3% of all weeks).
  • During this study, physician-staffing levels ranged from 98.7% to 102.0% of normal staffing, with similar values noted for nursing and nonclinical staffs. Reduced staffing was rare, even during COVID-19 surges.

IN PRACTICE:

“Our findings suggest that the cumulative interaction between infected healthcare personnel and others resulted in a negligible risk of transmission on the scale of public health emergencies,” the authors wrote.

SOURCE:

This study was led by Kurt D. Weber, MD, Department of Emergency Medicine, Orlando Health, Orlando, Florida, and published online in Annals of Emergency Medicine.

LIMITATIONS:

Data regarding the Delta variant surges that occurred toward the end of December and the ED status after the advent of the COVID-19 vaccine were not recorded. There may also have been a selection bias risk in this study because the volunteer participants may have exhibited behaviors like social distancing and use of protective equipment, which may have decreased their risk for infections.

DISCLOSURES:

This study was funded by a cooperative agreement from the Centers for Disease Control and Prevention and the Institute for Clinical and Translational Science at the University of Iowa through a grant from the National Center for Advancing Translational Sciences at the National Institutes of Health. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE: 

There were few cases of SARS-CoV-2 infections among emergency department (ED) healthcare personnel and no substantial changes in the delivery of emergency medical care during the initial phase of the COVID-19 pandemic.

METHODOLOGY:

  • This multicenter prospective cohort study of US ED healthcare personnel called Project COVERED was conducted from May to December 2020 to evaluate the following outcomes:
  • The possibility of infected ED personnel reporting to work
  • The burden of COVID-19 symptoms on an ED personnel’s work status
  • The association between SARS-CoV-2 infection levels and ED staffing
  • Project COVERED enrolled 1673 ED healthcare personnel with 29,825 person weeks of observational data from 25 geographically diverse EDs.
  • The presence of any SARS-CoV-2 infection was determined using reverse transcription polymerase chain reaction or IgG antibody testing at baseline, week 2, week 4, and every four subsequent weeks through week 20.
  • Investigators also collected weekly data on ED staffing and the incidence of SARS-CoV-2 infections in healthcare facilities.

TAKEAWAY:

  • Despite the absence of widespread natural immunity or COVID-19 vaccine availability during the time of this study, only 4.5% of ED healthcare personnel tested positive for SARS-CoV-2 infections, with more than half (57.3%) not experiencing any symptoms.
  • Most personnel (83%) who experienced symptoms associated with COVID-19 reported working at least one shift in the ED and nearly all of them continued to work until they received laboratory confirmation of their infection.
  • The working time lost as a result of COVID-19 and related concerns was minimal, as 89 healthcare personnel reported 90 person weeks of missed work (0.3% of all weeks).
  • During this study, physician-staffing levels ranged from 98.7% to 102.0% of normal staffing, with similar values noted for nursing and nonclinical staffs. Reduced staffing was rare, even during COVID-19 surges.

IN PRACTICE:

“Our findings suggest that the cumulative interaction between infected healthcare personnel and others resulted in a negligible risk of transmission on the scale of public health emergencies,” the authors wrote.

SOURCE:

This study was led by Kurt D. Weber, MD, Department of Emergency Medicine, Orlando Health, Orlando, Florida, and published online in Annals of Emergency Medicine.

LIMITATIONS:

Data regarding the Delta variant surges that occurred toward the end of December and the ED status after the advent of the COVID-19 vaccine were not recorded. There may also have been a selection bias risk in this study because the volunteer participants may have exhibited behaviors like social distancing and use of protective equipment, which may have decreased their risk for infections.

DISCLOSURES:

This study was funded by a cooperative agreement from the Centers for Disease Control and Prevention and the Institute for Clinical and Translational Science at the University of Iowa through a grant from the National Center for Advancing Translational Sciences at the National Institutes of Health. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE: 

There were few cases of SARS-CoV-2 infections among emergency department (ED) healthcare personnel and no substantial changes in the delivery of emergency medical care during the initial phase of the COVID-19 pandemic.

METHODOLOGY:

  • This multicenter prospective cohort study of US ED healthcare personnel called Project COVERED was conducted from May to December 2020 to evaluate the following outcomes:
  • The possibility of infected ED personnel reporting to work
  • The burden of COVID-19 symptoms on an ED personnel’s work status
  • The association between SARS-CoV-2 infection levels and ED staffing
  • Project COVERED enrolled 1673 ED healthcare personnel with 29,825 person weeks of observational data from 25 geographically diverse EDs.
  • The presence of any SARS-CoV-2 infection was determined using reverse transcription polymerase chain reaction or IgG antibody testing at baseline, week 2, week 4, and every four subsequent weeks through week 20.
  • Investigators also collected weekly data on ED staffing and the incidence of SARS-CoV-2 infections in healthcare facilities.

TAKEAWAY:

  • Despite the absence of widespread natural immunity or COVID-19 vaccine availability during the time of this study, only 4.5% of ED healthcare personnel tested positive for SARS-CoV-2 infections, with more than half (57.3%) not experiencing any symptoms.
  • Most personnel (83%) who experienced symptoms associated with COVID-19 reported working at least one shift in the ED and nearly all of them continued to work until they received laboratory confirmation of their infection.
  • The working time lost as a result of COVID-19 and related concerns was minimal, as 89 healthcare personnel reported 90 person weeks of missed work (0.3% of all weeks).
  • During this study, physician-staffing levels ranged from 98.7% to 102.0% of normal staffing, with similar values noted for nursing and nonclinical staffs. Reduced staffing was rare, even during COVID-19 surges.

IN PRACTICE:

“Our findings suggest that the cumulative interaction between infected healthcare personnel and others resulted in a negligible risk of transmission on the scale of public health emergencies,” the authors wrote.

SOURCE:

This study was led by Kurt D. Weber, MD, Department of Emergency Medicine, Orlando Health, Orlando, Florida, and published online in Annals of Emergency Medicine.

LIMITATIONS:

Data regarding the Delta variant surges that occurred toward the end of December and the ED status after the advent of the COVID-19 vaccine were not recorded. There may also have been a selection bias risk in this study because the volunteer participants may have exhibited behaviors like social distancing and use of protective equipment, which may have decreased their risk for infections.

DISCLOSURES:

This study was funded by a cooperative agreement from the Centers for Disease Control and Prevention and the Institute for Clinical and Translational Science at the University of Iowa through a grant from the National Center for Advancing Translational Sciences at the National Institutes of Health. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Immunohistochemistry May Improve Melanoma Diagnosis

Article Type
Changed
Mon, 04/08/2024 - 10:27

A retrospective analysis of Medicare data revealed that between 2000 and 2017, immunohistochemistry (IHC) claims associated with melanoma diagnoses grew from 11% to 51%. Rising utilization — and substantial geographic variation in practice patterns — argue for further research to optimize IHC use in the diagnoses of melanoma, according to the authors.

But with sparse guidance regarding best practices for IHC in melanoma diagnosis, concerns for appropriate use are rising, they wrote in their report, recently published in JAMA Dermatology.

Kenechukwu Ojukwu, MD, MPP, of the department of pathology and laboratory medicine, University of California, Los Angeles, and coinvestigators, searched the Surveillance, Epidemiology, and End Results (SEER)–Medicare database for incident in situ or invasive cutaneous melanoma in patients 65 years and older and accompanying IHC claims made during the month of diagnosis through 14 days afterward.

Among 132,547 melanomas in 116,117 patients, 43,396 (33%) had accompanying IHC claims. Such claims were less common with increasing age, declining from 44% in patients aged 65-74 years to 18% in patients 85 aged years and older. Although melanoma incidence increased throughout the period studied, melanoma mortality rates remained relatively stable.

By summary stage at diagnosis, IHC utilization ranged from 29% of in situ cases to 75% of distant cases. After the researchers controlled for year of diagnosis, IHC use was statistically significantly associated with all demographic, tumor, and geographic characteristics examined, except race and ethnicity. Across all the years of the study, regional usage ranged from a low of 22% in Detroit to a high of 44% in both Louisiana and San Jose-Monterey, California. Figures for 2017 ranged from 39% of cases in Kentucky and Atlanta to 68% in New Mexico.



“Given the extensive use of IHC in clinical practice,” the authors concluded, “studies examining the resulting outcomes of IHC on different domains, such as symptom burden, quality of life, and mortality, are crucial.”

The “notable” regional variation in IHC utilization suggests uncertainty about its optimal employment in clinical practice, and, they wrote, “these findings highlight the need for research to identify where IHC provides the most value and to develop guidelines regarding the appropriate use of IHC.”

In an accompanying JAMA Dermatology editorial, Alexandra Flamm, MD, wrote, “now is an exciting time to practice dermatopathology, with an increased number of ancillary tests, such as IHC, that can be used to diagnose malignant neoplasms more precisely and to more accurately determine prognosis and therapeutic options in this age of precision medicine”.

However, added Dr. Flamm, a dermatologist and dermatopathologist at New York University, New York City, the increasing number of ancillary tests is fueling awareness of appropriate use and the importance of ensuring high-quality, value-based healthcare. “With this increased scrutiny on the appropriateness of ancillary histopathologic testing within dermatopathology,” she wrote, “the need is growing for parameters that can be used to guide when to use IHC testing and other ancillary testing.” And using dermatopathologist-developed tools such as American Society of Dermatopathology guidelines for 11 IHC tests can help ensure that appropriate medical decision-making is taken into account when creating these tools, she added.

 

 


IHC Usage Growing

“The paper confirms what I already knew,” said Whitney High, MD, JD, who was not involved with the study and was asked to comment on the results. “Use of IHC in dermatopathology has increased substantially, and probably will continue to increase over time.” The societal burden of IHC costs represents a legitimate concern, said Dr. High, professor of dermatology and pathology and director of dermatopathology at the University of Colorado, Aurora.

“However,” he told this news organization, “the histologic diagnosis of melanoma is sometimes substantially subjective — and all physicians, including pathologists, even though they are not providing care in the physical presence of the patient, are fiduciaries.” If an IHC stain would meaningfully improve a patient’s care, he said, physicians should attempt to provide it, unless strictly disallowed by a payer. Controlling medical-care costs might be better left to professional societies to guide care standards over time, he noted.

Whitney High, MD, JD, professor of dermatology and pathology and director of dermatopathology at the University of Colorado School of Medicine in Aurora
Dr. High
Dr. Whitney High


IHC has the potential to improve the accuracy and reliability of melanoma assessments by providing additional data, said Dr. High.“To this end, disallowing the use of immunostains simply due to cost, without substantial evidence, has the potential to alter diagnoses and impact care negatively.” This is particularly true for melanoma, he said, where “finding even one additional melanoma with IHC has life-altering consequences for that patient.”

How IHC might impact melanoma overdiagnosis remains unclear without additional study. IHC might allow dermatologists to avoid diagnosing melanoma in borderline cases unsupported by IHC, explained Dr. High, or false-positive results could further fuel melanoma overdiagnosis.

Limitations of the IHC paper included an inability to determine whether IHC improved outcomes. Additional shortcomings included use of a SEER-specific older population. And because CPT codes are not site-specific, some samples may have come from surgical margins or non-skin locations.

Study authors reported receiving grants from the National Cancer Institute (NCI) during the conduct of the study. The study was funded by the University of California, Los Angeles (UCLA) National Clinician Scholars Program, the UCLA Department of Pathology, the California Department of Public Health, and the NCI. Dr. High reports no relevant financial interests.

Publications
Topics
Sections

A retrospective analysis of Medicare data revealed that between 2000 and 2017, immunohistochemistry (IHC) claims associated with melanoma diagnoses grew from 11% to 51%. Rising utilization — and substantial geographic variation in practice patterns — argue for further research to optimize IHC use in the diagnoses of melanoma, according to the authors.

But with sparse guidance regarding best practices for IHC in melanoma diagnosis, concerns for appropriate use are rising, they wrote in their report, recently published in JAMA Dermatology.

Kenechukwu Ojukwu, MD, MPP, of the department of pathology and laboratory medicine, University of California, Los Angeles, and coinvestigators, searched the Surveillance, Epidemiology, and End Results (SEER)–Medicare database for incident in situ or invasive cutaneous melanoma in patients 65 years and older and accompanying IHC claims made during the month of diagnosis through 14 days afterward.

Among 132,547 melanomas in 116,117 patients, 43,396 (33%) had accompanying IHC claims. Such claims were less common with increasing age, declining from 44% in patients aged 65-74 years to 18% in patients 85 aged years and older. Although melanoma incidence increased throughout the period studied, melanoma mortality rates remained relatively stable.

By summary stage at diagnosis, IHC utilization ranged from 29% of in situ cases to 75% of distant cases. After the researchers controlled for year of diagnosis, IHC use was statistically significantly associated with all demographic, tumor, and geographic characteristics examined, except race and ethnicity. Across all the years of the study, regional usage ranged from a low of 22% in Detroit to a high of 44% in both Louisiana and San Jose-Monterey, California. Figures for 2017 ranged from 39% of cases in Kentucky and Atlanta to 68% in New Mexico.



“Given the extensive use of IHC in clinical practice,” the authors concluded, “studies examining the resulting outcomes of IHC on different domains, such as symptom burden, quality of life, and mortality, are crucial.”

The “notable” regional variation in IHC utilization suggests uncertainty about its optimal employment in clinical practice, and, they wrote, “these findings highlight the need for research to identify where IHC provides the most value and to develop guidelines regarding the appropriate use of IHC.”

In an accompanying JAMA Dermatology editorial, Alexandra Flamm, MD, wrote, “now is an exciting time to practice dermatopathology, with an increased number of ancillary tests, such as IHC, that can be used to diagnose malignant neoplasms more precisely and to more accurately determine prognosis and therapeutic options in this age of precision medicine”.

However, added Dr. Flamm, a dermatologist and dermatopathologist at New York University, New York City, the increasing number of ancillary tests is fueling awareness of appropriate use and the importance of ensuring high-quality, value-based healthcare. “With this increased scrutiny on the appropriateness of ancillary histopathologic testing within dermatopathology,” she wrote, “the need is growing for parameters that can be used to guide when to use IHC testing and other ancillary testing.” And using dermatopathologist-developed tools such as American Society of Dermatopathology guidelines for 11 IHC tests can help ensure that appropriate medical decision-making is taken into account when creating these tools, she added.

 

 


IHC Usage Growing

“The paper confirms what I already knew,” said Whitney High, MD, JD, who was not involved with the study and was asked to comment on the results. “Use of IHC in dermatopathology has increased substantially, and probably will continue to increase over time.” The societal burden of IHC costs represents a legitimate concern, said Dr. High, professor of dermatology and pathology and director of dermatopathology at the University of Colorado, Aurora.

“However,” he told this news organization, “the histologic diagnosis of melanoma is sometimes substantially subjective — and all physicians, including pathologists, even though they are not providing care in the physical presence of the patient, are fiduciaries.” If an IHC stain would meaningfully improve a patient’s care, he said, physicians should attempt to provide it, unless strictly disallowed by a payer. Controlling medical-care costs might be better left to professional societies to guide care standards over time, he noted.

Whitney High, MD, JD, professor of dermatology and pathology and director of dermatopathology at the University of Colorado School of Medicine in Aurora
Dr. High
Dr. Whitney High


IHC has the potential to improve the accuracy and reliability of melanoma assessments by providing additional data, said Dr. High.“To this end, disallowing the use of immunostains simply due to cost, without substantial evidence, has the potential to alter diagnoses and impact care negatively.” This is particularly true for melanoma, he said, where “finding even one additional melanoma with IHC has life-altering consequences for that patient.”

How IHC might impact melanoma overdiagnosis remains unclear without additional study. IHC might allow dermatologists to avoid diagnosing melanoma in borderline cases unsupported by IHC, explained Dr. High, or false-positive results could further fuel melanoma overdiagnosis.

Limitations of the IHC paper included an inability to determine whether IHC improved outcomes. Additional shortcomings included use of a SEER-specific older population. And because CPT codes are not site-specific, some samples may have come from surgical margins or non-skin locations.

Study authors reported receiving grants from the National Cancer Institute (NCI) during the conduct of the study. The study was funded by the University of California, Los Angeles (UCLA) National Clinician Scholars Program, the UCLA Department of Pathology, the California Department of Public Health, and the NCI. Dr. High reports no relevant financial interests.

A retrospective analysis of Medicare data revealed that between 2000 and 2017, immunohistochemistry (IHC) claims associated with melanoma diagnoses grew from 11% to 51%. Rising utilization — and substantial geographic variation in practice patterns — argue for further research to optimize IHC use in the diagnoses of melanoma, according to the authors.

But with sparse guidance regarding best practices for IHC in melanoma diagnosis, concerns for appropriate use are rising, they wrote in their report, recently published in JAMA Dermatology.

Kenechukwu Ojukwu, MD, MPP, of the department of pathology and laboratory medicine, University of California, Los Angeles, and coinvestigators, searched the Surveillance, Epidemiology, and End Results (SEER)–Medicare database for incident in situ or invasive cutaneous melanoma in patients 65 years and older and accompanying IHC claims made during the month of diagnosis through 14 days afterward.

Among 132,547 melanomas in 116,117 patients, 43,396 (33%) had accompanying IHC claims. Such claims were less common with increasing age, declining from 44% in patients aged 65-74 years to 18% in patients 85 aged years and older. Although melanoma incidence increased throughout the period studied, melanoma mortality rates remained relatively stable.

By summary stage at diagnosis, IHC utilization ranged from 29% of in situ cases to 75% of distant cases. After the researchers controlled for year of diagnosis, IHC use was statistically significantly associated with all demographic, tumor, and geographic characteristics examined, except race and ethnicity. Across all the years of the study, regional usage ranged from a low of 22% in Detroit to a high of 44% in both Louisiana and San Jose-Monterey, California. Figures for 2017 ranged from 39% of cases in Kentucky and Atlanta to 68% in New Mexico.



“Given the extensive use of IHC in clinical practice,” the authors concluded, “studies examining the resulting outcomes of IHC on different domains, such as symptom burden, quality of life, and mortality, are crucial.”

The “notable” regional variation in IHC utilization suggests uncertainty about its optimal employment in clinical practice, and, they wrote, “these findings highlight the need for research to identify where IHC provides the most value and to develop guidelines regarding the appropriate use of IHC.”

In an accompanying JAMA Dermatology editorial, Alexandra Flamm, MD, wrote, “now is an exciting time to practice dermatopathology, with an increased number of ancillary tests, such as IHC, that can be used to diagnose malignant neoplasms more precisely and to more accurately determine prognosis and therapeutic options in this age of precision medicine”.

However, added Dr. Flamm, a dermatologist and dermatopathologist at New York University, New York City, the increasing number of ancillary tests is fueling awareness of appropriate use and the importance of ensuring high-quality, value-based healthcare. “With this increased scrutiny on the appropriateness of ancillary histopathologic testing within dermatopathology,” she wrote, “the need is growing for parameters that can be used to guide when to use IHC testing and other ancillary testing.” And using dermatopathologist-developed tools such as American Society of Dermatopathology guidelines for 11 IHC tests can help ensure that appropriate medical decision-making is taken into account when creating these tools, she added.

 

 


IHC Usage Growing

“The paper confirms what I already knew,” said Whitney High, MD, JD, who was not involved with the study and was asked to comment on the results. “Use of IHC in dermatopathology has increased substantially, and probably will continue to increase over time.” The societal burden of IHC costs represents a legitimate concern, said Dr. High, professor of dermatology and pathology and director of dermatopathology at the University of Colorado, Aurora.

“However,” he told this news organization, “the histologic diagnosis of melanoma is sometimes substantially subjective — and all physicians, including pathologists, even though they are not providing care in the physical presence of the patient, are fiduciaries.” If an IHC stain would meaningfully improve a patient’s care, he said, physicians should attempt to provide it, unless strictly disallowed by a payer. Controlling medical-care costs might be better left to professional societies to guide care standards over time, he noted.

Whitney High, MD, JD, professor of dermatology and pathology and director of dermatopathology at the University of Colorado School of Medicine in Aurora
Dr. High
Dr. Whitney High


IHC has the potential to improve the accuracy and reliability of melanoma assessments by providing additional data, said Dr. High.“To this end, disallowing the use of immunostains simply due to cost, without substantial evidence, has the potential to alter diagnoses and impact care negatively.” This is particularly true for melanoma, he said, where “finding even one additional melanoma with IHC has life-altering consequences for that patient.”

How IHC might impact melanoma overdiagnosis remains unclear without additional study. IHC might allow dermatologists to avoid diagnosing melanoma in borderline cases unsupported by IHC, explained Dr. High, or false-positive results could further fuel melanoma overdiagnosis.

Limitations of the IHC paper included an inability to determine whether IHC improved outcomes. Additional shortcomings included use of a SEER-specific older population. And because CPT codes are not site-specific, some samples may have come from surgical margins or non-skin locations.

Study authors reported receiving grants from the National Cancer Institute (NCI) during the conduct of the study. The study was funded by the University of California, Los Angeles (UCLA) National Clinician Scholars Program, the UCLA Department of Pathology, the California Department of Public Health, and the NCI. Dr. High reports no relevant financial interests.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA DERMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Regular Exercise Linked to Better Sleep

Article Type
Changed
Wed, 04/03/2024 - 07:31

 

TOPLINE:

Over time, exercising at least twice a week is associated with significantly fewer insomnia symptoms and better sleep duration, new research shows.

METHODOLOGY:

  • The study included 4339 adults aged 39-67 years (48% men) from 21 centers in nine countries participating in the third follow-up to the European Community Respiratory Health Survey (ECRHS III).
  • Participants responded to questions about physical activity, insomnia symptoms, sleep duration, and daytime sleepiness at 10-year follow-up.
  • Being “physically active” was defined as exercising with a frequency of at least twice a week for ≥ 1 hour per week.
  • The main outcome measures were insomnia, sleep time, and daytime sleepiness in relation to physical activity.

TAKEAWAY:

  • From baseline to follow-up, 37% of participants were persistently inactive, 25% were persistently active, 20% became inactive, and 18% became active.
  • After adjustment for age, sex, body mass index, smoking history, and study center, persistently active participants were less likely to report difficulties with sleep initiation (adjusted odds ratio [aOR], 0.60; 95% CI, 0.45-0.78), with short sleep duration of ≤ 6 hours/night (aOR, 0.71; 95% CI, 0.59-0.85) and long sleep of ≥ 9 hours/night (aOR, 0.53; 95% CI, 0.33-0.84), compared with persistently nonactive subjects.
  • Those who were persistently active were 22% less likely to report any symptoms of insomnia, 40% less likely to report two symptoms, and 37% less likely to report three symptoms.
  • Daytime sleepiness and difficulties maintaining sleep were found to be unrelated to physical activity status.

IN PRACTICE:

“This study has a long follow-up period (10 years) and indicates strongly that consistency in physical activity might be an important factor in optimizing sleep duration and reducing the symptoms of insomnia,” the authors wrote.

SOURCE:

Erla Björnsdóttir, of the Department of Psychology, Reykjavik University, Reykjavik, Iceland, was the co-senior author and corresponding author of the study. It was published online on March 25 in BMJ Open.

LIMITATIONS:

It’s unclear whether individuals who were active at both timepoints had been continuously physically active throughout the study period or only at those two timepoints. Sleep variables were available only at follow-up and were all subjectively reported, meaning the associations between physical activity and sleep may not be longitudinal. Residual confounders (eg, mental health and musculoskeletal disorders or chronic pain) that can influence both sleep and exercise were not explored.

DISCLOSURES:

Financial support for ECRHS III was provided by the National Health and Medical Research Council (Australia); Antwerp South, Antwerp City: Research Foundation Flanders (Belgium); Estonian Ministry of Education (Estonia); and other international agencies. Additional sources of funding were listed on the original paper. The authors reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Over time, exercising at least twice a week is associated with significantly fewer insomnia symptoms and better sleep duration, new research shows.

METHODOLOGY:

  • The study included 4339 adults aged 39-67 years (48% men) from 21 centers in nine countries participating in the third follow-up to the European Community Respiratory Health Survey (ECRHS III).
  • Participants responded to questions about physical activity, insomnia symptoms, sleep duration, and daytime sleepiness at 10-year follow-up.
  • Being “physically active” was defined as exercising with a frequency of at least twice a week for ≥ 1 hour per week.
  • The main outcome measures were insomnia, sleep time, and daytime sleepiness in relation to physical activity.

TAKEAWAY:

  • From baseline to follow-up, 37% of participants were persistently inactive, 25% were persistently active, 20% became inactive, and 18% became active.
  • After adjustment for age, sex, body mass index, smoking history, and study center, persistently active participants were less likely to report difficulties with sleep initiation (adjusted odds ratio [aOR], 0.60; 95% CI, 0.45-0.78), with short sleep duration of ≤ 6 hours/night (aOR, 0.71; 95% CI, 0.59-0.85) and long sleep of ≥ 9 hours/night (aOR, 0.53; 95% CI, 0.33-0.84), compared with persistently nonactive subjects.
  • Those who were persistently active were 22% less likely to report any symptoms of insomnia, 40% less likely to report two symptoms, and 37% less likely to report three symptoms.
  • Daytime sleepiness and difficulties maintaining sleep were found to be unrelated to physical activity status.

IN PRACTICE:

“This study has a long follow-up period (10 years) and indicates strongly that consistency in physical activity might be an important factor in optimizing sleep duration and reducing the symptoms of insomnia,” the authors wrote.

SOURCE:

Erla Björnsdóttir, of the Department of Psychology, Reykjavik University, Reykjavik, Iceland, was the co-senior author and corresponding author of the study. It was published online on March 25 in BMJ Open.

LIMITATIONS:

It’s unclear whether individuals who were active at both timepoints had been continuously physically active throughout the study period or only at those two timepoints. Sleep variables were available only at follow-up and were all subjectively reported, meaning the associations between physical activity and sleep may not be longitudinal. Residual confounders (eg, mental health and musculoskeletal disorders or chronic pain) that can influence both sleep and exercise were not explored.

DISCLOSURES:

Financial support for ECRHS III was provided by the National Health and Medical Research Council (Australia); Antwerp South, Antwerp City: Research Foundation Flanders (Belgium); Estonian Ministry of Education (Estonia); and other international agencies. Additional sources of funding were listed on the original paper. The authors reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Over time, exercising at least twice a week is associated with significantly fewer insomnia symptoms and better sleep duration, new research shows.

METHODOLOGY:

  • The study included 4339 adults aged 39-67 years (48% men) from 21 centers in nine countries participating in the third follow-up to the European Community Respiratory Health Survey (ECRHS III).
  • Participants responded to questions about physical activity, insomnia symptoms, sleep duration, and daytime sleepiness at 10-year follow-up.
  • Being “physically active” was defined as exercising with a frequency of at least twice a week for ≥ 1 hour per week.
  • The main outcome measures were insomnia, sleep time, and daytime sleepiness in relation to physical activity.

TAKEAWAY:

  • From baseline to follow-up, 37% of participants were persistently inactive, 25% were persistently active, 20% became inactive, and 18% became active.
  • After adjustment for age, sex, body mass index, smoking history, and study center, persistently active participants were less likely to report difficulties with sleep initiation (adjusted odds ratio [aOR], 0.60; 95% CI, 0.45-0.78), with short sleep duration of ≤ 6 hours/night (aOR, 0.71; 95% CI, 0.59-0.85) and long sleep of ≥ 9 hours/night (aOR, 0.53; 95% CI, 0.33-0.84), compared with persistently nonactive subjects.
  • Those who were persistently active were 22% less likely to report any symptoms of insomnia, 40% less likely to report two symptoms, and 37% less likely to report three symptoms.
  • Daytime sleepiness and difficulties maintaining sleep were found to be unrelated to physical activity status.

IN PRACTICE:

“This study has a long follow-up period (10 years) and indicates strongly that consistency in physical activity might be an important factor in optimizing sleep duration and reducing the symptoms of insomnia,” the authors wrote.

SOURCE:

Erla Björnsdóttir, of the Department of Psychology, Reykjavik University, Reykjavik, Iceland, was the co-senior author and corresponding author of the study. It was published online on March 25 in BMJ Open.

LIMITATIONS:

It’s unclear whether individuals who were active at both timepoints had been continuously physically active throughout the study period or only at those two timepoints. Sleep variables were available only at follow-up and were all subjectively reported, meaning the associations between physical activity and sleep may not be longitudinal. Residual confounders (eg, mental health and musculoskeletal disorders or chronic pain) that can influence both sleep and exercise were not explored.

DISCLOSURES:

Financial support for ECRHS III was provided by the National Health and Medical Research Council (Australia); Antwerp South, Antwerp City: Research Foundation Flanders (Belgium); Estonian Ministry of Education (Estonia); and other international agencies. Additional sources of funding were listed on the original paper. The authors reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article