Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort

Two Diets Linked to Improved Cognition, Slowed Brain Aging

Article Type
Changed
Wed, 07/31/2024 - 13:18

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELL METABOLISM

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heat Waves: A Silent Threat to Older Adults’ Kidneys

Article Type
Changed
Tue, 08/06/2024 - 02:25

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Risk Stratification May Work Well for FIT-Based CRC Screening in Elderly

Article Type
Changed
Wed, 08/07/2024 - 14:59

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Statins, Vitamin D, and Exercise in Older Adults

Article Type
Changed
Mon, 07/29/2024 - 15:09

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Topics
Sections

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Atogepant May Prevent Rebound Headache From Medication Overuse in Chronic Migraine

Article Type
Changed
Mon, 07/29/2024 - 15:15

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

TBI Significantly Increases Mortality Rate Among Veterans With Epilepsy

Article Type
Changed
Thu, 07/18/2024 - 10:11

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Publications
Topics
Sections

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EPILEPSIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Estimates Global Prevalence of Seborrheic Dermatitis

Article Type
Changed
Wed, 07/17/2024 - 10:52

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Most Potentially Hepatotoxic Meds Revealed: Real-World Data Analysis

Article Type
Changed
Mon, 07/22/2024 - 22:49

 

TOPLINE:

An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.

METHODOLOGY:

  • Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
  • Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
  • The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
  • Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.

TAKEAWAY:

  • Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
  • Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
  • Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
  • Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.

IN PRACTICE:

“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”

The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.

SOURCE:

The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.

LIMITATIONS:

The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.

DISCLOSURES:

This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.

METHODOLOGY:

  • Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
  • Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
  • The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
  • Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.

TAKEAWAY:

  • Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
  • Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
  • Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
  • Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.

IN PRACTICE:

“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”

The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.

SOURCE:

The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.

LIMITATIONS:

The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.

DISCLOSURES:

This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.

METHODOLOGY:

  • Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
  • Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
  • The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
  • Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.

TAKEAWAY:

  • Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
  • Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
  • Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
  • Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.

IN PRACTICE:

“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”

The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.

SOURCE:

The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.

LIMITATIONS:

The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.

DISCLOSURES:

This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

What Are the Ethics of Sex and Romance for Older Adults in Nursing Homes?

Article Type
Changed
Wed, 07/17/2024 - 15:16

This transcript has been edited for clarity. 

I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place. 

I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs. 

I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been. 

We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities. 

Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex? 

It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over. 

We can do better and we ought to do better. We ought to make sexuality and romance part of the possibility of enjoying your older years, if that’s what you wish to do. 

Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes. 

Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting. 

How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others. 

As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes. 

Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior. 

Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on. 

At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly. 

Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

This transcript has been edited for clarity. 

I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place. 

I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs. 

I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been. 

We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities. 

Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex? 

It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over. 

We can do better and we ought to do better. We ought to make sexuality and romance part of the possibility of enjoying your older years, if that’s what you wish to do. 

Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes. 

Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting. 

How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others. 

As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes. 

Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior. 

Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on. 

At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly. 

Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.

A version of this article first appeared on Medscape.com.

This transcript has been edited for clarity. 

I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place. 

I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs. 

I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been. 

We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities. 

Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex? 

It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over. 

We can do better and we ought to do better. We ought to make sexuality and romance part of the possibility of enjoying your older years, if that’s what you wish to do. 

Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes. 

Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting. 

How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others. 

As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes. 

Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior. 

Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on. 

At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly. 

Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Guidance on How Best to Manage Opioid Risks in Older Adults

Article Type
Changed
Wed, 07/17/2024 - 15:17

Polypharmacy and slow metabolism of drugs create a high risk among older adults for substance use disorder, raising the odds of intentional and unintentional overdoses. However, screening, assessment, and treatment for substance use disorder occurs less often in younger adults.
 

Rates of overdose from opioids increased the most among people aged 65 years and older from 2021 to 2022, compared with among younger age groups. Meanwhile, recent data show less than half older adults with opioid use disorder (OUD) receive care for the condition.

“Nobody is immune to developing some kind of use disorder, so don’t just assume that because someone’s 80 years old that there’s no way that they have a problem,” said Sara Meyer, PharmD, a medication safety pharmacist at Novant Health in Winston-Salem, North Carolina. “You never know who’s going to potentially have an issue.”

Clinicians and health systems like Novant are spearheading efforts to best manage older adults who may need opioids because of conditions like chronic pain in an effort to reduce addiction and overdoses.
 

Older Adults Have Unique Needs

A major challenge of treating older adults is their high incidence of chronic pain and multiple complex chronic conditions. As a result, some of the nonopioid medications clinicians might otherwise prescribe, like nonsteroidal anti-inflammatory drugs, cannot be used, according to Caroline Goldzweig, MD, chief medical officer of the Cedars-Sinai Medical Network in Los Angeles, California.

“Before you know it, the only thing left is an opiate, so you can sometimes be between a rock and a hard place,” she said.

But for adults older than 65 years, opioids can carry problematic side effects, including sedation, cognitive impairment, falls, and fractures.

With those factors in mind, part of a yearly checkup or wellness visit should include time to discuss how a patient is managing their chronic pain, according to Timothy Anderson, MD, an assistant professor of medicine at the University of Pittsburgh, Pittsburgh, Pennsylvania, and codirector of the Prescribing Wisely Lab, a research collaboration between that institution and Beth Israel Deaconess Medical Center in Boston.

When considering a prescription for pain medication, Dr. Anderson said he evaluates the potential worst, best, and average outcomes for a patient. Nonopioid options should always be considered first-line treatment. Patients and physicians often struggle with balancing an option that meets a patient’s goals for pain relief but does not put them at a risk for adverse outcomes, he said.
 

Greater Risk

Older adults experience neurophysiologic effects different from younger people, said Benjamin Han, MD, a geriatrician and addiction medicine specialist at the University of California, San Diego.

Seniors also absorb, metabolize, and excrete drugs differently, sometimes affected by decreased production of gastric acid, lean body mass, and renal function. Coupled with complications of other chronic conditions or medications, diagnosing problematic opioid use or OUD can be one of the most challenging experiences in geriatrics, Dr. Han said.

As a result, OUD is often underdiagnosed in these patients, he said. Single-item screening tools like the TAPS and OWLS can be used to assess if the benefits of an opioid outweigh a patient’s risk for addiction.

Dr. Han finds medications like buprenorphine to be relatively safe and effective, along with nonpharmacologic interventions like physical therapy. He also advised clinicians to provide patients with opioid-overdose reversal agents.

Naloxone is only used for reversing opioid withdrawal, but it is important to ensure that any patient at risk for an overdose, including being on chronic opioids, is provided naloxone and educated on preventing opioid overdoses,” he said.

Steroid injections and medications that target specific pathways, such as neuropathic pain, can be helpful in primary care for these older patients, according to Pooja Lagisetty, MD, an internal medicine physician at Michigan Medicine and a research scientist at VA Ann Arbor Health Care, Ann Arbor, Michigan.

She often recommends to her patients online programs that help them maintain strength and mobility, as well as low-impact exercises like tai chi, for pain management.

“This will ensure a much more balanced, patient-centered conversation with whatever decisions you and your patient come to,” Dr. Lagisetty said.
 

 

 

New Protocols for Pain Management in Older Adults

At the health system level, clinicians can use treatment agreements for patients taking opioids. At Novant, patients must attest they agree to take the medications only as prescribed and from a specified pharmacy. They promise not to seek opioids from other sources, to submit to random drug screenings, and to communicate regularly with their clinician about any health issues.

If a patient violates any part of this agreement, their clinician can stop the treatment. The system encourages clinicians to help patients find additional care for substance abuse disorder or pain management if it occurs.

Over the past 2 years, Novant also developed an AI prediction model, which generates a score for the risk a patient has in developing substance use disorder or experiencing an overdose within a year of initial opioid prescription. The model was validated by an internal team at the system but has not been independently certified.

If a patient has a high-risk score, their clinician considers additional risk mitigation strategies, such as seeing the patient more frequently or using an abuse deterrent formulation of an opioid. They also have the option of referring the patient to specialists in addiction medicine or neurology. Opioids are not necessarily withheld, according to Dr. Meyer. The tool is now used by clinicians during Medicare annual wellness visits.

And coming later this year are new protocols for pain management in patients aged 80 years and older. Clinicians will target a 50% dose reduction, compared with what a younger patient might receive to account for physiologic differences.

“We know that especially with some opioids like morphine, they’re not going to metabolize that the same way a young person with a young kidney will, so we’re trying to set the clinician up to select a lower starting dose for patients that are older,” Dr. Meyer said.

In 2017, the system implemented a program to reduce prescription of opioids to less than 350 morphine milligram equivalents (MME) per order following any kind of surgery. The health system compared numbers of prescriptions written among surgical colleagues and met with them to discuss alternative approaches. Novant said it continues to monitor the data and follow-up with surgeons who are not in alignment with the goal.

Between 2017 and 2019, patients switching to lower doses after surgeries rose by 20%.

Across the country at Cedars-Sinai Medical Network, leadership in 2016 made the move to deprescribe opioids or lower doses of the drugs to less than 90 MME per day, in accordance with Centers for Disease Control and Prevention guidelines established that year. Patients were referred to their pain program for support and for nonopioid interventions. Pharmacists worked closely with clinicians on safely tapering these medications in patients taking high doses.

The program worked, according to Dr. Goldzweig. Dr. Goldzweig could only find two patients currently taking high-dose opioids in the system’s database out of more than 7000 patients with Medicare Advantage insurance coverage.

“There will always be some patients who have no alternative than opioids, but we established some discipline with urine tox screens and pain agreements, and over time, we’ve been able to reduce the number of high-risk opioid prescriptions,” she said.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Polypharmacy and slow metabolism of drugs create a high risk among older adults for substance use disorder, raising the odds of intentional and unintentional overdoses. However, screening, assessment, and treatment for substance use disorder occurs less often in younger adults.
 

Rates of overdose from opioids increased the most among people aged 65 years and older from 2021 to 2022, compared with among younger age groups. Meanwhile, recent data show less than half older adults with opioid use disorder (OUD) receive care for the condition.

“Nobody is immune to developing some kind of use disorder, so don’t just assume that because someone’s 80 years old that there’s no way that they have a problem,” said Sara Meyer, PharmD, a medication safety pharmacist at Novant Health in Winston-Salem, North Carolina. “You never know who’s going to potentially have an issue.”

Clinicians and health systems like Novant are spearheading efforts to best manage older adults who may need opioids because of conditions like chronic pain in an effort to reduce addiction and overdoses.
 

Older Adults Have Unique Needs

A major challenge of treating older adults is their high incidence of chronic pain and multiple complex chronic conditions. As a result, some of the nonopioid medications clinicians might otherwise prescribe, like nonsteroidal anti-inflammatory drugs, cannot be used, according to Caroline Goldzweig, MD, chief medical officer of the Cedars-Sinai Medical Network in Los Angeles, California.

“Before you know it, the only thing left is an opiate, so you can sometimes be between a rock and a hard place,” she said.

But for adults older than 65 years, opioids can carry problematic side effects, including sedation, cognitive impairment, falls, and fractures.

With those factors in mind, part of a yearly checkup or wellness visit should include time to discuss how a patient is managing their chronic pain, according to Timothy Anderson, MD, an assistant professor of medicine at the University of Pittsburgh, Pittsburgh, Pennsylvania, and codirector of the Prescribing Wisely Lab, a research collaboration between that institution and Beth Israel Deaconess Medical Center in Boston.

When considering a prescription for pain medication, Dr. Anderson said he evaluates the potential worst, best, and average outcomes for a patient. Nonopioid options should always be considered first-line treatment. Patients and physicians often struggle with balancing an option that meets a patient’s goals for pain relief but does not put them at a risk for adverse outcomes, he said.
 

Greater Risk

Older adults experience neurophysiologic effects different from younger people, said Benjamin Han, MD, a geriatrician and addiction medicine specialist at the University of California, San Diego.

Seniors also absorb, metabolize, and excrete drugs differently, sometimes affected by decreased production of gastric acid, lean body mass, and renal function. Coupled with complications of other chronic conditions or medications, diagnosing problematic opioid use or OUD can be one of the most challenging experiences in geriatrics, Dr. Han said.

As a result, OUD is often underdiagnosed in these patients, he said. Single-item screening tools like the TAPS and OWLS can be used to assess if the benefits of an opioid outweigh a patient’s risk for addiction.

Dr. Han finds medications like buprenorphine to be relatively safe and effective, along with nonpharmacologic interventions like physical therapy. He also advised clinicians to provide patients with opioid-overdose reversal agents.

Naloxone is only used for reversing opioid withdrawal, but it is important to ensure that any patient at risk for an overdose, including being on chronic opioids, is provided naloxone and educated on preventing opioid overdoses,” he said.

Steroid injections and medications that target specific pathways, such as neuropathic pain, can be helpful in primary care for these older patients, according to Pooja Lagisetty, MD, an internal medicine physician at Michigan Medicine and a research scientist at VA Ann Arbor Health Care, Ann Arbor, Michigan.

She often recommends to her patients online programs that help them maintain strength and mobility, as well as low-impact exercises like tai chi, for pain management.

“This will ensure a much more balanced, patient-centered conversation with whatever decisions you and your patient come to,” Dr. Lagisetty said.
 

 

 

New Protocols for Pain Management in Older Adults

At the health system level, clinicians can use treatment agreements for patients taking opioids. At Novant, patients must attest they agree to take the medications only as prescribed and from a specified pharmacy. They promise not to seek opioids from other sources, to submit to random drug screenings, and to communicate regularly with their clinician about any health issues.

If a patient violates any part of this agreement, their clinician can stop the treatment. The system encourages clinicians to help patients find additional care for substance abuse disorder or pain management if it occurs.

Over the past 2 years, Novant also developed an AI prediction model, which generates a score for the risk a patient has in developing substance use disorder or experiencing an overdose within a year of initial opioid prescription. The model was validated by an internal team at the system but has not been independently certified.

If a patient has a high-risk score, their clinician considers additional risk mitigation strategies, such as seeing the patient more frequently or using an abuse deterrent formulation of an opioid. They also have the option of referring the patient to specialists in addiction medicine or neurology. Opioids are not necessarily withheld, according to Dr. Meyer. The tool is now used by clinicians during Medicare annual wellness visits.

And coming later this year are new protocols for pain management in patients aged 80 years and older. Clinicians will target a 50% dose reduction, compared with what a younger patient might receive to account for physiologic differences.

“We know that especially with some opioids like morphine, they’re not going to metabolize that the same way a young person with a young kidney will, so we’re trying to set the clinician up to select a lower starting dose for patients that are older,” Dr. Meyer said.

In 2017, the system implemented a program to reduce prescription of opioids to less than 350 morphine milligram equivalents (MME) per order following any kind of surgery. The health system compared numbers of prescriptions written among surgical colleagues and met with them to discuss alternative approaches. Novant said it continues to monitor the data and follow-up with surgeons who are not in alignment with the goal.

Between 2017 and 2019, patients switching to lower doses after surgeries rose by 20%.

Across the country at Cedars-Sinai Medical Network, leadership in 2016 made the move to deprescribe opioids or lower doses of the drugs to less than 90 MME per day, in accordance with Centers for Disease Control and Prevention guidelines established that year. Patients were referred to their pain program for support and for nonopioid interventions. Pharmacists worked closely with clinicians on safely tapering these medications in patients taking high doses.

The program worked, according to Dr. Goldzweig. Dr. Goldzweig could only find two patients currently taking high-dose opioids in the system’s database out of more than 7000 patients with Medicare Advantage insurance coverage.

“There will always be some patients who have no alternative than opioids, but we established some discipline with urine tox screens and pain agreements, and over time, we’ve been able to reduce the number of high-risk opioid prescriptions,” she said.

A version of this article first appeared on Medscape.com.

Polypharmacy and slow metabolism of drugs create a high risk among older adults for substance use disorder, raising the odds of intentional and unintentional overdoses. However, screening, assessment, and treatment for substance use disorder occurs less often in younger adults.
 

Rates of overdose from opioids increased the most among people aged 65 years and older from 2021 to 2022, compared with among younger age groups. Meanwhile, recent data show less than half older adults with opioid use disorder (OUD) receive care for the condition.

“Nobody is immune to developing some kind of use disorder, so don’t just assume that because someone’s 80 years old that there’s no way that they have a problem,” said Sara Meyer, PharmD, a medication safety pharmacist at Novant Health in Winston-Salem, North Carolina. “You never know who’s going to potentially have an issue.”

Clinicians and health systems like Novant are spearheading efforts to best manage older adults who may need opioids because of conditions like chronic pain in an effort to reduce addiction and overdoses.
 

Older Adults Have Unique Needs

A major challenge of treating older adults is their high incidence of chronic pain and multiple complex chronic conditions. As a result, some of the nonopioid medications clinicians might otherwise prescribe, like nonsteroidal anti-inflammatory drugs, cannot be used, according to Caroline Goldzweig, MD, chief medical officer of the Cedars-Sinai Medical Network in Los Angeles, California.

“Before you know it, the only thing left is an opiate, so you can sometimes be between a rock and a hard place,” she said.

But for adults older than 65 years, opioids can carry problematic side effects, including sedation, cognitive impairment, falls, and fractures.

With those factors in mind, part of a yearly checkup or wellness visit should include time to discuss how a patient is managing their chronic pain, according to Timothy Anderson, MD, an assistant professor of medicine at the University of Pittsburgh, Pittsburgh, Pennsylvania, and codirector of the Prescribing Wisely Lab, a research collaboration between that institution and Beth Israel Deaconess Medical Center in Boston.

When considering a prescription for pain medication, Dr. Anderson said he evaluates the potential worst, best, and average outcomes for a patient. Nonopioid options should always be considered first-line treatment. Patients and physicians often struggle with balancing an option that meets a patient’s goals for pain relief but does not put them at a risk for adverse outcomes, he said.
 

Greater Risk

Older adults experience neurophysiologic effects different from younger people, said Benjamin Han, MD, a geriatrician and addiction medicine specialist at the University of California, San Diego.

Seniors also absorb, metabolize, and excrete drugs differently, sometimes affected by decreased production of gastric acid, lean body mass, and renal function. Coupled with complications of other chronic conditions or medications, diagnosing problematic opioid use or OUD can be one of the most challenging experiences in geriatrics, Dr. Han said.

As a result, OUD is often underdiagnosed in these patients, he said. Single-item screening tools like the TAPS and OWLS can be used to assess if the benefits of an opioid outweigh a patient’s risk for addiction.

Dr. Han finds medications like buprenorphine to be relatively safe and effective, along with nonpharmacologic interventions like physical therapy. He also advised clinicians to provide patients with opioid-overdose reversal agents.

Naloxone is only used for reversing opioid withdrawal, but it is important to ensure that any patient at risk for an overdose, including being on chronic opioids, is provided naloxone and educated on preventing opioid overdoses,” he said.

Steroid injections and medications that target specific pathways, such as neuropathic pain, can be helpful in primary care for these older patients, according to Pooja Lagisetty, MD, an internal medicine physician at Michigan Medicine and a research scientist at VA Ann Arbor Health Care, Ann Arbor, Michigan.

She often recommends to her patients online programs that help them maintain strength and mobility, as well as low-impact exercises like tai chi, for pain management.

“This will ensure a much more balanced, patient-centered conversation with whatever decisions you and your patient come to,” Dr. Lagisetty said.
 

 

 

New Protocols for Pain Management in Older Adults

At the health system level, clinicians can use treatment agreements for patients taking opioids. At Novant, patients must attest they agree to take the medications only as prescribed and from a specified pharmacy. They promise not to seek opioids from other sources, to submit to random drug screenings, and to communicate regularly with their clinician about any health issues.

If a patient violates any part of this agreement, their clinician can stop the treatment. The system encourages clinicians to help patients find additional care for substance abuse disorder or pain management if it occurs.

Over the past 2 years, Novant also developed an AI prediction model, which generates a score for the risk a patient has in developing substance use disorder or experiencing an overdose within a year of initial opioid prescription. The model was validated by an internal team at the system but has not been independently certified.

If a patient has a high-risk score, their clinician considers additional risk mitigation strategies, such as seeing the patient more frequently or using an abuse deterrent formulation of an opioid. They also have the option of referring the patient to specialists in addiction medicine or neurology. Opioids are not necessarily withheld, according to Dr. Meyer. The tool is now used by clinicians during Medicare annual wellness visits.

And coming later this year are new protocols for pain management in patients aged 80 years and older. Clinicians will target a 50% dose reduction, compared with what a younger patient might receive to account for physiologic differences.

“We know that especially with some opioids like morphine, they’re not going to metabolize that the same way a young person with a young kidney will, so we’re trying to set the clinician up to select a lower starting dose for patients that are older,” Dr. Meyer said.

In 2017, the system implemented a program to reduce prescription of opioids to less than 350 morphine milligram equivalents (MME) per order following any kind of surgery. The health system compared numbers of prescriptions written among surgical colleagues and met with them to discuss alternative approaches. Novant said it continues to monitor the data and follow-up with surgeons who are not in alignment with the goal.

Between 2017 and 2019, patients switching to lower doses after surgeries rose by 20%.

Across the country at Cedars-Sinai Medical Network, leadership in 2016 made the move to deprescribe opioids or lower doses of the drugs to less than 90 MME per day, in accordance with Centers for Disease Control and Prevention guidelines established that year. Patients were referred to their pain program for support and for nonopioid interventions. Pharmacists worked closely with clinicians on safely tapering these medications in patients taking high doses.

The program worked, according to Dr. Goldzweig. Dr. Goldzweig could only find two patients currently taking high-dose opioids in the system’s database out of more than 7000 patients with Medicare Advantage insurance coverage.

“There will always be some patients who have no alternative than opioids, but we established some discipline with urine tox screens and pain agreements, and over time, we’ve been able to reduce the number of high-risk opioid prescriptions,” she said.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article