Weight Loss Surgery, Obesity Drugs Achieve Similar Results but Have Different Safety Profiles

Article Type
Changed
Fri, 11/01/2024 - 15:56

Roux-en-Y gastric bypass (RYGB) produces maximal weight loss in patients with obesity, compared with other surgical procedures and with weight loss drugs, according to a meta-analysis comparing the efficacy and safety of the different treatment options. 

However, tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 receptor agonist (GLP-1 RA), produces comparable weight loss and has a favorable safety profile, reported principal investigator Jena Velji-Ibrahim, MD, MSc, from Prisma Health–Upstate/University of South Carolina School of Medicine in Greenville. 

In addition, there was “no significant difference in percentage total body weight loss between tirzepatide when comparing it to one-anastomosis gastric bypass (OAGB), as well as laparoscopic sleeve gastrectomy,” she said. 

All 11 interventions studied exerted weight loss effects, and side-effect profiles were also deemed largely favorable, particularly for endoscopic interventions, she added. 

“When we compare bariatric surgery to bariatric endoscopy, endoscopic sleeve gastroplasty and transpyloric shuttle offer a minimally invasive alternative with good weight loss outcomes and fewer adverse events,” she said.

Velji-Ibrahim presented the findings at the annual meeting of the American College of Gastroenterology (ACG)
 

Comparing Weight Loss Interventions

Many of the studies comparing weight loss interventions to date have been limited by relatively small sample sizes, observational designs, and inconsistent results. This prompted Velji-Ibrahim and her colleagues to conduct what they believe to be the first-of-its-kind meta-analysis on this topic. 

They began by conducting a systematic search of the literature to identify randomized controlled trials (RCTs) that compared the efficacy of Food and Drug Administration–approved bariatric surgeries, bariatric endoscopies, and medications — against each other or with placebo — in adults with a body mass index of 25-45, with or without concurrent type 2 diabetes. 

A network meta-analysis was then performed to assess the various interventions’ impact on percentage total weight loss and side-effect profiles. P-scores were calculated to rank the treatments and identify the preferred interventions. The duration of therapy was 52 weeks. 

In total, 34 eligible RCTs with 15,660 patients were included. Overall, the RCTs analyzed 11 weight loss treatments, including bariatric surgeries (four studies), bariatric endoscopies (three studies), and medications (four studies). 

Specifically, the bariatric surgeries included RYGB, laparoscopic sleeve gastrectomy, OAGB, and laparoscopic adjustable gastric banding; bariatric endoscopies included endoscopic sleeve gastroplasty, transpyloric shuttle, and intragastric balloon; and medications included tirzepatide, semaglutide, and liraglutide.

Although all interventions were associated with reductions in percentage total weight loss compared with placebo, RYGB led to the greatest reductions (19.29%) and was ranked as the first preferred treatment (97% probability). It was followed in the rankings by OAGB, tirzepatide 15 mg, laparoscopic sleeve gastrectomy, and semaglutide 2.4 mg. 

Tirzepatide 15 mg had a slightly lower percentage total weight loss (15.18%) but a favorable safety profile. There was no significant difference in percentage total weight loss between tirzepatide 15 mg and OAGB (mean difference, 2.97%) or laparoscopic sleeve gastrectomy (mean difference, 0.43%). 

There was also no significant difference in percentage total weight loss between semaglutide 2.4 mg, compared with endoscopic sleeve gastroplasty and transpyloric shuttle. 

Endoscopic sleeve, transpyloric shuttle, and intragastric balloon all resulted in weight loss > 5%. 

When compared with bariatric surgery, “endoscopic interventions had a better side-effect profile, with no increased odds of mortality and intensive care needs,” Velji-Ibrahim said. 

When it came to the medications, “the most common side effects were gastrointestinal in nature, which included nausea, vomiting, diarrhea, and constipation,” she said.
 

 

 

Combining, Rather Than Comparing, Therapies

Following the presentation, session co-moderator Shivangi T. Kothari, MD, assistant professor of medicine and associate director of endoscopy at the University of Rochester Medical Center in New York, shared her thoughts of what the future of obesity management research might look like. 

It’s not just going to be about percentage total weight loss, she said, but about how well the effect is sustained following the intervention. 

And we might move “away from comparing one modality to another” and instead study combination therapies, “which would be ideal,” said Kothari.

This was the focus of another meta-analysis presented at ACG 2024, in which Nihal Ijaz I. Khan, MD, and colleagues compared the efficacy of endoscopic bariatric treatment alone vs its combined use with GLP-1 RAs.

The researchers identified three retrospective studies with 266 patients, of whom 143 underwent endoscopic bariatric treatment alone (either endoscopic sleeve gastroplasty or intragastric balloon) and 123 had it combined with GLP-1 RAs, specifically liraglutide. 

They reported that superior absolute weight loss was achieved in the group of patients receiving GLP-1 RAs in combination with endoscopic bariatric treatment. The standardized mean difference in body weight loss at treatment follow-up was 0.61 (P <.01). 

“Further studies are required to evaluate the safety and adverse events comparing these two treatment modalities and to discover differences between comparing the two endoscopic options to various GLP-1 receptor agonists,” Khan noted. 

Neither study had specific funding. Velji-Ibrahim and Khan reported no relevant financial relationships. Kothari reported serving as a consultant for Boston Scientific and Olympus, as well as serving as an advisory committee/board member for Castle Biosciences.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Roux-en-Y gastric bypass (RYGB) produces maximal weight loss in patients with obesity, compared with other surgical procedures and with weight loss drugs, according to a meta-analysis comparing the efficacy and safety of the different treatment options. 

However, tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 receptor agonist (GLP-1 RA), produces comparable weight loss and has a favorable safety profile, reported principal investigator Jena Velji-Ibrahim, MD, MSc, from Prisma Health–Upstate/University of South Carolina School of Medicine in Greenville. 

In addition, there was “no significant difference in percentage total body weight loss between tirzepatide when comparing it to one-anastomosis gastric bypass (OAGB), as well as laparoscopic sleeve gastrectomy,” she said. 

All 11 interventions studied exerted weight loss effects, and side-effect profiles were also deemed largely favorable, particularly for endoscopic interventions, she added. 

“When we compare bariatric surgery to bariatric endoscopy, endoscopic sleeve gastroplasty and transpyloric shuttle offer a minimally invasive alternative with good weight loss outcomes and fewer adverse events,” she said.

Velji-Ibrahim presented the findings at the annual meeting of the American College of Gastroenterology (ACG)
 

Comparing Weight Loss Interventions

Many of the studies comparing weight loss interventions to date have been limited by relatively small sample sizes, observational designs, and inconsistent results. This prompted Velji-Ibrahim and her colleagues to conduct what they believe to be the first-of-its-kind meta-analysis on this topic. 

They began by conducting a systematic search of the literature to identify randomized controlled trials (RCTs) that compared the efficacy of Food and Drug Administration–approved bariatric surgeries, bariatric endoscopies, and medications — against each other or with placebo — in adults with a body mass index of 25-45, with or without concurrent type 2 diabetes. 

A network meta-analysis was then performed to assess the various interventions’ impact on percentage total weight loss and side-effect profiles. P-scores were calculated to rank the treatments and identify the preferred interventions. The duration of therapy was 52 weeks. 

In total, 34 eligible RCTs with 15,660 patients were included. Overall, the RCTs analyzed 11 weight loss treatments, including bariatric surgeries (four studies), bariatric endoscopies (three studies), and medications (four studies). 

Specifically, the bariatric surgeries included RYGB, laparoscopic sleeve gastrectomy, OAGB, and laparoscopic adjustable gastric banding; bariatric endoscopies included endoscopic sleeve gastroplasty, transpyloric shuttle, and intragastric balloon; and medications included tirzepatide, semaglutide, and liraglutide.

Although all interventions were associated with reductions in percentage total weight loss compared with placebo, RYGB led to the greatest reductions (19.29%) and was ranked as the first preferred treatment (97% probability). It was followed in the rankings by OAGB, tirzepatide 15 mg, laparoscopic sleeve gastrectomy, and semaglutide 2.4 mg. 

Tirzepatide 15 mg had a slightly lower percentage total weight loss (15.18%) but a favorable safety profile. There was no significant difference in percentage total weight loss between tirzepatide 15 mg and OAGB (mean difference, 2.97%) or laparoscopic sleeve gastrectomy (mean difference, 0.43%). 

There was also no significant difference in percentage total weight loss between semaglutide 2.4 mg, compared with endoscopic sleeve gastroplasty and transpyloric shuttle. 

Endoscopic sleeve, transpyloric shuttle, and intragastric balloon all resulted in weight loss > 5%. 

When compared with bariatric surgery, “endoscopic interventions had a better side-effect profile, with no increased odds of mortality and intensive care needs,” Velji-Ibrahim said. 

When it came to the medications, “the most common side effects were gastrointestinal in nature, which included nausea, vomiting, diarrhea, and constipation,” she said.
 

 

 

Combining, Rather Than Comparing, Therapies

Following the presentation, session co-moderator Shivangi T. Kothari, MD, assistant professor of medicine and associate director of endoscopy at the University of Rochester Medical Center in New York, shared her thoughts of what the future of obesity management research might look like. 

It’s not just going to be about percentage total weight loss, she said, but about how well the effect is sustained following the intervention. 

And we might move “away from comparing one modality to another” and instead study combination therapies, “which would be ideal,” said Kothari.

This was the focus of another meta-analysis presented at ACG 2024, in which Nihal Ijaz I. Khan, MD, and colleagues compared the efficacy of endoscopic bariatric treatment alone vs its combined use with GLP-1 RAs.

The researchers identified three retrospective studies with 266 patients, of whom 143 underwent endoscopic bariatric treatment alone (either endoscopic sleeve gastroplasty or intragastric balloon) and 123 had it combined with GLP-1 RAs, specifically liraglutide. 

They reported that superior absolute weight loss was achieved in the group of patients receiving GLP-1 RAs in combination with endoscopic bariatric treatment. The standardized mean difference in body weight loss at treatment follow-up was 0.61 (P <.01). 

“Further studies are required to evaluate the safety and adverse events comparing these two treatment modalities and to discover differences between comparing the two endoscopic options to various GLP-1 receptor agonists,” Khan noted. 

Neither study had specific funding. Velji-Ibrahim and Khan reported no relevant financial relationships. Kothari reported serving as a consultant for Boston Scientific and Olympus, as well as serving as an advisory committee/board member for Castle Biosciences.

A version of this article first appeared on Medscape.com.

Roux-en-Y gastric bypass (RYGB) produces maximal weight loss in patients with obesity, compared with other surgical procedures and with weight loss drugs, according to a meta-analysis comparing the efficacy and safety of the different treatment options. 

However, tirzepatide, a long-acting glucose-dependent insulinotropic polypeptide (GIP) receptor agonist and glucagon-like peptide 1 receptor agonist (GLP-1 RA), produces comparable weight loss and has a favorable safety profile, reported principal investigator Jena Velji-Ibrahim, MD, MSc, from Prisma Health–Upstate/University of South Carolina School of Medicine in Greenville. 

In addition, there was “no significant difference in percentage total body weight loss between tirzepatide when comparing it to one-anastomosis gastric bypass (OAGB), as well as laparoscopic sleeve gastrectomy,” she said. 

All 11 interventions studied exerted weight loss effects, and side-effect profiles were also deemed largely favorable, particularly for endoscopic interventions, she added. 

“When we compare bariatric surgery to bariatric endoscopy, endoscopic sleeve gastroplasty and transpyloric shuttle offer a minimally invasive alternative with good weight loss outcomes and fewer adverse events,” she said.

Velji-Ibrahim presented the findings at the annual meeting of the American College of Gastroenterology (ACG)
 

Comparing Weight Loss Interventions

Many of the studies comparing weight loss interventions to date have been limited by relatively small sample sizes, observational designs, and inconsistent results. This prompted Velji-Ibrahim and her colleagues to conduct what they believe to be the first-of-its-kind meta-analysis on this topic. 

They began by conducting a systematic search of the literature to identify randomized controlled trials (RCTs) that compared the efficacy of Food and Drug Administration–approved bariatric surgeries, bariatric endoscopies, and medications — against each other or with placebo — in adults with a body mass index of 25-45, with or without concurrent type 2 diabetes. 

A network meta-analysis was then performed to assess the various interventions’ impact on percentage total weight loss and side-effect profiles. P-scores were calculated to rank the treatments and identify the preferred interventions. The duration of therapy was 52 weeks. 

In total, 34 eligible RCTs with 15,660 patients were included. Overall, the RCTs analyzed 11 weight loss treatments, including bariatric surgeries (four studies), bariatric endoscopies (three studies), and medications (four studies). 

Specifically, the bariatric surgeries included RYGB, laparoscopic sleeve gastrectomy, OAGB, and laparoscopic adjustable gastric banding; bariatric endoscopies included endoscopic sleeve gastroplasty, transpyloric shuttle, and intragastric balloon; and medications included tirzepatide, semaglutide, and liraglutide.

Although all interventions were associated with reductions in percentage total weight loss compared with placebo, RYGB led to the greatest reductions (19.29%) and was ranked as the first preferred treatment (97% probability). It was followed in the rankings by OAGB, tirzepatide 15 mg, laparoscopic sleeve gastrectomy, and semaglutide 2.4 mg. 

Tirzepatide 15 mg had a slightly lower percentage total weight loss (15.18%) but a favorable safety profile. There was no significant difference in percentage total weight loss between tirzepatide 15 mg and OAGB (mean difference, 2.97%) or laparoscopic sleeve gastrectomy (mean difference, 0.43%). 

There was also no significant difference in percentage total weight loss between semaglutide 2.4 mg, compared with endoscopic sleeve gastroplasty and transpyloric shuttle. 

Endoscopic sleeve, transpyloric shuttle, and intragastric balloon all resulted in weight loss > 5%. 

When compared with bariatric surgery, “endoscopic interventions had a better side-effect profile, with no increased odds of mortality and intensive care needs,” Velji-Ibrahim said. 

When it came to the medications, “the most common side effects were gastrointestinal in nature, which included nausea, vomiting, diarrhea, and constipation,” she said.
 

 

 

Combining, Rather Than Comparing, Therapies

Following the presentation, session co-moderator Shivangi T. Kothari, MD, assistant professor of medicine and associate director of endoscopy at the University of Rochester Medical Center in New York, shared her thoughts of what the future of obesity management research might look like. 

It’s not just going to be about percentage total weight loss, she said, but about how well the effect is sustained following the intervention. 

And we might move “away from comparing one modality to another” and instead study combination therapies, “which would be ideal,” said Kothari.

This was the focus of another meta-analysis presented at ACG 2024, in which Nihal Ijaz I. Khan, MD, and colleagues compared the efficacy of endoscopic bariatric treatment alone vs its combined use with GLP-1 RAs.

The researchers identified three retrospective studies with 266 patients, of whom 143 underwent endoscopic bariatric treatment alone (either endoscopic sleeve gastroplasty or intragastric balloon) and 123 had it combined with GLP-1 RAs, specifically liraglutide. 

They reported that superior absolute weight loss was achieved in the group of patients receiving GLP-1 RAs in combination with endoscopic bariatric treatment. The standardized mean difference in body weight loss at treatment follow-up was 0.61 (P <.01). 

“Further studies are required to evaluate the safety and adverse events comparing these two treatment modalities and to discover differences between comparing the two endoscopic options to various GLP-1 receptor agonists,” Khan noted. 

Neither study had specific funding. Velji-Ibrahim and Khan reported no relevant financial relationships. Kothari reported serving as a consultant for Boston Scientific and Olympus, as well as serving as an advisory committee/board member for Castle Biosciences.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Breath Gas Patterns Predict Response to Low FODMAP Diet

Article Type
Changed
Thu, 10/31/2024 - 10:13

Non-fasting breath gas patterns may help identify patients with irritable bowel syndrome (IBS) who are most likely to respond to a low fermentable oligo-, di-, monosaccharides and polyols (FODMAP) diet, according to a new study.

The low FODMAP diet is the most evidence-based dietary therapy for patients with IBS, but we know that “only about 50% of our patients respond to it,” said principal investigator Prashant Singh, MD, assistant professor at the University of Michigan in Ann Arbor, Michigan. “Exhaled breath gases represent bacterial fermentation of dietary carbohydrates. These measurements could provide a simple biomarker for response to low FODMAP diets.”

Even before starting the low FODMAP diet, “you could see notable differences in breath test patterns between responders and nonresponders,” he said. “We saw that low FODMAP responders had higher hydrogen (H2) and lower methane (CH4) at baseline than nonresponders and had a greater drop in hydrogen following FODMAP restriction vs nonresponders.”

He added that these results imply that responders to this diet may exhibit differences in baseline microbiota composition regarding saccharolytic capacity and/or methanogens. 

Singh presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Breaths That Can Predict Response

To determine if pre-intervention non-fasting breath patterns are associated with a clinical response to low FODMAP diets, Singh and colleagues enrolled 284 self-selected participants (mean age, 45.2 years) with mild to moderate gastrointestinal (GI) symptoms. Participants used an app-connected breath analyzer to record hourly, non-fasting H2 and CH4 levels during waking hours, in addition to logging meal content and symptom severity (bloating, abdominal pain, and flatulence) on a 0-10 scale. 

Patients were directed to consume their habitual diet for 1 week, before following an app-directed low FODMAP diet for 1 week. Responders were defined as those with a ≥ 30% reduction in at least one mean symptom score. The researchers then compared average hourly H2 and CH4 levels and symptom scores at baseline between low FODMAP diet responders and nonresponders.

Of the participants, 111 were classified as responders and 173 as nonresponders. There were no significant differences between the groups in gender, age, body mass index, or FODMAP per calorie.

Following FODMAP restriction, responders had consistently lower abdominal pain throughout the day and lower bloating and flatulence predominantly in the latter part of the day. Nonresponders experienced no significant changes in key abdominal symptoms after adopting the low FODMAP diet. 

The researchers found that breath tests taken at baseline revealed predictive trends between the groups, even though average FODMAP consumption did not significantly differ between them. Baseline H2 levels were higher among responders than among nonresponders, especially in the morning and evening. However, responders had lower baseline CH4 levels throughout the day. 

Following FODMAP restrictions, responders had a significant drop in non-fasting H2 but not CH4, whereas nonresponders did not have a significant drop in either.

The study was limited by the fact that participants were not clinically diagnosed with IBS, their GI symptoms were mild overall, and no data were available on stool consistency/frequency or fecal microbiome composition for correlation with exhaled breath gas levels.
 

 

 

A Potential New Biomarker

Session co-moderator Kyle Staller, MD, MPH, director of the Gastrointestinal Motility Laboratory at Mass General and associate professor of medicine at Harvard Medical School in Boston, Massachusetts, said in an interview that if validated, these findings provide hope for better directing low FODMAP diets to those patients who may benefit. 

Dr. Kyle Staller, Director, GI Motility Laboratory at Massachusetts General Hospital and Harvard Medical School, Boston
Massachusetts General Hospital
Dr. Kyle Staller

There are some patients who may or may not respond to a FODMAP diet, for reasons we don’t yet know, possibly related to fermentation of gas, and it’s helpful to know before starting treatment, he said. It may help us with more of “a precision medicine approach before we really torture people with diets that can be very difficult to adhere to.” 

Staller, who was not involved in the study, added that, “People tend to really focus on small intestinal bacteria overgrowth when it comes to hydrogen and methane production, but in reality, this is really a very agile day-to-day, meal-to-meal responsiveness. 

“It’s a different paradigm,” he continued. “I’d also like to see more data as to why we see the diurnal rhythm” and whether potential factors such as intestinal transit times are playing a role. 

Singh reported receiving royalties from UpToDate. Staller reported receiving research support from Ardelyx and Restasis and serving as a consultant to Anji, Ardelyx, GI Supply, Mahana, Restasis, and Sanofi. Funding associated with the study was not available at the time of publication.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Non-fasting breath gas patterns may help identify patients with irritable bowel syndrome (IBS) who are most likely to respond to a low fermentable oligo-, di-, monosaccharides and polyols (FODMAP) diet, according to a new study.

The low FODMAP diet is the most evidence-based dietary therapy for patients with IBS, but we know that “only about 50% of our patients respond to it,” said principal investigator Prashant Singh, MD, assistant professor at the University of Michigan in Ann Arbor, Michigan. “Exhaled breath gases represent bacterial fermentation of dietary carbohydrates. These measurements could provide a simple biomarker for response to low FODMAP diets.”

Even before starting the low FODMAP diet, “you could see notable differences in breath test patterns between responders and nonresponders,” he said. “We saw that low FODMAP responders had higher hydrogen (H2) and lower methane (CH4) at baseline than nonresponders and had a greater drop in hydrogen following FODMAP restriction vs nonresponders.”

He added that these results imply that responders to this diet may exhibit differences in baseline microbiota composition regarding saccharolytic capacity and/or methanogens. 

Singh presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Breaths That Can Predict Response

To determine if pre-intervention non-fasting breath patterns are associated with a clinical response to low FODMAP diets, Singh and colleagues enrolled 284 self-selected participants (mean age, 45.2 years) with mild to moderate gastrointestinal (GI) symptoms. Participants used an app-connected breath analyzer to record hourly, non-fasting H2 and CH4 levels during waking hours, in addition to logging meal content and symptom severity (bloating, abdominal pain, and flatulence) on a 0-10 scale. 

Patients were directed to consume their habitual diet for 1 week, before following an app-directed low FODMAP diet for 1 week. Responders were defined as those with a ≥ 30% reduction in at least one mean symptom score. The researchers then compared average hourly H2 and CH4 levels and symptom scores at baseline between low FODMAP diet responders and nonresponders.

Of the participants, 111 were classified as responders and 173 as nonresponders. There were no significant differences between the groups in gender, age, body mass index, or FODMAP per calorie.

Following FODMAP restriction, responders had consistently lower abdominal pain throughout the day and lower bloating and flatulence predominantly in the latter part of the day. Nonresponders experienced no significant changes in key abdominal symptoms after adopting the low FODMAP diet. 

The researchers found that breath tests taken at baseline revealed predictive trends between the groups, even though average FODMAP consumption did not significantly differ between them. Baseline H2 levels were higher among responders than among nonresponders, especially in the morning and evening. However, responders had lower baseline CH4 levels throughout the day. 

Following FODMAP restrictions, responders had a significant drop in non-fasting H2 but not CH4, whereas nonresponders did not have a significant drop in either.

The study was limited by the fact that participants were not clinically diagnosed with IBS, their GI symptoms were mild overall, and no data were available on stool consistency/frequency or fecal microbiome composition for correlation with exhaled breath gas levels.
 

 

 

A Potential New Biomarker

Session co-moderator Kyle Staller, MD, MPH, director of the Gastrointestinal Motility Laboratory at Mass General and associate professor of medicine at Harvard Medical School in Boston, Massachusetts, said in an interview that if validated, these findings provide hope for better directing low FODMAP diets to those patients who may benefit. 

Dr. Kyle Staller, Director, GI Motility Laboratory at Massachusetts General Hospital and Harvard Medical School, Boston
Massachusetts General Hospital
Dr. Kyle Staller

There are some patients who may or may not respond to a FODMAP diet, for reasons we don’t yet know, possibly related to fermentation of gas, and it’s helpful to know before starting treatment, he said. It may help us with more of “a precision medicine approach before we really torture people with diets that can be very difficult to adhere to.” 

Staller, who was not involved in the study, added that, “People tend to really focus on small intestinal bacteria overgrowth when it comes to hydrogen and methane production, but in reality, this is really a very agile day-to-day, meal-to-meal responsiveness. 

“It’s a different paradigm,” he continued. “I’d also like to see more data as to why we see the diurnal rhythm” and whether potential factors such as intestinal transit times are playing a role. 

Singh reported receiving royalties from UpToDate. Staller reported receiving research support from Ardelyx and Restasis and serving as a consultant to Anji, Ardelyx, GI Supply, Mahana, Restasis, and Sanofi. Funding associated with the study was not available at the time of publication.

A version of this article appeared on Medscape.com.

Non-fasting breath gas patterns may help identify patients with irritable bowel syndrome (IBS) who are most likely to respond to a low fermentable oligo-, di-, monosaccharides and polyols (FODMAP) diet, according to a new study.

The low FODMAP diet is the most evidence-based dietary therapy for patients with IBS, but we know that “only about 50% of our patients respond to it,” said principal investigator Prashant Singh, MD, assistant professor at the University of Michigan in Ann Arbor, Michigan. “Exhaled breath gases represent bacterial fermentation of dietary carbohydrates. These measurements could provide a simple biomarker for response to low FODMAP diets.”

Even before starting the low FODMAP diet, “you could see notable differences in breath test patterns between responders and nonresponders,” he said. “We saw that low FODMAP responders had higher hydrogen (H2) and lower methane (CH4) at baseline than nonresponders and had a greater drop in hydrogen following FODMAP restriction vs nonresponders.”

He added that these results imply that responders to this diet may exhibit differences in baseline microbiota composition regarding saccharolytic capacity and/or methanogens. 

Singh presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Breaths That Can Predict Response

To determine if pre-intervention non-fasting breath patterns are associated with a clinical response to low FODMAP diets, Singh and colleagues enrolled 284 self-selected participants (mean age, 45.2 years) with mild to moderate gastrointestinal (GI) symptoms. Participants used an app-connected breath analyzer to record hourly, non-fasting H2 and CH4 levels during waking hours, in addition to logging meal content and symptom severity (bloating, abdominal pain, and flatulence) on a 0-10 scale. 

Patients were directed to consume their habitual diet for 1 week, before following an app-directed low FODMAP diet for 1 week. Responders were defined as those with a ≥ 30% reduction in at least one mean symptom score. The researchers then compared average hourly H2 and CH4 levels and symptom scores at baseline between low FODMAP diet responders and nonresponders.

Of the participants, 111 were classified as responders and 173 as nonresponders. There were no significant differences between the groups in gender, age, body mass index, or FODMAP per calorie.

Following FODMAP restriction, responders had consistently lower abdominal pain throughout the day and lower bloating and flatulence predominantly in the latter part of the day. Nonresponders experienced no significant changes in key abdominal symptoms after adopting the low FODMAP diet. 

The researchers found that breath tests taken at baseline revealed predictive trends between the groups, even though average FODMAP consumption did not significantly differ between them. Baseline H2 levels were higher among responders than among nonresponders, especially in the morning and evening. However, responders had lower baseline CH4 levels throughout the day. 

Following FODMAP restrictions, responders had a significant drop in non-fasting H2 but not CH4, whereas nonresponders did not have a significant drop in either.

The study was limited by the fact that participants were not clinically diagnosed with IBS, their GI symptoms were mild overall, and no data were available on stool consistency/frequency or fecal microbiome composition for correlation with exhaled breath gas levels.
 

 

 

A Potential New Biomarker

Session co-moderator Kyle Staller, MD, MPH, director of the Gastrointestinal Motility Laboratory at Mass General and associate professor of medicine at Harvard Medical School in Boston, Massachusetts, said in an interview that if validated, these findings provide hope for better directing low FODMAP diets to those patients who may benefit. 

Dr. Kyle Staller, Director, GI Motility Laboratory at Massachusetts General Hospital and Harvard Medical School, Boston
Massachusetts General Hospital
Dr. Kyle Staller

There are some patients who may or may not respond to a FODMAP diet, for reasons we don’t yet know, possibly related to fermentation of gas, and it’s helpful to know before starting treatment, he said. It may help us with more of “a precision medicine approach before we really torture people with diets that can be very difficult to adhere to.” 

Staller, who was not involved in the study, added that, “People tend to really focus on small intestinal bacteria overgrowth when it comes to hydrogen and methane production, but in reality, this is really a very agile day-to-day, meal-to-meal responsiveness. 

“It’s a different paradigm,” he continued. “I’d also like to see more data as to why we see the diurnal rhythm” and whether potential factors such as intestinal transit times are playing a role. 

Singh reported receiving royalties from UpToDate. Staller reported receiving research support from Ardelyx and Restasis and serving as a consultant to Anji, Ardelyx, GI Supply, Mahana, Restasis, and Sanofi. Funding associated with the study was not available at the time of publication.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

GLP-1 RAs Reduce Early-Onset CRC Risk in Patients With Type 2 Diabetes

Article Type
Changed
Sun, 11/03/2024 - 11:43

The use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) is associated with a significant decrease in the risk for early-onset colorectal cancer (EO-CRC) in patients with type 2 diabetes (T2D), according to the results of a retrospective study.

“This is the first large study to investigate the impact of GLP-1 RA use on EO-CRC risk,” principal investigator Temitope Olasehinde, MD, resident physician at the University Hospitals Cleveland Medical Center, Case Western Reserve University in Cleveland, Ohio, said in an interview.

The results indicate the GLP-1 RAs have a potentially protective role to play in combating EO-CRC, the incidence of which is notably rising in younger adults, with a corresponding increase in associated mortality.

Previous studies investigating the link between GLP-1 RAs and CRC did not capture patients aged younger than 50 years; thus, it was unknown if these results could be extrapolated to a younger age group, said Olasehinde.

The researcher presented the findings at the annual meeting of the American College of Gastroenterology.
 

Retrospective Database Analysis

Olasehinde and colleagues analyzed data from TriNetX, a large federated deidentified health research network, to identify patients (age ≤ 49 years) with diagnosed T2D subsequently prescribed antidiabetic medications who had not received a prior diagnosis of CRC. Additionally, patients were stratified on the basis of first-time GLP-1 RA use.

They identified 2,025,034 drug-naive patients with T2D; of these, 284,685 were subsequently prescribed GLP-1 RAs, and 1,740,349 remained in the non–GLP-1 RA cohort. Following propensity score matching, there were 86,186 patients in each cohort.

Patients who received GLP-1 RAs had significantly lower odds of developing EO-CRC than those who received non–GLP-1 RAs (0.6% vs 0.9%; P < .001; odds ratio [OR], 0.61; 95% CI, 0.54-068).

Furthermore, a sub-analysis revealed that patients who were obese and taking GLP-1 RAs had significantly lower odds of developing EO-CRC than patients who were obese but not taking GLP-1 RAs (0.7% vs 1.1%; P < .001; OR, 0.58; 95% CI, 0.50-067).
 

A Proposed Protective Effect

Although GLP-1 RAs are indicated for the treatment of T2D and obesity, recent evidence suggests that they may play a role in reducing the risk for CRC as well. This protective effect may be produced not only by addressing T2D and obesity — both important risk factors for CRC — but also via cellular mechanisms, Olasehinde noted.

“GLP-1 receptors are widely expressed throughout the gastrointestinal tract, with various effects on tissues in the stomach, small intestine, and colon,” she explained. Specifically, activation of these receptors in the proximal and distal colon promotes the release of “important factors that protect and facilitate healing of the intestinal epithelium” and “regulate the gut microbiome.”

This is particularly relevant in EO-CRC, she added, given its greater association with T2D and obesity, both factors that “have been shown to create dysbiosis in the gut microbiome and low-grade inflammation via release of free radicals/inflammatory cytokines.”

These results provide more evidence that EO-CRC “is clinically and molecularly distinct from late-onset colorectal cancer,” which is important for both clinicians and patients to understand, said Olasehinde.

“It is imperative that we are all aware of the specific signs and symptoms this population presents with and the implications of this diagnosis in younger age groups,” she added. “Patients should continue making informed dietary and lifestyle modifications/choices to help reduce the burden of EO-CRC.”

Hypothesis-Generating Results

Aasma Shaukat, MD, MPH, who was not affiliated with the research, called the results promising but — at this stage — primarily useful for stimulating future research. 

"We do need more studies such as this to generate hypotheses that can be studied prospectively," Shaukat, professor of medicine and population health, and director of GI Outcomes Research at NYU Langone Health in New York City, told Medscape Medical News. 

She referred to another study, published in JAMA Oncology, that also used the TriNetX research network, which showed that GLP-1 RAs were associated with reduced CRC risk in drug-naive patients with T2D. 

Shaukat also noted that the current analysis has limitations that should be considered. "The study is retrospective, and confounding is a possibility,” she said. 

“How the groups that did and did not receive GLP-1 RAs differ in other risk factors that could be the drivers of the cancers is not known. Whether cancers were detected through screening or symptoms, stage, and other features that may differ are not known. Finally, since we don’t know who did or did not have colonoscopy, undiagnosed cancers are not known," she explained. 

Shaukat, who was the lead author of the ACG 2021 Colorectal Cancer Screening Guidelines, added that the field would benefit from studies providing "biological plausibility information, such as animal studies to understand how GLP-1 RAs may modulate risk of colon cancer; other population-based cohort studies on the incidence of colon cancer among GLP-1 RA users and non-users; and prospective trials on chemoprevention." 

The study had no specific funding. Olasehinde reported no relevant financial relationships. Shaukat reported serving as a consultant for Freenome, Medtronic, and Motus GI, as well as an advisory board member for Iterative Scopes Inc.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

The use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) is associated with a significant decrease in the risk for early-onset colorectal cancer (EO-CRC) in patients with type 2 diabetes (T2D), according to the results of a retrospective study.

“This is the first large study to investigate the impact of GLP-1 RA use on EO-CRC risk,” principal investigator Temitope Olasehinde, MD, resident physician at the University Hospitals Cleveland Medical Center, Case Western Reserve University in Cleveland, Ohio, said in an interview.

The results indicate the GLP-1 RAs have a potentially protective role to play in combating EO-CRC, the incidence of which is notably rising in younger adults, with a corresponding increase in associated mortality.

Previous studies investigating the link between GLP-1 RAs and CRC did not capture patients aged younger than 50 years; thus, it was unknown if these results could be extrapolated to a younger age group, said Olasehinde.

The researcher presented the findings at the annual meeting of the American College of Gastroenterology.
 

Retrospective Database Analysis

Olasehinde and colleagues analyzed data from TriNetX, a large federated deidentified health research network, to identify patients (age ≤ 49 years) with diagnosed T2D subsequently prescribed antidiabetic medications who had not received a prior diagnosis of CRC. Additionally, patients were stratified on the basis of first-time GLP-1 RA use.

They identified 2,025,034 drug-naive patients with T2D; of these, 284,685 were subsequently prescribed GLP-1 RAs, and 1,740,349 remained in the non–GLP-1 RA cohort. Following propensity score matching, there were 86,186 patients in each cohort.

Patients who received GLP-1 RAs had significantly lower odds of developing EO-CRC than those who received non–GLP-1 RAs (0.6% vs 0.9%; P < .001; odds ratio [OR], 0.61; 95% CI, 0.54-068).

Furthermore, a sub-analysis revealed that patients who were obese and taking GLP-1 RAs had significantly lower odds of developing EO-CRC than patients who were obese but not taking GLP-1 RAs (0.7% vs 1.1%; P < .001; OR, 0.58; 95% CI, 0.50-067).
 

A Proposed Protective Effect

Although GLP-1 RAs are indicated for the treatment of T2D and obesity, recent evidence suggests that they may play a role in reducing the risk for CRC as well. This protective effect may be produced not only by addressing T2D and obesity — both important risk factors for CRC — but also via cellular mechanisms, Olasehinde noted.

“GLP-1 receptors are widely expressed throughout the gastrointestinal tract, with various effects on tissues in the stomach, small intestine, and colon,” she explained. Specifically, activation of these receptors in the proximal and distal colon promotes the release of “important factors that protect and facilitate healing of the intestinal epithelium” and “regulate the gut microbiome.”

This is particularly relevant in EO-CRC, she added, given its greater association with T2D and obesity, both factors that “have been shown to create dysbiosis in the gut microbiome and low-grade inflammation via release of free radicals/inflammatory cytokines.”

These results provide more evidence that EO-CRC “is clinically and molecularly distinct from late-onset colorectal cancer,” which is important for both clinicians and patients to understand, said Olasehinde.

“It is imperative that we are all aware of the specific signs and symptoms this population presents with and the implications of this diagnosis in younger age groups,” she added. “Patients should continue making informed dietary and lifestyle modifications/choices to help reduce the burden of EO-CRC.”

Hypothesis-Generating Results

Aasma Shaukat, MD, MPH, who was not affiliated with the research, called the results promising but — at this stage — primarily useful for stimulating future research. 

"We do need more studies such as this to generate hypotheses that can be studied prospectively," Shaukat, professor of medicine and population health, and director of GI Outcomes Research at NYU Langone Health in New York City, told Medscape Medical News. 

She referred to another study, published in JAMA Oncology, that also used the TriNetX research network, which showed that GLP-1 RAs were associated with reduced CRC risk in drug-naive patients with T2D. 

Shaukat also noted that the current analysis has limitations that should be considered. "The study is retrospective, and confounding is a possibility,” she said. 

“How the groups that did and did not receive GLP-1 RAs differ in other risk factors that could be the drivers of the cancers is not known. Whether cancers were detected through screening or symptoms, stage, and other features that may differ are not known. Finally, since we don’t know who did or did not have colonoscopy, undiagnosed cancers are not known," she explained. 

Shaukat, who was the lead author of the ACG 2021 Colorectal Cancer Screening Guidelines, added that the field would benefit from studies providing "biological plausibility information, such as animal studies to understand how GLP-1 RAs may modulate risk of colon cancer; other population-based cohort studies on the incidence of colon cancer among GLP-1 RA users and non-users; and prospective trials on chemoprevention." 

The study had no specific funding. Olasehinde reported no relevant financial relationships. Shaukat reported serving as a consultant for Freenome, Medtronic, and Motus GI, as well as an advisory board member for Iterative Scopes Inc.

A version of this article appeared on Medscape.com.

The use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) is associated with a significant decrease in the risk for early-onset colorectal cancer (EO-CRC) in patients with type 2 diabetes (T2D), according to the results of a retrospective study.

“This is the first large study to investigate the impact of GLP-1 RA use on EO-CRC risk,” principal investigator Temitope Olasehinde, MD, resident physician at the University Hospitals Cleveland Medical Center, Case Western Reserve University in Cleveland, Ohio, said in an interview.

The results indicate the GLP-1 RAs have a potentially protective role to play in combating EO-CRC, the incidence of which is notably rising in younger adults, with a corresponding increase in associated mortality.

Previous studies investigating the link between GLP-1 RAs and CRC did not capture patients aged younger than 50 years; thus, it was unknown if these results could be extrapolated to a younger age group, said Olasehinde.

The researcher presented the findings at the annual meeting of the American College of Gastroenterology.
 

Retrospective Database Analysis

Olasehinde and colleagues analyzed data from TriNetX, a large federated deidentified health research network, to identify patients (age ≤ 49 years) with diagnosed T2D subsequently prescribed antidiabetic medications who had not received a prior diagnosis of CRC. Additionally, patients were stratified on the basis of first-time GLP-1 RA use.

They identified 2,025,034 drug-naive patients with T2D; of these, 284,685 were subsequently prescribed GLP-1 RAs, and 1,740,349 remained in the non–GLP-1 RA cohort. Following propensity score matching, there were 86,186 patients in each cohort.

Patients who received GLP-1 RAs had significantly lower odds of developing EO-CRC than those who received non–GLP-1 RAs (0.6% vs 0.9%; P < .001; odds ratio [OR], 0.61; 95% CI, 0.54-068).

Furthermore, a sub-analysis revealed that patients who were obese and taking GLP-1 RAs had significantly lower odds of developing EO-CRC than patients who were obese but not taking GLP-1 RAs (0.7% vs 1.1%; P < .001; OR, 0.58; 95% CI, 0.50-067).
 

A Proposed Protective Effect

Although GLP-1 RAs are indicated for the treatment of T2D and obesity, recent evidence suggests that they may play a role in reducing the risk for CRC as well. This protective effect may be produced not only by addressing T2D and obesity — both important risk factors for CRC — but also via cellular mechanisms, Olasehinde noted.

“GLP-1 receptors are widely expressed throughout the gastrointestinal tract, with various effects on tissues in the stomach, small intestine, and colon,” she explained. Specifically, activation of these receptors in the proximal and distal colon promotes the release of “important factors that protect and facilitate healing of the intestinal epithelium” and “regulate the gut microbiome.”

This is particularly relevant in EO-CRC, she added, given its greater association with T2D and obesity, both factors that “have been shown to create dysbiosis in the gut microbiome and low-grade inflammation via release of free radicals/inflammatory cytokines.”

These results provide more evidence that EO-CRC “is clinically and molecularly distinct from late-onset colorectal cancer,” which is important for both clinicians and patients to understand, said Olasehinde.

“It is imperative that we are all aware of the specific signs and symptoms this population presents with and the implications of this diagnosis in younger age groups,” she added. “Patients should continue making informed dietary and lifestyle modifications/choices to help reduce the burden of EO-CRC.”

Hypothesis-Generating Results

Aasma Shaukat, MD, MPH, who was not affiliated with the research, called the results promising but — at this stage — primarily useful for stimulating future research. 

"We do need more studies such as this to generate hypotheses that can be studied prospectively," Shaukat, professor of medicine and population health, and director of GI Outcomes Research at NYU Langone Health in New York City, told Medscape Medical News. 

She referred to another study, published in JAMA Oncology, that also used the TriNetX research network, which showed that GLP-1 RAs were associated with reduced CRC risk in drug-naive patients with T2D. 

Shaukat also noted that the current analysis has limitations that should be considered. "The study is retrospective, and confounding is a possibility,” she said. 

“How the groups that did and did not receive GLP-1 RAs differ in other risk factors that could be the drivers of the cancers is not known. Whether cancers were detected through screening or symptoms, stage, and other features that may differ are not known. Finally, since we don’t know who did or did not have colonoscopy, undiagnosed cancers are not known," she explained. 

Shaukat, who was the lead author of the ACG 2021 Colorectal Cancer Screening Guidelines, added that the field would benefit from studies providing "biological plausibility information, such as animal studies to understand how GLP-1 RAs may modulate risk of colon cancer; other population-based cohort studies on the incidence of colon cancer among GLP-1 RA users and non-users; and prospective trials on chemoprevention." 

The study had no specific funding. Olasehinde reported no relevant financial relationships. Shaukat reported serving as a consultant for Freenome, Medtronic, and Motus GI, as well as an advisory board member for Iterative Scopes Inc.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Mortality Rates From Early-Onset CRC Have Risen Considerably Over Last 2 Decades

Article Type
Changed
Thu, 10/31/2024 - 13:36

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

GI Docs Will Need to Forge a ‘Human-Computer Cooperative’

Article Type
Changed
Sun, 10/13/2024 - 22:52

Several artificial intelligence (AI) technologies are emerging that will change the management of gastrointestinal (GI) diseases sooner rather than later. One of the leading researchers working toward that AI-driven future is Ryan W. Stidham, MD, MS, AGAF, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor.

Stidham’s work focuses on leveraging AI to develop automated systems that better quantify disease activity and aid gastroenterologists in their decision-making. He also serves as a meber of AGA's AI Task Force. He spoke with this news organization about his efforts to shape AI into a tool with practical applications in gastroenterology, what the technology may do to improve physician efficiency, and why gastroenterologists shouldn’t be worried about being replaced by machines any time soon.
 

How did you first become involved in studying AI applications for GI conditions?

My medical training coincided with the emergence of electronic health records (EHRs) making enormous amounts of data, ranging from laboratory results to diagnostic codes and billing records, readily accessible.

gastroenterology and computational medicine and bioinformatics at the University of Michigan, Michigan Medicine, Ann Arbor, Michigan
Leisa Thompson
Dr. Ryan W. Stidham

I quickly contracted data analytics fever, but a major problem became apparent: EHRs and medical claims data alone only weakly describe a patient. Researchers in the field were excited to use machine learning for personalizing treatment decisions for GI conditions, including inflammatory bowel disease (IBD). But no matter how large the dataset, the EHRs lacked the most rudimentary descriptions: What was the patient’s IBD phenotype? Where exactly was the disease located?

I could see machine learning had the potential to learn and reproduce expert decision-making. Unfortunately, we were fueling this machine-learning rocket ship with crude data unlikely to take us very far. Gastroenterologists rely on data in progress notes, emails, interpretations of colonoscopies, and radiologists’ and pathologists’ reviews of imaging to make treatment decisions, but that information is not well organized in any dataset.

I wanted to use AI to retrieve that key information in text, images, and video that we use every day for IBD care, automatically interpreting the data like a seasoned gastroenterologist. Generating higher-quality data describing patients could take our AI models from interesting research to useful and reliable tools in clinical care.
 

How did your early research go about trying to solve that problem?

My GI career began amid the IBD field shifting from relying on symptoms alone to objective biomarkers for IBD assessment, particularly focusing on standardized scoring of endoscopic mucosal inflammation. However, these scores were challenged with interobserver variability, prompting the need for centralized reading. More importantly, these scores are qualitative and do not capture all the visual findings an experienced physician appreciates when assessing severity, phenotype, and therapeutic effect. As a result, even experts could disagree on the degree of endoscopic severity, and patients with obvious differences in the appearance of mucosa could have the same endoscopic score.

I asked myself: Are we really using these measures to make treatment decisions and determine the effectiveness of investigational therapies? I thought we could do better and aimed to improve endoscopic IBD assessments using then-emerging digital image analysis techniques.

Convolutional neural network (CNN) modeling was just becoming feasible as computing performance increased. CNNs are well suited for complex medical image interpretation, using an associated “label,” such as the presence or grade of disease, to decipher the complex set of image feature patterns characterizing an expert’s determination of disease severity.
 

 

 

How did you convert the promise of CNN into tangible results?

The plan was simple: Collect endoscopic images from patients with IBD, find some experts to grade IBD severity on the images, and train a CNN model using the images and expert labels.

In 2016, developing a CNN wasn’t easy. There was no database of endoscopic images or simple methods for image labeling. The CNN needed tens of thousands of images. How were we to collect enough images with a broad range of IBD severity? I also reached some technical limits and needed help solving computational challenges.

Designing our first IBD endoscopic CNN took years of reading, coursework, additional training, and a new host of collaborators.

Failure was frequent, and my colleagues and I spent a lot of nights and weekends looking at thousands of individual endoscopic images. But we eventually had a working model for grading endoscopic severity, and its performance exceeded our expectations.

To our surprise, the CNN model grading of ulcerative colitis severity almost perfectly matched the opinion of IBD experts. We introduced the proof of concept that AI could automate complex disease measurement for IBD.

What took us 3 years in 2016 would take about 3 weeks today.
 

You have said that AI could help reduce the substantial administrative burdens in medicine today. What might an AI-assisted future look like for time-strapped gastroenterologists?

We will be spending more time on complex decision-making and developing treatment plans, with less time needed to hunt for information in the chart and administrative tasks.

The practical applications of AI will chip away at tedious mechanical tasks, soon to be done by machines, reclaiming time for gastroenterologists.

For example, automated documentation is almost usable, and audio recordings in the clinic could be leveraged to generate office notes.

Computer vision analysis of endoscopic video is generating draft procedural notes and letters to patients in a shared language, as well as recommending surveillance intervals based on the findings.

Text processing is already being used to automate billing and manage health maintenance like vaccinations, laboratory screening, and therapeutic drug monitoring.

Unfortunately, I don’t think that AI will immediately help with burnout. These near-term AI administrative assistant advantages, however, will help us manage the increasing patient load, address physician shortages, and potentially improve access to care in underserved areas.
 

Were there any surprises in your work?

I must admit, I was certain AI would put us gastroenterologists to shame. Over time, I have reversed that view.

AI really struggles to understand the holistic patient context when interpreting disease and predicting what to do for an individual patient. Humans anticipate gaps in data and customize the weighting of information when making decisions for individuals. An experienced gastroenterologist can incorporate risks, harms, and costs in ways AI is several generations from achieving.

With certainty, AI will outperform gastroenterologists for tedious and repetitive tasks, and we should gladly expect AI to assume those responsibilities. However, many unknowns remain in the daily management of GI conditions. We will continue to rely on the clinical experience, creativity, and improvisation of gastroenterologists for years to come.
 

 

 

Has there been a turning-point moment when it felt like this technology moved from being more theoretical to something with real-world clinical applications?

Last spring, I saw a lecture by Peter Lee, who is president of Microsoft Research and a leader in developing AI-powered applications in medicine and scientific research, demonstrating how a large language model (LLM) could “understand” medical text and generate responses to questions. My jaw dropped.

We watched an LLM answer American Board of Internal Medicine questions with perfect explanations and rationale. He demonstrated how an audio recording of a clinic visit could be used to automatically generate a SOAP (subjective, objective assessment and plan) note. It was better than anything I would have drafted. He also showed how the LLM could directly ingest EHR data, without any modification, and provide a great diagnosis and treatment plan. Finally, LLM chatbots could carry on an interactive conversation with a patient that would be difficult to distinguish from a human physician.

The inevitability of AI-powered transformations in gastroenterology care became apparent.

Documentation, billing, and administrative work will be handled by AI. AI will collect and organize information for me. Chart reviews and even telephone/email checkups on patients will be a thing of the past. AI chatbots will be able to discuss an individual patient’s condition and test results. Our GI-AI assistants will proactively collect information from patients after hospitalization or react to a change in labs.

AI will soon be an amazing diagnostician and will know more than me. So do we need to polish our resumes for new careers? No, but we will need to adapt to changes, which I believe on the whole will be better for gastroenterologists and patients.
 

What does adaptation look like for gastroenterologists over the next handful of years?

Like any other tool, gastroenterologists will be figuring out how to use AI prediction models, chatbots, and imaging analytics. Value, ease of use, and information-gain will drive which AI tools are ultimately adopted.

Memory, information recall, calculations, and repetitive tasks where gastroenterologists occasionally error or find tiresome will become the job of machines. We will still be the magicians, now aided by machines, applying our human strengths of contextual awareness, judgment, and creativity to find customized solutions for more patients.

That, I think, is the future that we are reliably moving toward over the next decade — a human-computer cooperative throughout gastroenterology (including IBD) and, frankly, all of medicine.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Several artificial intelligence (AI) technologies are emerging that will change the management of gastrointestinal (GI) diseases sooner rather than later. One of the leading researchers working toward that AI-driven future is Ryan W. Stidham, MD, MS, AGAF, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor.

Stidham’s work focuses on leveraging AI to develop automated systems that better quantify disease activity and aid gastroenterologists in their decision-making. He also serves as a meber of AGA's AI Task Force. He spoke with this news organization about his efforts to shape AI into a tool with practical applications in gastroenterology, what the technology may do to improve physician efficiency, and why gastroenterologists shouldn’t be worried about being replaced by machines any time soon.
 

How did you first become involved in studying AI applications for GI conditions?

My medical training coincided with the emergence of electronic health records (EHRs) making enormous amounts of data, ranging from laboratory results to diagnostic codes and billing records, readily accessible.

gastroenterology and computational medicine and bioinformatics at the University of Michigan, Michigan Medicine, Ann Arbor, Michigan
Leisa Thompson
Dr. Ryan W. Stidham

I quickly contracted data analytics fever, but a major problem became apparent: EHRs and medical claims data alone only weakly describe a patient. Researchers in the field were excited to use machine learning for personalizing treatment decisions for GI conditions, including inflammatory bowel disease (IBD). But no matter how large the dataset, the EHRs lacked the most rudimentary descriptions: What was the patient’s IBD phenotype? Where exactly was the disease located?

I could see machine learning had the potential to learn and reproduce expert decision-making. Unfortunately, we were fueling this machine-learning rocket ship with crude data unlikely to take us very far. Gastroenterologists rely on data in progress notes, emails, interpretations of colonoscopies, and radiologists’ and pathologists’ reviews of imaging to make treatment decisions, but that information is not well organized in any dataset.

I wanted to use AI to retrieve that key information in text, images, and video that we use every day for IBD care, automatically interpreting the data like a seasoned gastroenterologist. Generating higher-quality data describing patients could take our AI models from interesting research to useful and reliable tools in clinical care.
 

How did your early research go about trying to solve that problem?

My GI career began amid the IBD field shifting from relying on symptoms alone to objective biomarkers for IBD assessment, particularly focusing on standardized scoring of endoscopic mucosal inflammation. However, these scores were challenged with interobserver variability, prompting the need for centralized reading. More importantly, these scores are qualitative and do not capture all the visual findings an experienced physician appreciates when assessing severity, phenotype, and therapeutic effect. As a result, even experts could disagree on the degree of endoscopic severity, and patients with obvious differences in the appearance of mucosa could have the same endoscopic score.

I asked myself: Are we really using these measures to make treatment decisions and determine the effectiveness of investigational therapies? I thought we could do better and aimed to improve endoscopic IBD assessments using then-emerging digital image analysis techniques.

Convolutional neural network (CNN) modeling was just becoming feasible as computing performance increased. CNNs are well suited for complex medical image interpretation, using an associated “label,” such as the presence or grade of disease, to decipher the complex set of image feature patterns characterizing an expert’s determination of disease severity.
 

 

 

How did you convert the promise of CNN into tangible results?

The plan was simple: Collect endoscopic images from patients with IBD, find some experts to grade IBD severity on the images, and train a CNN model using the images and expert labels.

In 2016, developing a CNN wasn’t easy. There was no database of endoscopic images or simple methods for image labeling. The CNN needed tens of thousands of images. How were we to collect enough images with a broad range of IBD severity? I also reached some technical limits and needed help solving computational challenges.

Designing our first IBD endoscopic CNN took years of reading, coursework, additional training, and a new host of collaborators.

Failure was frequent, and my colleagues and I spent a lot of nights and weekends looking at thousands of individual endoscopic images. But we eventually had a working model for grading endoscopic severity, and its performance exceeded our expectations.

To our surprise, the CNN model grading of ulcerative colitis severity almost perfectly matched the opinion of IBD experts. We introduced the proof of concept that AI could automate complex disease measurement for IBD.

What took us 3 years in 2016 would take about 3 weeks today.
 

You have said that AI could help reduce the substantial administrative burdens in medicine today. What might an AI-assisted future look like for time-strapped gastroenterologists?

We will be spending more time on complex decision-making and developing treatment plans, with less time needed to hunt for information in the chart and administrative tasks.

The practical applications of AI will chip away at tedious mechanical tasks, soon to be done by machines, reclaiming time for gastroenterologists.

For example, automated documentation is almost usable, and audio recordings in the clinic could be leveraged to generate office notes.

Computer vision analysis of endoscopic video is generating draft procedural notes and letters to patients in a shared language, as well as recommending surveillance intervals based on the findings.

Text processing is already being used to automate billing and manage health maintenance like vaccinations, laboratory screening, and therapeutic drug monitoring.

Unfortunately, I don’t think that AI will immediately help with burnout. These near-term AI administrative assistant advantages, however, will help us manage the increasing patient load, address physician shortages, and potentially improve access to care in underserved areas.
 

Were there any surprises in your work?

I must admit, I was certain AI would put us gastroenterologists to shame. Over time, I have reversed that view.

AI really struggles to understand the holistic patient context when interpreting disease and predicting what to do for an individual patient. Humans anticipate gaps in data and customize the weighting of information when making decisions for individuals. An experienced gastroenterologist can incorporate risks, harms, and costs in ways AI is several generations from achieving.

With certainty, AI will outperform gastroenterologists for tedious and repetitive tasks, and we should gladly expect AI to assume those responsibilities. However, many unknowns remain in the daily management of GI conditions. We will continue to rely on the clinical experience, creativity, and improvisation of gastroenterologists for years to come.
 

 

 

Has there been a turning-point moment when it felt like this technology moved from being more theoretical to something with real-world clinical applications?

Last spring, I saw a lecture by Peter Lee, who is president of Microsoft Research and a leader in developing AI-powered applications in medicine and scientific research, demonstrating how a large language model (LLM) could “understand” medical text and generate responses to questions. My jaw dropped.

We watched an LLM answer American Board of Internal Medicine questions with perfect explanations and rationale. He demonstrated how an audio recording of a clinic visit could be used to automatically generate a SOAP (subjective, objective assessment and plan) note. It was better than anything I would have drafted. He also showed how the LLM could directly ingest EHR data, without any modification, and provide a great diagnosis and treatment plan. Finally, LLM chatbots could carry on an interactive conversation with a patient that would be difficult to distinguish from a human physician.

The inevitability of AI-powered transformations in gastroenterology care became apparent.

Documentation, billing, and administrative work will be handled by AI. AI will collect and organize information for me. Chart reviews and even telephone/email checkups on patients will be a thing of the past. AI chatbots will be able to discuss an individual patient’s condition and test results. Our GI-AI assistants will proactively collect information from patients after hospitalization or react to a change in labs.

AI will soon be an amazing diagnostician and will know more than me. So do we need to polish our resumes for new careers? No, but we will need to adapt to changes, which I believe on the whole will be better for gastroenterologists and patients.
 

What does adaptation look like for gastroenterologists over the next handful of years?

Like any other tool, gastroenterologists will be figuring out how to use AI prediction models, chatbots, and imaging analytics. Value, ease of use, and information-gain will drive which AI tools are ultimately adopted.

Memory, information recall, calculations, and repetitive tasks where gastroenterologists occasionally error or find tiresome will become the job of machines. We will still be the magicians, now aided by machines, applying our human strengths of contextual awareness, judgment, and creativity to find customized solutions for more patients.

That, I think, is the future that we are reliably moving toward over the next decade — a human-computer cooperative throughout gastroenterology (including IBD) and, frankly, all of medicine.

A version of this article appeared on Medscape.com.

Several artificial intelligence (AI) technologies are emerging that will change the management of gastrointestinal (GI) diseases sooner rather than later. One of the leading researchers working toward that AI-driven future is Ryan W. Stidham, MD, MS, AGAF, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor.

Stidham’s work focuses on leveraging AI to develop automated systems that better quantify disease activity and aid gastroenterologists in their decision-making. He also serves as a meber of AGA's AI Task Force. He spoke with this news organization about his efforts to shape AI into a tool with practical applications in gastroenterology, what the technology may do to improve physician efficiency, and why gastroenterologists shouldn’t be worried about being replaced by machines any time soon.
 

How did you first become involved in studying AI applications for GI conditions?

My medical training coincided with the emergence of electronic health records (EHRs) making enormous amounts of data, ranging from laboratory results to diagnostic codes and billing records, readily accessible.

gastroenterology and computational medicine and bioinformatics at the University of Michigan, Michigan Medicine, Ann Arbor, Michigan
Leisa Thompson
Dr. Ryan W. Stidham

I quickly contracted data analytics fever, but a major problem became apparent: EHRs and medical claims data alone only weakly describe a patient. Researchers in the field were excited to use machine learning for personalizing treatment decisions for GI conditions, including inflammatory bowel disease (IBD). But no matter how large the dataset, the EHRs lacked the most rudimentary descriptions: What was the patient’s IBD phenotype? Where exactly was the disease located?

I could see machine learning had the potential to learn and reproduce expert decision-making. Unfortunately, we were fueling this machine-learning rocket ship with crude data unlikely to take us very far. Gastroenterologists rely on data in progress notes, emails, interpretations of colonoscopies, and radiologists’ and pathologists’ reviews of imaging to make treatment decisions, but that information is not well organized in any dataset.

I wanted to use AI to retrieve that key information in text, images, and video that we use every day for IBD care, automatically interpreting the data like a seasoned gastroenterologist. Generating higher-quality data describing patients could take our AI models from interesting research to useful and reliable tools in clinical care.
 

How did your early research go about trying to solve that problem?

My GI career began amid the IBD field shifting from relying on symptoms alone to objective biomarkers for IBD assessment, particularly focusing on standardized scoring of endoscopic mucosal inflammation. However, these scores were challenged with interobserver variability, prompting the need for centralized reading. More importantly, these scores are qualitative and do not capture all the visual findings an experienced physician appreciates when assessing severity, phenotype, and therapeutic effect. As a result, even experts could disagree on the degree of endoscopic severity, and patients with obvious differences in the appearance of mucosa could have the same endoscopic score.

I asked myself: Are we really using these measures to make treatment decisions and determine the effectiveness of investigational therapies? I thought we could do better and aimed to improve endoscopic IBD assessments using then-emerging digital image analysis techniques.

Convolutional neural network (CNN) modeling was just becoming feasible as computing performance increased. CNNs are well suited for complex medical image interpretation, using an associated “label,” such as the presence or grade of disease, to decipher the complex set of image feature patterns characterizing an expert’s determination of disease severity.
 

 

 

How did you convert the promise of CNN into tangible results?

The plan was simple: Collect endoscopic images from patients with IBD, find some experts to grade IBD severity on the images, and train a CNN model using the images and expert labels.

In 2016, developing a CNN wasn’t easy. There was no database of endoscopic images or simple methods for image labeling. The CNN needed tens of thousands of images. How were we to collect enough images with a broad range of IBD severity? I also reached some technical limits and needed help solving computational challenges.

Designing our first IBD endoscopic CNN took years of reading, coursework, additional training, and a new host of collaborators.

Failure was frequent, and my colleagues and I spent a lot of nights and weekends looking at thousands of individual endoscopic images. But we eventually had a working model for grading endoscopic severity, and its performance exceeded our expectations.

To our surprise, the CNN model grading of ulcerative colitis severity almost perfectly matched the opinion of IBD experts. We introduced the proof of concept that AI could automate complex disease measurement for IBD.

What took us 3 years in 2016 would take about 3 weeks today.
 

You have said that AI could help reduce the substantial administrative burdens in medicine today. What might an AI-assisted future look like for time-strapped gastroenterologists?

We will be spending more time on complex decision-making and developing treatment plans, with less time needed to hunt for information in the chart and administrative tasks.

The practical applications of AI will chip away at tedious mechanical tasks, soon to be done by machines, reclaiming time for gastroenterologists.

For example, automated documentation is almost usable, and audio recordings in the clinic could be leveraged to generate office notes.

Computer vision analysis of endoscopic video is generating draft procedural notes and letters to patients in a shared language, as well as recommending surveillance intervals based on the findings.

Text processing is already being used to automate billing and manage health maintenance like vaccinations, laboratory screening, and therapeutic drug monitoring.

Unfortunately, I don’t think that AI will immediately help with burnout. These near-term AI administrative assistant advantages, however, will help us manage the increasing patient load, address physician shortages, and potentially improve access to care in underserved areas.
 

Were there any surprises in your work?

I must admit, I was certain AI would put us gastroenterologists to shame. Over time, I have reversed that view.

AI really struggles to understand the holistic patient context when interpreting disease and predicting what to do for an individual patient. Humans anticipate gaps in data and customize the weighting of information when making decisions for individuals. An experienced gastroenterologist can incorporate risks, harms, and costs in ways AI is several generations from achieving.

With certainty, AI will outperform gastroenterologists for tedious and repetitive tasks, and we should gladly expect AI to assume those responsibilities. However, many unknowns remain in the daily management of GI conditions. We will continue to rely on the clinical experience, creativity, and improvisation of gastroenterologists for years to come.
 

 

 

Has there been a turning-point moment when it felt like this technology moved from being more theoretical to something with real-world clinical applications?

Last spring, I saw a lecture by Peter Lee, who is president of Microsoft Research and a leader in developing AI-powered applications in medicine and scientific research, demonstrating how a large language model (LLM) could “understand” medical text and generate responses to questions. My jaw dropped.

We watched an LLM answer American Board of Internal Medicine questions with perfect explanations and rationale. He demonstrated how an audio recording of a clinic visit could be used to automatically generate a SOAP (subjective, objective assessment and plan) note. It was better than anything I would have drafted. He also showed how the LLM could directly ingest EHR data, without any modification, and provide a great diagnosis and treatment plan. Finally, LLM chatbots could carry on an interactive conversation with a patient that would be difficult to distinguish from a human physician.

The inevitability of AI-powered transformations in gastroenterology care became apparent.

Documentation, billing, and administrative work will be handled by AI. AI will collect and organize information for me. Chart reviews and even telephone/email checkups on patients will be a thing of the past. AI chatbots will be able to discuss an individual patient’s condition and test results. Our GI-AI assistants will proactively collect information from patients after hospitalization or react to a change in labs.

AI will soon be an amazing diagnostician and will know more than me. So do we need to polish our resumes for new careers? No, but we will need to adapt to changes, which I believe on the whole will be better for gastroenterologists and patients.
 

What does adaptation look like for gastroenterologists over the next handful of years?

Like any other tool, gastroenterologists will be figuring out how to use AI prediction models, chatbots, and imaging analytics. Value, ease of use, and information-gain will drive which AI tools are ultimately adopted.

Memory, information recall, calculations, and repetitive tasks where gastroenterologists occasionally error or find tiresome will become the job of machines. We will still be the magicians, now aided by machines, applying our human strengths of contextual awareness, judgment, and creativity to find customized solutions for more patients.

That, I think, is the future that we are reliably moving toward over the next decade — a human-computer cooperative throughout gastroenterology (including IBD) and, frankly, all of medicine.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alcohol-Associated Liver Disease’s Changing Demographics

Article Type
Changed
Fri, 08/30/2024 - 10:56

 

Alcohol-associated liver disease (ALD) is a significant global health concernaccounting for approximately 5% of all disease and injury. In the United States, the prevalence of ALD has increased since 2014, and the trajectory accelerated during the COVID-19 pandemic.

ALD encompasses a spectrum of diseases that includes steatosis, fibrosis, cirrhosis, and hepatocellular carcinoma, as well as related complications. Although earlier stages of ALD may be asymptomatic, hepatologists and gastroenterologists rarely see patients at this point.

“Unfortunately, patients with ALD more often present in late stages of disease (decompensated cirrhosis) as compared with other chronic liver diseases, such as metabolic dysfunction-associated steatotic liver disease or hepatitis C,” Doug A. Simonetto, MD, associate professor of medicine and director of the Gastroenterology and Hepatology Fellowship Program at the Mayo Clinic, Rochester, Minnesota, told this news organization.

Recent data have identified three demographic groups experiencing higher rates of ALD relative to previous periods and who may therefore require special attention. Understanding what makes these groups increasingly susceptible to ALD may allow for improved screening, earlier diagnosis, and potentially the prevention of its most dire consequences.
 

As Women Consume More Alcohol, ALD Follows

Historically, men have had higher rates of alcohol use, heavy drinking, and alcohol disorders than women. But this gender gap has begun to narrow.

Men born in the early 1900s were 2.2 times more likely to drink alcohol and 3.6 times more likely to experience alcohol-related harms than women, according to a 2016 meta-analysis. By the end of the 1990s, however, women’s drinking had begun to catch up. Men still led in these categories, but only by 1.1 and 1.3 times, respectively.

Rates of binge drinking (defined as at least five drinks in men or at least four drinks in women in an approximately 2-hour period) are also converging between the sexes. The authors of a longitudinal analysis hypothesized that an uptick in young women reporting drinking for social reasons — from 53% in 1987 to 87% in 2020 — was a possible cause.

Greater alcohol consumption among women has translated into higher rates of ALD. Analyzing data from the Global Burden of Disease Study 2019, which looked at hundreds of diseases across 204 countries and territories, researchers reported that the worldwide prevalence of ALD among young women (15-49 years) rose within the past decade. Those in the 20- to 24-year-old age group had the most significant increases in ALD prevalence rates.

Recent US statistics highlight the relative imbalance in ALD’s impact on women, according to George F. Koob, PhD, director of the National Institute on Alcohol Abuse and Alcoholism (NIAAA).

“The age-adjusted death rate from alcohol-associated liver cirrhosis increased by 47% between 2000 and 2019, with larger increases for females than for males (83.5% compared to 33%),” Dr. Koob told this news organization. “Larger increases for women are consistent with a general increase in alcohol use among adult women and larger increases in alcohol-related emergency department visits, hospitalizations, and deaths.”

Physiologically, women have a higher risk than men of developing ALD and more severe disease, even at lower levels of alcohol exposure. According to a 2021 review, several proposed mechanisms might play a role, including differences in alcohol metabolism and first-pass metabolism, hormones, and endotoxin and Kupffer cell activation.

Crucially, women are less likely than men to receive in-person therapy or approved medications for alcohol use disorder, according to a 2019 analysis of over 66,000 privately insured adult patients.
 

 

 

Certain Ethnic, Racial Minorities Have Higher Rates of ALD

In the United States, rates of ALD and associated complications are higher among certain minority groups, most prominently Hispanic and Native American individuals.

2021 analysis of three large US databases found that Hispanic ethnicity was associated with a 17% increased risk for acute-on-chronic liver failure in patients with ALD-related admissions.

Data also show that Hispanic and White patients have a higher proportion of alcoholic hepatitis than African American patients. And for Hispanic patients admitted for alcoholic hepatitis, they incur significantly more total hospital costs despite having similar mortality rates as White patients.

ALD-related mortality appears higher within certain subgroups of Hispanic patient populations. NIAAA surveillance reports track deaths resulting from cirrhosis in the White, Black, and Hispanic populations. From 2000 to 2019, these statistics show that although death rates from cirrhosis decreased for Hispanic White men, they increased for Hispanic White women, Dr. Koob said.

The latest data show that Native American populations are experiencing ALD at relatively higher rates than other racial/ethnic groups as well. An analysis of nearly 200,000 cirrhosis-related hospitalizations found that ALD, including alcoholic hepatitis, was the most common etiology in American Indian/Alaska Native patients. A separate analysis of the National Inpatient Sample database revealed that discharges resulting from ALD were disproportionately higher among Native American women.

As with Hispanic populations, ALD-associated mortality rates are also higher in Native American populations. The death rate from ALD increased for all racial and ethnic groups by 23.4% from 2019 to 2020, but the biggest increase occurred in the American Indian or Alaska Native populations (34.3% increase, from 20.1 to 27 per 100,000 people). Additionally, over the first two decades of the 21st century, mortality rates resulting from cirrhosis were highest among the American Indian and Alaska Native populations, according to a recently published systematic analysis of US health disparities across five racial/ethnic groups.

Discrepancies in these and other minority groups may be due partly to genetic mechanisms, such as the relatively higher frequency of the PNPLA3 G/G polymorphism, a known risk factor for the development of advanced ALD, among those with Native American ancestry. A host of complex socioeconomic factors, such as income discrepancies and access to care, likely contribute too.

Evidence suggests that alcohol screening interventions are not applied equally across various racial and ethnic groups, Dr. Koob noted.

“For instance, Subbaraman and colleagues reported that, compared to non-Hispanic White patients, those who identify as Hispanic, Black, or other race or ethnicity were less likely to be screened for alcohol use during visits to healthcare providers. This was particularly true for those with a high school education or less,” he told this news organization. “However, other studies have not found such disparities.”
 

ALD Rates High in Young Adults, but the Tide May Be Changing

Globally, the prevalence of ALD has increased among both adolescents and young adults since the beginning of the 21st century. The global incidence of alcohol-associated hepatitis in recent years has been greatest among those aged 15-44 years.

In the United States, the increasing rate of ALD-related hospitalizations is primarily driven by the rise in cases of alcoholic hepatitis and acute-on-chronic liver failure among those aged 35 years and younger.

ALD is now the most common indication for liver transplant in those younger than 40 years of age, having increased fourfold between 2003 and 2018.

From 2009 to 2016, people aged 25-34 years experienced the highest average annual increase in cirrhosis-related mortality (10.5%), a trend the authors noted was “driven entirely by alcohol-related liver disease.”

Younger adults may be more susceptible to ALD due to the way they drink.

In a 2021 analysis of the National Health and Nutrition Examination Survey database, the weighted prevalence of harmful alcohol use was 29.3% in those younger than 35 years, compared with 16.9% in those aged 35-64 years. Higher blood alcohol levels resulting from binge drinking may make patients more susceptible to bacterial translocation and liver fibrosis and can increase the likelihood of cirrhosis in those with an underlying metabolic syndrome.

Yet, Dr. Koob said, thinking of “young adults” as one cohort may be misguided because he’s found very different attitudes toward alcohol within that population. Cross-sectional survey data obtained from more than 180,000 young adults indicated that alcohol abstinence increased between 2002 and 2018. Young adults report various reasons for not drinking, ranging from lack of interest to financial and situational barriers (eg, not wanting to interfere with school or work).

“The tide is coming in and out at the same time,” he said. “Younger people under the age of 25 are drinking less each year, are increasingly interested in things like Dry January, and more than half view moderate levels of consumption as unhealthy. People who are 26 years and older are drinking more, are not as interested in cutting back or taking breaks, and are less likely to consider 1 or 2 drinks per day as potentially unhealthy.”

Dr. Koob would like to believe the positive trends around alcohol in the under-25 set prove not only resilient, but someday, dominant.

“We have seen historic increases in alcohol consumption in the last few years — the largest increases in more than 50 years. But we are hopeful that, as the younger cohorts age, we will see lower levels of drinking by adults in mid-life and beyond.”
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Alcohol-associated liver disease (ALD) is a significant global health concernaccounting for approximately 5% of all disease and injury. In the United States, the prevalence of ALD has increased since 2014, and the trajectory accelerated during the COVID-19 pandemic.

ALD encompasses a spectrum of diseases that includes steatosis, fibrosis, cirrhosis, and hepatocellular carcinoma, as well as related complications. Although earlier stages of ALD may be asymptomatic, hepatologists and gastroenterologists rarely see patients at this point.

“Unfortunately, patients with ALD more often present in late stages of disease (decompensated cirrhosis) as compared with other chronic liver diseases, such as metabolic dysfunction-associated steatotic liver disease or hepatitis C,” Doug A. Simonetto, MD, associate professor of medicine and director of the Gastroenterology and Hepatology Fellowship Program at the Mayo Clinic, Rochester, Minnesota, told this news organization.

Recent data have identified three demographic groups experiencing higher rates of ALD relative to previous periods and who may therefore require special attention. Understanding what makes these groups increasingly susceptible to ALD may allow for improved screening, earlier diagnosis, and potentially the prevention of its most dire consequences.
 

As Women Consume More Alcohol, ALD Follows

Historically, men have had higher rates of alcohol use, heavy drinking, and alcohol disorders than women. But this gender gap has begun to narrow.

Men born in the early 1900s were 2.2 times more likely to drink alcohol and 3.6 times more likely to experience alcohol-related harms than women, according to a 2016 meta-analysis. By the end of the 1990s, however, women’s drinking had begun to catch up. Men still led in these categories, but only by 1.1 and 1.3 times, respectively.

Rates of binge drinking (defined as at least five drinks in men or at least four drinks in women in an approximately 2-hour period) are also converging between the sexes. The authors of a longitudinal analysis hypothesized that an uptick in young women reporting drinking for social reasons — from 53% in 1987 to 87% in 2020 — was a possible cause.

Greater alcohol consumption among women has translated into higher rates of ALD. Analyzing data from the Global Burden of Disease Study 2019, which looked at hundreds of diseases across 204 countries and territories, researchers reported that the worldwide prevalence of ALD among young women (15-49 years) rose within the past decade. Those in the 20- to 24-year-old age group had the most significant increases in ALD prevalence rates.

Recent US statistics highlight the relative imbalance in ALD’s impact on women, according to George F. Koob, PhD, director of the National Institute on Alcohol Abuse and Alcoholism (NIAAA).

“The age-adjusted death rate from alcohol-associated liver cirrhosis increased by 47% between 2000 and 2019, with larger increases for females than for males (83.5% compared to 33%),” Dr. Koob told this news organization. “Larger increases for women are consistent with a general increase in alcohol use among adult women and larger increases in alcohol-related emergency department visits, hospitalizations, and deaths.”

Physiologically, women have a higher risk than men of developing ALD and more severe disease, even at lower levels of alcohol exposure. According to a 2021 review, several proposed mechanisms might play a role, including differences in alcohol metabolism and first-pass metabolism, hormones, and endotoxin and Kupffer cell activation.

Crucially, women are less likely than men to receive in-person therapy or approved medications for alcohol use disorder, according to a 2019 analysis of over 66,000 privately insured adult patients.
 

 

 

Certain Ethnic, Racial Minorities Have Higher Rates of ALD

In the United States, rates of ALD and associated complications are higher among certain minority groups, most prominently Hispanic and Native American individuals.

2021 analysis of three large US databases found that Hispanic ethnicity was associated with a 17% increased risk for acute-on-chronic liver failure in patients with ALD-related admissions.

Data also show that Hispanic and White patients have a higher proportion of alcoholic hepatitis than African American patients. And for Hispanic patients admitted for alcoholic hepatitis, they incur significantly more total hospital costs despite having similar mortality rates as White patients.

ALD-related mortality appears higher within certain subgroups of Hispanic patient populations. NIAAA surveillance reports track deaths resulting from cirrhosis in the White, Black, and Hispanic populations. From 2000 to 2019, these statistics show that although death rates from cirrhosis decreased for Hispanic White men, they increased for Hispanic White women, Dr. Koob said.

The latest data show that Native American populations are experiencing ALD at relatively higher rates than other racial/ethnic groups as well. An analysis of nearly 200,000 cirrhosis-related hospitalizations found that ALD, including alcoholic hepatitis, was the most common etiology in American Indian/Alaska Native patients. A separate analysis of the National Inpatient Sample database revealed that discharges resulting from ALD were disproportionately higher among Native American women.

As with Hispanic populations, ALD-associated mortality rates are also higher in Native American populations. The death rate from ALD increased for all racial and ethnic groups by 23.4% from 2019 to 2020, but the biggest increase occurred in the American Indian or Alaska Native populations (34.3% increase, from 20.1 to 27 per 100,000 people). Additionally, over the first two decades of the 21st century, mortality rates resulting from cirrhosis were highest among the American Indian and Alaska Native populations, according to a recently published systematic analysis of US health disparities across five racial/ethnic groups.

Discrepancies in these and other minority groups may be due partly to genetic mechanisms, such as the relatively higher frequency of the PNPLA3 G/G polymorphism, a known risk factor for the development of advanced ALD, among those with Native American ancestry. A host of complex socioeconomic factors, such as income discrepancies and access to care, likely contribute too.

Evidence suggests that alcohol screening interventions are not applied equally across various racial and ethnic groups, Dr. Koob noted.

“For instance, Subbaraman and colleagues reported that, compared to non-Hispanic White patients, those who identify as Hispanic, Black, or other race or ethnicity were less likely to be screened for alcohol use during visits to healthcare providers. This was particularly true for those with a high school education or less,” he told this news organization. “However, other studies have not found such disparities.”
 

ALD Rates High in Young Adults, but the Tide May Be Changing

Globally, the prevalence of ALD has increased among both adolescents and young adults since the beginning of the 21st century. The global incidence of alcohol-associated hepatitis in recent years has been greatest among those aged 15-44 years.

In the United States, the increasing rate of ALD-related hospitalizations is primarily driven by the rise in cases of alcoholic hepatitis and acute-on-chronic liver failure among those aged 35 years and younger.

ALD is now the most common indication for liver transplant in those younger than 40 years of age, having increased fourfold between 2003 and 2018.

From 2009 to 2016, people aged 25-34 years experienced the highest average annual increase in cirrhosis-related mortality (10.5%), a trend the authors noted was “driven entirely by alcohol-related liver disease.”

Younger adults may be more susceptible to ALD due to the way they drink.

In a 2021 analysis of the National Health and Nutrition Examination Survey database, the weighted prevalence of harmful alcohol use was 29.3% in those younger than 35 years, compared with 16.9% in those aged 35-64 years. Higher blood alcohol levels resulting from binge drinking may make patients more susceptible to bacterial translocation and liver fibrosis and can increase the likelihood of cirrhosis in those with an underlying metabolic syndrome.

Yet, Dr. Koob said, thinking of “young adults” as one cohort may be misguided because he’s found very different attitudes toward alcohol within that population. Cross-sectional survey data obtained from more than 180,000 young adults indicated that alcohol abstinence increased between 2002 and 2018. Young adults report various reasons for not drinking, ranging from lack of interest to financial and situational barriers (eg, not wanting to interfere with school or work).

“The tide is coming in and out at the same time,” he said. “Younger people under the age of 25 are drinking less each year, are increasingly interested in things like Dry January, and more than half view moderate levels of consumption as unhealthy. People who are 26 years and older are drinking more, are not as interested in cutting back or taking breaks, and are less likely to consider 1 or 2 drinks per day as potentially unhealthy.”

Dr. Koob would like to believe the positive trends around alcohol in the under-25 set prove not only resilient, but someday, dominant.

“We have seen historic increases in alcohol consumption in the last few years — the largest increases in more than 50 years. But we are hopeful that, as the younger cohorts age, we will see lower levels of drinking by adults in mid-life and beyond.”
 

A version of this article first appeared on Medscape.com.

 

Alcohol-associated liver disease (ALD) is a significant global health concernaccounting for approximately 5% of all disease and injury. In the United States, the prevalence of ALD has increased since 2014, and the trajectory accelerated during the COVID-19 pandemic.

ALD encompasses a spectrum of diseases that includes steatosis, fibrosis, cirrhosis, and hepatocellular carcinoma, as well as related complications. Although earlier stages of ALD may be asymptomatic, hepatologists and gastroenterologists rarely see patients at this point.

“Unfortunately, patients with ALD more often present in late stages of disease (decompensated cirrhosis) as compared with other chronic liver diseases, such as metabolic dysfunction-associated steatotic liver disease or hepatitis C,” Doug A. Simonetto, MD, associate professor of medicine and director of the Gastroenterology and Hepatology Fellowship Program at the Mayo Clinic, Rochester, Minnesota, told this news organization.

Recent data have identified three demographic groups experiencing higher rates of ALD relative to previous periods and who may therefore require special attention. Understanding what makes these groups increasingly susceptible to ALD may allow for improved screening, earlier diagnosis, and potentially the prevention of its most dire consequences.
 

As Women Consume More Alcohol, ALD Follows

Historically, men have had higher rates of alcohol use, heavy drinking, and alcohol disorders than women. But this gender gap has begun to narrow.

Men born in the early 1900s were 2.2 times more likely to drink alcohol and 3.6 times more likely to experience alcohol-related harms than women, according to a 2016 meta-analysis. By the end of the 1990s, however, women’s drinking had begun to catch up. Men still led in these categories, but only by 1.1 and 1.3 times, respectively.

Rates of binge drinking (defined as at least five drinks in men or at least four drinks in women in an approximately 2-hour period) are also converging between the sexes. The authors of a longitudinal analysis hypothesized that an uptick in young women reporting drinking for social reasons — from 53% in 1987 to 87% in 2020 — was a possible cause.

Greater alcohol consumption among women has translated into higher rates of ALD. Analyzing data from the Global Burden of Disease Study 2019, which looked at hundreds of diseases across 204 countries and territories, researchers reported that the worldwide prevalence of ALD among young women (15-49 years) rose within the past decade. Those in the 20- to 24-year-old age group had the most significant increases in ALD prevalence rates.

Recent US statistics highlight the relative imbalance in ALD’s impact on women, according to George F. Koob, PhD, director of the National Institute on Alcohol Abuse and Alcoholism (NIAAA).

“The age-adjusted death rate from alcohol-associated liver cirrhosis increased by 47% between 2000 and 2019, with larger increases for females than for males (83.5% compared to 33%),” Dr. Koob told this news organization. “Larger increases for women are consistent with a general increase in alcohol use among adult women and larger increases in alcohol-related emergency department visits, hospitalizations, and deaths.”

Physiologically, women have a higher risk than men of developing ALD and more severe disease, even at lower levels of alcohol exposure. According to a 2021 review, several proposed mechanisms might play a role, including differences in alcohol metabolism and first-pass metabolism, hormones, and endotoxin and Kupffer cell activation.

Crucially, women are less likely than men to receive in-person therapy or approved medications for alcohol use disorder, according to a 2019 analysis of over 66,000 privately insured adult patients.
 

 

 

Certain Ethnic, Racial Minorities Have Higher Rates of ALD

In the United States, rates of ALD and associated complications are higher among certain minority groups, most prominently Hispanic and Native American individuals.

2021 analysis of three large US databases found that Hispanic ethnicity was associated with a 17% increased risk for acute-on-chronic liver failure in patients with ALD-related admissions.

Data also show that Hispanic and White patients have a higher proportion of alcoholic hepatitis than African American patients. And for Hispanic patients admitted for alcoholic hepatitis, they incur significantly more total hospital costs despite having similar mortality rates as White patients.

ALD-related mortality appears higher within certain subgroups of Hispanic patient populations. NIAAA surveillance reports track deaths resulting from cirrhosis in the White, Black, and Hispanic populations. From 2000 to 2019, these statistics show that although death rates from cirrhosis decreased for Hispanic White men, they increased for Hispanic White women, Dr. Koob said.

The latest data show that Native American populations are experiencing ALD at relatively higher rates than other racial/ethnic groups as well. An analysis of nearly 200,000 cirrhosis-related hospitalizations found that ALD, including alcoholic hepatitis, was the most common etiology in American Indian/Alaska Native patients. A separate analysis of the National Inpatient Sample database revealed that discharges resulting from ALD were disproportionately higher among Native American women.

As with Hispanic populations, ALD-associated mortality rates are also higher in Native American populations. The death rate from ALD increased for all racial and ethnic groups by 23.4% from 2019 to 2020, but the biggest increase occurred in the American Indian or Alaska Native populations (34.3% increase, from 20.1 to 27 per 100,000 people). Additionally, over the first two decades of the 21st century, mortality rates resulting from cirrhosis were highest among the American Indian and Alaska Native populations, according to a recently published systematic analysis of US health disparities across five racial/ethnic groups.

Discrepancies in these and other minority groups may be due partly to genetic mechanisms, such as the relatively higher frequency of the PNPLA3 G/G polymorphism, a known risk factor for the development of advanced ALD, among those with Native American ancestry. A host of complex socioeconomic factors, such as income discrepancies and access to care, likely contribute too.

Evidence suggests that alcohol screening interventions are not applied equally across various racial and ethnic groups, Dr. Koob noted.

“For instance, Subbaraman and colleagues reported that, compared to non-Hispanic White patients, those who identify as Hispanic, Black, or other race or ethnicity were less likely to be screened for alcohol use during visits to healthcare providers. This was particularly true for those with a high school education or less,” he told this news organization. “However, other studies have not found such disparities.”
 

ALD Rates High in Young Adults, but the Tide May Be Changing

Globally, the prevalence of ALD has increased among both adolescents and young adults since the beginning of the 21st century. The global incidence of alcohol-associated hepatitis in recent years has been greatest among those aged 15-44 years.

In the United States, the increasing rate of ALD-related hospitalizations is primarily driven by the rise in cases of alcoholic hepatitis and acute-on-chronic liver failure among those aged 35 years and younger.

ALD is now the most common indication for liver transplant in those younger than 40 years of age, having increased fourfold between 2003 and 2018.

From 2009 to 2016, people aged 25-34 years experienced the highest average annual increase in cirrhosis-related mortality (10.5%), a trend the authors noted was “driven entirely by alcohol-related liver disease.”

Younger adults may be more susceptible to ALD due to the way they drink.

In a 2021 analysis of the National Health and Nutrition Examination Survey database, the weighted prevalence of harmful alcohol use was 29.3% in those younger than 35 years, compared with 16.9% in those aged 35-64 years. Higher blood alcohol levels resulting from binge drinking may make patients more susceptible to bacterial translocation and liver fibrosis and can increase the likelihood of cirrhosis in those with an underlying metabolic syndrome.

Yet, Dr. Koob said, thinking of “young adults” as one cohort may be misguided because he’s found very different attitudes toward alcohol within that population. Cross-sectional survey data obtained from more than 180,000 young adults indicated that alcohol abstinence increased between 2002 and 2018. Young adults report various reasons for not drinking, ranging from lack of interest to financial and situational barriers (eg, not wanting to interfere with school or work).

“The tide is coming in and out at the same time,” he said. “Younger people under the age of 25 are drinking less each year, are increasingly interested in things like Dry January, and more than half view moderate levels of consumption as unhealthy. People who are 26 years and older are drinking more, are not as interested in cutting back or taking breaks, and are less likely to consider 1 or 2 drinks per day as potentially unhealthy.”

Dr. Koob would like to believe the positive trends around alcohol in the under-25 set prove not only resilient, but someday, dominant.

“We have seen historic increases in alcohol consumption in the last few years — the largest increases in more than 50 years. But we are hopeful that, as the younger cohorts age, we will see lower levels of drinking by adults in mid-life and beyond.”
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Three AI Technologies Poised to Transform IBD Care

Article Type
Changed
Mon, 07/15/2024 - 15:47

By now, it is widely accepted that artificial intelligence (AI) will reshape contemporary medicine. The question is simply when this hypothetical will become an everyday reality. For gastroenterologists involved in the management of inflammatory bowel disease (IBD), the waiting period may be ending.

AI “is the next step in clinical care,” Jacob Kurowski, MD, medical director of pediatric inflammatory bowel diseases at Cleveland Clinic Children’s in Cleveland, Ohio, said in an interview.

“In terms of technological breakthroughs, this is like going from some of the more rigid endoscopies to high-definition and white-light endoscopy or the upgrade from paper charts to the electronic medical record (EMR), but instead of making your life more difficult, it will actually make it a lot easier,” said Dr. Kurowski, who has researched and lectured on AI applications in IBD.

Simply put, “AI is when algorithms use data to simulate human intelligence,” said Seth A. Gross, MD, clinical chief in the Division of Gastroenterology and Hepatology at NYU Langone Health and a professor at NYU Grossman School of Medicine, New York City, who has studied the use of AI for polyp detection.

IBD is ideally served by AI because to diagnose and manage the disease, gastroenterologists must gather, analyze, and weave together a particularly heterogeneous mix of information — from blood tests and imaging to patient-reported symptoms and family history — often stored in different places or formats. And to ensure patient participation in their care plans, gastroenterologists also need to help them understand this complex disease.

Because of their potential to aid gastroenterologists with these tasks, three core AI technologies — some of which already have commercial applications — are likely to become foundational in clinical practice in the coming years: Image analysis and processing, natural language processing (NLP), and generative AI, according to experts familiar with AI research in IBD.
 

Image Analysis and Processing

One of AI’s most promising applications for IBD care is in medical image and video processing and analysis. Emerging AI tools convert the essential elements of medical images into mathematical features, which they then use to train and refine themselves. The ultimate goal is to provide fast, accurate, and granular results without inter- and intraobserver variation and human potential for bias.

Today’s techniques don’t quantify IBD very well because they’re qualitative and subjective, Ryan Stidham, MD, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor, and a leading researcher in AI applications in IBD, said in an interview.

“Even standardized scoring systems used by the US Food and Drug Administration and the European Medicines Agency to assess disease severity and measure therapeutic response are still pretty crude systems — not because of the gastroenterologists interpreting them, who are smart — but because it’s a very difficult task to quantify these features on imaging,” he said.

Another appeal of the technology in IBD care is that it has capabilities, including complex pattern recognition, beyond those of physicians.

“What we can’t do is things such as tediously measure every single ulcer, count how many different disease features are seen throughout the entire colon, where they are and how they’re spatially correlated, or what are their color patterns,” Dr. Stidham said. “We don’t have the time, feasibility, or, frankly, the energy and cognitive attention span to be able to do that for one patient, let alone every patient.”

AI-based disease activity assessments have yielded promising results across multiple imaging systems. The technology has advanced rapidly in the last decade and is beginning to demonstrate the ability to replicate near perfectly the endoscopic interpretation of human experts.

In separate studies, AI models had high levels of agreement with experienced reviewers on Mayo endoscopic scores and ulcerative colitis endoscopic index of severity scores, and they reduced the review time of pan-enteric capsule endoscopy among patients with suspected Crohn’s disease from a range of 26-39 minutes to 3.2 minutes per patient.

A report from the PiCaSSO study showed that an AI-guided system could distinguish remission/inflammation using histologic assessments of ulcerative colitis biopsies with an accuracy rate close to that of human reviewers.

In Crohn’s disease, research indicates that cross-sectional enterography imaging could potentially be made more precise with AI, providing hope that radiologists will be freed from this time-consuming task.

“As of today, several commercial companies are producing tools that can take an endoscopic image or a full-motion video and more or less give you a standardized score that would be akin to what an expert would give you on review of a colonoscopy,” Dr. Stidham said.

This is not to say there isn’t room for improvement.

“There’s probably still a bit of work to do when looking for the difference between inflammation and adenoma,” said Dr. Kurowski. “But it’s coming sooner rather than later.”
 

 

 

NLP

NLP — a subset of applied machine learning that essentially teaches computers to read — enables automated systems to go through existing digital information, including text like clinical notes, and extract, interpret, and quantify it in a fraction of the time required by clinicians.

One area this type of AI can help in IBD care is by automating EMR chart reviews. Currently, clinicians often must conduct time-consuming reviews to gather and read all the information they need to manage the care of patients with the disease.

Evidence suggests that this task takes a considerable toll. In a 2023 report, gastroenterologists cited hassles with EMRs and too much time spent at work among the main contributors to burnout.

NLP used on entire EMR systems could be used to improve overall IBD care.

“We have 30-40 years of EMRs available at our fingertips. These reams of clinical data are just sitting out there and provide a longitudinal narrative of what’s happened to every patient and the changes in their treatment course,” Dr. Stidham said.

Results from several studies involving NLP are promising. Automated chart review models enhanced with NLP have been shown to be better at identifying patients with Crohn’s disease or ulcerative colitis and at detecting and inferring the activity status of extraintestinal manifestations of IBD than models using only medical codes.

Additional examples of NLP applications that could save physicians’ time and energy in everyday practice include automatically generating clinical notes, summarizing patient interactions, and flagging important information for follow-up.

For time-strapped, overburdened clinicians, NLP may even restore the core aspects of care that first attracted them to the profession, Dr. Kurowski noted.

“It might actually be the next best step to get physicians away from the computer and back to being face to face with the patient, which I think is one of the biggest complaints of everybody in the modern EMR world in that we live in,” he said.
 

Generative AI

Patient education likely will be reshaped by emerging AI applications that can generate digital materials in a conversational tone. These generative AI tools, including advanced chatbots, are powered by large-language models, a type of machine learning that is trained on vast amounts of text data to understand and generate natural language.

This technology will be familiar to anyone who has interacted with OpenAI’s ChatGPT, which after getting a “prompt” — a question or request — from a user provides a conversational-sounding reply.

“Chatbots have been around for a while, but what’s new is that they now can understand and generate language that’s far more realistic,” Dr. Stidham said. “Plus, they can be trained on clinical scenarios so that it can put individual patients into context when having that digital, AI-powered conversation.”

In IBD, chatbots are being used to educate patients, for example, by answering their questions before they undergo colonoscopy. In a recent analysis, the best performer of three chatbots answered 91.4% of common precolonoscopy questions accurately. Other research determined that chatbot responses to colonoscopy questions were comparable with those provided by gastroenterologists.

Dr. Stidham and colleagues have seen the technology’s potential firsthand at the University of Michigan, where they’ve successfully deployed commercial chatbots to interact with patients prior to colonoscopy.

“It’s a force multiplier, in that these chatbots are essentially allowing us to expand our staff without bringing in more humans,” he said.

Despite fears that AI will threaten healthcare jobs, that isn’t an issue in today’s environment where “we can’t hire enough help,” Dr. Stidham said.

However, this technology isn’t fully ready for large-scale implementation, he added.

“ChatGPT may be ready for general medicine, but it’s not taking care of my gastroenterology patients (yet),” Dr. Stidham and coauthors wrote in a recent article. Among their concerns was the inability of ChatGPT versions 3 and 4 to pass the American College of Gastroenterology’s self-assessment test.
 

 

 

Preparing for the Future of AI

AI technology is advancing rapidly, and gastroenterologists need to be prepared for its integration into clinical practice. One proactive step is engaging with professional societies and initiatives aimed at guiding AI implementation.

One such initiative is the American Society for Gastrointestinal Endoscopy’s AI Task Force, which is led by Dr. Gross.

“The AI Task Force, which has recently evolved into an AI institute, believes in responsible AI,” Dr. Gross said. “The group highlights the importance of transparency and partnership with all key stakeholders to ensure that AI development and integration deliver improved care to GI patients.”

Dr. Kurowski, for one, believes that as AI gets even better at quantifying patient data, it will usher in the long-sought era of personalized care.

“I think it actually moves us into the realm of talking about a cure for certain people with IBD, for certain subtypes of the disease,” he said. “AI is going to be much more your friend and less of your foe than anything else you’ve seen in the modern era of medicine.”

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

By now, it is widely accepted that artificial intelligence (AI) will reshape contemporary medicine. The question is simply when this hypothetical will become an everyday reality. For gastroenterologists involved in the management of inflammatory bowel disease (IBD), the waiting period may be ending.

AI “is the next step in clinical care,” Jacob Kurowski, MD, medical director of pediatric inflammatory bowel diseases at Cleveland Clinic Children’s in Cleveland, Ohio, said in an interview.

“In terms of technological breakthroughs, this is like going from some of the more rigid endoscopies to high-definition and white-light endoscopy or the upgrade from paper charts to the electronic medical record (EMR), but instead of making your life more difficult, it will actually make it a lot easier,” said Dr. Kurowski, who has researched and lectured on AI applications in IBD.

Simply put, “AI is when algorithms use data to simulate human intelligence,” said Seth A. Gross, MD, clinical chief in the Division of Gastroenterology and Hepatology at NYU Langone Health and a professor at NYU Grossman School of Medicine, New York City, who has studied the use of AI for polyp detection.

IBD is ideally served by AI because to diagnose and manage the disease, gastroenterologists must gather, analyze, and weave together a particularly heterogeneous mix of information — from blood tests and imaging to patient-reported symptoms and family history — often stored in different places or formats. And to ensure patient participation in their care plans, gastroenterologists also need to help them understand this complex disease.

Because of their potential to aid gastroenterologists with these tasks, three core AI technologies — some of which already have commercial applications — are likely to become foundational in clinical practice in the coming years: Image analysis and processing, natural language processing (NLP), and generative AI, according to experts familiar with AI research in IBD.
 

Image Analysis and Processing

One of AI’s most promising applications for IBD care is in medical image and video processing and analysis. Emerging AI tools convert the essential elements of medical images into mathematical features, which they then use to train and refine themselves. The ultimate goal is to provide fast, accurate, and granular results without inter- and intraobserver variation and human potential for bias.

Today’s techniques don’t quantify IBD very well because they’re qualitative and subjective, Ryan Stidham, MD, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor, and a leading researcher in AI applications in IBD, said in an interview.

“Even standardized scoring systems used by the US Food and Drug Administration and the European Medicines Agency to assess disease severity and measure therapeutic response are still pretty crude systems — not because of the gastroenterologists interpreting them, who are smart — but because it’s a very difficult task to quantify these features on imaging,” he said.

Another appeal of the technology in IBD care is that it has capabilities, including complex pattern recognition, beyond those of physicians.

“What we can’t do is things such as tediously measure every single ulcer, count how many different disease features are seen throughout the entire colon, where they are and how they’re spatially correlated, or what are their color patterns,” Dr. Stidham said. “We don’t have the time, feasibility, or, frankly, the energy and cognitive attention span to be able to do that for one patient, let alone every patient.”

AI-based disease activity assessments have yielded promising results across multiple imaging systems. The technology has advanced rapidly in the last decade and is beginning to demonstrate the ability to replicate near perfectly the endoscopic interpretation of human experts.

In separate studies, AI models had high levels of agreement with experienced reviewers on Mayo endoscopic scores and ulcerative colitis endoscopic index of severity scores, and they reduced the review time of pan-enteric capsule endoscopy among patients with suspected Crohn’s disease from a range of 26-39 minutes to 3.2 minutes per patient.

A report from the PiCaSSO study showed that an AI-guided system could distinguish remission/inflammation using histologic assessments of ulcerative colitis biopsies with an accuracy rate close to that of human reviewers.

In Crohn’s disease, research indicates that cross-sectional enterography imaging could potentially be made more precise with AI, providing hope that radiologists will be freed from this time-consuming task.

“As of today, several commercial companies are producing tools that can take an endoscopic image or a full-motion video and more or less give you a standardized score that would be akin to what an expert would give you on review of a colonoscopy,” Dr. Stidham said.

This is not to say there isn’t room for improvement.

“There’s probably still a bit of work to do when looking for the difference between inflammation and adenoma,” said Dr. Kurowski. “But it’s coming sooner rather than later.”
 

 

 

NLP

NLP — a subset of applied machine learning that essentially teaches computers to read — enables automated systems to go through existing digital information, including text like clinical notes, and extract, interpret, and quantify it in a fraction of the time required by clinicians.

One area this type of AI can help in IBD care is by automating EMR chart reviews. Currently, clinicians often must conduct time-consuming reviews to gather and read all the information they need to manage the care of patients with the disease.

Evidence suggests that this task takes a considerable toll. In a 2023 report, gastroenterologists cited hassles with EMRs and too much time spent at work among the main contributors to burnout.

NLP used on entire EMR systems could be used to improve overall IBD care.

“We have 30-40 years of EMRs available at our fingertips. These reams of clinical data are just sitting out there and provide a longitudinal narrative of what’s happened to every patient and the changes in their treatment course,” Dr. Stidham said.

Results from several studies involving NLP are promising. Automated chart review models enhanced with NLP have been shown to be better at identifying patients with Crohn’s disease or ulcerative colitis and at detecting and inferring the activity status of extraintestinal manifestations of IBD than models using only medical codes.

Additional examples of NLP applications that could save physicians’ time and energy in everyday practice include automatically generating clinical notes, summarizing patient interactions, and flagging important information for follow-up.

For time-strapped, overburdened clinicians, NLP may even restore the core aspects of care that first attracted them to the profession, Dr. Kurowski noted.

“It might actually be the next best step to get physicians away from the computer and back to being face to face with the patient, which I think is one of the biggest complaints of everybody in the modern EMR world in that we live in,” he said.
 

Generative AI

Patient education likely will be reshaped by emerging AI applications that can generate digital materials in a conversational tone. These generative AI tools, including advanced chatbots, are powered by large-language models, a type of machine learning that is trained on vast amounts of text data to understand and generate natural language.

This technology will be familiar to anyone who has interacted with OpenAI’s ChatGPT, which after getting a “prompt” — a question or request — from a user provides a conversational-sounding reply.

“Chatbots have been around for a while, but what’s new is that they now can understand and generate language that’s far more realistic,” Dr. Stidham said. “Plus, they can be trained on clinical scenarios so that it can put individual patients into context when having that digital, AI-powered conversation.”

In IBD, chatbots are being used to educate patients, for example, by answering their questions before they undergo colonoscopy. In a recent analysis, the best performer of three chatbots answered 91.4% of common precolonoscopy questions accurately. Other research determined that chatbot responses to colonoscopy questions were comparable with those provided by gastroenterologists.

Dr. Stidham and colleagues have seen the technology’s potential firsthand at the University of Michigan, where they’ve successfully deployed commercial chatbots to interact with patients prior to colonoscopy.

“It’s a force multiplier, in that these chatbots are essentially allowing us to expand our staff without bringing in more humans,” he said.

Despite fears that AI will threaten healthcare jobs, that isn’t an issue in today’s environment where “we can’t hire enough help,” Dr. Stidham said.

However, this technology isn’t fully ready for large-scale implementation, he added.

“ChatGPT may be ready for general medicine, but it’s not taking care of my gastroenterology patients (yet),” Dr. Stidham and coauthors wrote in a recent article. Among their concerns was the inability of ChatGPT versions 3 and 4 to pass the American College of Gastroenterology’s self-assessment test.
 

 

 

Preparing for the Future of AI

AI technology is advancing rapidly, and gastroenterologists need to be prepared for its integration into clinical practice. One proactive step is engaging with professional societies and initiatives aimed at guiding AI implementation.

One such initiative is the American Society for Gastrointestinal Endoscopy’s AI Task Force, which is led by Dr. Gross.

“The AI Task Force, which has recently evolved into an AI institute, believes in responsible AI,” Dr. Gross said. “The group highlights the importance of transparency and partnership with all key stakeholders to ensure that AI development and integration deliver improved care to GI patients.”

Dr. Kurowski, for one, believes that as AI gets even better at quantifying patient data, it will usher in the long-sought era of personalized care.

“I think it actually moves us into the realm of talking about a cure for certain people with IBD, for certain subtypes of the disease,” he said. “AI is going to be much more your friend and less of your foe than anything else you’ve seen in the modern era of medicine.”

A version of this article first appeared on Medscape.com.

By now, it is widely accepted that artificial intelligence (AI) will reshape contemporary medicine. The question is simply when this hypothetical will become an everyday reality. For gastroenterologists involved in the management of inflammatory bowel disease (IBD), the waiting period may be ending.

AI “is the next step in clinical care,” Jacob Kurowski, MD, medical director of pediatric inflammatory bowel diseases at Cleveland Clinic Children’s in Cleveland, Ohio, said in an interview.

“In terms of technological breakthroughs, this is like going from some of the more rigid endoscopies to high-definition and white-light endoscopy or the upgrade from paper charts to the electronic medical record (EMR), but instead of making your life more difficult, it will actually make it a lot easier,” said Dr. Kurowski, who has researched and lectured on AI applications in IBD.

Simply put, “AI is when algorithms use data to simulate human intelligence,” said Seth A. Gross, MD, clinical chief in the Division of Gastroenterology and Hepatology at NYU Langone Health and a professor at NYU Grossman School of Medicine, New York City, who has studied the use of AI for polyp detection.

IBD is ideally served by AI because to diagnose and manage the disease, gastroenterologists must gather, analyze, and weave together a particularly heterogeneous mix of information — from blood tests and imaging to patient-reported symptoms and family history — often stored in different places or formats. And to ensure patient participation in their care plans, gastroenterologists also need to help them understand this complex disease.

Because of their potential to aid gastroenterologists with these tasks, three core AI technologies — some of which already have commercial applications — are likely to become foundational in clinical practice in the coming years: Image analysis and processing, natural language processing (NLP), and generative AI, according to experts familiar with AI research in IBD.
 

Image Analysis and Processing

One of AI’s most promising applications for IBD care is in medical image and video processing and analysis. Emerging AI tools convert the essential elements of medical images into mathematical features, which they then use to train and refine themselves. The ultimate goal is to provide fast, accurate, and granular results without inter- and intraobserver variation and human potential for bias.

Today’s techniques don’t quantify IBD very well because they’re qualitative and subjective, Ryan Stidham, MD, associate professor of gastroenterology and computational medicine and bioinformatics at the University of Michigan, Ann Arbor, and a leading researcher in AI applications in IBD, said in an interview.

“Even standardized scoring systems used by the US Food and Drug Administration and the European Medicines Agency to assess disease severity and measure therapeutic response are still pretty crude systems — not because of the gastroenterologists interpreting them, who are smart — but because it’s a very difficult task to quantify these features on imaging,” he said.

Another appeal of the technology in IBD care is that it has capabilities, including complex pattern recognition, beyond those of physicians.

“What we can’t do is things such as tediously measure every single ulcer, count how many different disease features are seen throughout the entire colon, where they are and how they’re spatially correlated, or what are their color patterns,” Dr. Stidham said. “We don’t have the time, feasibility, or, frankly, the energy and cognitive attention span to be able to do that for one patient, let alone every patient.”

AI-based disease activity assessments have yielded promising results across multiple imaging systems. The technology has advanced rapidly in the last decade and is beginning to demonstrate the ability to replicate near perfectly the endoscopic interpretation of human experts.

In separate studies, AI models had high levels of agreement with experienced reviewers on Mayo endoscopic scores and ulcerative colitis endoscopic index of severity scores, and they reduced the review time of pan-enteric capsule endoscopy among patients with suspected Crohn’s disease from a range of 26-39 minutes to 3.2 minutes per patient.

A report from the PiCaSSO study showed that an AI-guided system could distinguish remission/inflammation using histologic assessments of ulcerative colitis biopsies with an accuracy rate close to that of human reviewers.

In Crohn’s disease, research indicates that cross-sectional enterography imaging could potentially be made more precise with AI, providing hope that radiologists will be freed from this time-consuming task.

“As of today, several commercial companies are producing tools that can take an endoscopic image or a full-motion video and more or less give you a standardized score that would be akin to what an expert would give you on review of a colonoscopy,” Dr. Stidham said.

This is not to say there isn’t room for improvement.

“There’s probably still a bit of work to do when looking for the difference between inflammation and adenoma,” said Dr. Kurowski. “But it’s coming sooner rather than later.”
 

 

 

NLP

NLP — a subset of applied machine learning that essentially teaches computers to read — enables automated systems to go through existing digital information, including text like clinical notes, and extract, interpret, and quantify it in a fraction of the time required by clinicians.

One area this type of AI can help in IBD care is by automating EMR chart reviews. Currently, clinicians often must conduct time-consuming reviews to gather and read all the information they need to manage the care of patients with the disease.

Evidence suggests that this task takes a considerable toll. In a 2023 report, gastroenterologists cited hassles with EMRs and too much time spent at work among the main contributors to burnout.

NLP used on entire EMR systems could be used to improve overall IBD care.

“We have 30-40 years of EMRs available at our fingertips. These reams of clinical data are just sitting out there and provide a longitudinal narrative of what’s happened to every patient and the changes in their treatment course,” Dr. Stidham said.

Results from several studies involving NLP are promising. Automated chart review models enhanced with NLP have been shown to be better at identifying patients with Crohn’s disease or ulcerative colitis and at detecting and inferring the activity status of extraintestinal manifestations of IBD than models using only medical codes.

Additional examples of NLP applications that could save physicians’ time and energy in everyday practice include automatically generating clinical notes, summarizing patient interactions, and flagging important information for follow-up.

For time-strapped, overburdened clinicians, NLP may even restore the core aspects of care that first attracted them to the profession, Dr. Kurowski noted.

“It might actually be the next best step to get physicians away from the computer and back to being face to face with the patient, which I think is one of the biggest complaints of everybody in the modern EMR world in that we live in,” he said.
 

Generative AI

Patient education likely will be reshaped by emerging AI applications that can generate digital materials in a conversational tone. These generative AI tools, including advanced chatbots, are powered by large-language models, a type of machine learning that is trained on vast amounts of text data to understand and generate natural language.

This technology will be familiar to anyone who has interacted with OpenAI’s ChatGPT, which after getting a “prompt” — a question or request — from a user provides a conversational-sounding reply.

“Chatbots have been around for a while, but what’s new is that they now can understand and generate language that’s far more realistic,” Dr. Stidham said. “Plus, they can be trained on clinical scenarios so that it can put individual patients into context when having that digital, AI-powered conversation.”

In IBD, chatbots are being used to educate patients, for example, by answering their questions before they undergo colonoscopy. In a recent analysis, the best performer of three chatbots answered 91.4% of common precolonoscopy questions accurately. Other research determined that chatbot responses to colonoscopy questions were comparable with those provided by gastroenterologists.

Dr. Stidham and colleagues have seen the technology’s potential firsthand at the University of Michigan, where they’ve successfully deployed commercial chatbots to interact with patients prior to colonoscopy.

“It’s a force multiplier, in that these chatbots are essentially allowing us to expand our staff without bringing in more humans,” he said.

Despite fears that AI will threaten healthcare jobs, that isn’t an issue in today’s environment where “we can’t hire enough help,” Dr. Stidham said.

However, this technology isn’t fully ready for large-scale implementation, he added.

“ChatGPT may be ready for general medicine, but it’s not taking care of my gastroenterology patients (yet),” Dr. Stidham and coauthors wrote in a recent article. Among their concerns was the inability of ChatGPT versions 3 and 4 to pass the American College of Gastroenterology’s self-assessment test.
 

 

 

Preparing for the Future of AI

AI technology is advancing rapidly, and gastroenterologists need to be prepared for its integration into clinical practice. One proactive step is engaging with professional societies and initiatives aimed at guiding AI implementation.

One such initiative is the American Society for Gastrointestinal Endoscopy’s AI Task Force, which is led by Dr. Gross.

“The AI Task Force, which has recently evolved into an AI institute, believes in responsible AI,” Dr. Gross said. “The group highlights the importance of transparency and partnership with all key stakeholders to ensure that AI development and integration deliver improved care to GI patients.”

Dr. Kurowski, for one, believes that as AI gets even better at quantifying patient data, it will usher in the long-sought era of personalized care.

“I think it actually moves us into the realm of talking about a cure for certain people with IBD, for certain subtypes of the disease,” he said. “AI is going to be much more your friend and less of your foe than anything else you’ve seen in the modern era of medicine.”

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Celiac Disease: Five Things to Know

Article Type
Changed
Fri, 06/07/2024 - 16:34

Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.

Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.

To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.

Here are five things to know about celiac disease.
 

1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play

Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.

2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.

Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
 

2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults

It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.

Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.

The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.

To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.

For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”

The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.

A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
 

 

 

3. Celiac Disease Is Associated With Several Life-Threatening Conditions

Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.

Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.

Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.

In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.

Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopeniawomen’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adultscertain forms of cancerinfectious diseases, and cardiomyopathy.
 

4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients

GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.

Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.

GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.

Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.

GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.

The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
 

5. Novel Therapies for Celiac Disease Are in the Pipeline

The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.

Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.

These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.

Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.

To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.

Here are five things to know about celiac disease.
 

1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play

Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.

2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.

Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
 

2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults

It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.

Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.

The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.

To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.

For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”

The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.

A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
 

 

 

3. Celiac Disease Is Associated With Several Life-Threatening Conditions

Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.

Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.

Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.

In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.

Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopeniawomen’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adultscertain forms of cancerinfectious diseases, and cardiomyopathy.
 

4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients

GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.

Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.

GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.

Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.

GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.

The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
 

5. Novel Therapies for Celiac Disease Are in the Pipeline

The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.

Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.

These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.

A version of this article first appeared on Medscape.com.

Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.

Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.

To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.

Here are five things to know about celiac disease.
 

1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play

Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.

2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.

Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
 

2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults

It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.

Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.

The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.

To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.

For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”

The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.

A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
 

 

 

3. Celiac Disease Is Associated With Several Life-Threatening Conditions

Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.

Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.

Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.

In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.

Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopeniawomen’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adultscertain forms of cancerinfectious diseases, and cardiomyopathy.
 

4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients

GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.

Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.

GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.

Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.

GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.

The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
 

5. Novel Therapies for Celiac Disease Are in the Pipeline

The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.

Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.

These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Speedy Eating and Late-Night Meals May Take a Toll on Health

Article Type
Changed
Fri, 04/19/2024 - 11:19

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Eosinophilic Esophagitis: 5 Things to Know

Article Type
Changed
Thu, 02/15/2024 - 15:36

Eosinophilic esophagitis (EoE) is a chronic inflammatory disease of the esophagus that affects both children and adults. EoE is defined by symptoms of esophageal dysfunction (eg, dysphagia, vomiting, difficulty in feeding), with presentation varying depending on patient age.

The global incidence of EoE has increased in recent decades. In the United States alone, EoE is estimated to affect approximately 150,000 people and result in as much as $1.4 billion in annual healthcare costs.

There currently is no clear treatment hierarchy for EoE, and long delays between symptom onset and diagnoses are common.

Still, the knowledge base surrounding the disease is growing, and existing interventions have shown tremendous success at curbing symptoms and disease progression. The recent approvals of a monoclonal antibody and the first oral agent for EoE treatment have suddenly expanded medication options — at a time when other promising therapies are being investigated too.

To help clinicians stay up to date on the latest information on this debilitating disease, here are five things to know about EoE.

1. EoE prevalence is increasing although not consistently around the globe.

EoE was first recognized as a distinct clinical entity in the early 1990s, when it was considered a relatively rare disease. Now, the incidence and prevalence rates of EoE are escalating at rates that cannot be explained by increased disease awareness and detection. 

Although EoE has been diagnosed in Latin America, the Middle East, and Asia, such instances are relatively uncommon in comparison with the spiking rates noted in the United States; in Western Europe, including Denmarkthe Netherlands, and Switzerland; and in Australia

Emerging data suggest that climate and location may be a factor in the varying incidence rates of EoE. An analysis of 233,649 patients in a US pathology database reported that EoE was more common in cold and arid climate zones than in tropical zones. Another study suggests that EoE is more common in low-density, rural environments compared with urban settings. 

2. Environmental and food exposures may trigger EoE, and genetics probably play a role.

The unequal geographic distribution of EoE lends credence to the theory that external triggers, which naturally differ in various locales, play an outsized role in its development.

Mice studies have indicated that the inhalation of allergens induces notable eosinophil infiltration and degranulation, and a pilot study conducted in New York City found that EoE symptoms peaked during the July-to-September period when grass pollen counts were at their highest.

Early-life factors that can result in alteration to the microbiome have also been identified as possibly influencing EoE development. They include cesarean delivery, preterm delivery, admission to a neonatal intensive care unit, infant formula use, and maternal or infant use of antibiotics. Conversely, evidence suggests that Helicobacter pylori infection may be protective against EoE due to immunomodulating effects that have not yet been sufficiently identified in the literature.

Yet, the clearest association between EoE and outside triggers is found with food exposures. In one analysis of pediatric patients, the items that were most commonly associated with elevated food-specific serum immunoglobulin E antibodies in patients with EoE were milk (78%), wheat (69%), eggs (64%), peanuts (54%), and soy (51%). Food allergies are also on the uptick in countries with rising EoE rates, suggesting that the two trends may be interrelated.

From a genetic standpoint, EoE is more likely to develop in those with first-degree relatives with the disease than in the general population. Thirty independent genes thought to be associated with EoE have been identified. EoE is also significantly more common in men than in women.

3. Diagnosis requires knowing the symptoms, excluding other disorders, and performing biopsy.

EoE can occur early in life, with approximately one third of children with the disease presenting under age 5 years. The prevalence rises with age, eventually peaking in those aged 35-45 years. 

The presentation of EoE can be quite variable depending on patient age. Pediatric patients are significantly more likely to experience failure to thrive, vomiting, and heartburn, whereas their adult counterparts more often present with food impaction and dysphagia. 

At the 2018 AGREE international consensus conference, researchers defined diagnostic criteria as presence of esophageal dysfunction symptoms; exclusion of non-EoE disorders, such as gastroesophageal reflux disease and achalasia; and esophageal biopsy findings of at least 15 eosinophils per high-power field (or approximately 60 eosinophils per mm2). 

Endoscopic findings can also be crucial in diagnosing EoE because patients with this disease often present with inflammatory patterns recognizable in the form of exudates, furrows, and edema and/or fibrotic phenotypes such as the presence of rings and stenosis. Clinicians are advised to refer to the Endoscopic Reference Score proposed by Hirano and colleagues

4. Treatment approaches rely on the ‘3 Ds.’

Although there is currently no leading strategy for the primary treatment of EoE, clinicians can avail themselves of suggested pathways

The lack of a treatment hierarchy means that patients typically are very involved in selecting the therapy that works best for them. Physicians should be aware that patients researching EoE on their own might not find the information they need. A recent study found that the artificial intelligence tool ChatGPT was highly inaccurate when it came to providing answers about EoE.

The treatment strategies that clinicians and their patients can choose from revolve around the “3 Ds”: diet, drugs, and dilation. 

Diet:

Three dietary interventions are available for EoE treatment: 

  • Elemental diet, in which patients consume only an amino-acid based formula that does not include any intact proteins
  • Empiric elimination diet, which removes foods more commonly associated with food allergy regardless of whether there has been a positive allergy testing result
  • Allergy testing-directed food elimination, which involves avoidance of all foods for which specific antibodies were detected or that tested positive on skin-prick tests

Each of these dietary interventions has clear advantages and drawbacks that should be discussed with patients. Elemental diets achieve robust histologic responses, yet their highly restrictive nature makes compliance difficult and can greatly impair patients’ quality of life. 

Empiric elimination diets are the most popular choice and have shown high response rates. A common approach is to begin by removing six common foods (milk, wheat, egg, soy, nuts, and fish/seafood), which are then gradually reintroduced to identify the culprits. However, patients must be motivated to follow this process, and the likelihood it will be successful is greatly enhanced with assistance from a dietitian, which may not always be possible.

Last, allergy testing-guided food elimination diets have been reported to produce remissions rates of just under 50%, and the skin allergy tests they primarily rely on have been criticized for being unreliable. 

Drugs:

The treatment of EoE experienced a significant advance in 2022 when dupilumab, a monoclonal antibody that binds to the interleukin (IL)–4 receptor alpha, became the first drug approved by the US Food and Drug Administration (FDA) for treating EoE in adults and pediatric patients aged 12 years or older. The drug was approved by the European Commission in 2023. In late January 2024, the FDA expanded dupilumab’s approval to children aged 1-11 years and weighing ≥ 15 kg after positive histologic remission and safety results were reported in the two-part phase 3 EoE KIDS trial. 

In addition, the FDA approved budesonide, the first oral treatment for EoE, in February 2024. 

These approvals have expanded treatment options beyond proton pump inhibitors (PPIs) and topical glucocorticosteroids, both of which received only nuanced recommendations for use under US and UK clinical guidelines. 

A recent meta-analysis found that PPIs, off-label and EoE-specific topical steroids, and biologics had greater efficacy than did placebo in achieving histological remission. However, significant heterogeneity in the included studies’ eligibility criteria and outcome measures prevented development of a “solid therapeutic hierarchy,” the authors noted. 

In addition, researchers are investigating therapies targeting IL-5 (eg, mepolizumabreslizumab, and benralizumab) and other key inflammatory mediators in EoE, such as Siglec-8 (lirentelimab), IL-13 (cendakimab), and the sphingosine 1–phosphate receptor (etrasimod)

Dilation:

Finally, patients with significant strictures can benefit from dilation performed via through-the-scope balloons or Savary-Gilliard bougies, which can significantly and immediately improve symptoms even if they cannot address the underlying inflammation. Concerns that dilation would lead to increased complications, such as perforation and mucosal tears, do not appear to be borne out by recent data

5. Reducing diagnosis delays is crucial for limiting EoE-associated morbidity.

Despite efforts to bring attention to EoE, evidence suggests that delays between symptom onset and diagnosis are common, and result in treatment delays. One study found a median lag time of 6 years. 

The longer the delay in treatment, the more likely patients are to develop esophageal rings, a long narrowing in the esophageal caliber, or focal strictures. For example, diagnostic delays of more than 20 years result in prevalence rates of 70.8% for esophageal strictures, compared with 17.2% with delays of 0-2 years. 

Simply put, the sooner one can identify EoE and begin treatment, the more likely patients are to be spared its worst effects. 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Eosinophilic esophagitis (EoE) is a chronic inflammatory disease of the esophagus that affects both children and adults. EoE is defined by symptoms of esophageal dysfunction (eg, dysphagia, vomiting, difficulty in feeding), with presentation varying depending on patient age.

The global incidence of EoE has increased in recent decades. In the United States alone, EoE is estimated to affect approximately 150,000 people and result in as much as $1.4 billion in annual healthcare costs.

There currently is no clear treatment hierarchy for EoE, and long delays between symptom onset and diagnoses are common.

Still, the knowledge base surrounding the disease is growing, and existing interventions have shown tremendous success at curbing symptoms and disease progression. The recent approvals of a monoclonal antibody and the first oral agent for EoE treatment have suddenly expanded medication options — at a time when other promising therapies are being investigated too.

To help clinicians stay up to date on the latest information on this debilitating disease, here are five things to know about EoE.

1. EoE prevalence is increasing although not consistently around the globe.

EoE was first recognized as a distinct clinical entity in the early 1990s, when it was considered a relatively rare disease. Now, the incidence and prevalence rates of EoE are escalating at rates that cannot be explained by increased disease awareness and detection. 

Although EoE has been diagnosed in Latin America, the Middle East, and Asia, such instances are relatively uncommon in comparison with the spiking rates noted in the United States; in Western Europe, including Denmarkthe Netherlands, and Switzerland; and in Australia

Emerging data suggest that climate and location may be a factor in the varying incidence rates of EoE. An analysis of 233,649 patients in a US pathology database reported that EoE was more common in cold and arid climate zones than in tropical zones. Another study suggests that EoE is more common in low-density, rural environments compared with urban settings. 

2. Environmental and food exposures may trigger EoE, and genetics probably play a role.

The unequal geographic distribution of EoE lends credence to the theory that external triggers, which naturally differ in various locales, play an outsized role in its development.

Mice studies have indicated that the inhalation of allergens induces notable eosinophil infiltration and degranulation, and a pilot study conducted in New York City found that EoE symptoms peaked during the July-to-September period when grass pollen counts were at their highest.

Early-life factors that can result in alteration to the microbiome have also been identified as possibly influencing EoE development. They include cesarean delivery, preterm delivery, admission to a neonatal intensive care unit, infant formula use, and maternal or infant use of antibiotics. Conversely, evidence suggests that Helicobacter pylori infection may be protective against EoE due to immunomodulating effects that have not yet been sufficiently identified in the literature.

Yet, the clearest association between EoE and outside triggers is found with food exposures. In one analysis of pediatric patients, the items that were most commonly associated with elevated food-specific serum immunoglobulin E antibodies in patients with EoE were milk (78%), wheat (69%), eggs (64%), peanuts (54%), and soy (51%). Food allergies are also on the uptick in countries with rising EoE rates, suggesting that the two trends may be interrelated.

From a genetic standpoint, EoE is more likely to develop in those with first-degree relatives with the disease than in the general population. Thirty independent genes thought to be associated with EoE have been identified. EoE is also significantly more common in men than in women.

3. Diagnosis requires knowing the symptoms, excluding other disorders, and performing biopsy.

EoE can occur early in life, with approximately one third of children with the disease presenting under age 5 years. The prevalence rises with age, eventually peaking in those aged 35-45 years. 

The presentation of EoE can be quite variable depending on patient age. Pediatric patients are significantly more likely to experience failure to thrive, vomiting, and heartburn, whereas their adult counterparts more often present with food impaction and dysphagia. 

At the 2018 AGREE international consensus conference, researchers defined diagnostic criteria as presence of esophageal dysfunction symptoms; exclusion of non-EoE disorders, such as gastroesophageal reflux disease and achalasia; and esophageal biopsy findings of at least 15 eosinophils per high-power field (or approximately 60 eosinophils per mm2). 

Endoscopic findings can also be crucial in diagnosing EoE because patients with this disease often present with inflammatory patterns recognizable in the form of exudates, furrows, and edema and/or fibrotic phenotypes such as the presence of rings and stenosis. Clinicians are advised to refer to the Endoscopic Reference Score proposed by Hirano and colleagues

4. Treatment approaches rely on the ‘3 Ds.’

Although there is currently no leading strategy for the primary treatment of EoE, clinicians can avail themselves of suggested pathways

The lack of a treatment hierarchy means that patients typically are very involved in selecting the therapy that works best for them. Physicians should be aware that patients researching EoE on their own might not find the information they need. A recent study found that the artificial intelligence tool ChatGPT was highly inaccurate when it came to providing answers about EoE.

The treatment strategies that clinicians and their patients can choose from revolve around the “3 Ds”: diet, drugs, and dilation. 

Diet:

Three dietary interventions are available for EoE treatment: 

  • Elemental diet, in which patients consume only an amino-acid based formula that does not include any intact proteins
  • Empiric elimination diet, which removes foods more commonly associated with food allergy regardless of whether there has been a positive allergy testing result
  • Allergy testing-directed food elimination, which involves avoidance of all foods for which specific antibodies were detected or that tested positive on skin-prick tests

Each of these dietary interventions has clear advantages and drawbacks that should be discussed with patients. Elemental diets achieve robust histologic responses, yet their highly restrictive nature makes compliance difficult and can greatly impair patients’ quality of life. 

Empiric elimination diets are the most popular choice and have shown high response rates. A common approach is to begin by removing six common foods (milk, wheat, egg, soy, nuts, and fish/seafood), which are then gradually reintroduced to identify the culprits. However, patients must be motivated to follow this process, and the likelihood it will be successful is greatly enhanced with assistance from a dietitian, which may not always be possible.

Last, allergy testing-guided food elimination diets have been reported to produce remissions rates of just under 50%, and the skin allergy tests they primarily rely on have been criticized for being unreliable. 

Drugs:

The treatment of EoE experienced a significant advance in 2022 when dupilumab, a monoclonal antibody that binds to the interleukin (IL)–4 receptor alpha, became the first drug approved by the US Food and Drug Administration (FDA) for treating EoE in adults and pediatric patients aged 12 years or older. The drug was approved by the European Commission in 2023. In late January 2024, the FDA expanded dupilumab’s approval to children aged 1-11 years and weighing ≥ 15 kg after positive histologic remission and safety results were reported in the two-part phase 3 EoE KIDS trial. 

In addition, the FDA approved budesonide, the first oral treatment for EoE, in February 2024. 

These approvals have expanded treatment options beyond proton pump inhibitors (PPIs) and topical glucocorticosteroids, both of which received only nuanced recommendations for use under US and UK clinical guidelines. 

A recent meta-analysis found that PPIs, off-label and EoE-specific topical steroids, and biologics had greater efficacy than did placebo in achieving histological remission. However, significant heterogeneity in the included studies’ eligibility criteria and outcome measures prevented development of a “solid therapeutic hierarchy,” the authors noted. 

In addition, researchers are investigating therapies targeting IL-5 (eg, mepolizumabreslizumab, and benralizumab) and other key inflammatory mediators in EoE, such as Siglec-8 (lirentelimab), IL-13 (cendakimab), and the sphingosine 1–phosphate receptor (etrasimod)

Dilation:

Finally, patients with significant strictures can benefit from dilation performed via through-the-scope balloons or Savary-Gilliard bougies, which can significantly and immediately improve symptoms even if they cannot address the underlying inflammation. Concerns that dilation would lead to increased complications, such as perforation and mucosal tears, do not appear to be borne out by recent data

5. Reducing diagnosis delays is crucial for limiting EoE-associated morbidity.

Despite efforts to bring attention to EoE, evidence suggests that delays between symptom onset and diagnosis are common, and result in treatment delays. One study found a median lag time of 6 years. 

The longer the delay in treatment, the more likely patients are to develop esophageal rings, a long narrowing in the esophageal caliber, or focal strictures. For example, diagnostic delays of more than 20 years result in prevalence rates of 70.8% for esophageal strictures, compared with 17.2% with delays of 0-2 years. 

Simply put, the sooner one can identify EoE and begin treatment, the more likely patients are to be spared its worst effects. 
 

A version of this article appeared on Medscape.com.

Eosinophilic esophagitis (EoE) is a chronic inflammatory disease of the esophagus that affects both children and adults. EoE is defined by symptoms of esophageal dysfunction (eg, dysphagia, vomiting, difficulty in feeding), with presentation varying depending on patient age.

The global incidence of EoE has increased in recent decades. In the United States alone, EoE is estimated to affect approximately 150,000 people and result in as much as $1.4 billion in annual healthcare costs.

There currently is no clear treatment hierarchy for EoE, and long delays between symptom onset and diagnoses are common.

Still, the knowledge base surrounding the disease is growing, and existing interventions have shown tremendous success at curbing symptoms and disease progression. The recent approvals of a monoclonal antibody and the first oral agent for EoE treatment have suddenly expanded medication options — at a time when other promising therapies are being investigated too.

To help clinicians stay up to date on the latest information on this debilitating disease, here are five things to know about EoE.

1. EoE prevalence is increasing although not consistently around the globe.

EoE was first recognized as a distinct clinical entity in the early 1990s, when it was considered a relatively rare disease. Now, the incidence and prevalence rates of EoE are escalating at rates that cannot be explained by increased disease awareness and detection. 

Although EoE has been diagnosed in Latin America, the Middle East, and Asia, such instances are relatively uncommon in comparison with the spiking rates noted in the United States; in Western Europe, including Denmarkthe Netherlands, and Switzerland; and in Australia

Emerging data suggest that climate and location may be a factor in the varying incidence rates of EoE. An analysis of 233,649 patients in a US pathology database reported that EoE was more common in cold and arid climate zones than in tropical zones. Another study suggests that EoE is more common in low-density, rural environments compared with urban settings. 

2. Environmental and food exposures may trigger EoE, and genetics probably play a role.

The unequal geographic distribution of EoE lends credence to the theory that external triggers, which naturally differ in various locales, play an outsized role in its development.

Mice studies have indicated that the inhalation of allergens induces notable eosinophil infiltration and degranulation, and a pilot study conducted in New York City found that EoE symptoms peaked during the July-to-September period when grass pollen counts were at their highest.

Early-life factors that can result in alteration to the microbiome have also been identified as possibly influencing EoE development. They include cesarean delivery, preterm delivery, admission to a neonatal intensive care unit, infant formula use, and maternal or infant use of antibiotics. Conversely, evidence suggests that Helicobacter pylori infection may be protective against EoE due to immunomodulating effects that have not yet been sufficiently identified in the literature.

Yet, the clearest association between EoE and outside triggers is found with food exposures. In one analysis of pediatric patients, the items that were most commonly associated with elevated food-specific serum immunoglobulin E antibodies in patients with EoE were milk (78%), wheat (69%), eggs (64%), peanuts (54%), and soy (51%). Food allergies are also on the uptick in countries with rising EoE rates, suggesting that the two trends may be interrelated.

From a genetic standpoint, EoE is more likely to develop in those with first-degree relatives with the disease than in the general population. Thirty independent genes thought to be associated with EoE have been identified. EoE is also significantly more common in men than in women.

3. Diagnosis requires knowing the symptoms, excluding other disorders, and performing biopsy.

EoE can occur early in life, with approximately one third of children with the disease presenting under age 5 years. The prevalence rises with age, eventually peaking in those aged 35-45 years. 

The presentation of EoE can be quite variable depending on patient age. Pediatric patients are significantly more likely to experience failure to thrive, vomiting, and heartburn, whereas their adult counterparts more often present with food impaction and dysphagia. 

At the 2018 AGREE international consensus conference, researchers defined diagnostic criteria as presence of esophageal dysfunction symptoms; exclusion of non-EoE disorders, such as gastroesophageal reflux disease and achalasia; and esophageal biopsy findings of at least 15 eosinophils per high-power field (or approximately 60 eosinophils per mm2). 

Endoscopic findings can also be crucial in diagnosing EoE because patients with this disease often present with inflammatory patterns recognizable in the form of exudates, furrows, and edema and/or fibrotic phenotypes such as the presence of rings and stenosis. Clinicians are advised to refer to the Endoscopic Reference Score proposed by Hirano and colleagues

4. Treatment approaches rely on the ‘3 Ds.’

Although there is currently no leading strategy for the primary treatment of EoE, clinicians can avail themselves of suggested pathways

The lack of a treatment hierarchy means that patients typically are very involved in selecting the therapy that works best for them. Physicians should be aware that patients researching EoE on their own might not find the information they need. A recent study found that the artificial intelligence tool ChatGPT was highly inaccurate when it came to providing answers about EoE.

The treatment strategies that clinicians and their patients can choose from revolve around the “3 Ds”: diet, drugs, and dilation. 

Diet:

Three dietary interventions are available for EoE treatment: 

  • Elemental diet, in which patients consume only an amino-acid based formula that does not include any intact proteins
  • Empiric elimination diet, which removes foods more commonly associated with food allergy regardless of whether there has been a positive allergy testing result
  • Allergy testing-directed food elimination, which involves avoidance of all foods for which specific antibodies were detected or that tested positive on skin-prick tests

Each of these dietary interventions has clear advantages and drawbacks that should be discussed with patients. Elemental diets achieve robust histologic responses, yet their highly restrictive nature makes compliance difficult and can greatly impair patients’ quality of life. 

Empiric elimination diets are the most popular choice and have shown high response rates. A common approach is to begin by removing six common foods (milk, wheat, egg, soy, nuts, and fish/seafood), which are then gradually reintroduced to identify the culprits. However, patients must be motivated to follow this process, and the likelihood it will be successful is greatly enhanced with assistance from a dietitian, which may not always be possible.

Last, allergy testing-guided food elimination diets have been reported to produce remissions rates of just under 50%, and the skin allergy tests they primarily rely on have been criticized for being unreliable. 

Drugs:

The treatment of EoE experienced a significant advance in 2022 when dupilumab, a monoclonal antibody that binds to the interleukin (IL)–4 receptor alpha, became the first drug approved by the US Food and Drug Administration (FDA) for treating EoE in adults and pediatric patients aged 12 years or older. The drug was approved by the European Commission in 2023. In late January 2024, the FDA expanded dupilumab’s approval to children aged 1-11 years and weighing ≥ 15 kg after positive histologic remission and safety results were reported in the two-part phase 3 EoE KIDS trial. 

In addition, the FDA approved budesonide, the first oral treatment for EoE, in February 2024. 

These approvals have expanded treatment options beyond proton pump inhibitors (PPIs) and topical glucocorticosteroids, both of which received only nuanced recommendations for use under US and UK clinical guidelines. 

A recent meta-analysis found that PPIs, off-label and EoE-specific topical steroids, and biologics had greater efficacy than did placebo in achieving histological remission. However, significant heterogeneity in the included studies’ eligibility criteria and outcome measures prevented development of a “solid therapeutic hierarchy,” the authors noted. 

In addition, researchers are investigating therapies targeting IL-5 (eg, mepolizumabreslizumab, and benralizumab) and other key inflammatory mediators in EoE, such as Siglec-8 (lirentelimab), IL-13 (cendakimab), and the sphingosine 1–phosphate receptor (etrasimod)

Dilation:

Finally, patients with significant strictures can benefit from dilation performed via through-the-scope balloons or Savary-Gilliard bougies, which can significantly and immediately improve symptoms even if they cannot address the underlying inflammation. Concerns that dilation would lead to increased complications, such as perforation and mucosal tears, do not appear to be borne out by recent data

5. Reducing diagnosis delays is crucial for limiting EoE-associated morbidity.

Despite efforts to bring attention to EoE, evidence suggests that delays between symptom onset and diagnosis are common, and result in treatment delays. One study found a median lag time of 6 years. 

The longer the delay in treatment, the more likely patients are to develop esophageal rings, a long narrowing in the esophageal caliber, or focal strictures. For example, diagnostic delays of more than 20 years result in prevalence rates of 70.8% for esophageal strictures, compared with 17.2% with delays of 0-2 years. 

Simply put, the sooner one can identify EoE and begin treatment, the more likely patients are to be spared its worst effects. 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article