AGA Clinical Guideline Stresses Patient Preferences in Barrett’s Treatment

Article Type
Changed
Fri, 05/17/2024 - 05:15

The American Gastroenterological Association (AGA) has released updated evidence-based recommendations on the endoscopic eradication therapy (EET) of Barrett’s esophagus (BE) and related neoplasms.

Published in Gastroenterology , the clinical practice guideline makes five main recommendations — one strong and four conditional — based on very low to moderate evidence. It also stresses that providers should practice shared decision making according to patient preferences and risk perception.

Dr. Joel H. Rubenstein, Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor
University of Michigan Medical School
Dr. Joel H. Rubenstein

For the most part, the new guideline is not a significant departure from the way expert endoscopists are currently practicing EET for BE and related neoplasia, gastroenterologist Joel H. Rubenstein, MD, MSc, AGAF, of the Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor, said in an interview. One of three first authors of the guideline, Dr. Rubenstein added, “There is, however, considerable variability in how endoscopists practice, and we hope this guidance will serve as a useful resource to refer to for best practices.”

Added gastroenterologist Tarek Sawas, MD, MPH, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas, “We hope the update will provide some clarity for practice and for implementation, while allowing gastroenterologists the freedom to decide what is best for patients based on lesion characteristics.”

Dr. Tarek Sawas, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas
UT Southwestern
Dr. Tarek Sawas


Dr. Sawas added that one of the differences in the new guideline relates to the approach to low-grade dysplasia. While earlier guidance favored treatment over surveillance, patient preferences should now be factored into management. “Some patients are risk-averse and prefer to wait and watch, while others place more value on treatment and just want to get on with it,” he said.

When this guideline was circulated for public comment, “the areas prompting the most feedback was on our current suggestions against the routine use of EET in non-dysplastic BE and for the use of either endoscopic mucosal resection [EMR] or endoscopic submucosal dissection [ESD] for resection — with the expectation that the vast majority may be managed with EMR,” Dr. Rubenstein said.

“We felt that ESD would work best for larger lesions,” explained Dr. Sawas. “There aren’t a lot data in this area, just some observational studies, but we should have more data for comparison in the next few years.”

The incidence of esophageal adenocarcinoma continues to rise and an update was deemed in order since the AGA’s last formal guidance on this subject using the systematic GRADE (Grading of Recommendations Assessment, Development, and Evaluation) methodology was issued in 2011. “In the following time span, there’s been a lot of research, particularly with regard to management of low-grade dysplasia and endoscopic resection techniques,”  Dr. Rubenstein said.
 

Key Recommendations

The 14 guideline panelists made the following suggestions for treatment and implementation based on different levels of certainty of evidence (CoE):

1. If high-grade dysplasia (HGD) is present, EET is recommended over surveillance, with subsequent surveillance performed at 3, 6, and 12 months, and annually thereafter. (Strong recommendation, moderate CoE).

Surveillance endoscopies should obtain targeted tissue samples of visible lesions and random biopsies of the cardia and distal 2 cm of the tubular esophagus.

2. In patients with low-grade dysplasia, EET is also preferred to surveillance. But for those placing a higher value on the certain harms and a lower value on the uncertain benefits of EET for reducing mortality, surveillance endoscopy is a reasonable option. (Conditional recommendation, low CoE).

Following EET, clinicians should perform surveillance at years 1 and 3 after complete eradication of intestinal metaplasia, then revert to the surveillance intervals used in non-dysplastic BE.

3. For non-dysplastic BE, the AGA advises against the routine use of EET. (Conditional recommendation, low CoE).

4. Patients undergoing EET should have resection of visible lesions followed by ablation of the remaining BE segment rather than resection of the entire segment.

In patients with only a small area of BE beyond the visible lesion, endoscopic resection is acceptable and may be preferred over repeated ablation. Radiofrequency ablation is the preferred ablative modality. (Conditional recommendation, very low CoE).

5. For treating visible neoplastic lesions the AGA suggests either EMR or ESD based on lesion characteristics. (Conditional recommendation, very low CoE).

Patients with suspected T1 esophageal adenocarcinoma (EAC) should be considered for EET. Endoscopic resection is recommended over endoscopic ultrasound for distinguishing EAC from HGD and for staging depth of invasion.

The vast majority of neoplastic lesions may be managed with EMR rather than ESD. Patients who have bulky lesions, or lesions highly suspicious of at least T1b invasion and are deemed candidates for endoscopic resection might benefit from ESD over EMR. Those with previously failed EMR might benefit from ESD.

As to the generally low quality of the supporting evidence, Dr. Rubenstein said, “Unfortunately, very few decisions we make in medicine are supported by high certainty of evidence, but we still have to make a decision.” He pointed out that the guideline highlights areas for future research that could help strengthen or change the guideline’s recommendations.

Considering benefits and harms, the panelists concluded that overall CoE across critical desirable outcomes of disease progression to EAC was moderate. Patient-important outcomes informing the harms were strictures, major bleeding perforation, and serious adverse events.
 

Lifestyle

The guidance also urges providers to counsel BE patients on tobacco cessation and weight loss if needed, and notes the specter of cancer may incentivize patients to make lifestyle changes.

The most common causes of death in EET patients are cardiovascular disease and other cancers, for which tobacco use and obesity are also major risk factors, and tobacco is associated with strictures, the panelists wrote. “The prospect of progression to cancer in patients with dysplastic BE often holds greater valence than prior counseling attempts, and patients may re-commit to such efforts following consultation for EET.”
 

Going Forward

Areas for future attention include:

  • Identifying populations with non-dysplastic BE whose risk warrants EET
  • Balancing risk and benefit of EET in low-grade dysplasia
  • Randomized controlled trials comparing EMR and ESD in higher-risk lesions
  • Optimal management of post-EET pain
  • Stricture prevention and control
  • Managing resistant/recurrent disease beyond reflux control
  • Optimal surveillance and biopsy strategies following EETThis guideline was supported by the National Institutes of Health, the Department of Defense, the Veterans Administration Health Services and Research Division, and the Katy O. and Paul M. Rady Endowed Chair in Esophageal Cancer Research at the University of Colorado.

Dr. Sawas had no competing interests to disclose. Dr. Rubenstein reported research funding from Lucid Diagnostics.

Several other panelists reported research funding or consultation fees from various pharmaceutical and biotechnology companies.

Publications
Topics
Sections

The American Gastroenterological Association (AGA) has released updated evidence-based recommendations on the endoscopic eradication therapy (EET) of Barrett’s esophagus (BE) and related neoplasms.

Published in Gastroenterology , the clinical practice guideline makes five main recommendations — one strong and four conditional — based on very low to moderate evidence. It also stresses that providers should practice shared decision making according to patient preferences and risk perception.

Dr. Joel H. Rubenstein, Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor
University of Michigan Medical School
Dr. Joel H. Rubenstein

For the most part, the new guideline is not a significant departure from the way expert endoscopists are currently practicing EET for BE and related neoplasia, gastroenterologist Joel H. Rubenstein, MD, MSc, AGAF, of the Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor, said in an interview. One of three first authors of the guideline, Dr. Rubenstein added, “There is, however, considerable variability in how endoscopists practice, and we hope this guidance will serve as a useful resource to refer to for best practices.”

Added gastroenterologist Tarek Sawas, MD, MPH, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas, “We hope the update will provide some clarity for practice and for implementation, while allowing gastroenterologists the freedom to decide what is best for patients based on lesion characteristics.”

Dr. Tarek Sawas, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas
UT Southwestern
Dr. Tarek Sawas


Dr. Sawas added that one of the differences in the new guideline relates to the approach to low-grade dysplasia. While earlier guidance favored treatment over surveillance, patient preferences should now be factored into management. “Some patients are risk-averse and prefer to wait and watch, while others place more value on treatment and just want to get on with it,” he said.

When this guideline was circulated for public comment, “the areas prompting the most feedback was on our current suggestions against the routine use of EET in non-dysplastic BE and for the use of either endoscopic mucosal resection [EMR] or endoscopic submucosal dissection [ESD] for resection — with the expectation that the vast majority may be managed with EMR,” Dr. Rubenstein said.

“We felt that ESD would work best for larger lesions,” explained Dr. Sawas. “There aren’t a lot data in this area, just some observational studies, but we should have more data for comparison in the next few years.”

The incidence of esophageal adenocarcinoma continues to rise and an update was deemed in order since the AGA’s last formal guidance on this subject using the systematic GRADE (Grading of Recommendations Assessment, Development, and Evaluation) methodology was issued in 2011. “In the following time span, there’s been a lot of research, particularly with regard to management of low-grade dysplasia and endoscopic resection techniques,”  Dr. Rubenstein said.
 

Key Recommendations

The 14 guideline panelists made the following suggestions for treatment and implementation based on different levels of certainty of evidence (CoE):

1. If high-grade dysplasia (HGD) is present, EET is recommended over surveillance, with subsequent surveillance performed at 3, 6, and 12 months, and annually thereafter. (Strong recommendation, moderate CoE).

Surveillance endoscopies should obtain targeted tissue samples of visible lesions and random biopsies of the cardia and distal 2 cm of the tubular esophagus.

2. In patients with low-grade dysplasia, EET is also preferred to surveillance. But for those placing a higher value on the certain harms and a lower value on the uncertain benefits of EET for reducing mortality, surveillance endoscopy is a reasonable option. (Conditional recommendation, low CoE).

Following EET, clinicians should perform surveillance at years 1 and 3 after complete eradication of intestinal metaplasia, then revert to the surveillance intervals used in non-dysplastic BE.

3. For non-dysplastic BE, the AGA advises against the routine use of EET. (Conditional recommendation, low CoE).

4. Patients undergoing EET should have resection of visible lesions followed by ablation of the remaining BE segment rather than resection of the entire segment.

In patients with only a small area of BE beyond the visible lesion, endoscopic resection is acceptable and may be preferred over repeated ablation. Radiofrequency ablation is the preferred ablative modality. (Conditional recommendation, very low CoE).

5. For treating visible neoplastic lesions the AGA suggests either EMR or ESD based on lesion characteristics. (Conditional recommendation, very low CoE).

Patients with suspected T1 esophageal adenocarcinoma (EAC) should be considered for EET. Endoscopic resection is recommended over endoscopic ultrasound for distinguishing EAC from HGD and for staging depth of invasion.

The vast majority of neoplastic lesions may be managed with EMR rather than ESD. Patients who have bulky lesions, or lesions highly suspicious of at least T1b invasion and are deemed candidates for endoscopic resection might benefit from ESD over EMR. Those with previously failed EMR might benefit from ESD.

As to the generally low quality of the supporting evidence, Dr. Rubenstein said, “Unfortunately, very few decisions we make in medicine are supported by high certainty of evidence, but we still have to make a decision.” He pointed out that the guideline highlights areas for future research that could help strengthen or change the guideline’s recommendations.

Considering benefits and harms, the panelists concluded that overall CoE across critical desirable outcomes of disease progression to EAC was moderate. Patient-important outcomes informing the harms were strictures, major bleeding perforation, and serious adverse events.
 

Lifestyle

The guidance also urges providers to counsel BE patients on tobacco cessation and weight loss if needed, and notes the specter of cancer may incentivize patients to make lifestyle changes.

The most common causes of death in EET patients are cardiovascular disease and other cancers, for which tobacco use and obesity are also major risk factors, and tobacco is associated with strictures, the panelists wrote. “The prospect of progression to cancer in patients with dysplastic BE often holds greater valence than prior counseling attempts, and patients may re-commit to such efforts following consultation for EET.”
 

Going Forward

Areas for future attention include:

  • Identifying populations with non-dysplastic BE whose risk warrants EET
  • Balancing risk and benefit of EET in low-grade dysplasia
  • Randomized controlled trials comparing EMR and ESD in higher-risk lesions
  • Optimal management of post-EET pain
  • Stricture prevention and control
  • Managing resistant/recurrent disease beyond reflux control
  • Optimal surveillance and biopsy strategies following EETThis guideline was supported by the National Institutes of Health, the Department of Defense, the Veterans Administration Health Services and Research Division, and the Katy O. and Paul M. Rady Endowed Chair in Esophageal Cancer Research at the University of Colorado.

Dr. Sawas had no competing interests to disclose. Dr. Rubenstein reported research funding from Lucid Diagnostics.

Several other panelists reported research funding or consultation fees from various pharmaceutical and biotechnology companies.

The American Gastroenterological Association (AGA) has released updated evidence-based recommendations on the endoscopic eradication therapy (EET) of Barrett’s esophagus (BE) and related neoplasms.

Published in Gastroenterology , the clinical practice guideline makes five main recommendations — one strong and four conditional — based on very low to moderate evidence. It also stresses that providers should practice shared decision making according to patient preferences and risk perception.

Dr. Joel H. Rubenstein, Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor
University of Michigan Medical School
Dr. Joel H. Rubenstein

For the most part, the new guideline is not a significant departure from the way expert endoscopists are currently practicing EET for BE and related neoplasia, gastroenterologist Joel H. Rubenstein, MD, MSc, AGAF, of the Barrett’s Esophagus Program in the Division of Gastroenterology at University of Michigan Medical School at Ann Arbor, said in an interview. One of three first authors of the guideline, Dr. Rubenstein added, “There is, however, considerable variability in how endoscopists practice, and we hope this guidance will serve as a useful resource to refer to for best practices.”

Added gastroenterologist Tarek Sawas, MD, MPH, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas, “We hope the update will provide some clarity for practice and for implementation, while allowing gastroenterologists the freedom to decide what is best for patients based on lesion characteristics.”

Dr. Tarek Sawas, assistant professor of internal medicine at UT Southwestern Medical Center in Dallas
UT Southwestern
Dr. Tarek Sawas


Dr. Sawas added that one of the differences in the new guideline relates to the approach to low-grade dysplasia. While earlier guidance favored treatment over surveillance, patient preferences should now be factored into management. “Some patients are risk-averse and prefer to wait and watch, while others place more value on treatment and just want to get on with it,” he said.

When this guideline was circulated for public comment, “the areas prompting the most feedback was on our current suggestions against the routine use of EET in non-dysplastic BE and for the use of either endoscopic mucosal resection [EMR] or endoscopic submucosal dissection [ESD] for resection — with the expectation that the vast majority may be managed with EMR,” Dr. Rubenstein said.

“We felt that ESD would work best for larger lesions,” explained Dr. Sawas. “There aren’t a lot data in this area, just some observational studies, but we should have more data for comparison in the next few years.”

The incidence of esophageal adenocarcinoma continues to rise and an update was deemed in order since the AGA’s last formal guidance on this subject using the systematic GRADE (Grading of Recommendations Assessment, Development, and Evaluation) methodology was issued in 2011. “In the following time span, there’s been a lot of research, particularly with regard to management of low-grade dysplasia and endoscopic resection techniques,”  Dr. Rubenstein said.
 

Key Recommendations

The 14 guideline panelists made the following suggestions for treatment and implementation based on different levels of certainty of evidence (CoE):

1. If high-grade dysplasia (HGD) is present, EET is recommended over surveillance, with subsequent surveillance performed at 3, 6, and 12 months, and annually thereafter. (Strong recommendation, moderate CoE).

Surveillance endoscopies should obtain targeted tissue samples of visible lesions and random biopsies of the cardia and distal 2 cm of the tubular esophagus.

2. In patients with low-grade dysplasia, EET is also preferred to surveillance. But for those placing a higher value on the certain harms and a lower value on the uncertain benefits of EET for reducing mortality, surveillance endoscopy is a reasonable option. (Conditional recommendation, low CoE).

Following EET, clinicians should perform surveillance at years 1 and 3 after complete eradication of intestinal metaplasia, then revert to the surveillance intervals used in non-dysplastic BE.

3. For non-dysplastic BE, the AGA advises against the routine use of EET. (Conditional recommendation, low CoE).

4. Patients undergoing EET should have resection of visible lesions followed by ablation of the remaining BE segment rather than resection of the entire segment.

In patients with only a small area of BE beyond the visible lesion, endoscopic resection is acceptable and may be preferred over repeated ablation. Radiofrequency ablation is the preferred ablative modality. (Conditional recommendation, very low CoE).

5. For treating visible neoplastic lesions the AGA suggests either EMR or ESD based on lesion characteristics. (Conditional recommendation, very low CoE).

Patients with suspected T1 esophageal adenocarcinoma (EAC) should be considered for EET. Endoscopic resection is recommended over endoscopic ultrasound for distinguishing EAC from HGD and for staging depth of invasion.

The vast majority of neoplastic lesions may be managed with EMR rather than ESD. Patients who have bulky lesions, or lesions highly suspicious of at least T1b invasion and are deemed candidates for endoscopic resection might benefit from ESD over EMR. Those with previously failed EMR might benefit from ESD.

As to the generally low quality of the supporting evidence, Dr. Rubenstein said, “Unfortunately, very few decisions we make in medicine are supported by high certainty of evidence, but we still have to make a decision.” He pointed out that the guideline highlights areas for future research that could help strengthen or change the guideline’s recommendations.

Considering benefits and harms, the panelists concluded that overall CoE across critical desirable outcomes of disease progression to EAC was moderate. Patient-important outcomes informing the harms were strictures, major bleeding perforation, and serious adverse events.
 

Lifestyle

The guidance also urges providers to counsel BE patients on tobacco cessation and weight loss if needed, and notes the specter of cancer may incentivize patients to make lifestyle changes.

The most common causes of death in EET patients are cardiovascular disease and other cancers, for which tobacco use and obesity are also major risk factors, and tobacco is associated with strictures, the panelists wrote. “The prospect of progression to cancer in patients with dysplastic BE often holds greater valence than prior counseling attempts, and patients may re-commit to such efforts following consultation for EET.”
 

Going Forward

Areas for future attention include:

  • Identifying populations with non-dysplastic BE whose risk warrants EET
  • Balancing risk and benefit of EET in low-grade dysplasia
  • Randomized controlled trials comparing EMR and ESD in higher-risk lesions
  • Optimal management of post-EET pain
  • Stricture prevention and control
  • Managing resistant/recurrent disease beyond reflux control
  • Optimal surveillance and biopsy strategies following EETThis guideline was supported by the National Institutes of Health, the Department of Defense, the Veterans Administration Health Services and Research Division, and the Katy O. and Paul M. Rady Endowed Chair in Esophageal Cancer Research at the University of Colorado.

Dr. Sawas had no competing interests to disclose. Dr. Rubenstein reported research funding from Lucid Diagnostics.

Several other panelists reported research funding or consultation fees from various pharmaceutical and biotechnology companies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

High-Alcohol Intake in MASLD Increases Risk of Cirrhosis

Critical Need for Early Tx of High-Risk Alcohol Use
Article Type
Changed
Fri, 05/10/2024 - 13:41

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Robert J. Wong, Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California
Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.

Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Dr. Tiffany Wu, Transplant Hepatology at Mayo Clinic in Rochester, Minnesota
Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Publications
Topics
Sections
Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Dr. Tiffany Wu, Transplant Hepatology at Mayo Clinic in Rochester, Minnesota
Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Dr. Tiffany Wu, Transplant Hepatology at Mayo Clinic in Rochester, Minnesota
Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Title
Critical Need for Early Tx of High-Risk Alcohol Use
Critical Need for Early Tx of High-Risk Alcohol Use

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Robert J. Wong, Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California
Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Robert J. Wong, Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California
Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Genes May Govern Intestinal Sites of Pediatric Crohn’s

Article Type
Changed
Thu, 05/09/2024 - 16:43

Genetic predisposition likely directs intestinal disease location in pediatric Crohn’s disease (CD) — whether colon-predominant (C-CD) or small-bowel-predominant (SB-CD), a small analysis in Cellular and Molecular Gastroenterology and Hepatology suggests.

Richard Kellermayer, MD, PhD, professor of pediatrics in the Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas, and colleagues compared the genetic makeup of patients based on their Crohn’s disease location — predominantly in the small bowel (L4) or predominantly in the colon (L2 and/or L3). They then generated bipartite networks of susceptibility genes to study the polygenic background of the disease subtypes. They hypothesize that such networks may govern where a patient develops Crohn’s disease.

Dr. Richard Kellermayer, Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas
Baylor College of Medicine
Dr. Richard Kellermayer

According to current understanding, as Dr. Kellermayer told GI & Hepatology News, most autoimmune disorders, CD included, develop in people with a genetic predisposition after serial environmental insults between conception and young adulthood. “As opposed to single-gene-associated genetic disorders, autoimmune diseases are linked to several hundred genes in which subtle anomalies can work in concert to predispose someone to a certain disorder,” he said. “We hope our findings will guide the development of personalized treatments based on the disease location at diagnosis to advance precision medicine.”
 

CD cases

Eight cases of SB-CD and 11 of C-CD met the inclusion criteria. Mean age at CD diagnosis was about 11 years for both subtypes, while 36.3% of patients with C-CD were female vs 25% of those with SB-CD. Ethnicity was 72.2% White in the C-CD group and 87.5% in the SB-CD group.

As to the main ileocolonic locations according to the Paris Classification of pediatric inflammatory bowel disease, 54.5% in the C-CD group had involvement at L2 and 45.5% at L3. In SB-CD cases, 100% had disease at L4b, 37.5% at L4, and 50% at L1.

The researchers identified 115 single-nucleotide polymorphisms (SNPs) with a combined annotation-dependent depletion (CADD) score on Phil’s Read Editor (PHRED) of >10 that was associated with 97 genes. PHRED is a computer program measuring the quality of the identification of nucleobases generated by automated DNA sequencing and scores the deleteriousness of single-nucleotide variants. The identified genes in this study had a significantly (P < .01) different allele variation between C-CD and SB-CD.

Among the top 28 candidates was an SNP in the EFNA3 gene with a CADD score > 20 for differentiating between the two phenotypically distinct CD groups. Furthermore, the EFNA3 rs17723260 (predicted to be deleterious) was found to have a significantly lower allele frequency (4.5%) in C-CD compared with its allele frequency of 37.5% in SB-CD (chi square P = .0097).

“This finding indicates that EFNA3 might play a role in modulating colonic inflammation, in which a deleterious genetic defect might provide protection against colitis (and direct autoimmunity against the proximal small bowel) in the polygenic background of CD,” the investigators wrote.

EFNA3 has been linked to both CD and ulcerative colitis. Another four genes associated with the top five SNP candidates had already been connected with IBD or mammalian intestinal inflammation. 

According to the authors, the biomedical literature and mouse model findings “implicate the translational relevance of our candidate gene compendium for directing colon- vs small-bowel–predominant CD development.” They hope the findings will be replicated in larger CD cohorts differentiated by disease location. “Our work may set the nidus for CD subtype–based precision medicine by guiding individualized treatment strategies,” they wrote.

This study was supported by the ProKIIDS Network of the Crohn’s and Colitis Foundation and the Public Health Service. It was also supported by the Wagner, Frugoni, and Klaasmeyer families’ Gutsy Kids Fund and by the DR and GL Laws Fund. The authors disclosed no conflicts of interest.

Publications
Topics
Sections

Genetic predisposition likely directs intestinal disease location in pediatric Crohn’s disease (CD) — whether colon-predominant (C-CD) or small-bowel-predominant (SB-CD), a small analysis in Cellular and Molecular Gastroenterology and Hepatology suggests.

Richard Kellermayer, MD, PhD, professor of pediatrics in the Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas, and colleagues compared the genetic makeup of patients based on their Crohn’s disease location — predominantly in the small bowel (L4) or predominantly in the colon (L2 and/or L3). They then generated bipartite networks of susceptibility genes to study the polygenic background of the disease subtypes. They hypothesize that such networks may govern where a patient develops Crohn’s disease.

Dr. Richard Kellermayer, Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas
Baylor College of Medicine
Dr. Richard Kellermayer

According to current understanding, as Dr. Kellermayer told GI & Hepatology News, most autoimmune disorders, CD included, develop in people with a genetic predisposition after serial environmental insults between conception and young adulthood. “As opposed to single-gene-associated genetic disorders, autoimmune diseases are linked to several hundred genes in which subtle anomalies can work in concert to predispose someone to a certain disorder,” he said. “We hope our findings will guide the development of personalized treatments based on the disease location at diagnosis to advance precision medicine.”
 

CD cases

Eight cases of SB-CD and 11 of C-CD met the inclusion criteria. Mean age at CD diagnosis was about 11 years for both subtypes, while 36.3% of patients with C-CD were female vs 25% of those with SB-CD. Ethnicity was 72.2% White in the C-CD group and 87.5% in the SB-CD group.

As to the main ileocolonic locations according to the Paris Classification of pediatric inflammatory bowel disease, 54.5% in the C-CD group had involvement at L2 and 45.5% at L3. In SB-CD cases, 100% had disease at L4b, 37.5% at L4, and 50% at L1.

The researchers identified 115 single-nucleotide polymorphisms (SNPs) with a combined annotation-dependent depletion (CADD) score on Phil’s Read Editor (PHRED) of >10 that was associated with 97 genes. PHRED is a computer program measuring the quality of the identification of nucleobases generated by automated DNA sequencing and scores the deleteriousness of single-nucleotide variants. The identified genes in this study had a significantly (P < .01) different allele variation between C-CD and SB-CD.

Among the top 28 candidates was an SNP in the EFNA3 gene with a CADD score > 20 for differentiating between the two phenotypically distinct CD groups. Furthermore, the EFNA3 rs17723260 (predicted to be deleterious) was found to have a significantly lower allele frequency (4.5%) in C-CD compared with its allele frequency of 37.5% in SB-CD (chi square P = .0097).

“This finding indicates that EFNA3 might play a role in modulating colonic inflammation, in which a deleterious genetic defect might provide protection against colitis (and direct autoimmunity against the proximal small bowel) in the polygenic background of CD,” the investigators wrote.

EFNA3 has been linked to both CD and ulcerative colitis. Another four genes associated with the top five SNP candidates had already been connected with IBD or mammalian intestinal inflammation. 

According to the authors, the biomedical literature and mouse model findings “implicate the translational relevance of our candidate gene compendium for directing colon- vs small-bowel–predominant CD development.” They hope the findings will be replicated in larger CD cohorts differentiated by disease location. “Our work may set the nidus for CD subtype–based precision medicine by guiding individualized treatment strategies,” they wrote.

This study was supported by the ProKIIDS Network of the Crohn’s and Colitis Foundation and the Public Health Service. It was also supported by the Wagner, Frugoni, and Klaasmeyer families’ Gutsy Kids Fund and by the DR and GL Laws Fund. The authors disclosed no conflicts of interest.

Genetic predisposition likely directs intestinal disease location in pediatric Crohn’s disease (CD) — whether colon-predominant (C-CD) or small-bowel-predominant (SB-CD), a small analysis in Cellular and Molecular Gastroenterology and Hepatology suggests.

Richard Kellermayer, MD, PhD, professor of pediatrics in the Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas, and colleagues compared the genetic makeup of patients based on their Crohn’s disease location — predominantly in the small bowel (L4) or predominantly in the colon (L2 and/or L3). They then generated bipartite networks of susceptibility genes to study the polygenic background of the disease subtypes. They hypothesize that such networks may govern where a patient develops Crohn’s disease.

Dr. Richard Kellermayer, Section of Pediatric Gastroenterology, Hepatology and Nutrition at Baylor College of Medicine in Houston, Texas
Baylor College of Medicine
Dr. Richard Kellermayer

According to current understanding, as Dr. Kellermayer told GI & Hepatology News, most autoimmune disorders, CD included, develop in people with a genetic predisposition after serial environmental insults between conception and young adulthood. “As opposed to single-gene-associated genetic disorders, autoimmune diseases are linked to several hundred genes in which subtle anomalies can work in concert to predispose someone to a certain disorder,” he said. “We hope our findings will guide the development of personalized treatments based on the disease location at diagnosis to advance precision medicine.”
 

CD cases

Eight cases of SB-CD and 11 of C-CD met the inclusion criteria. Mean age at CD diagnosis was about 11 years for both subtypes, while 36.3% of patients with C-CD were female vs 25% of those with SB-CD. Ethnicity was 72.2% White in the C-CD group and 87.5% in the SB-CD group.

As to the main ileocolonic locations according to the Paris Classification of pediatric inflammatory bowel disease, 54.5% in the C-CD group had involvement at L2 and 45.5% at L3. In SB-CD cases, 100% had disease at L4b, 37.5% at L4, and 50% at L1.

The researchers identified 115 single-nucleotide polymorphisms (SNPs) with a combined annotation-dependent depletion (CADD) score on Phil’s Read Editor (PHRED) of >10 that was associated with 97 genes. PHRED is a computer program measuring the quality of the identification of nucleobases generated by automated DNA sequencing and scores the deleteriousness of single-nucleotide variants. The identified genes in this study had a significantly (P < .01) different allele variation between C-CD and SB-CD.

Among the top 28 candidates was an SNP in the EFNA3 gene with a CADD score > 20 for differentiating between the two phenotypically distinct CD groups. Furthermore, the EFNA3 rs17723260 (predicted to be deleterious) was found to have a significantly lower allele frequency (4.5%) in C-CD compared with its allele frequency of 37.5% in SB-CD (chi square P = .0097).

“This finding indicates that EFNA3 might play a role in modulating colonic inflammation, in which a deleterious genetic defect might provide protection against colitis (and direct autoimmunity against the proximal small bowel) in the polygenic background of CD,” the investigators wrote.

EFNA3 has been linked to both CD and ulcerative colitis. Another four genes associated with the top five SNP candidates had already been connected with IBD or mammalian intestinal inflammation. 

According to the authors, the biomedical literature and mouse model findings “implicate the translational relevance of our candidate gene compendium for directing colon- vs small-bowel–predominant CD development.” They hope the findings will be replicated in larger CD cohorts differentiated by disease location. “Our work may set the nidus for CD subtype–based precision medicine by guiding individualized treatment strategies,” they wrote.

This study was supported by the ProKIIDS Network of the Crohn’s and Colitis Foundation and the Public Health Service. It was also supported by the Wagner, Frugoni, and Klaasmeyer families’ Gutsy Kids Fund and by the DR and GL Laws Fund. The authors disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Composite Scale Better Gauges Mucosal Injury in Celiac Disease

Article Type
Changed
Thu, 05/09/2024 - 15:08

A new two-measure metric seems to improve accuracy and statistical precision in assessing celiac disease (CeD) histology compared with either of two components alone, according to a study in Clinical Gastroenterology and Hepatology.

The new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.

The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.

Dr. Jocelyn A. Silvester, Boston Children’s Hospital and Harvard Medical School
Boston Children’s Hospital
Dr. Jocelyn A. Silvester


This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.

The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”

Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
 

The Study

In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.

Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.

The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.

In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.

For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.

In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.

Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.

The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.

This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.

Publications
Topics
Sections

A new two-measure metric seems to improve accuracy and statistical precision in assessing celiac disease (CeD) histology compared with either of two components alone, according to a study in Clinical Gastroenterology and Hepatology.

The new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.

The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.

Dr. Jocelyn A. Silvester, Boston Children’s Hospital and Harvard Medical School
Boston Children’s Hospital
Dr. Jocelyn A. Silvester


This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.

The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”

Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
 

The Study

In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.

Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.

The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.

In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.

For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.

In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.

Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.

The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.

This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.

A new two-measure metric seems to improve accuracy and statistical precision in assessing celiac disease (CeD) histology compared with either of two components alone, according to a study in Clinical Gastroenterology and Hepatology.

The new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.

The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.

Dr. Jocelyn A. Silvester, Boston Children’s Hospital and Harvard Medical School
Boston Children’s Hospital
Dr. Jocelyn A. Silvester


This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.

The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”

Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
 

The Study

In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.

Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.

The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.

In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.

For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.

In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.

Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.

The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.

This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

High-Quality Diet in Early Life May Ward Off Later IBD

Article Type
Changed
Tue, 05/07/2024 - 15:23

Children who ate a high-quality diet at 1 year of age were at a 25% reduced risk of developing inflammatory bowel disease (IBD) in later life, prospective pooled data from two Scandinavian birth cohorts suggested.

It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.

Ms. Annie Guo, PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden
Ms. Guo
Ms. Annie Guo

Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.

“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
 

Two-Cohort Study

The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.

The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).

Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:

  • Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
  • The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
  • AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
  • There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
 

 

Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.

The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”

An ‘Elegant’ Study

In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.

Dr. Ashwin N. Ananthakrishnan, associate professor of medicine at Massachusetts General Hospital in Boston
Mass General Brigham
Dr. Ashwin N. Ananthakrishnan

Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.

Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”

As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”

In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.

The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.

The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.

Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.

Publications
Topics
Sections

Children who ate a high-quality diet at 1 year of age were at a 25% reduced risk of developing inflammatory bowel disease (IBD) in later life, prospective pooled data from two Scandinavian birth cohorts suggested.

It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.

Ms. Annie Guo, PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden
Ms. Guo
Ms. Annie Guo

Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.

“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
 

Two-Cohort Study

The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.

The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).

Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:

  • Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
  • The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
  • AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
  • There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
 

 

Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.

The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”

An ‘Elegant’ Study

In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.

Dr. Ashwin N. Ananthakrishnan, associate professor of medicine at Massachusetts General Hospital in Boston
Mass General Brigham
Dr. Ashwin N. Ananthakrishnan

Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.

Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”

As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”

In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.

The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.

The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.

Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.

Children who ate a high-quality diet at 1 year of age were at a 25% reduced risk of developing inflammatory bowel disease (IBD) in later life, prospective pooled data from two Scandinavian birth cohorts suggested.

It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.

Ms. Annie Guo, PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden
Ms. Guo
Ms. Annie Guo

Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.

“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
 

Two-Cohort Study

The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.

The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).

Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:

  • Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
  • The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
  • AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
  • There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
 

 

Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.

The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”

An ‘Elegant’ Study

In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.

Dr. Ashwin N. Ananthakrishnan, associate professor of medicine at Massachusetts General Hospital in Boston
Mass General Brigham
Dr. Ashwin N. Ananthakrishnan

Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.

Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”

As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”

In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.

The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.

The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.

Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GUT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Late-Stage Incidence Rates Support CRC Screening From Age 45

Article Type
Changed
Mon, 04/29/2024 - 10:34

In the setting of conflicting national screening guidelines, the incidence of distant- and regional-stage colorectal adenocarcinoma (CRC) has been increasing in individuals aged 46-49 years, a cross-sectional study of stage-stratified CRC found.

It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.

Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.

Dr. Eric M. Montminy


The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.

“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”

A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.

“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.

Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:

  • Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
  • Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
  • Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
  • The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).

“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”

His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
 

 

 

Findings From 2000 to 2020

During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).

As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.

Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
 

Why the CRC Uptick?

“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”

Dr. Joshua Meyer

Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.

More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”

There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”

This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.

Publications
Topics
Sections

In the setting of conflicting national screening guidelines, the incidence of distant- and regional-stage colorectal adenocarcinoma (CRC) has been increasing in individuals aged 46-49 years, a cross-sectional study of stage-stratified CRC found.

It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.

Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.

Dr. Eric M. Montminy


The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.

“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”

A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.

“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.

Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:

  • Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
  • Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
  • Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
  • The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).

“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”

His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
 

 

 

Findings From 2000 to 2020

During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).

As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.

Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
 

Why the CRC Uptick?

“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”

Dr. Joshua Meyer

Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.

More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”

There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”

This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.

In the setting of conflicting national screening guidelines, the incidence of distant- and regional-stage colorectal adenocarcinoma (CRC) has been increasing in individuals aged 46-49 years, a cross-sectional study of stage-stratified CRC found.

It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.

Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.

Dr. Eric M. Montminy


The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.

“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”

A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.

“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.

Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:

  • Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
  • Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
  • Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
  • The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).

“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”

His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
 

 

 

Findings From 2000 to 2020

During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).

As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.

Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
 

Why the CRC Uptick?

“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”

Dr. Joshua Meyer

Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.

More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”

There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”

This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heart Failure the Most Common Complication of Atrial Fibrillation, Not Stroke

Article Type
Changed
Tue, 04/23/2024 - 15:20

 

FROM BMJ

The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.

Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.

Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.

Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark
Agata Lenczewska-Madsen, Regional Hospital Central Jutland
Dr. Nicklas Vinter


“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.

“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
 

The Study

The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.

All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.

Among the findings:

  • Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
  • Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
  • Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
  • The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.

“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.

Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”

Dr. John P. Higgins, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston,
University of Texas Health Science Center at Houston
Dr. John P. Higgins


According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”

In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.

Dr. Jianhua Wu, professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, UK
Dr. Wu
Dr. Jianhua Wu


The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.

In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.

This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.

Publications
Topics
Sections

 

FROM BMJ

The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.

Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.

Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.

Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark
Agata Lenczewska-Madsen, Regional Hospital Central Jutland
Dr. Nicklas Vinter


“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.

“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
 

The Study

The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.

All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.

Among the findings:

  • Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
  • Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
  • Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
  • The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.

“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.

Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”

Dr. John P. Higgins, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston,
University of Texas Health Science Center at Houston
Dr. John P. Higgins


According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”

In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.

Dr. Jianhua Wu, professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, UK
Dr. Wu
Dr. Jianhua Wu


The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.

In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.

This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.

 

FROM BMJ

The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.

Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.

Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.

Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark
Agata Lenczewska-Madsen, Regional Hospital Central Jutland
Dr. Nicklas Vinter


“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.

“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
 

The Study

The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.

All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.

Among the findings:

  • Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
  • Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
  • Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
  • The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.

“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.

Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”

Dr. John P. Higgins, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston,
University of Texas Health Science Center at Houston
Dr. John P. Higgins


According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”

In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.

Dr. Jianhua Wu, professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, UK
Dr. Wu
Dr. Jianhua Wu


The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.

In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.

This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Antibiotics of Little Benefit in Lower Respiratory Tract Infection

Article Type
Changed
Mon, 04/15/2024 - 17:23

 

Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.

In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.

“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).

Dr. Daniel J. Merenstein is professor and director of research programs, Family Medicine, at Georgetown University Medical Center in Washington, DC
Dr. Merenstein
Dr. Daniel J. Merenstein


Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.

“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.

LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”

The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
 

Risks

Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.

An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
 

The study

The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).

 

 

Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”

Dr. Linder is a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago
Northwestern Medicine
Dr. Jeffrey A. Linder


A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.

Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”

Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”

In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”

That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.

This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.

Publications
Topics
Sections

 

Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.

In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.

“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).

Dr. Daniel J. Merenstein is professor and director of research programs, Family Medicine, at Georgetown University Medical Center in Washington, DC
Dr. Merenstein
Dr. Daniel J. Merenstein


Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.

“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.

LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”

The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
 

Risks

Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.

An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
 

The study

The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).

 

 

Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”

Dr. Linder is a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago
Northwestern Medicine
Dr. Jeffrey A. Linder


A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.

Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”

Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”

In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”

That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.

This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.

 

Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.

In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.

“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).

Dr. Daniel J. Merenstein is professor and director of research programs, Family Medicine, at Georgetown University Medical Center in Washington, DC
Dr. Merenstein
Dr. Daniel J. Merenstein


Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.

“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.

LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”

The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
 

Risks

Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.

An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
 

The study

The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).

 

 

Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”

Dr. Linder is a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago
Northwestern Medicine
Dr. Jeffrey A. Linder


A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.

Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”

Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”

In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”

That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.

This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF GENERAL INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Salt Substitutes May Cut All-Cause And Cardiovascular Mortality

Article Type
Changed
Fri, 04/19/2024 - 11:17

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Publications
Topics
Sections

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.

The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.

Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).

Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.

Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.

With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.

“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”

In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”

Dr. Albarqouni an assistant professor at the Institute for Evidence-Based Healthcare, Bond University.
Bond University
Dr Loai Albarqouni


Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”

Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”

While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.

“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”

Dr. Laing is director of dietetics at the University of Georgia in Athens
University of Georgia
Dr. Emma Laing


She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.

In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.

Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.

How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”

In agreement, an accompanying editorial  by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.

Dr. J. Jaime Miranda is of the Sydney School of Public Health at the University of Sydney, Australia,
University of Sydney
Dr. J. Jaime Miranda


“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”

Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”

The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.

This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Premenstrual Disorders and Perinatal Depression: A Two-Way Street

Article Type
Changed
Tue, 04/09/2024 - 09:52

Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.

In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.

“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.

In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).

“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.

Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”

Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.

Mood disorders are common in pregnancy and the postpartum period.

The Swedish Study

In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.

In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).

In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).

Dr. Bernard Harlow: Boston University.
Dr. Bernard Harlow

Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.

Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.

The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.

Publications
Topics
Sections

Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.

In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.

“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.

In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).

“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.

Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”

Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.

Mood disorders are common in pregnancy and the postpartum period.

The Swedish Study

In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.

In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).

In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).

Dr. Bernard Harlow: Boston University.
Dr. Bernard Harlow

Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.

Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.

The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.

Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.

In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.

“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.

In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).

“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.

Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”

Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.

Mood disorders are common in pregnancy and the postpartum period.

The Swedish Study

In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.

In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).

In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).

Dr. Bernard Harlow: Boston University.
Dr. Bernard Harlow

Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.

Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.

The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PLOS MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article