More comprehensive testing needed to characterize esophageal dysphagia

New approach to an old disorder
Article Type
Changed
Tue, 01/09/2024 - 15:04

The current approach to esophageal function testing is insufficient to characterize esophageal motility disorders, as many patients with esophageal dysphagia have abnormalities that are undetectable with routine tests, according to investigators.

More nuanced assessments of esophageal motility disorders could potentially lead to more accurate diagnoses, and more effective treatments, reported Ravinder K. Mittal, MD, and Ali Zifan, PhD, of the University of California San Diego.

Ravinder K. Mittal, MD, is with the division of digestive diseases at University of California, San Diego.
Dr. Mittal
Dr. Ravinder K. Mittal

Esophageal motility disorders are currently divided into major and minor variants based on the contraction phase of peristalsis, Dr. Mittal and Dr. Zifan wrote in their report in Gastro Hep Advances. Yet the reason for dysphagia in many of these patients remains a puzzle, particularly in patients with supernormal contraction during peristalsis, like those with nutcracker esophagus. What’s more, up to half of patients with dysphagia have normal findings on high-resolution manometry impedance (HRMZ), the typical diagnostic modality, leaving many with the broad label of functional dysphagia.

This lack of clarity “suggests that the etiology in many patients remains unknown,” according to the investigators, which prompted them to publish the present review article.

After describing the shortcomings of current test methods, the investigators provided an overview of the physiology of esophageal peristalsis, then dove deeper into available data concerning luminal cross section measurements, esophageal distension during peristalsis, bolus flow, and distension contraction patterns in normal patients versus those with various kinds of dysphagia.

They highlighted two key findings.

First, in patients with functional dysphagia, esophagogastric junction outflow obstruction (EGJOO), and high amplitude esophageal peristaltic contractions (HAEC), the bolus must travel through a narrow esophageal lumen. Second, in patients with nonobstructive dysphagia and type 3 achalasia, the bolus moves against distal luminal occlusion.

“These findings indicate a relative dynamic obstruction to bolus flow and reduced distensibility of the esophageal wall in patients with several primary esophageal motility disorders,” the investigators wrote. “We speculate that the dysphagia sensation experienced by many patients may result from a normal or supernormal contraction wave pushing the bolus against resistance.”

Yet routine esophageal function testing fails to capture these abnormalities, Dr. Mittal and Dr. Zifan noted.

“[C]urrent techniques used to measure esophageal distension during peristalsis are not adequate,” they wrote. “The high-resolution manometry and current scheme of classifying esophageal motor disorders in the current format emphasize only half of the story of peristalsis, probably the less important of the two halves, i.e., the contraction phase of peristalsis.”

More focus is needed on esophageal distension, they suggested, noting that relaxation is first needed to accommodate a bolus before contraction, no matter how powerful, can push it down the esophagus.

“A simple analogy is that of a car — it cannot get through a roadway that is smaller than its own width, irrespective of the horsepower of its engine,” they wrote.

The solution may lie in a more comprehensive approach to esophageal function testing.

“Integrating representations of distension and contraction, along with objective assessments of flow timing and distensibility, complements the current classification of esophageal motility disorders that are based on the contraction characteristics,” the investigators wrote, predicting that these efforts could improve diagnostic accuracy.

What to do about those diagnoses is another mystery.

“The question though remains regarding the optimal treatment for the impaired distension function of the esophagus, and whether improvement in the distension function will lead to improvement in dysphagia symptoms,” the investigators concluded.

The review was supported by the National Institutes of Health. The investigators reported copyright/patent protection for the computer software (Dplots) used to evaluate the distension contraction plots.

Body

Medicine is strewn with diseases first labeled as functional or psychologically induced that have been recategorized into clear non-sensory disorders of which functional dysphagia is one.

In this review article, Dr. Mittal and Dr. Zifan discuss a summary and new paradigm for esophageal motility disorders and the origin of functional dysphagia (FD). As with other functional disorders, the predominance of research has suggested that functional dysphagia is in large part a sensory disorder in which patient are sensing normally sub-threshold events of normal bolus transit interpreted as dysphagia.

Dr. David A. Katzka, Columbia University, New York
Dr. David A. Katzka
In this review, largely a summary of Dr. Mittal’s work, the role of more subtle characteristics of esophageal motility are examined. Several novel findings are observed including the role of increased esophageal wall tension and failure of relaxation with luminal narrowing as a cause of dysphagia. This may be due to inhibition or impaired relaxation or dyscoordination of the circular and longitudinal muscle layers during peristalsis. These novel findings are reinforced by a multidisciplinary approach blending the pressure findings on high resolution manometry, the motor and distensibility data from impedance planimetry (EndoFLIP), the anatomic findings of endoscopic ultrasound, and the bolus and anatomic information from barium esophagography, providing as complete a picture as possible for understanding dysphagia.

Will this lead to recategorization of all functional dysphagia as a perturbation in motor function and the discovery of new therapies? Certainly, to some degree, though sensory dysfunction will likely remain a prominent mechanism in some patients. Nevertheless, it is always exciting when a new approach to an old disorder emerges. With the work from Dr. Mittal’s laboratory and many others, functional dysphagia may soon drop the functional!
 

David A. Katzka, MD, is a gastroenterologist at New York–Presbyterian/Columbia University Irving Medical Center, New York, where he leads the Esophagology and Swallowing Center. He has performed research for Medtronic, but has no other relevant disclosures.

Publications
Topics
Sections
Body

Medicine is strewn with diseases first labeled as functional or psychologically induced that have been recategorized into clear non-sensory disorders of which functional dysphagia is one.

In this review article, Dr. Mittal and Dr. Zifan discuss a summary and new paradigm for esophageal motility disorders and the origin of functional dysphagia (FD). As with other functional disorders, the predominance of research has suggested that functional dysphagia is in large part a sensory disorder in which patient are sensing normally sub-threshold events of normal bolus transit interpreted as dysphagia.

Dr. David A. Katzka, Columbia University, New York
Dr. David A. Katzka
In this review, largely a summary of Dr. Mittal’s work, the role of more subtle characteristics of esophageal motility are examined. Several novel findings are observed including the role of increased esophageal wall tension and failure of relaxation with luminal narrowing as a cause of dysphagia. This may be due to inhibition or impaired relaxation or dyscoordination of the circular and longitudinal muscle layers during peristalsis. These novel findings are reinforced by a multidisciplinary approach blending the pressure findings on high resolution manometry, the motor and distensibility data from impedance planimetry (EndoFLIP), the anatomic findings of endoscopic ultrasound, and the bolus and anatomic information from barium esophagography, providing as complete a picture as possible for understanding dysphagia.

Will this lead to recategorization of all functional dysphagia as a perturbation in motor function and the discovery of new therapies? Certainly, to some degree, though sensory dysfunction will likely remain a prominent mechanism in some patients. Nevertheless, it is always exciting when a new approach to an old disorder emerges. With the work from Dr. Mittal’s laboratory and many others, functional dysphagia may soon drop the functional!
 

David A. Katzka, MD, is a gastroenterologist at New York–Presbyterian/Columbia University Irving Medical Center, New York, where he leads the Esophagology and Swallowing Center. He has performed research for Medtronic, but has no other relevant disclosures.

Body

Medicine is strewn with diseases first labeled as functional or psychologically induced that have been recategorized into clear non-sensory disorders of which functional dysphagia is one.

In this review article, Dr. Mittal and Dr. Zifan discuss a summary and new paradigm for esophageal motility disorders and the origin of functional dysphagia (FD). As with other functional disorders, the predominance of research has suggested that functional dysphagia is in large part a sensory disorder in which patient are sensing normally sub-threshold events of normal bolus transit interpreted as dysphagia.

Dr. David A. Katzka, Columbia University, New York
Dr. David A. Katzka
In this review, largely a summary of Dr. Mittal’s work, the role of more subtle characteristics of esophageal motility are examined. Several novel findings are observed including the role of increased esophageal wall tension and failure of relaxation with luminal narrowing as a cause of dysphagia. This may be due to inhibition or impaired relaxation or dyscoordination of the circular and longitudinal muscle layers during peristalsis. These novel findings are reinforced by a multidisciplinary approach blending the pressure findings on high resolution manometry, the motor and distensibility data from impedance planimetry (EndoFLIP), the anatomic findings of endoscopic ultrasound, and the bolus and anatomic information from barium esophagography, providing as complete a picture as possible for understanding dysphagia.

Will this lead to recategorization of all functional dysphagia as a perturbation in motor function and the discovery of new therapies? Certainly, to some degree, though sensory dysfunction will likely remain a prominent mechanism in some patients. Nevertheless, it is always exciting when a new approach to an old disorder emerges. With the work from Dr. Mittal’s laboratory and many others, functional dysphagia may soon drop the functional!
 

David A. Katzka, MD, is a gastroenterologist at New York–Presbyterian/Columbia University Irving Medical Center, New York, where he leads the Esophagology and Swallowing Center. He has performed research for Medtronic, but has no other relevant disclosures.

Title
New approach to an old disorder
New approach to an old disorder

The current approach to esophageal function testing is insufficient to characterize esophageal motility disorders, as many patients with esophageal dysphagia have abnormalities that are undetectable with routine tests, according to investigators.

More nuanced assessments of esophageal motility disorders could potentially lead to more accurate diagnoses, and more effective treatments, reported Ravinder K. Mittal, MD, and Ali Zifan, PhD, of the University of California San Diego.

Ravinder K. Mittal, MD, is with the division of digestive diseases at University of California, San Diego.
Dr. Mittal
Dr. Ravinder K. Mittal

Esophageal motility disorders are currently divided into major and minor variants based on the contraction phase of peristalsis, Dr. Mittal and Dr. Zifan wrote in their report in Gastro Hep Advances. Yet the reason for dysphagia in many of these patients remains a puzzle, particularly in patients with supernormal contraction during peristalsis, like those with nutcracker esophagus. What’s more, up to half of patients with dysphagia have normal findings on high-resolution manometry impedance (HRMZ), the typical diagnostic modality, leaving many with the broad label of functional dysphagia.

This lack of clarity “suggests that the etiology in many patients remains unknown,” according to the investigators, which prompted them to publish the present review article.

After describing the shortcomings of current test methods, the investigators provided an overview of the physiology of esophageal peristalsis, then dove deeper into available data concerning luminal cross section measurements, esophageal distension during peristalsis, bolus flow, and distension contraction patterns in normal patients versus those with various kinds of dysphagia.

They highlighted two key findings.

First, in patients with functional dysphagia, esophagogastric junction outflow obstruction (EGJOO), and high amplitude esophageal peristaltic contractions (HAEC), the bolus must travel through a narrow esophageal lumen. Second, in patients with nonobstructive dysphagia and type 3 achalasia, the bolus moves against distal luminal occlusion.

“These findings indicate a relative dynamic obstruction to bolus flow and reduced distensibility of the esophageal wall in patients with several primary esophageal motility disorders,” the investigators wrote. “We speculate that the dysphagia sensation experienced by many patients may result from a normal or supernormal contraction wave pushing the bolus against resistance.”

Yet routine esophageal function testing fails to capture these abnormalities, Dr. Mittal and Dr. Zifan noted.

“[C]urrent techniques used to measure esophageal distension during peristalsis are not adequate,” they wrote. “The high-resolution manometry and current scheme of classifying esophageal motor disorders in the current format emphasize only half of the story of peristalsis, probably the less important of the two halves, i.e., the contraction phase of peristalsis.”

More focus is needed on esophageal distension, they suggested, noting that relaxation is first needed to accommodate a bolus before contraction, no matter how powerful, can push it down the esophagus.

“A simple analogy is that of a car — it cannot get through a roadway that is smaller than its own width, irrespective of the horsepower of its engine,” they wrote.

The solution may lie in a more comprehensive approach to esophageal function testing.

“Integrating representations of distension and contraction, along with objective assessments of flow timing and distensibility, complements the current classification of esophageal motility disorders that are based on the contraction characteristics,” the investigators wrote, predicting that these efforts could improve diagnostic accuracy.

What to do about those diagnoses is another mystery.

“The question though remains regarding the optimal treatment for the impaired distension function of the esophagus, and whether improvement in the distension function will lead to improvement in dysphagia symptoms,” the investigators concluded.

The review was supported by the National Institutes of Health. The investigators reported copyright/patent protection for the computer software (Dplots) used to evaluate the distension contraction plots.

The current approach to esophageal function testing is insufficient to characterize esophageal motility disorders, as many patients with esophageal dysphagia have abnormalities that are undetectable with routine tests, according to investigators.

More nuanced assessments of esophageal motility disorders could potentially lead to more accurate diagnoses, and more effective treatments, reported Ravinder K. Mittal, MD, and Ali Zifan, PhD, of the University of California San Diego.

Ravinder K. Mittal, MD, is with the division of digestive diseases at University of California, San Diego.
Dr. Mittal
Dr. Ravinder K. Mittal

Esophageal motility disorders are currently divided into major and minor variants based on the contraction phase of peristalsis, Dr. Mittal and Dr. Zifan wrote in their report in Gastro Hep Advances. Yet the reason for dysphagia in many of these patients remains a puzzle, particularly in patients with supernormal contraction during peristalsis, like those with nutcracker esophagus. What’s more, up to half of patients with dysphagia have normal findings on high-resolution manometry impedance (HRMZ), the typical diagnostic modality, leaving many with the broad label of functional dysphagia.

This lack of clarity “suggests that the etiology in many patients remains unknown,” according to the investigators, which prompted them to publish the present review article.

After describing the shortcomings of current test methods, the investigators provided an overview of the physiology of esophageal peristalsis, then dove deeper into available data concerning luminal cross section measurements, esophageal distension during peristalsis, bolus flow, and distension contraction patterns in normal patients versus those with various kinds of dysphagia.

They highlighted two key findings.

First, in patients with functional dysphagia, esophagogastric junction outflow obstruction (EGJOO), and high amplitude esophageal peristaltic contractions (HAEC), the bolus must travel through a narrow esophageal lumen. Second, in patients with nonobstructive dysphagia and type 3 achalasia, the bolus moves against distal luminal occlusion.

“These findings indicate a relative dynamic obstruction to bolus flow and reduced distensibility of the esophageal wall in patients with several primary esophageal motility disorders,” the investigators wrote. “We speculate that the dysphagia sensation experienced by many patients may result from a normal or supernormal contraction wave pushing the bolus against resistance.”

Yet routine esophageal function testing fails to capture these abnormalities, Dr. Mittal and Dr. Zifan noted.

“[C]urrent techniques used to measure esophageal distension during peristalsis are not adequate,” they wrote. “The high-resolution manometry and current scheme of classifying esophageal motor disorders in the current format emphasize only half of the story of peristalsis, probably the less important of the two halves, i.e., the contraction phase of peristalsis.”

More focus is needed on esophageal distension, they suggested, noting that relaxation is first needed to accommodate a bolus before contraction, no matter how powerful, can push it down the esophagus.

“A simple analogy is that of a car — it cannot get through a roadway that is smaller than its own width, irrespective of the horsepower of its engine,” they wrote.

The solution may lie in a more comprehensive approach to esophageal function testing.

“Integrating representations of distension and contraction, along with objective assessments of flow timing and distensibility, complements the current classification of esophageal motility disorders that are based on the contraction characteristics,” the investigators wrote, predicting that these efforts could improve diagnostic accuracy.

What to do about those diagnoses is another mystery.

“The question though remains regarding the optimal treatment for the impaired distension function of the esophagus, and whether improvement in the distension function will lead to improvement in dysphagia symptoms,” the investigators concluded.

The review was supported by the National Institutes of Health. The investigators reported copyright/patent protection for the computer software (Dplots) used to evaluate the distension contraction plots.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTRO HEP ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New KDIGO guideline encourages use of HCV-positive kidneys for HCV-negative recipients

Article Type
Changed
Wed, 12/13/2023 - 10:29

The Kidney Disease: Improving Global Outcomes (KDIGO) Work Group has updated its guideline concerning the prevention, diagnosis, evaluation, and treatment of hepatitis C virus (HCV) infection in patients with chronic kidney disease (CKD).

Of note, KDIGO now supports transplant of HCV-positive kidneys to HCV-negative recipients.

The guidance document, authored by Ahmed Arslan Yousuf Awan, MD, of Baylor College of Medicine, Houston, and colleagues, was written in light of new evidence that has emerged since the 2018 guideline was published.

“The focused update was triggered by new data on antiviral treatment in patients with advanced stages of CKD (G4, G5, or G5D), transplant of HCV-infected kidneys into uninfected recipients, and evolution of the viewpoint on the role of kidney biopsy in managing kidney disease caused by HCV,” the guideline panelists wrote in Annals of Internal Medicine. “This update is intended to assist clinicians in the care of patients with HCV infection and CKD, including patients receiving dialysis (CKD G5D) and patients with a kidney transplant (CKD G1T-G5T).”

Anjay Rastogi, MD, PhD, professor and clinical chief of nephrology at the David Geffen School of Medicine at UCLA, said the update is both “timely and relevant,” and “will really have an impact on the organ shortage that we have for kidney transplant”

The updates are outlined below.
 

Expanded Access to HCV-Positive Kidneys

While the 2018 guideline recommended that HCV-positive kidneys be directed to HCV-positive recipients, the new guideline suggests that these kidneys are appropriate for all patients regardless of HCV status.

In support, the panelists cited a follow-up of THINKER-1 trial, which showed that eGFR and quality of life were not negatively affected when HCV-negative patients received an HCV-positive kidney, compared with an HCV-negative kidney. Data from 525 unmatched recipients in 16 other studies support this conclusion, the panelists noted.

Jose Debes, MD, PhD, associate professor at the University of Minnesota, Minneapolis, suggested that this is the most important update to the KDIGO guidelines.

“That [change] would be the main impact of these recommendations,” Dr. Debes said in an interview. “Several centers were already doing this, since some data [were] out there, but I think the fact that they’re making this into a guideline is quite important.”

Dr. Rastogi agreed that this recommendation is the most impactful update.

“That’s a big move,” Dr. Rastogi said in an interview. He predicted that the change will “definitely increase the donor pool, which is very, very important.”

For this new recommendation to have the greatest positive effect, however, Dr. Rastogi suggested that health care providers and treatment centers need to prepare an effective implementation strategy. He emphasized the importance of early communication with patients concerning the safety of HCV-positive kidneys, which depends on early initiation of direct-acting antiviral (DAA) therapy.

In the guideline, Dr. Awan and colleagues reported three documented cases of fibrosing cholestatic hepatitis occurred in patients who did not begin DAA therapy until 30 days after transplant.

“[Patients] should start [DAA treatment] right away,” Dr. Rastogi said, “and sometimes even before the transplant.”

This will require institutional support, he noted, as centers need to ensure that patients are covered for DAA therapy and medication is readily available.
 

 

 

Sofosbuvir Given the Green Light

Compared with the 2018 guideline, which recommended against sofosbuvir in patients with CKD G4 and G5, including those on dialysis, because of concerns about metabolization via the kidneys, the new guideline suggests that sofosbuvir-based DAA regimens are appropriate in patients with glomerular filtration rate (GFR) less than 30 mL/min per 1.73 m2, including those receiving dialysis.

This recommendation was based on a systematic review of 106 studies including both sofosbuvir-based and non-sofosbuvir-based DAA regimens that showed high safety and efficacy for all DAA regimen types across a broad variety of patient types.

“DAAs are highly effective and well tolerated treatments for hepatitis C in patients across all stages of CKD, including those undergoing dialysis and kidney transplant recipients, with no need for dose adjustment,” Dr. Awan and colleagues wrote.
 

Loosened Biopsy Requirements

Unlike the 2018 guideline, which advised kidney biopsy in HCV-positive patients with clinical evidence of glomerular disease prior to initiating DAA treatment, the new guideline suggests that HCV-infected patients with a typical presentation of immune-complex proliferative glomerulonephritis do not require confirmatory kidney biopsy.

“Because almost all patients with chronic hepatitis C (with or without glomerulonephritis) should be treated with DAAs, a kidney biopsy is unlikely to change management in most patients with hepatitis C and clinical glomerulonephritis,” the panelists wrote.

If kidney disease does not stabilize or improve with achievement of sustained virologic response, or if there is evidence of rapidly progressive glomerulonephritis, then a kidney biopsy should be considered before beginning immunosuppressive therapy, according to the guideline, which includes a flow chart to guide clinicians through this decision-making process.
 

Individualizing Immunosuppressive Therapy

Consistent with the old guideline, the new guideline recommends DAA treatment with concurrent immunosuppressive therapy for patients with cryoglobulinemic flare or rapidly progressive kidney failure. But in contrast, the new guideline calls for an individualized approach to immunosuppression in patients with nephrotic syndrome.

Dr. Awan and colleagues suggested that “nephrotic-range proteinuria (greater than 3.5 g/d) alone does not warrant use of immunosuppressive treatment because such patients can achieve remission of proteinuria after treatment with DAAs.” Still, if other associated complications — such as anasarca, thromboembolic disease, or severe hypoalbuminemia — are present, then immunosuppressive therapy may be warranted, with rituximab remaining the preferred first-line agent.
 

More Work Is Needed

Dr. Awan and colleagues concluded the guideline by highlighting areas of unmet need, and how filling these knowledge gaps could lead to additional guideline updates.

“Future studies of kidney donations from HCV-positive donors to HCV-negative recipients are needed to refine and clarify the timing of initiation and duration of DAA therapy and to assess long-term outcomes associated with this practice,” they wrote. “Also, randomized controlled trials are needed to determine which patients with HCV-associated kidney disease can be treated with DAA therapy alone versus in combination with immunosuppression and plasma exchange. KDIGO will assess the currency of its recommendations and the need to update them in the next 3 years.”

The guideline was funded by KDIGO. The investigators disclosed relationships with GSK, Gilead, Intercept, Novo Nordisk, and others. Dr. Rastogi and Dr. Debes had no conflicts of interest.

Publications
Topics
Sections

The Kidney Disease: Improving Global Outcomes (KDIGO) Work Group has updated its guideline concerning the prevention, diagnosis, evaluation, and treatment of hepatitis C virus (HCV) infection in patients with chronic kidney disease (CKD).

Of note, KDIGO now supports transplant of HCV-positive kidneys to HCV-negative recipients.

The guidance document, authored by Ahmed Arslan Yousuf Awan, MD, of Baylor College of Medicine, Houston, and colleagues, was written in light of new evidence that has emerged since the 2018 guideline was published.

“The focused update was triggered by new data on antiviral treatment in patients with advanced stages of CKD (G4, G5, or G5D), transplant of HCV-infected kidneys into uninfected recipients, and evolution of the viewpoint on the role of kidney biopsy in managing kidney disease caused by HCV,” the guideline panelists wrote in Annals of Internal Medicine. “This update is intended to assist clinicians in the care of patients with HCV infection and CKD, including patients receiving dialysis (CKD G5D) and patients with a kidney transplant (CKD G1T-G5T).”

Anjay Rastogi, MD, PhD, professor and clinical chief of nephrology at the David Geffen School of Medicine at UCLA, said the update is both “timely and relevant,” and “will really have an impact on the organ shortage that we have for kidney transplant”

The updates are outlined below.
 

Expanded Access to HCV-Positive Kidneys

While the 2018 guideline recommended that HCV-positive kidneys be directed to HCV-positive recipients, the new guideline suggests that these kidneys are appropriate for all patients regardless of HCV status.

In support, the panelists cited a follow-up of THINKER-1 trial, which showed that eGFR and quality of life were not negatively affected when HCV-negative patients received an HCV-positive kidney, compared with an HCV-negative kidney. Data from 525 unmatched recipients in 16 other studies support this conclusion, the panelists noted.

Jose Debes, MD, PhD, associate professor at the University of Minnesota, Minneapolis, suggested that this is the most important update to the KDIGO guidelines.

“That [change] would be the main impact of these recommendations,” Dr. Debes said in an interview. “Several centers were already doing this, since some data [were] out there, but I think the fact that they’re making this into a guideline is quite important.”

Dr. Rastogi agreed that this recommendation is the most impactful update.

“That’s a big move,” Dr. Rastogi said in an interview. He predicted that the change will “definitely increase the donor pool, which is very, very important.”

For this new recommendation to have the greatest positive effect, however, Dr. Rastogi suggested that health care providers and treatment centers need to prepare an effective implementation strategy. He emphasized the importance of early communication with patients concerning the safety of HCV-positive kidneys, which depends on early initiation of direct-acting antiviral (DAA) therapy.

In the guideline, Dr. Awan and colleagues reported three documented cases of fibrosing cholestatic hepatitis occurred in patients who did not begin DAA therapy until 30 days after transplant.

“[Patients] should start [DAA treatment] right away,” Dr. Rastogi said, “and sometimes even before the transplant.”

This will require institutional support, he noted, as centers need to ensure that patients are covered for DAA therapy and medication is readily available.
 

 

 

Sofosbuvir Given the Green Light

Compared with the 2018 guideline, which recommended against sofosbuvir in patients with CKD G4 and G5, including those on dialysis, because of concerns about metabolization via the kidneys, the new guideline suggests that sofosbuvir-based DAA regimens are appropriate in patients with glomerular filtration rate (GFR) less than 30 mL/min per 1.73 m2, including those receiving dialysis.

This recommendation was based on a systematic review of 106 studies including both sofosbuvir-based and non-sofosbuvir-based DAA regimens that showed high safety and efficacy for all DAA regimen types across a broad variety of patient types.

“DAAs are highly effective and well tolerated treatments for hepatitis C in patients across all stages of CKD, including those undergoing dialysis and kidney transplant recipients, with no need for dose adjustment,” Dr. Awan and colleagues wrote.
 

Loosened Biopsy Requirements

Unlike the 2018 guideline, which advised kidney biopsy in HCV-positive patients with clinical evidence of glomerular disease prior to initiating DAA treatment, the new guideline suggests that HCV-infected patients with a typical presentation of immune-complex proliferative glomerulonephritis do not require confirmatory kidney biopsy.

“Because almost all patients with chronic hepatitis C (with or without glomerulonephritis) should be treated with DAAs, a kidney biopsy is unlikely to change management in most patients with hepatitis C and clinical glomerulonephritis,” the panelists wrote.

If kidney disease does not stabilize or improve with achievement of sustained virologic response, or if there is evidence of rapidly progressive glomerulonephritis, then a kidney biopsy should be considered before beginning immunosuppressive therapy, according to the guideline, which includes a flow chart to guide clinicians through this decision-making process.
 

Individualizing Immunosuppressive Therapy

Consistent with the old guideline, the new guideline recommends DAA treatment with concurrent immunosuppressive therapy for patients with cryoglobulinemic flare or rapidly progressive kidney failure. But in contrast, the new guideline calls for an individualized approach to immunosuppression in patients with nephrotic syndrome.

Dr. Awan and colleagues suggested that “nephrotic-range proteinuria (greater than 3.5 g/d) alone does not warrant use of immunosuppressive treatment because such patients can achieve remission of proteinuria after treatment with DAAs.” Still, if other associated complications — such as anasarca, thromboembolic disease, or severe hypoalbuminemia — are present, then immunosuppressive therapy may be warranted, with rituximab remaining the preferred first-line agent.
 

More Work Is Needed

Dr. Awan and colleagues concluded the guideline by highlighting areas of unmet need, and how filling these knowledge gaps could lead to additional guideline updates.

“Future studies of kidney donations from HCV-positive donors to HCV-negative recipients are needed to refine and clarify the timing of initiation and duration of DAA therapy and to assess long-term outcomes associated with this practice,” they wrote. “Also, randomized controlled trials are needed to determine which patients with HCV-associated kidney disease can be treated with DAA therapy alone versus in combination with immunosuppression and plasma exchange. KDIGO will assess the currency of its recommendations and the need to update them in the next 3 years.”

The guideline was funded by KDIGO. The investigators disclosed relationships with GSK, Gilead, Intercept, Novo Nordisk, and others. Dr. Rastogi and Dr. Debes had no conflicts of interest.

The Kidney Disease: Improving Global Outcomes (KDIGO) Work Group has updated its guideline concerning the prevention, diagnosis, evaluation, and treatment of hepatitis C virus (HCV) infection in patients with chronic kidney disease (CKD).

Of note, KDIGO now supports transplant of HCV-positive kidneys to HCV-negative recipients.

The guidance document, authored by Ahmed Arslan Yousuf Awan, MD, of Baylor College of Medicine, Houston, and colleagues, was written in light of new evidence that has emerged since the 2018 guideline was published.

“The focused update was triggered by new data on antiviral treatment in patients with advanced stages of CKD (G4, G5, or G5D), transplant of HCV-infected kidneys into uninfected recipients, and evolution of the viewpoint on the role of kidney biopsy in managing kidney disease caused by HCV,” the guideline panelists wrote in Annals of Internal Medicine. “This update is intended to assist clinicians in the care of patients with HCV infection and CKD, including patients receiving dialysis (CKD G5D) and patients with a kidney transplant (CKD G1T-G5T).”

Anjay Rastogi, MD, PhD, professor and clinical chief of nephrology at the David Geffen School of Medicine at UCLA, said the update is both “timely and relevant,” and “will really have an impact on the organ shortage that we have for kidney transplant”

The updates are outlined below.
 

Expanded Access to HCV-Positive Kidneys

While the 2018 guideline recommended that HCV-positive kidneys be directed to HCV-positive recipients, the new guideline suggests that these kidneys are appropriate for all patients regardless of HCV status.

In support, the panelists cited a follow-up of THINKER-1 trial, which showed that eGFR and quality of life were not negatively affected when HCV-negative patients received an HCV-positive kidney, compared with an HCV-negative kidney. Data from 525 unmatched recipients in 16 other studies support this conclusion, the panelists noted.

Jose Debes, MD, PhD, associate professor at the University of Minnesota, Minneapolis, suggested that this is the most important update to the KDIGO guidelines.

“That [change] would be the main impact of these recommendations,” Dr. Debes said in an interview. “Several centers were already doing this, since some data [were] out there, but I think the fact that they’re making this into a guideline is quite important.”

Dr. Rastogi agreed that this recommendation is the most impactful update.

“That’s a big move,” Dr. Rastogi said in an interview. He predicted that the change will “definitely increase the donor pool, which is very, very important.”

For this new recommendation to have the greatest positive effect, however, Dr. Rastogi suggested that health care providers and treatment centers need to prepare an effective implementation strategy. He emphasized the importance of early communication with patients concerning the safety of HCV-positive kidneys, which depends on early initiation of direct-acting antiviral (DAA) therapy.

In the guideline, Dr. Awan and colleagues reported three documented cases of fibrosing cholestatic hepatitis occurred in patients who did not begin DAA therapy until 30 days after transplant.

“[Patients] should start [DAA treatment] right away,” Dr. Rastogi said, “and sometimes even before the transplant.”

This will require institutional support, he noted, as centers need to ensure that patients are covered for DAA therapy and medication is readily available.
 

 

 

Sofosbuvir Given the Green Light

Compared with the 2018 guideline, which recommended against sofosbuvir in patients with CKD G4 and G5, including those on dialysis, because of concerns about metabolization via the kidneys, the new guideline suggests that sofosbuvir-based DAA regimens are appropriate in patients with glomerular filtration rate (GFR) less than 30 mL/min per 1.73 m2, including those receiving dialysis.

This recommendation was based on a systematic review of 106 studies including both sofosbuvir-based and non-sofosbuvir-based DAA regimens that showed high safety and efficacy for all DAA regimen types across a broad variety of patient types.

“DAAs are highly effective and well tolerated treatments for hepatitis C in patients across all stages of CKD, including those undergoing dialysis and kidney transplant recipients, with no need for dose adjustment,” Dr. Awan and colleagues wrote.
 

Loosened Biopsy Requirements

Unlike the 2018 guideline, which advised kidney biopsy in HCV-positive patients with clinical evidence of glomerular disease prior to initiating DAA treatment, the new guideline suggests that HCV-infected patients with a typical presentation of immune-complex proliferative glomerulonephritis do not require confirmatory kidney biopsy.

“Because almost all patients with chronic hepatitis C (with or without glomerulonephritis) should be treated with DAAs, a kidney biopsy is unlikely to change management in most patients with hepatitis C and clinical glomerulonephritis,” the panelists wrote.

If kidney disease does not stabilize or improve with achievement of sustained virologic response, or if there is evidence of rapidly progressive glomerulonephritis, then a kidney biopsy should be considered before beginning immunosuppressive therapy, according to the guideline, which includes a flow chart to guide clinicians through this decision-making process.
 

Individualizing Immunosuppressive Therapy

Consistent with the old guideline, the new guideline recommends DAA treatment with concurrent immunosuppressive therapy for patients with cryoglobulinemic flare or rapidly progressive kidney failure. But in contrast, the new guideline calls for an individualized approach to immunosuppression in patients with nephrotic syndrome.

Dr. Awan and colleagues suggested that “nephrotic-range proteinuria (greater than 3.5 g/d) alone does not warrant use of immunosuppressive treatment because such patients can achieve remission of proteinuria after treatment with DAAs.” Still, if other associated complications — such as anasarca, thromboembolic disease, or severe hypoalbuminemia — are present, then immunosuppressive therapy may be warranted, with rituximab remaining the preferred first-line agent.
 

More Work Is Needed

Dr. Awan and colleagues concluded the guideline by highlighting areas of unmet need, and how filling these knowledge gaps could lead to additional guideline updates.

“Future studies of kidney donations from HCV-positive donors to HCV-negative recipients are needed to refine and clarify the timing of initiation and duration of DAA therapy and to assess long-term outcomes associated with this practice,” they wrote. “Also, randomized controlled trials are needed to determine which patients with HCV-associated kidney disease can be treated with DAA therapy alone versus in combination with immunosuppression and plasma exchange. KDIGO will assess the currency of its recommendations and the need to update them in the next 3 years.”

The guideline was funded by KDIGO. The investigators disclosed relationships with GSK, Gilead, Intercept, Novo Nordisk, and others. Dr. Rastogi and Dr. Debes had no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Real-world evidence: Early ileocecal resection outperforms anti-TNF therapy for Crohn’s disease

Article Type
Changed
Thu, 12/07/2023 - 11:12

Early ileocecal resection is associated with better long-term outcomes compared with anti–tumor necrosis factor (TNF) therapy for patients with Crohn’s disease (CD), based on new real-world evidence.

These findings add weight to previously reported data from the LIR!C trial, suggesting that ileocecal resection should be considered a first-line treatment option for CD, reported principal investigator Kristine H. Allin, MD, PhD, of Aalborg University, Copenhagen.

“The LIR!C randomized clinical trial has demonstrated comparable quality of life with ileocecal resection and infliximab as a first-line treatment for limited, nonstricturing ileocecal CD at 1 year of follow-up, and improved outcomes with ileocecal resection on retrospective analysis of long-term follow-up data,” the investigators wrote in Gastroenterology. “However, in the real world, the long-term impact of early ileocecal resection for CD, compared with medical therapy, remains largely unexplored.”

To gather these real-world data, the investigators turned to the Danish National Patient Registry and the Danish National Prescription Registry, which included 1,279 individuals diagnosed with CD between 2003 and 2018 who received anti-TNF therapy or underwent ileocecal resection within 1 year of diagnosis. Within this group, slightly less than half underwent ileocecal resection (45.4%) while the remainder (54.6%) received anti-TNF therapy.

The primary outcome was a composite of one or more events: perianal CD, CD-related surgery, systemic corticosteroid exposure, and CD-related hospitalization. Secondary analyses evaluated the relative risks of these same four events as independent entities.

Multifactor-adjusted Cox proportional hazards regression analysis revealed that patients who underwent ileocecal resection had a 33% lower risk of the composite outcome compared with those who received anti-TNF therapy (adjusted hazard ratio [aHR], 0.67; 95% CI, 0.54-0.83).

In the secondary analyses, which examined risks for each component of the composite outcome, the surgery group had a significantly lower risk of CD-related surgery (aHR, 0.56; 95% CI, 0.39-0.80) and corticosteroid exposure (aHR, 0.71; 95% CI, 0.54-0.92), but not perianal CD or CD-related hospitalization.

After 5 years, half of the patients (49.7%) who underwent ileocecal resection were not receiving any treatment for CD. At the same timepoint, a slightly lower percentage of this group (46.3%) had started immunomodulator therapy, while 16.8% started anti-TNF therapy. Just 1.8% of these patients required a second intestinal resection.

Manasi Agrawal, MD, of the Icahn School of Medicine at Mount Sinai, New York,
Icahn School of Medicine at Mount Sinai
Dr. Manasi Agrawal

“To our knowledge, these are the first real-world data in a population-based cohort with long-term follow-up of early ileocecal resection compared with anti-TNF therapy for newly diagnosed ileal and ileocecal CD,” the investigators wrote. “These data suggest that ileocecal resection may have a role as first-line therapy in Crohn’s disease management and challenge the current paradigm of reserving surgery for complicated Crohn’s disease refractory or intolerant to medications.”

Corresponding author Manasi Agrawal, MD, of Icahn School of Medicine at Mount Sinai, New York, suggested that “validation of our findings in external cohorts [is needed], and understanding of factors associated with improved outcomes following ileocecal resection.”

For clinicians and patients choosing between first-line anti-TNF therapy versus ileocecal resection using currently available evidence, Dr. Agrawal suggested that a variety of factors need to be considered, including disease location, extent of terminal ileum involved, presence of complications such as stricture, fistula, comorbid conditions, access to biologics, financial considerations, and patient preferences.

Dr. Benjamin Cohen, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic
Cleveland Clinic
Dr. Benjamin Cohen

Benjamin Cohen, MD, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic, called this “an important study” because it offers the first real-world evidence to support the findings from the LIR!C trial.

Dr. Cohen agreed with Dr. Agrawal that more work is needed to determine which patients benefit most from early ileocecal resection, although he suggested that known risk factors for worse outcomes — such as early age at diagnosis, penetrating features of disease, or perianal disease — may increase strength of surgical candidacy.

Still, based on the “fairly strong” body of data now available, he suggested that all patients should be educated about first-line ileocecal resection, as it is “reasonable” approach.

“It’s always important to present surgery as a treatment option,” Dr. Cohen said in an interview. “We don’t want to think of surgery as a last resort, or a failure, because that really colors it in a negative light, and then that ultimately impacts patients’ quality of life, and their perception of outcomes.”

The study was supported by the Danish National Research Foundation. The investigators disclosed no conflicts of interest. Dr. Cohen disclosed consulting and speaking honoraria from AbbVie.

Publications
Topics
Sections

Early ileocecal resection is associated with better long-term outcomes compared with anti–tumor necrosis factor (TNF) therapy for patients with Crohn’s disease (CD), based on new real-world evidence.

These findings add weight to previously reported data from the LIR!C trial, suggesting that ileocecal resection should be considered a first-line treatment option for CD, reported principal investigator Kristine H. Allin, MD, PhD, of Aalborg University, Copenhagen.

“The LIR!C randomized clinical trial has demonstrated comparable quality of life with ileocecal resection and infliximab as a first-line treatment for limited, nonstricturing ileocecal CD at 1 year of follow-up, and improved outcomes with ileocecal resection on retrospective analysis of long-term follow-up data,” the investigators wrote in Gastroenterology. “However, in the real world, the long-term impact of early ileocecal resection for CD, compared with medical therapy, remains largely unexplored.”

To gather these real-world data, the investigators turned to the Danish National Patient Registry and the Danish National Prescription Registry, which included 1,279 individuals diagnosed with CD between 2003 and 2018 who received anti-TNF therapy or underwent ileocecal resection within 1 year of diagnosis. Within this group, slightly less than half underwent ileocecal resection (45.4%) while the remainder (54.6%) received anti-TNF therapy.

The primary outcome was a composite of one or more events: perianal CD, CD-related surgery, systemic corticosteroid exposure, and CD-related hospitalization. Secondary analyses evaluated the relative risks of these same four events as independent entities.

Multifactor-adjusted Cox proportional hazards regression analysis revealed that patients who underwent ileocecal resection had a 33% lower risk of the composite outcome compared with those who received anti-TNF therapy (adjusted hazard ratio [aHR], 0.67; 95% CI, 0.54-0.83).

In the secondary analyses, which examined risks for each component of the composite outcome, the surgery group had a significantly lower risk of CD-related surgery (aHR, 0.56; 95% CI, 0.39-0.80) and corticosteroid exposure (aHR, 0.71; 95% CI, 0.54-0.92), but not perianal CD or CD-related hospitalization.

After 5 years, half of the patients (49.7%) who underwent ileocecal resection were not receiving any treatment for CD. At the same timepoint, a slightly lower percentage of this group (46.3%) had started immunomodulator therapy, while 16.8% started anti-TNF therapy. Just 1.8% of these patients required a second intestinal resection.

Manasi Agrawal, MD, of the Icahn School of Medicine at Mount Sinai, New York,
Icahn School of Medicine at Mount Sinai
Dr. Manasi Agrawal

“To our knowledge, these are the first real-world data in a population-based cohort with long-term follow-up of early ileocecal resection compared with anti-TNF therapy for newly diagnosed ileal and ileocecal CD,” the investigators wrote. “These data suggest that ileocecal resection may have a role as first-line therapy in Crohn’s disease management and challenge the current paradigm of reserving surgery for complicated Crohn’s disease refractory or intolerant to medications.”

Corresponding author Manasi Agrawal, MD, of Icahn School of Medicine at Mount Sinai, New York, suggested that “validation of our findings in external cohorts [is needed], and understanding of factors associated with improved outcomes following ileocecal resection.”

For clinicians and patients choosing between first-line anti-TNF therapy versus ileocecal resection using currently available evidence, Dr. Agrawal suggested that a variety of factors need to be considered, including disease location, extent of terminal ileum involved, presence of complications such as stricture, fistula, comorbid conditions, access to biologics, financial considerations, and patient preferences.

Dr. Benjamin Cohen, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic
Cleveland Clinic
Dr. Benjamin Cohen

Benjamin Cohen, MD, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic, called this “an important study” because it offers the first real-world evidence to support the findings from the LIR!C trial.

Dr. Cohen agreed with Dr. Agrawal that more work is needed to determine which patients benefit most from early ileocecal resection, although he suggested that known risk factors for worse outcomes — such as early age at diagnosis, penetrating features of disease, or perianal disease — may increase strength of surgical candidacy.

Still, based on the “fairly strong” body of data now available, he suggested that all patients should be educated about first-line ileocecal resection, as it is “reasonable” approach.

“It’s always important to present surgery as a treatment option,” Dr. Cohen said in an interview. “We don’t want to think of surgery as a last resort, or a failure, because that really colors it in a negative light, and then that ultimately impacts patients’ quality of life, and their perception of outcomes.”

The study was supported by the Danish National Research Foundation. The investigators disclosed no conflicts of interest. Dr. Cohen disclosed consulting and speaking honoraria from AbbVie.

Early ileocecal resection is associated with better long-term outcomes compared with anti–tumor necrosis factor (TNF) therapy for patients with Crohn’s disease (CD), based on new real-world evidence.

These findings add weight to previously reported data from the LIR!C trial, suggesting that ileocecal resection should be considered a first-line treatment option for CD, reported principal investigator Kristine H. Allin, MD, PhD, of Aalborg University, Copenhagen.

“The LIR!C randomized clinical trial has demonstrated comparable quality of life with ileocecal resection and infliximab as a first-line treatment for limited, nonstricturing ileocecal CD at 1 year of follow-up, and improved outcomes with ileocecal resection on retrospective analysis of long-term follow-up data,” the investigators wrote in Gastroenterology. “However, in the real world, the long-term impact of early ileocecal resection for CD, compared with medical therapy, remains largely unexplored.”

To gather these real-world data, the investigators turned to the Danish National Patient Registry and the Danish National Prescription Registry, which included 1,279 individuals diagnosed with CD between 2003 and 2018 who received anti-TNF therapy or underwent ileocecal resection within 1 year of diagnosis. Within this group, slightly less than half underwent ileocecal resection (45.4%) while the remainder (54.6%) received anti-TNF therapy.

The primary outcome was a composite of one or more events: perianal CD, CD-related surgery, systemic corticosteroid exposure, and CD-related hospitalization. Secondary analyses evaluated the relative risks of these same four events as independent entities.

Multifactor-adjusted Cox proportional hazards regression analysis revealed that patients who underwent ileocecal resection had a 33% lower risk of the composite outcome compared with those who received anti-TNF therapy (adjusted hazard ratio [aHR], 0.67; 95% CI, 0.54-0.83).

In the secondary analyses, which examined risks for each component of the composite outcome, the surgery group had a significantly lower risk of CD-related surgery (aHR, 0.56; 95% CI, 0.39-0.80) and corticosteroid exposure (aHR, 0.71; 95% CI, 0.54-0.92), but not perianal CD or CD-related hospitalization.

After 5 years, half of the patients (49.7%) who underwent ileocecal resection were not receiving any treatment for CD. At the same timepoint, a slightly lower percentage of this group (46.3%) had started immunomodulator therapy, while 16.8% started anti-TNF therapy. Just 1.8% of these patients required a second intestinal resection.

Manasi Agrawal, MD, of the Icahn School of Medicine at Mount Sinai, New York,
Icahn School of Medicine at Mount Sinai
Dr. Manasi Agrawal

“To our knowledge, these are the first real-world data in a population-based cohort with long-term follow-up of early ileocecal resection compared with anti-TNF therapy for newly diagnosed ileal and ileocecal CD,” the investigators wrote. “These data suggest that ileocecal resection may have a role as first-line therapy in Crohn’s disease management and challenge the current paradigm of reserving surgery for complicated Crohn’s disease refractory or intolerant to medications.”

Corresponding author Manasi Agrawal, MD, of Icahn School of Medicine at Mount Sinai, New York, suggested that “validation of our findings in external cohorts [is needed], and understanding of factors associated with improved outcomes following ileocecal resection.”

For clinicians and patients choosing between first-line anti-TNF therapy versus ileocecal resection using currently available evidence, Dr. Agrawal suggested that a variety of factors need to be considered, including disease location, extent of terminal ileum involved, presence of complications such as stricture, fistula, comorbid conditions, access to biologics, financial considerations, and patient preferences.

Dr. Benjamin Cohen, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic
Cleveland Clinic
Dr. Benjamin Cohen

Benjamin Cohen, MD, staff physician and co-section head and clinical director for inflammatory bowel diseases in the department of gastroenterology, hepatology, and nutrition at Cleveland Clinic, called this “an important study” because it offers the first real-world evidence to support the findings from the LIR!C trial.

Dr. Cohen agreed with Dr. Agrawal that more work is needed to determine which patients benefit most from early ileocecal resection, although he suggested that known risk factors for worse outcomes — such as early age at diagnosis, penetrating features of disease, or perianal disease — may increase strength of surgical candidacy.

Still, based on the “fairly strong” body of data now available, he suggested that all patients should be educated about first-line ileocecal resection, as it is “reasonable” approach.

“It’s always important to present surgery as a treatment option,” Dr. Cohen said in an interview. “We don’t want to think of surgery as a last resort, or a failure, because that really colors it in a negative light, and then that ultimately impacts patients’ quality of life, and their perception of outcomes.”

The study was supported by the Danish National Research Foundation. The investigators disclosed no conflicts of interest. Dr. Cohen disclosed consulting and speaking honoraria from AbbVie.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

NAFLD familial risk score outperforms FIB-4 index for identifying advanced fibrosis

Article Type
Changed
Thu, 12/07/2023 - 10:52

A new risk model for nonalcoholic fatty liver disease (NAFLD) could offer a simpler and more accurate way of predicting advanced fibrosis in first-degree relatives, according to investigators.

By leveraging basic clinical factors instead of more advanced diagnostic findings, the NAFLD Familial Risk Score is more scalable than existing strategies for identifying advanced fibrosis, reported lead author Rohit Loomba, MD, of the University of California San Diego, La Jolla, and colleagues.

“[G]iven the enormous global burden of NAFLD, it is not possible to perform an imaging-based fibrosis assessment on all individuals with NAFLD,” the investigators wrote in Clinical Gastroenterology and Hepatology. “The ability to identify individuals at risk for advanced fibrosis using routine clinical history taking is a major unmet need in clinical practice.”

To this end, the investigators conducted a prospective, cross-sectional, familial study that comprised 242 consecutive probands and 396 first-degree relatives. All participants underwent liver fibrosis evaluation, most with magnetic resonance elastography.

Dr. Loomba and colleagues first developed the risk model by analyzing data from a derivation cohort of 220 individuals in San Diego, among whom 92 were first-degree relatives of probands without advanced fibrosis and 128 were first-degree relatives of probands with NAFLD and advanced fibrosis.

Their analysis identified the following four risk factors for advanced fibrosis: age of 50 years or more, presence of type 2 diabetes mellitus, obesity, and family history of NAFLD with advanced fibrosis. These variables were used to construct the NAFLD Familial Risk Score, with age and diabetes each accounting for one point, and obesity and family history contributing two points each, for a possible total of six points.

Within the derivation cohort, this scoring system demonstrated an area under the receiver operating characteristic curve (AUROC) of 0.85 (95% CI, 0.76-0.92), suggesting high accuracy for identifying advanced fibrosis.

When applied to a validation cohort of 176 individuals in Finland, the AUROC was higher still, at 0.94 (95% CI, 0.89-0.99). For comparison, in the same group, the FIB-4 index had a significantly lower AUROC of 0.70 (P = .02).

“The NAFLD Familial Risk Score potentially can be used by family members who are aware of the diagnosis of advanced fibrosis in the proband,” the investigators wrote. “Information on how to calculate and interpret the score can be conveyed to first-degree relatives by the proband, or by medical staff to first-degree relatives who accompany the proband to medical appointments. First-degree relatives with a score of four points or more (corresponding to 13% risk of NAFLD with advanced fibrosis) may consider undergoing an imaging-based fibrosis assessment.”

Dr. Loomba and colleagues highlighted the simplicity of their scoring system, which does not require a calculator or any information more complex than a basic clinical history.

“It may be a helpful alternative to FIB-4 for identifying NAFLD with advanced fibrosis among first-degree relatives in clinical practice because it does not require laboratory tests,” they wrote, noting that this, along with the other comparative advantages of the new risk score, “may have implications for surveillance in NAFLD.”

The study was supported by the National Center for Advancing Translational Sciences, the National Institute of Diabetes and Digestive and Kidney Diseases, the National Heart, Lung, and Blood Institute, and others. The investigators disclosed relationships with Aardvark Therapeutics, Altimmune, Anylam/Regeneron, and others.

Body

 

My patients with metabolic dysfunction associated steatotic liver disease (MASLD) and advanced fibrosis and cirrhosis often worry about the risk of MASLD and advanced fibrosis among their relatives, especially their children and siblings. Based on my clinical experience, I tell them that their first-degree relatives get checked for MASLD with liver enzymes and a liver ultrasound. I advise if either of these tests is abnormal, they should see a gastroenterologist for further evaluation. In this paper, Huang and colleagues developed and validated a NAFLD Familial Risk Score to identify advanced fibrosis in the first-degree relatives of patients with NAFLD and advanced fibrosis. This score consists of age greater than 50 years (one point), BMI greater than 30 kg/m2 (two points), type 2 diabetes (one point), and a first-degree relative with NAFLD and advanced fibrosis (two points).

Dr. Naga Chalasani, Indiana University, Indianapolis
Indiana University
Dr. Naga Chalasani
A score of ≥ 4 denotes heightened risk for NAFLD with advanced fibrosis in a first-degree relative and thus they should be directed to a health care provider for further evaluation. This important observation, while it needs confirmation by other research groups, is practice changing for me. Next time, when I see a patient with MASLD and advanced fibrosis, I will not only ask for the family history of liver disease, but will attempt to estimate the risk for MASLD and advanced fibrosis among the first-degree relatives using this scoring system. If you are caring for patients with NAFLD, this scoring system is worth considering for incorporating into your clinical practice.

Naga Chalasani, MD, AGAF, is a practicing hepatologist and David W. Crabb Professor of Gastroenterology and vice president for academic affairs at Indiana University School of Medicine and Indiana University Health in Indianapolis. He declared no conflicts of interests for this commentary.

Publications
Topics
Sections
Body

 

My patients with metabolic dysfunction associated steatotic liver disease (MASLD) and advanced fibrosis and cirrhosis often worry about the risk of MASLD and advanced fibrosis among their relatives, especially their children and siblings. Based on my clinical experience, I tell them that their first-degree relatives get checked for MASLD with liver enzymes and a liver ultrasound. I advise if either of these tests is abnormal, they should see a gastroenterologist for further evaluation. In this paper, Huang and colleagues developed and validated a NAFLD Familial Risk Score to identify advanced fibrosis in the first-degree relatives of patients with NAFLD and advanced fibrosis. This score consists of age greater than 50 years (one point), BMI greater than 30 kg/m2 (two points), type 2 diabetes (one point), and a first-degree relative with NAFLD and advanced fibrosis (two points).

Dr. Naga Chalasani, Indiana University, Indianapolis
Indiana University
Dr. Naga Chalasani
A score of ≥ 4 denotes heightened risk for NAFLD with advanced fibrosis in a first-degree relative and thus they should be directed to a health care provider for further evaluation. This important observation, while it needs confirmation by other research groups, is practice changing for me. Next time, when I see a patient with MASLD and advanced fibrosis, I will not only ask for the family history of liver disease, but will attempt to estimate the risk for MASLD and advanced fibrosis among the first-degree relatives using this scoring system. If you are caring for patients with NAFLD, this scoring system is worth considering for incorporating into your clinical practice.

Naga Chalasani, MD, AGAF, is a practicing hepatologist and David W. Crabb Professor of Gastroenterology and vice president for academic affairs at Indiana University School of Medicine and Indiana University Health in Indianapolis. He declared no conflicts of interests for this commentary.

Body

 

My patients with metabolic dysfunction associated steatotic liver disease (MASLD) and advanced fibrosis and cirrhosis often worry about the risk of MASLD and advanced fibrosis among their relatives, especially their children and siblings. Based on my clinical experience, I tell them that their first-degree relatives get checked for MASLD with liver enzymes and a liver ultrasound. I advise if either of these tests is abnormal, they should see a gastroenterologist for further evaluation. In this paper, Huang and colleagues developed and validated a NAFLD Familial Risk Score to identify advanced fibrosis in the first-degree relatives of patients with NAFLD and advanced fibrosis. This score consists of age greater than 50 years (one point), BMI greater than 30 kg/m2 (two points), type 2 diabetes (one point), and a first-degree relative with NAFLD and advanced fibrosis (two points).

Dr. Naga Chalasani, Indiana University, Indianapolis
Indiana University
Dr. Naga Chalasani
A score of ≥ 4 denotes heightened risk for NAFLD with advanced fibrosis in a first-degree relative and thus they should be directed to a health care provider for further evaluation. This important observation, while it needs confirmation by other research groups, is practice changing for me. Next time, when I see a patient with MASLD and advanced fibrosis, I will not only ask for the family history of liver disease, but will attempt to estimate the risk for MASLD and advanced fibrosis among the first-degree relatives using this scoring system. If you are caring for patients with NAFLD, this scoring system is worth considering for incorporating into your clinical practice.

Naga Chalasani, MD, AGAF, is a practicing hepatologist and David W. Crabb Professor of Gastroenterology and vice president for academic affairs at Indiana University School of Medicine and Indiana University Health in Indianapolis. He declared no conflicts of interests for this commentary.

A new risk model for nonalcoholic fatty liver disease (NAFLD) could offer a simpler and more accurate way of predicting advanced fibrosis in first-degree relatives, according to investigators.

By leveraging basic clinical factors instead of more advanced diagnostic findings, the NAFLD Familial Risk Score is more scalable than existing strategies for identifying advanced fibrosis, reported lead author Rohit Loomba, MD, of the University of California San Diego, La Jolla, and colleagues.

“[G]iven the enormous global burden of NAFLD, it is not possible to perform an imaging-based fibrosis assessment on all individuals with NAFLD,” the investigators wrote in Clinical Gastroenterology and Hepatology. “The ability to identify individuals at risk for advanced fibrosis using routine clinical history taking is a major unmet need in clinical practice.”

To this end, the investigators conducted a prospective, cross-sectional, familial study that comprised 242 consecutive probands and 396 first-degree relatives. All participants underwent liver fibrosis evaluation, most with magnetic resonance elastography.

Dr. Loomba and colleagues first developed the risk model by analyzing data from a derivation cohort of 220 individuals in San Diego, among whom 92 were first-degree relatives of probands without advanced fibrosis and 128 were first-degree relatives of probands with NAFLD and advanced fibrosis.

Their analysis identified the following four risk factors for advanced fibrosis: age of 50 years or more, presence of type 2 diabetes mellitus, obesity, and family history of NAFLD with advanced fibrosis. These variables were used to construct the NAFLD Familial Risk Score, with age and diabetes each accounting for one point, and obesity and family history contributing two points each, for a possible total of six points.

Within the derivation cohort, this scoring system demonstrated an area under the receiver operating characteristic curve (AUROC) of 0.85 (95% CI, 0.76-0.92), suggesting high accuracy for identifying advanced fibrosis.

When applied to a validation cohort of 176 individuals in Finland, the AUROC was higher still, at 0.94 (95% CI, 0.89-0.99). For comparison, in the same group, the FIB-4 index had a significantly lower AUROC of 0.70 (P = .02).

“The NAFLD Familial Risk Score potentially can be used by family members who are aware of the diagnosis of advanced fibrosis in the proband,” the investigators wrote. “Information on how to calculate and interpret the score can be conveyed to first-degree relatives by the proband, or by medical staff to first-degree relatives who accompany the proband to medical appointments. First-degree relatives with a score of four points or more (corresponding to 13% risk of NAFLD with advanced fibrosis) may consider undergoing an imaging-based fibrosis assessment.”

Dr. Loomba and colleagues highlighted the simplicity of their scoring system, which does not require a calculator or any information more complex than a basic clinical history.

“It may be a helpful alternative to FIB-4 for identifying NAFLD with advanced fibrosis among first-degree relatives in clinical practice because it does not require laboratory tests,” they wrote, noting that this, along with the other comparative advantages of the new risk score, “may have implications for surveillance in NAFLD.”

The study was supported by the National Center for Advancing Translational Sciences, the National Institute of Diabetes and Digestive and Kidney Diseases, the National Heart, Lung, and Blood Institute, and others. The investigators disclosed relationships with Aardvark Therapeutics, Altimmune, Anylam/Regeneron, and others.

A new risk model for nonalcoholic fatty liver disease (NAFLD) could offer a simpler and more accurate way of predicting advanced fibrosis in first-degree relatives, according to investigators.

By leveraging basic clinical factors instead of more advanced diagnostic findings, the NAFLD Familial Risk Score is more scalable than existing strategies for identifying advanced fibrosis, reported lead author Rohit Loomba, MD, of the University of California San Diego, La Jolla, and colleagues.

“[G]iven the enormous global burden of NAFLD, it is not possible to perform an imaging-based fibrosis assessment on all individuals with NAFLD,” the investigators wrote in Clinical Gastroenterology and Hepatology. “The ability to identify individuals at risk for advanced fibrosis using routine clinical history taking is a major unmet need in clinical practice.”

To this end, the investigators conducted a prospective, cross-sectional, familial study that comprised 242 consecutive probands and 396 first-degree relatives. All participants underwent liver fibrosis evaluation, most with magnetic resonance elastography.

Dr. Loomba and colleagues first developed the risk model by analyzing data from a derivation cohort of 220 individuals in San Diego, among whom 92 were first-degree relatives of probands without advanced fibrosis and 128 were first-degree relatives of probands with NAFLD and advanced fibrosis.

Their analysis identified the following four risk factors for advanced fibrosis: age of 50 years or more, presence of type 2 diabetes mellitus, obesity, and family history of NAFLD with advanced fibrosis. These variables were used to construct the NAFLD Familial Risk Score, with age and diabetes each accounting for one point, and obesity and family history contributing two points each, for a possible total of six points.

Within the derivation cohort, this scoring system demonstrated an area under the receiver operating characteristic curve (AUROC) of 0.85 (95% CI, 0.76-0.92), suggesting high accuracy for identifying advanced fibrosis.

When applied to a validation cohort of 176 individuals in Finland, the AUROC was higher still, at 0.94 (95% CI, 0.89-0.99). For comparison, in the same group, the FIB-4 index had a significantly lower AUROC of 0.70 (P = .02).

“The NAFLD Familial Risk Score potentially can be used by family members who are aware of the diagnosis of advanced fibrosis in the proband,” the investigators wrote. “Information on how to calculate and interpret the score can be conveyed to first-degree relatives by the proband, or by medical staff to first-degree relatives who accompany the proband to medical appointments. First-degree relatives with a score of four points or more (corresponding to 13% risk of NAFLD with advanced fibrosis) may consider undergoing an imaging-based fibrosis assessment.”

Dr. Loomba and colleagues highlighted the simplicity of their scoring system, which does not require a calculator or any information more complex than a basic clinical history.

“It may be a helpful alternative to FIB-4 for identifying NAFLD with advanced fibrosis among first-degree relatives in clinical practice because it does not require laboratory tests,” they wrote, noting that this, along with the other comparative advantages of the new risk score, “may have implications for surveillance in NAFLD.”

The study was supported by the National Center for Advancing Translational Sciences, the National Institute of Diabetes and Digestive and Kidney Diseases, the National Heart, Lung, and Blood Institute, and others. The investigators disclosed relationships with Aardvark Therapeutics, Altimmune, Anylam/Regeneron, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Acyclcarnitines could drive IBD via dysbiosis

Article Type
Changed
Tue, 12/05/2023 - 13:40

Increased levels of carnitine and acylcarnitines are associated with increased dysbiosis and disease activity in pediatric inflammatory bowel disease (IBD), according to investigators.

These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.

“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”

The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.

First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.

Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.

To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.

“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.

Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.

“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”

The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.

Body

 

The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.

Nadine Cerf-Bensussan, MD, PhD, research director, French National Institute of Health and Medical Research (INSERM); head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University.
Imagine Institute
Dr. Nadine Cerf-Bensussan
Distinct host- and microbiota-derived factors combinedly contribute to the elevation of luminal acylcarnitine, which the authors then suggested to be both a symptom and a cause of IBD. Further studies will be needed to elucidate the refined balance of this relationship, which may have a potential to be used as a clinical biomarker for the diagnosis and prognosis of IBD.

Renan Oliveira Corrêa, PhD, is postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris
Imagine Institute
Dr. Renan Oliveira Corrêa
Renan Oliveira Corrêa, PhD, is a postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris. Nadine Cerf-Bensussan, MD, PhD, is a research director at the French National Institute of Health and Medical Research (INSERM), and head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University. They have no conflicts of interest.

Publications
Topics
Sections
Body

 

The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.

Nadine Cerf-Bensussan, MD, PhD, research director, French National Institute of Health and Medical Research (INSERM); head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University.
Imagine Institute
Dr. Nadine Cerf-Bensussan
Distinct host- and microbiota-derived factors combinedly contribute to the elevation of luminal acylcarnitine, which the authors then suggested to be both a symptom and a cause of IBD. Further studies will be needed to elucidate the refined balance of this relationship, which may have a potential to be used as a clinical biomarker for the diagnosis and prognosis of IBD.

Renan Oliveira Corrêa, PhD, is postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris
Imagine Institute
Dr. Renan Oliveira Corrêa
Renan Oliveira Corrêa, PhD, is a postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris. Nadine Cerf-Bensussan, MD, PhD, is a research director at the French National Institute of Health and Medical Research (INSERM), and head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University. They have no conflicts of interest.

Body

 

The description of noninvasive biomarkers for inflammatory bowel disease (IBD) is key to better characterizing the disease pathogenesis. In this new publication, Lemons et al. describe deleterious effects of gut luminal carnitine and acylcarnitine in pediatric IBD patients, showing that these metabolites can serve as energy substrates to the microbiota, especially Enterobacteriaceae, promoting the growth of pathobionts and contributing to the persistence of dysbiosis which, in turn, may foster the course of IBD. In fact, acylcarnitine had been highlighted as a potential new target for IBD during dysbiosis by a previous multi-omics study of the gut microbiome. Moreover, Dr. Gary Wu’s team has shown that the intestinal epithelium can uptake and use acylcarnitine as an alternative source for energy production. However, epithelial mitochondrial dysfunction triggered by inflammation reduces the capacity of colonocytes to consume long-chain fatty acids, thus enhancing the fecal levels of acylcarnitine as described in IBD patients.

Nadine Cerf-Bensussan, MD, PhD, research director, French National Institute of Health and Medical Research (INSERM); head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University.
Imagine Institute
Dr. Nadine Cerf-Bensussan
Distinct host- and microbiota-derived factors combinedly contribute to the elevation of luminal acylcarnitine, which the authors then suggested to be both a symptom and a cause of IBD. Further studies will be needed to elucidate the refined balance of this relationship, which may have a potential to be used as a clinical biomarker for the diagnosis and prognosis of IBD.

Renan Oliveira Corrêa, PhD, is postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris
Imagine Institute
Dr. Renan Oliveira Corrêa
Renan Oliveira Corrêa, PhD, is a postdoctoral researcher at the Imagine Institute of Genetic Diseases in Paris. Nadine Cerf-Bensussan, MD, PhD, is a research director at the French National Institute of Health and Medical Research (INSERM), and head of the Laboratory of Intestinal Immunity at Imagine Institute in Paris and Paris University. They have no conflicts of interest.

Increased levels of carnitine and acylcarnitines are associated with increased dysbiosis and disease activity in pediatric inflammatory bowel disease (IBD), according to investigators.

These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.

“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”

The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.

First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.

Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.

To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.

“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.

Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.

“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”

The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.

Increased levels of carnitine and acylcarnitines are associated with increased dysbiosis and disease activity in pediatric inflammatory bowel disease (IBD), according to investigators.

These findings improve our understanding of IBD pathogenesis and disease course, and could prove valuable in biomarker research, reported lead author Gary D. Wu, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

In health, carnitine and acylcarnitines aid in fatty acid transport, the investigators wrote in September in Cellular and Molecular Gastroenterology and Hepatology. Acylcarnitines are also involved in metabolic signaling, and in the absence of sufficient short-chain fatty acids may serve as an alternative energy source for the intestinal epithelium.

“Recently, we and others have shown that fecal acylcarnitines are increased in patients with IBD, especially during dysbiosis,” they noted. “However, the mechanism(s) responsible for the increase of fecal acylcarnitines in IBD and their biological function have not been elucidated.”

The present study aimed to address this knowledge gap by characterizing both carnitine and acylcarnitines in pediatric IBD.

First, the investigators confirmed that both carnitine and acylcarnitines were elevated in fecal samples from pediatric patients with IBD.

Next, they analyzed fecal samples from subjects in the Food and Resulting Microbiota and Metabolome (FARMM) study, which compared microbiota recovery after gut purge and antibiotics among participants eating an omnivorous diet, a vegan diet, or an exclusive enteral nutrition (EEN) diet lacking in fiber. After the antibiotics, levels of fecal carnitine and acylcarnitines increased significantly in all groups, suggesting that microbiota were consuming these molecules.

To clarify the relationship between inflammation and levels of carnitine and acylcarnitines in the absence of microbiota, Dr. Wu and colleagues employed a germ-free mouse model with dextran sodium sulfate (DSS)–induced colitis. Levels of both molecule types were significantly increased in bile and plasma of mice with colitis versus those that were not exposed to DSS.

“Because the gut microbiota consumes both carnitine and acylcarnitines, these results are consistent with the notion that the increase of these metabolites in the feces of patients with IBD is driven by increased biliary delivery of acylcarnitines to the lumen combined with the reduced number and function of mitochondria in the colonic epithelium as previously reported,” the investigators wrote.

Further experiments with plated cultures and mice revealed that various bacterial species consumed carnitine and acylcarnitines in distinct patterns. Enterobacteriaceae demonstrated a notable proclivity for consumption in vitro and within the murine gut.

“As a high-dimensional analytic feature, the pattern of fecal acylcarnitines, perhaps together with bacterial taxonomy, may have utility as a biomarker for the presence or prognosis of IBD,” Dr. Wu and colleagues concluded. “In addition, based on currently available information about the impact of carnitine on the biology of Enterobacteriaceae, acylcarnitines also may have an important functional effect on the biology of the gut microbiota that is relevant to the pathogenesis or course of disease in patients with IBD.”

The study was supported by the Crohn’s and Colitis Foundation, the PennCHOP Microbiome Program, the Penn Center for Nutritional Science and Medicine, and others. The investigators disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AGA CPU updates usage of vasoactive drugs, IV albumin, for cirrhosis

Article Type
Changed
Wed, 12/13/2023 - 10:50

 

American Gastroenterological Association (AGA) has released a new Clinical Practice Update (CPU) guiding the use of vasoactive drugs and intravenous albumin in patients with cirrhosis.

The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.

These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
 

Variceal Hemorrhage

Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.

“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.

In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.

The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.

“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
 

Ascites and Spontaneous Bacterial Peritonitis

In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.

Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.

Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.

The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
 

Acute Kidney Injury and Hepatorenal Syndrome

In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”

 

 

The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.

Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.

The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.

Publications
Topics
Sections

 

American Gastroenterological Association (AGA) has released a new Clinical Practice Update (CPU) guiding the use of vasoactive drugs and intravenous albumin in patients with cirrhosis.

The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.

These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
 

Variceal Hemorrhage

Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.

“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.

In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.

The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.

“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
 

Ascites and Spontaneous Bacterial Peritonitis

In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.

Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.

Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.

The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
 

Acute Kidney Injury and Hepatorenal Syndrome

In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”

 

 

The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.

Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.

The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.

 

American Gastroenterological Association (AGA) has released a new Clinical Practice Update (CPU) guiding the use of vasoactive drugs and intravenous albumin in patients with cirrhosis.

The publication, authored by Vincent Wai-Sun Wong, MBChB, MD, and colleagues, includes 12 best-practice-advice statements concerning 3 common clinical scenarios: variceal hemorrhage, ascites and spontaneous bacterial peritonitis, and acute kidney injury and hepatorenal syndrome.

These complications of liver decompensation “are manifestations of portal hypertension with a [consequent] vasodilatory–hyperdynamic circulatory state, resulting in progressive decreases in effective arterial blood volume and renal perfusion,” the update authors wrote in November in Gastroenterology. “Because a potent vasoconstrictor, terlipressin, was recently approved by the United States Food and Drug Administration and because recent trials have explored use of intravenous albumin in other settings, it was considered that a best practice update would be relevant regarding the use of vasoactive drugs and intravenous albumin in these 3 specific scenarios.”
 

Variceal Hemorrhage

Comprising 70% of all upper GI hemorrhage in patients with cirrhosis, and carrying a 6-week mortality rate as high as 43%, Dr. Wong and colleagues advise immediate initiation of vasoactive drugs upon suspision of variceal hemorrhage, ideally before therapeutic and/or diagnostic endoscopy.

“The goals of management of acute variceal hemorrhage include initial hemostasis, preventing early rebleeding, and reducing in-hospital and 6-week mortality,” they wrote, noting that vasoactive drugs are effective at stopping bleeding in up to 8 out of 10 cases.

In patients with acute variceal hemorrhage undergoing endoscopic hemostasis, vasoactive agents should be continued for 2-5 days to prevent early rebleeding, according to the second best-practice-advice statement.

The third statement suggests octreotide as the drug of choice for variceal hemorrhage due to its favorable safety profile.

“Nowadays, vasopressin is no longer advised in patients with acute variceal hemorrhage because of a high risk of cardiovascular adverse events,” the update authors noted.
 

Ascites and Spontaneous Bacterial Peritonitis

In cases requiring large-volume (greater than 5 L) paracentesis, intravenous albumin should be administered at time of fluid removal, according to the update. In these patients, albumin reduces the risk of post-paracentesis circulatory dysfunction (defined as an increase in plasma renin activity), thereby reducing the risk of acute kidney injury.

Intravenous albumin should also be considered in patients with spontaneous bacterial peritonitis as this can overcome associated vasodilatation and decreased effective arterial blood volume, which may lead to acute kidney injury if untreated. In contrast, because of a demonstrated lack of efficacy, albumin is not advised in infections other than spontaneous bacterial peritonitis, unless associated with acute kidney injury.

Long-term albumin administration should be avoided in patients with cirrhosis and uncomplicated ascites, whether they are hospitalized or not, as evidence is lacking to support a consistent beneficial effect.

The update also advises against vasoconstrictors in patients with uncomplicated ascites, bacterial peritonitis, and after large-volume paracentesis, again due to a lack of supporting evidence.
 

Acute Kidney Injury and Hepatorenal Syndrome

In hospitalized patients with cirrhosis and ascites presenting with acute kidney injury, Dr. Wong and colleagues called albumin “the volume expander of choice in hospitalized patients with cirrhosis and ascites presenting with acute kidney injury,” however, the authors caution the dose of albumin “should be tailored to the volume status of the patient.”

 

 

The update authors suggested that terlipressin and norepinephrine are suitable options for patients with cirrhosis and the hepatorenal syndrome; however, they suggest terlipressin above the others based on available evidence and suggested concomitant albumin administration as it may further improve renal blood flow by filling the central circulation.

Terlipressin also has the advantage (over norepinephrine) of being administrable via a peripheral line without the need for intensive care unit monitoring, the update authors wrote. The agent is contraindicated in patients with hypoxia or with coronary, peripheral, or mesenteric ischemia, and it should be used with caution in patients with ACLF grade 3, according to the publication. Risks of terlipressin may also outweigh benefits in patients with a serum creatine greater than 5 mg/dL and those listed for transplant with a MELD score of 35 or higher.

The Clinical Practice Update was commissioned and supported by AGA. The authors disclosed relationships with Advanz, Boehringer Ingelheim, 89bio, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Antireflux surgery may not reduce cancer risk in Barrett’s esophagus

Article Type
Changed
Tue, 12/12/2023 - 15:02

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Jesper Lagergren, MD, PhD, Karolinska Institutet, Stockholm
Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Publications
Topics
Sections
Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Jesper Lagergren, MD, PhD, Karolinska Institutet, Stockholm
Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Jesper Lagergren, MD, PhD, Karolinska Institutet, Stockholm
Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Fewer than 1 out of 4 patients with HCV-related liver cancer receive antivirals

Article Type
Changed
Thu, 12/07/2023 - 18:12

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Taste and smell changes linked with worse QOL and cognition in cirrhosis, renal failure

Article Type
Changed
Mon, 12/04/2023 - 12:41

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj, Virginia Commonwealth University, Richmond
Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Dr. Adrian M. Di Bisceglie, Saint Louis University
Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj, Virginia Commonwealth University, Richmond
Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Dr. Adrian M. Di Bisceglie, Saint Louis University
Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj, Virginia Commonwealth University, Richmond
Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Dr. Adrian M. Di Bisceglie, Saint Louis University
Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

COVID livers are safe for transplant

Article Type
Changed
Mon, 12/04/2023 - 12:22

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Dr. Timucin Taner, Mayo Clinic, Rochester, Minn.
Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Dr. Timucin Taner, Mayo Clinic, Rochester, Minn.
Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Dr. Timucin Taner, Mayo Clinic, Rochester, Minn.
Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article