Bifidobacteria supplementation regulates newborn immune system

Article Type
Changed
Fri, 06/25/2021 - 12:26

 

Supplementing breastfed infants with bifidobacteria promotes development of a well-regulated immune system, theoretically reducing risk of immune-mediated conditions like allergies and asthma, according to investigators.

These findings support the importance of early gut colonization with beneficial microbes, an event that may affect the immune system throughout life, reported lead author Bethany M. Henrick, PhD, director of immunology and diagnostics at Evolve Biosystems, Davis, Calif., and adjunct assistant professor at the University of Nebraska, Lincoln, and colleagues.

“Dysbiosis of the infant gut microbiome is common in modern societies and a likely contributing factor to the increased incidences of immune-mediated disorders,” the investigators wrote in Cell. “Therefore, there is great interest in identifying microbial factors that can support healthier immune system imprinting and hopefully prevent cases of allergy, autoimmunity, and possibly other conditions involving the immune system.”

Prevailing theory suggests that the rising incidence of neonatal intestinal dysbiosis – which is typical in developed countries – may be caused by a variety of factors, including cesarean sections; modern hygiene practices; antibiotics, antiseptics, and other medications; diets high in fat and sugar; and infant formula.

According to Dr. Henrick and colleagues, a healthy gut microbiome plays the greatest role in immunological development during the first 3 months post partum; specifically, a lack of bifidobacteria during this time has been linked with increased risks of autoimmunity and enteric inflammation, although underlying immune mechanisms remain unclear.

Bifidobacteria also exemplify the symbiotic relationship between mothers, babies, and beneficial microbes. The investigators pointed out that breast milk contains human milk oligosaccharides (HMOs), which humans cannot digest, but are an excellent source of energy for bifidobacteria and other beneficial microbes, giving them a “selective nutritional advantage.”

Bifidobacteria should therefore be common residents within the infant gut, but this is often not now the case, leading Dr. Henrick and colleagues to zero in on the microbe, in hopes of determining the exactly how beneficial bacteria shape immune development.

It is only recently that the necessary knowledge and techniques to perform studies like this one have become available, the investigators wrote, noting a better understanding of cell-regulatory relationships, advances in immune profiling at the systems level, and new technology that allows for profiling small-volume samples from infants.

The present study involved a series of observational experiments and a small interventional trial.

First, the investigators conducted a wide array of blood- and fecal-based longitudinal analyses from 208 infants in Sweden to characterize immune cell expansion and microbiome colonization of the gut, with a focus on bifidobacteria.

Their results showed that infants lacking bifidobacteria, and HMO-utilization genes (which are expressed by bifidobacteria and other beneficial microbes), had higher levels of systemic inflammation, including increased T helper 2 (Th2) and Th17 responses.

“Infants not colonized by Bifidobacteriaceae or in cases where these microbes fail to expand during the first months of life there is evidence of systemic and intestinal inflammation, increased frequencies of activated immune cells, and reduced levels of regulatory cells indicative of systemic immune dysregulation,” the investigators wrote.

The interventional part of the study involved 60 breastfed infants in California. Twenty-nine of the newborns were given 1.8 x 1010 colony-forming units (CFUs) of B. longum subsp. infantis EVC001 daily from postnatal day 7 to day 28, while the remaining 31 infants were given no supplementation.

Fecal samples were collected on day 6 and day 60. At day 60, supplemented infants had high levels of HMO-utilization genes, plus significantly greater alpha diversity (P = .0001; Wilcoxon), compared with controls. Infants receiving EVC001 also had less inflammatory fecal cytokines, suggesting that microbes expressing HMO-utilization genes cause a shift away from proinflammatory Th2 and Th17 responses, and toward Th1.

Dr. Petter Brodin professor of pediatric immunology at Karolinska Institutet, Solna, Sweden
Dr. Petter Brodin

“It is not the simple presence of bifidobacteria that is responsible for the immune effects but the metabolic partnership between the bacteria and HMOs,” the investigators noted.

According to principal investigator Petter Brodin, MD, PhD, professor of pediatric immunology at Karolinska Institutet, Solna, Sweden, the findings deserve further investigation.

“Our data indicate that substitution with beneficial microbes efficiently metabolizing HMOs could open a way to prevent cases of immune-mediated diseases, but larger, randomized trials aimed at this will be required to determine this potential,” Dr. Brodin said in an interview.

Dr. Carolynn Dude assistant professor in the division of maternal-fetal medicine at Emory University School of Medicine, Atlanta
Dr. Carolynn Dude

Carolynn Dude, MD, PhD, assistant professor in the division of maternal-fetal medicine at Emory University, Atlanta, agreed that more work is needed.

“While this study provides some insight into the mechanisms that may set up a newborn for poor health outcomes later in life, the data is still very limited, and more long-term follow-up on these infants is needed before recommending any sort of bacterial supplementation to a newborn,” Dr. Dude said in an interview.

Dr. Brodin and colleagues are planning an array of related studies, including larger clinical trials; further investigations into mechanisms of action; comparisons between the present cohort and infants in Kenya, where immune-mediated diseases are rare; and evaluations of vaccine responses and infectious disease susceptibility.

The study was supported by the European Research Council, the Swedish Research Council, the Marianne & Marcus Wallenberg Foundation, and others. The investigators disclosed relationships with Cytodelics, Scailyte, Kancera, and others. Dr. Dude reported no relevant conflicts of interest.

Publications
Topics
Sections

 

Supplementing breastfed infants with bifidobacteria promotes development of a well-regulated immune system, theoretically reducing risk of immune-mediated conditions like allergies and asthma, according to investigators.

These findings support the importance of early gut colonization with beneficial microbes, an event that may affect the immune system throughout life, reported lead author Bethany M. Henrick, PhD, director of immunology and diagnostics at Evolve Biosystems, Davis, Calif., and adjunct assistant professor at the University of Nebraska, Lincoln, and colleagues.

“Dysbiosis of the infant gut microbiome is common in modern societies and a likely contributing factor to the increased incidences of immune-mediated disorders,” the investigators wrote in Cell. “Therefore, there is great interest in identifying microbial factors that can support healthier immune system imprinting and hopefully prevent cases of allergy, autoimmunity, and possibly other conditions involving the immune system.”

Prevailing theory suggests that the rising incidence of neonatal intestinal dysbiosis – which is typical in developed countries – may be caused by a variety of factors, including cesarean sections; modern hygiene practices; antibiotics, antiseptics, and other medications; diets high in fat and sugar; and infant formula.

According to Dr. Henrick and colleagues, a healthy gut microbiome plays the greatest role in immunological development during the first 3 months post partum; specifically, a lack of bifidobacteria during this time has been linked with increased risks of autoimmunity and enteric inflammation, although underlying immune mechanisms remain unclear.

Bifidobacteria also exemplify the symbiotic relationship between mothers, babies, and beneficial microbes. The investigators pointed out that breast milk contains human milk oligosaccharides (HMOs), which humans cannot digest, but are an excellent source of energy for bifidobacteria and other beneficial microbes, giving them a “selective nutritional advantage.”

Bifidobacteria should therefore be common residents within the infant gut, but this is often not now the case, leading Dr. Henrick and colleagues to zero in on the microbe, in hopes of determining the exactly how beneficial bacteria shape immune development.

It is only recently that the necessary knowledge and techniques to perform studies like this one have become available, the investigators wrote, noting a better understanding of cell-regulatory relationships, advances in immune profiling at the systems level, and new technology that allows for profiling small-volume samples from infants.

The present study involved a series of observational experiments and a small interventional trial.

First, the investigators conducted a wide array of blood- and fecal-based longitudinal analyses from 208 infants in Sweden to characterize immune cell expansion and microbiome colonization of the gut, with a focus on bifidobacteria.

Their results showed that infants lacking bifidobacteria, and HMO-utilization genes (which are expressed by bifidobacteria and other beneficial microbes), had higher levels of systemic inflammation, including increased T helper 2 (Th2) and Th17 responses.

“Infants not colonized by Bifidobacteriaceae or in cases where these microbes fail to expand during the first months of life there is evidence of systemic and intestinal inflammation, increased frequencies of activated immune cells, and reduced levels of regulatory cells indicative of systemic immune dysregulation,” the investigators wrote.

The interventional part of the study involved 60 breastfed infants in California. Twenty-nine of the newborns were given 1.8 x 1010 colony-forming units (CFUs) of B. longum subsp. infantis EVC001 daily from postnatal day 7 to day 28, while the remaining 31 infants were given no supplementation.

Fecal samples were collected on day 6 and day 60. At day 60, supplemented infants had high levels of HMO-utilization genes, plus significantly greater alpha diversity (P = .0001; Wilcoxon), compared with controls. Infants receiving EVC001 also had less inflammatory fecal cytokines, suggesting that microbes expressing HMO-utilization genes cause a shift away from proinflammatory Th2 and Th17 responses, and toward Th1.

Dr. Petter Brodin professor of pediatric immunology at Karolinska Institutet, Solna, Sweden
Dr. Petter Brodin

“It is not the simple presence of bifidobacteria that is responsible for the immune effects but the metabolic partnership between the bacteria and HMOs,” the investigators noted.

According to principal investigator Petter Brodin, MD, PhD, professor of pediatric immunology at Karolinska Institutet, Solna, Sweden, the findings deserve further investigation.

“Our data indicate that substitution with beneficial microbes efficiently metabolizing HMOs could open a way to prevent cases of immune-mediated diseases, but larger, randomized trials aimed at this will be required to determine this potential,” Dr. Brodin said in an interview.

Dr. Carolynn Dude assistant professor in the division of maternal-fetal medicine at Emory University School of Medicine, Atlanta
Dr. Carolynn Dude

Carolynn Dude, MD, PhD, assistant professor in the division of maternal-fetal medicine at Emory University, Atlanta, agreed that more work is needed.

“While this study provides some insight into the mechanisms that may set up a newborn for poor health outcomes later in life, the data is still very limited, and more long-term follow-up on these infants is needed before recommending any sort of bacterial supplementation to a newborn,” Dr. Dude said in an interview.

Dr. Brodin and colleagues are planning an array of related studies, including larger clinical trials; further investigations into mechanisms of action; comparisons between the present cohort and infants in Kenya, where immune-mediated diseases are rare; and evaluations of vaccine responses and infectious disease susceptibility.

The study was supported by the European Research Council, the Swedish Research Council, the Marianne & Marcus Wallenberg Foundation, and others. The investigators disclosed relationships with Cytodelics, Scailyte, Kancera, and others. Dr. Dude reported no relevant conflicts of interest.

 

Supplementing breastfed infants with bifidobacteria promotes development of a well-regulated immune system, theoretically reducing risk of immune-mediated conditions like allergies and asthma, according to investigators.

These findings support the importance of early gut colonization with beneficial microbes, an event that may affect the immune system throughout life, reported lead author Bethany M. Henrick, PhD, director of immunology and diagnostics at Evolve Biosystems, Davis, Calif., and adjunct assistant professor at the University of Nebraska, Lincoln, and colleagues.

“Dysbiosis of the infant gut microbiome is common in modern societies and a likely contributing factor to the increased incidences of immune-mediated disorders,” the investigators wrote in Cell. “Therefore, there is great interest in identifying microbial factors that can support healthier immune system imprinting and hopefully prevent cases of allergy, autoimmunity, and possibly other conditions involving the immune system.”

Prevailing theory suggests that the rising incidence of neonatal intestinal dysbiosis – which is typical in developed countries – may be caused by a variety of factors, including cesarean sections; modern hygiene practices; antibiotics, antiseptics, and other medications; diets high in fat and sugar; and infant formula.

According to Dr. Henrick and colleagues, a healthy gut microbiome plays the greatest role in immunological development during the first 3 months post partum; specifically, a lack of bifidobacteria during this time has been linked with increased risks of autoimmunity and enteric inflammation, although underlying immune mechanisms remain unclear.

Bifidobacteria also exemplify the symbiotic relationship between mothers, babies, and beneficial microbes. The investigators pointed out that breast milk contains human milk oligosaccharides (HMOs), which humans cannot digest, but are an excellent source of energy for bifidobacteria and other beneficial microbes, giving them a “selective nutritional advantage.”

Bifidobacteria should therefore be common residents within the infant gut, but this is often not now the case, leading Dr. Henrick and colleagues to zero in on the microbe, in hopes of determining the exactly how beneficial bacteria shape immune development.

It is only recently that the necessary knowledge and techniques to perform studies like this one have become available, the investigators wrote, noting a better understanding of cell-regulatory relationships, advances in immune profiling at the systems level, and new technology that allows for profiling small-volume samples from infants.

The present study involved a series of observational experiments and a small interventional trial.

First, the investigators conducted a wide array of blood- and fecal-based longitudinal analyses from 208 infants in Sweden to characterize immune cell expansion and microbiome colonization of the gut, with a focus on bifidobacteria.

Their results showed that infants lacking bifidobacteria, and HMO-utilization genes (which are expressed by bifidobacteria and other beneficial microbes), had higher levels of systemic inflammation, including increased T helper 2 (Th2) and Th17 responses.

“Infants not colonized by Bifidobacteriaceae or in cases where these microbes fail to expand during the first months of life there is evidence of systemic and intestinal inflammation, increased frequencies of activated immune cells, and reduced levels of regulatory cells indicative of systemic immune dysregulation,” the investigators wrote.

The interventional part of the study involved 60 breastfed infants in California. Twenty-nine of the newborns were given 1.8 x 1010 colony-forming units (CFUs) of B. longum subsp. infantis EVC001 daily from postnatal day 7 to day 28, while the remaining 31 infants were given no supplementation.

Fecal samples were collected on day 6 and day 60. At day 60, supplemented infants had high levels of HMO-utilization genes, plus significantly greater alpha diversity (P = .0001; Wilcoxon), compared with controls. Infants receiving EVC001 also had less inflammatory fecal cytokines, suggesting that microbes expressing HMO-utilization genes cause a shift away from proinflammatory Th2 and Th17 responses, and toward Th1.

Dr. Petter Brodin professor of pediatric immunology at Karolinska Institutet, Solna, Sweden
Dr. Petter Brodin

“It is not the simple presence of bifidobacteria that is responsible for the immune effects but the metabolic partnership between the bacteria and HMOs,” the investigators noted.

According to principal investigator Petter Brodin, MD, PhD, professor of pediatric immunology at Karolinska Institutet, Solna, Sweden, the findings deserve further investigation.

“Our data indicate that substitution with beneficial microbes efficiently metabolizing HMOs could open a way to prevent cases of immune-mediated diseases, but larger, randomized trials aimed at this will be required to determine this potential,” Dr. Brodin said in an interview.

Dr. Carolynn Dude assistant professor in the division of maternal-fetal medicine at Emory University School of Medicine, Atlanta
Dr. Carolynn Dude

Carolynn Dude, MD, PhD, assistant professor in the division of maternal-fetal medicine at Emory University, Atlanta, agreed that more work is needed.

“While this study provides some insight into the mechanisms that may set up a newborn for poor health outcomes later in life, the data is still very limited, and more long-term follow-up on these infants is needed before recommending any sort of bacterial supplementation to a newborn,” Dr. Dude said in an interview.

Dr. Brodin and colleagues are planning an array of related studies, including larger clinical trials; further investigations into mechanisms of action; comparisons between the present cohort and infants in Kenya, where immune-mediated diseases are rare; and evaluations of vaccine responses and infectious disease susceptibility.

The study was supported by the European Research Council, the Swedish Research Council, the Marianne & Marcus Wallenberg Foundation, and others. The investigators disclosed relationships with Cytodelics, Scailyte, Kancera, and others. Dr. Dude reported no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELL

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Memory benefit seen with antihypertensives crossing blood-brain barrier

Article Type
Changed
Mon, 06/21/2021 - 19:05

 

Antihypertensive medications that cross the blood-brain barrier (BBB) may be linked with less memory decline, compared with other drugs for high blood pressure, suggest the findings of a meta-analysis.

Over a 3-year period, cognitively normal older adults taking BBB-crossing antihypertensives demonstrated superior verbal memory, compared with similar individuals receiving non–BBB-crossing antihypertensives, reported lead author Jean K. Ho, PhD, of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, and colleagues.

According to the investigators, the findings add color to a known link between hypertension and neurologic degeneration, and may aid the search for new therapeutic targets.

“Hypertension is a well-established risk factor for cognitive decline and dementia, possibly through its effects on both cerebrovascular disease and Alzheimer’s disease,” Dr. Ho and colleagues wrote in Hypertension. “Studies of antihypertensive treatments have reported possible salutary effects on cognition and cerebrovascular disease, as well as Alzheimer’s disease neuropathology.”

In a previous study, individuals younger than 75 years exposed to antihypertensives had an 8% decreased risk of dementia per year of use, while another trial showed that intensive blood pressure–lowering therapy reduced mild cognitive impairment by 19%.

“Despite these encouraging findings ... larger meta-analytic studies have been hampered by the fact that pharmacokinetic properties are typically not considered in existing studies or routine clinical practice,” wrote Dr. Ho and colleagues. “The present study sought to fill this gap [in that it was] a large and longitudinal meta-analytic study of existing data recoded to assess the effects of BBB-crossing potential in renin-angiotensin system [RAS] treatments among hypertensive adults.”
 

Methods and results

The meta-analysis included randomized clinical trials, prospective cohort studies, and retrospective observational studies. The researchers assessed data on 12,849 individuals from 14 cohorts that received either BBB-crossing or non–BBB-crossing antihypertensives.

The BBB-crossing properties of RAS treatments were identified by a literature review. Of ACE inhibitors, captopril, fosinopril, lisinopril, perindopril, ramipril, and trandolapril were classified as BBB crossing, and benazepril, enalapril, moexipril, and quinapril were classified as non–BBB-crossing. Of ARBs, telmisartan and candesartan were considered BBB-crossing, and olmesartan, eprosartan, irbesartan, and losartan were tagged as non–BBB-crossing.

Cognition was assessed via the following seven domains: executive function, attention, verbal memory learning, language, mental status, recall, and processing speed.

Compared with individuals taking non–BBB-crossing antihypertensives, those taking BBB-crossing agents had significantly superior verbal memory (recall), with a maximum effect size of 0.07 (P = .03).

According to the investigators, this finding was particularly noteworthy, as the BBB-crossing group had relatively higher vascular risk burden and lower mean education level.

“These differences make it all the more remarkable that the BBB-crossing group displayed better memory ability over time despite these cognitive disadvantages,” the investigators wrote.

Still, not all the findings favored BBB-crossing agents. Individuals in the BBB-crossing group had relatively inferior attention ability, with a minimum effect size of –0.17 (P = .02).

The other cognitive measures were not significantly different between groups.
 

Clinicians may consider findings after accounting for other factors

Principal investigator Daniel A. Nation, PhD, associate professor of psychological science and a faculty member of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, suggested that the small difference in verbal memory between groups could be clinically significant over a longer period of time.

Dr. Daniel A. Nation

“Although the overall effect size was pretty small, if you look at how long it would take for someone [with dementia] to progress over many years of decline, it would actually end up being a pretty big effect,” Dr. Nation said in an interview. “Small effect sizes could actually end up preventing a lot of cases of dementia,” he added.

The conflicting results in the BBB-crossing group – better verbal memory but worse attention ability – were “surprising,” he noted.

“I sort of didn’t believe it at first,” Dr. Nation said, “because the memory finding is sort of replication – we’d observed the same exact effect on memory in a smaller sample in another study. ... The attention [finding], going another way, was a new thing.”

Dr. Nation suggested that the intergroup differences in attention ability may stem from idiosyncrasies of the tests used to measure that domain, which can be impacted by cardiovascular or brain vascular disease. Or it could be caused by something else entirely, he said, noting that further investigation is needed.

He added that the improvements in verbal memory within the BBB-crossing group could be caused by direct effects on the brain. He pointed out that certain ACE polymorphisms have been linked with Alzheimer’s disease risk, and those same polymorphisms, in animal models, lead to neurodegeneration, with reversal possible through administration of ACE inhibitors.

“It could be that what we’re observing has nothing really to do with blood pressure,” Dr. Nation explained. “This could be a neuronal effect on learning memory systems.”

He went on to suggest that clinicians may consider these findings when selecting antihypertensive agents for their patients, with the caveat that all other prescribing factors have already been taking to account.

“In the event that you’re going to give an ACE inhibitor or an angiotensin receptor blocker anyway, and it ends up being a somewhat arbitrary decision in terms of which specific drug you’re going to give, then perhaps this is a piece of information you would take into account – that one gets in the brain and one doesn’t – in somebody at risk for cognitive decline,” Dr. Nation said.
 

 

 

Exact mechanisms of action unknown

Hélène Girouard, PhD, assistant professor of pharmacology and physiology at the University of Montreal, said in an interview that the findings are “of considerable importance, knowing that brain alterations could begin as much as 30 years before manifestation of dementia.”

Dr. Hélène Girouard
Dr. Hélène Girouard

Since 2003, Dr. Girouard has been studying the cognitive effects of antihypertensive medications. She noted that previous studies involving rodents “have shown beneficial effects [of BBB-crossing antihypertensive drugs] on cognition independent of their effects on blood pressure.”

The drugs’ exact mechanisms of action, however, remain elusive, according to Dr. Girouard, who offered several possible explanations, including amelioration of BBB disruption, brain inflammation, cerebral blood flow dysregulation, cholinergic dysfunction, and neurologic deficits. “Whether these mechanisms may explain Ho and colleagues’ observations remains to be established,” she added.

Andrea L. Schneider, MD, PhD, assistant professor of neurology at the University of Pennsylvania, Philadelphia, applauded the study, but ultimately suggested that more research is needed to impact clinical decision-making.

“The results of this important and well-done study suggest that further investigation into targeted mechanism-based approaches to selecting hypertension treatment agents, with a specific focus on cognitive outcomes, is warranted,” Dr. Schneider said in an interview. “Before changing clinical practice, further work is necessary to disentangle contributions of medication mechanism, comorbid vascular risk factors, and achieved blood pressure reduction, among others.”

The investigators disclosed support from the National Institutes of Health, the Alzheimer’s Association, the Waksman Foundation of Japan, and others. The interviewees reported no relevant conflicts of interest.

Publications
Topics
Sections

 

Antihypertensive medications that cross the blood-brain barrier (BBB) may be linked with less memory decline, compared with other drugs for high blood pressure, suggest the findings of a meta-analysis.

Over a 3-year period, cognitively normal older adults taking BBB-crossing antihypertensives demonstrated superior verbal memory, compared with similar individuals receiving non–BBB-crossing antihypertensives, reported lead author Jean K. Ho, PhD, of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, and colleagues.

According to the investigators, the findings add color to a known link between hypertension and neurologic degeneration, and may aid the search for new therapeutic targets.

“Hypertension is a well-established risk factor for cognitive decline and dementia, possibly through its effects on both cerebrovascular disease and Alzheimer’s disease,” Dr. Ho and colleagues wrote in Hypertension. “Studies of antihypertensive treatments have reported possible salutary effects on cognition and cerebrovascular disease, as well as Alzheimer’s disease neuropathology.”

In a previous study, individuals younger than 75 years exposed to antihypertensives had an 8% decreased risk of dementia per year of use, while another trial showed that intensive blood pressure–lowering therapy reduced mild cognitive impairment by 19%.

“Despite these encouraging findings ... larger meta-analytic studies have been hampered by the fact that pharmacokinetic properties are typically not considered in existing studies or routine clinical practice,” wrote Dr. Ho and colleagues. “The present study sought to fill this gap [in that it was] a large and longitudinal meta-analytic study of existing data recoded to assess the effects of BBB-crossing potential in renin-angiotensin system [RAS] treatments among hypertensive adults.”
 

Methods and results

The meta-analysis included randomized clinical trials, prospective cohort studies, and retrospective observational studies. The researchers assessed data on 12,849 individuals from 14 cohorts that received either BBB-crossing or non–BBB-crossing antihypertensives.

The BBB-crossing properties of RAS treatments were identified by a literature review. Of ACE inhibitors, captopril, fosinopril, lisinopril, perindopril, ramipril, and trandolapril were classified as BBB crossing, and benazepril, enalapril, moexipril, and quinapril were classified as non–BBB-crossing. Of ARBs, telmisartan and candesartan were considered BBB-crossing, and olmesartan, eprosartan, irbesartan, and losartan were tagged as non–BBB-crossing.

Cognition was assessed via the following seven domains: executive function, attention, verbal memory learning, language, mental status, recall, and processing speed.

Compared with individuals taking non–BBB-crossing antihypertensives, those taking BBB-crossing agents had significantly superior verbal memory (recall), with a maximum effect size of 0.07 (P = .03).

According to the investigators, this finding was particularly noteworthy, as the BBB-crossing group had relatively higher vascular risk burden and lower mean education level.

“These differences make it all the more remarkable that the BBB-crossing group displayed better memory ability over time despite these cognitive disadvantages,” the investigators wrote.

Still, not all the findings favored BBB-crossing agents. Individuals in the BBB-crossing group had relatively inferior attention ability, with a minimum effect size of –0.17 (P = .02).

The other cognitive measures were not significantly different between groups.
 

Clinicians may consider findings after accounting for other factors

Principal investigator Daniel A. Nation, PhD, associate professor of psychological science and a faculty member of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, suggested that the small difference in verbal memory between groups could be clinically significant over a longer period of time.

Dr. Daniel A. Nation

“Although the overall effect size was pretty small, if you look at how long it would take for someone [with dementia] to progress over many years of decline, it would actually end up being a pretty big effect,” Dr. Nation said in an interview. “Small effect sizes could actually end up preventing a lot of cases of dementia,” he added.

The conflicting results in the BBB-crossing group – better verbal memory but worse attention ability – were “surprising,” he noted.

“I sort of didn’t believe it at first,” Dr. Nation said, “because the memory finding is sort of replication – we’d observed the same exact effect on memory in a smaller sample in another study. ... The attention [finding], going another way, was a new thing.”

Dr. Nation suggested that the intergroup differences in attention ability may stem from idiosyncrasies of the tests used to measure that domain, which can be impacted by cardiovascular or brain vascular disease. Or it could be caused by something else entirely, he said, noting that further investigation is needed.

He added that the improvements in verbal memory within the BBB-crossing group could be caused by direct effects on the brain. He pointed out that certain ACE polymorphisms have been linked with Alzheimer’s disease risk, and those same polymorphisms, in animal models, lead to neurodegeneration, with reversal possible through administration of ACE inhibitors.

“It could be that what we’re observing has nothing really to do with blood pressure,” Dr. Nation explained. “This could be a neuronal effect on learning memory systems.”

He went on to suggest that clinicians may consider these findings when selecting antihypertensive agents for their patients, with the caveat that all other prescribing factors have already been taking to account.

“In the event that you’re going to give an ACE inhibitor or an angiotensin receptor blocker anyway, and it ends up being a somewhat arbitrary decision in terms of which specific drug you’re going to give, then perhaps this is a piece of information you would take into account – that one gets in the brain and one doesn’t – in somebody at risk for cognitive decline,” Dr. Nation said.
 

 

 

Exact mechanisms of action unknown

Hélène Girouard, PhD, assistant professor of pharmacology and physiology at the University of Montreal, said in an interview that the findings are “of considerable importance, knowing that brain alterations could begin as much as 30 years before manifestation of dementia.”

Dr. Hélène Girouard
Dr. Hélène Girouard

Since 2003, Dr. Girouard has been studying the cognitive effects of antihypertensive medications. She noted that previous studies involving rodents “have shown beneficial effects [of BBB-crossing antihypertensive drugs] on cognition independent of their effects on blood pressure.”

The drugs’ exact mechanisms of action, however, remain elusive, according to Dr. Girouard, who offered several possible explanations, including amelioration of BBB disruption, brain inflammation, cerebral blood flow dysregulation, cholinergic dysfunction, and neurologic deficits. “Whether these mechanisms may explain Ho and colleagues’ observations remains to be established,” she added.

Andrea L. Schneider, MD, PhD, assistant professor of neurology at the University of Pennsylvania, Philadelphia, applauded the study, but ultimately suggested that more research is needed to impact clinical decision-making.

“The results of this important and well-done study suggest that further investigation into targeted mechanism-based approaches to selecting hypertension treatment agents, with a specific focus on cognitive outcomes, is warranted,” Dr. Schneider said in an interview. “Before changing clinical practice, further work is necessary to disentangle contributions of medication mechanism, comorbid vascular risk factors, and achieved blood pressure reduction, among others.”

The investigators disclosed support from the National Institutes of Health, the Alzheimer’s Association, the Waksman Foundation of Japan, and others. The interviewees reported no relevant conflicts of interest.

 

Antihypertensive medications that cross the blood-brain barrier (BBB) may be linked with less memory decline, compared with other drugs for high blood pressure, suggest the findings of a meta-analysis.

Over a 3-year period, cognitively normal older adults taking BBB-crossing antihypertensives demonstrated superior verbal memory, compared with similar individuals receiving non–BBB-crossing antihypertensives, reported lead author Jean K. Ho, PhD, of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, and colleagues.

According to the investigators, the findings add color to a known link between hypertension and neurologic degeneration, and may aid the search for new therapeutic targets.

“Hypertension is a well-established risk factor for cognitive decline and dementia, possibly through its effects on both cerebrovascular disease and Alzheimer’s disease,” Dr. Ho and colleagues wrote in Hypertension. “Studies of antihypertensive treatments have reported possible salutary effects on cognition and cerebrovascular disease, as well as Alzheimer’s disease neuropathology.”

In a previous study, individuals younger than 75 years exposed to antihypertensives had an 8% decreased risk of dementia per year of use, while another trial showed that intensive blood pressure–lowering therapy reduced mild cognitive impairment by 19%.

“Despite these encouraging findings ... larger meta-analytic studies have been hampered by the fact that pharmacokinetic properties are typically not considered in existing studies or routine clinical practice,” wrote Dr. Ho and colleagues. “The present study sought to fill this gap [in that it was] a large and longitudinal meta-analytic study of existing data recoded to assess the effects of BBB-crossing potential in renin-angiotensin system [RAS] treatments among hypertensive adults.”
 

Methods and results

The meta-analysis included randomized clinical trials, prospective cohort studies, and retrospective observational studies. The researchers assessed data on 12,849 individuals from 14 cohorts that received either BBB-crossing or non–BBB-crossing antihypertensives.

The BBB-crossing properties of RAS treatments were identified by a literature review. Of ACE inhibitors, captopril, fosinopril, lisinopril, perindopril, ramipril, and trandolapril were classified as BBB crossing, and benazepril, enalapril, moexipril, and quinapril were classified as non–BBB-crossing. Of ARBs, telmisartan and candesartan were considered BBB-crossing, and olmesartan, eprosartan, irbesartan, and losartan were tagged as non–BBB-crossing.

Cognition was assessed via the following seven domains: executive function, attention, verbal memory learning, language, mental status, recall, and processing speed.

Compared with individuals taking non–BBB-crossing antihypertensives, those taking BBB-crossing agents had significantly superior verbal memory (recall), with a maximum effect size of 0.07 (P = .03).

According to the investigators, this finding was particularly noteworthy, as the BBB-crossing group had relatively higher vascular risk burden and lower mean education level.

“These differences make it all the more remarkable that the BBB-crossing group displayed better memory ability over time despite these cognitive disadvantages,” the investigators wrote.

Still, not all the findings favored BBB-crossing agents. Individuals in the BBB-crossing group had relatively inferior attention ability, with a minimum effect size of –0.17 (P = .02).

The other cognitive measures were not significantly different between groups.
 

Clinicians may consider findings after accounting for other factors

Principal investigator Daniel A. Nation, PhD, associate professor of psychological science and a faculty member of the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine, suggested that the small difference in verbal memory between groups could be clinically significant over a longer period of time.

Dr. Daniel A. Nation

“Although the overall effect size was pretty small, if you look at how long it would take for someone [with dementia] to progress over many years of decline, it would actually end up being a pretty big effect,” Dr. Nation said in an interview. “Small effect sizes could actually end up preventing a lot of cases of dementia,” he added.

The conflicting results in the BBB-crossing group – better verbal memory but worse attention ability – were “surprising,” he noted.

“I sort of didn’t believe it at first,” Dr. Nation said, “because the memory finding is sort of replication – we’d observed the same exact effect on memory in a smaller sample in another study. ... The attention [finding], going another way, was a new thing.”

Dr. Nation suggested that the intergroup differences in attention ability may stem from idiosyncrasies of the tests used to measure that domain, which can be impacted by cardiovascular or brain vascular disease. Or it could be caused by something else entirely, he said, noting that further investigation is needed.

He added that the improvements in verbal memory within the BBB-crossing group could be caused by direct effects on the brain. He pointed out that certain ACE polymorphisms have been linked with Alzheimer’s disease risk, and those same polymorphisms, in animal models, lead to neurodegeneration, with reversal possible through administration of ACE inhibitors.

“It could be that what we’re observing has nothing really to do with blood pressure,” Dr. Nation explained. “This could be a neuronal effect on learning memory systems.”

He went on to suggest that clinicians may consider these findings when selecting antihypertensive agents for their patients, with the caveat that all other prescribing factors have already been taking to account.

“In the event that you’re going to give an ACE inhibitor or an angiotensin receptor blocker anyway, and it ends up being a somewhat arbitrary decision in terms of which specific drug you’re going to give, then perhaps this is a piece of information you would take into account – that one gets in the brain and one doesn’t – in somebody at risk for cognitive decline,” Dr. Nation said.
 

 

 

Exact mechanisms of action unknown

Hélène Girouard, PhD, assistant professor of pharmacology and physiology at the University of Montreal, said in an interview that the findings are “of considerable importance, knowing that brain alterations could begin as much as 30 years before manifestation of dementia.”

Dr. Hélène Girouard
Dr. Hélène Girouard

Since 2003, Dr. Girouard has been studying the cognitive effects of antihypertensive medications. She noted that previous studies involving rodents “have shown beneficial effects [of BBB-crossing antihypertensive drugs] on cognition independent of their effects on blood pressure.”

The drugs’ exact mechanisms of action, however, remain elusive, according to Dr. Girouard, who offered several possible explanations, including amelioration of BBB disruption, brain inflammation, cerebral blood flow dysregulation, cholinergic dysfunction, and neurologic deficits. “Whether these mechanisms may explain Ho and colleagues’ observations remains to be established,” she added.

Andrea L. Schneider, MD, PhD, assistant professor of neurology at the University of Pennsylvania, Philadelphia, applauded the study, but ultimately suggested that more research is needed to impact clinical decision-making.

“The results of this important and well-done study suggest that further investigation into targeted mechanism-based approaches to selecting hypertension treatment agents, with a specific focus on cognitive outcomes, is warranted,” Dr. Schneider said in an interview. “Before changing clinical practice, further work is necessary to disentangle contributions of medication mechanism, comorbid vascular risk factors, and achieved blood pressure reduction, among others.”

The investigators disclosed support from the National Institutes of Health, the Alzheimer’s Association, the Waksman Foundation of Japan, and others. The interviewees reported no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM HYPERTENSION

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Sporebiotics improve functional dyspepsia symptoms

Article Type
Changed
Fri, 06/04/2021 - 15:05

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Some nasogastric intubation procedures lead to less aerosolization than feared

Article Type
Changed
Fri, 06/04/2021 - 10:45

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Head-to-head trial compares ustekinumab with adalimumab in Crohn’s

Article Type
Changed
Wed, 06/16/2021 - 10:25

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Jordan E. Axelrad, MD, NYU Langone Health, New York
Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Jordan E. Axelrad, MD, NYU Langone Health, New York
Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Jordan E. Axelrad, MD, NYU Langone Health, New York
Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Microbiome therapeutic offers durable protection against C. difficile recurrence

Article Type
Changed
Thu, 06/03/2021 - 10:34

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn.
Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn.
Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn.
Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Intervention reduces PPI use without worsening acid-related diseases

Article Type
Changed
Fri, 05/28/2021 - 12:32

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Center for Clinical Management Research
Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Center for Clinical Management Research
Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Center for Clinical Management Research
Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pandemic colonoscopy restrictions may lead to worse CRC outcomes

Article Type
Changed
Wed, 05/26/2021 - 12:01

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pandemic colonoscopy restrictions may lead to worse CRC outcomes

Article Type
Changed
Thu, 09/09/2021 - 16:19

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario's organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto
Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario's organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto
Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario's organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto
Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Adversity accelerates aging at early ages, now measurable in real-time

Article Type
Changed
Tue, 05/25/2021 - 10:40

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at UCLA
Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

Publications
Topics
Sections

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at UCLA
Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at UCLA
Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article