Dupilumab significantly improves markers of AD severity in pediatric patients

Article Type
Changed
Thu, 12/15/2022 - 13:00

A registry-based study provides further evidence that treatment with dupilumab significantly reduces severity and symptoms of atopic dermatitis (AD) in clinical practice.

Dupilumab also decreased severity-associated biomarkers in pediatric patients with moderate to severe AD, researchers in the Netherlands reported.

Obtaining serum biomarkers is not the current standard in everyday practice, but studying them may improve understanding of who might respond best to dupilumab, said Jessica Hui, MD, a pediatric allergist and immunologist at National Jewish Health in Denver, in an email comment to this news organization.

“AD is heterogeneous, as each patient may have different presentations and underlying biology,” said Dr. Hui, who wasn’t involved in the research. “Studying biomarkers can eventually assist us in providing targeted therapy to each individual patient.”

Dr. Hui added, “As blood biomarkers can inform us of severity and treatment response, we can be hopeful that this will assist us in the management of AD patients in the future.”
 

Examining effect on disease severity

Dupilumab, a monoclonal antibody that inhibits interleukin (IL)-4 and IL-13 signaling, is approved in Europe and the United States to treat moderate to severe AD in patients 6 months of age or older, and to treat certain other inflammatory conditions.

Phase 3 studies show that dupilumab is effective for improving AD symptoms and quality of life in pediatric patients, but few clinical practice studies have researched the effect of the therapy on severity- and disease-related biomarkers in this population, the study authors write.

The study was published online in Pediatric Allergy Immunology.

In a new study, a team led by Esmé Kamphuis, MD, of the University of Groningen, the Netherlands, and colleagues evaluated the efficacy and safety of a 28-week dupilumab treatment course in 61 pediatric patients with moderate to severe AD. Additionally, the investigators examined the effect of this treatment regimen on serum biomarkers associated with disease severity.

Patients in the study were registered in the multicenter BioDay registry, which includes patients with moderate to severe AD receiving biologics or small-molecule agents. The AD cohort included children between 6 and 12 years of age (n = 16) and adolescents between 12 and less than 18 years of age (n = 45), all of whom received dupilumab on a dosing regimen indicated by age and body weight.

Over one-third (36.1%) of dupilumab-treated patients achieved an Investigator Global Assessment score of “almost clear” by 28 weeks of treatment. Approximately 75.4% of patients reached an Eczema Area and Severity Index (EASI) of 50, 49.2% reached EASI-75, and 24.6% reached EASI-90 at the 7-month follow-up.

Among patient-reported outcomes, 84.7% experienced improvements of 4 or more points on the Patient-Oriented Eczema Measure after the 28-week dupilumab treatment. In addition, improvements of 4 or more points on the Numeric Rating Scale for pruritus and pain were achieved by 45.3% and 77.4% of patients, respectively.

The most frequently reported side effects included conjunctivitis (n = 10) and headache (n = 4).

Of the 19 severity-associated serum biomarkers measured at baseline, week 4, and week 16, markers related to AD severity and treatment response significantly decreased during treatment (thymus- and activation-regulated chemokine, pulmonary and activation-regulated chemokine, periostin, soluble IL-2 receptor alpha).

A predicted EASI, calculated from selected biomarkers, demonstrated a significant association with disease severity in the cohort.
 

 

 

Implications for practice

When asked to comment on the study findings, Raegan Hunt, MD, the division chief of pediatric dermatology at Texas Children’s Hospital in Houston, said it is important to validate the changes in AD serum biomarkers in pediatric patients on dupilumab therapy, given that this treatment has historically been better studied in adults.

“This study adds to daily practice outcomes data, which in many cases is more relevant to the everyday care of patients than structured clinical trial data,” said Dr. Hunt, an associate professor at the Baylor College of Medicine, Houston.

Dr. Hunt, who didn’t participate in the study, noted that more research is needed on the adverse effects of dupilumab in the pediatric AD population.

Dr. Hui added that there is a lack of clear understanding of the exact underlying mechanisms for certain side effects, such as conjunctivitis, warranting further study.

The study’s BioDay registry is funded by Sanofi/Regeneron, AbbVie, Leo Pharma, Pfizer, and Eli Lilly. Several study coauthors report relationships with several pharmaceutical companies. Dr. Hunt and Dr. Hui report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

A registry-based study provides further evidence that treatment with dupilumab significantly reduces severity and symptoms of atopic dermatitis (AD) in clinical practice.

Dupilumab also decreased severity-associated biomarkers in pediatric patients with moderate to severe AD, researchers in the Netherlands reported.

Obtaining serum biomarkers is not the current standard in everyday practice, but studying them may improve understanding of who might respond best to dupilumab, said Jessica Hui, MD, a pediatric allergist and immunologist at National Jewish Health in Denver, in an email comment to this news organization.

“AD is heterogeneous, as each patient may have different presentations and underlying biology,” said Dr. Hui, who wasn’t involved in the research. “Studying biomarkers can eventually assist us in providing targeted therapy to each individual patient.”

Dr. Hui added, “As blood biomarkers can inform us of severity and treatment response, we can be hopeful that this will assist us in the management of AD patients in the future.”
 

Examining effect on disease severity

Dupilumab, a monoclonal antibody that inhibits interleukin (IL)-4 and IL-13 signaling, is approved in Europe and the United States to treat moderate to severe AD in patients 6 months of age or older, and to treat certain other inflammatory conditions.

Phase 3 studies show that dupilumab is effective for improving AD symptoms and quality of life in pediatric patients, but few clinical practice studies have researched the effect of the therapy on severity- and disease-related biomarkers in this population, the study authors write.

The study was published online in Pediatric Allergy Immunology.

In a new study, a team led by Esmé Kamphuis, MD, of the University of Groningen, the Netherlands, and colleagues evaluated the efficacy and safety of a 28-week dupilumab treatment course in 61 pediatric patients with moderate to severe AD. Additionally, the investigators examined the effect of this treatment regimen on serum biomarkers associated with disease severity.

Patients in the study were registered in the multicenter BioDay registry, which includes patients with moderate to severe AD receiving biologics or small-molecule agents. The AD cohort included children between 6 and 12 years of age (n = 16) and adolescents between 12 and less than 18 years of age (n = 45), all of whom received dupilumab on a dosing regimen indicated by age and body weight.

Over one-third (36.1%) of dupilumab-treated patients achieved an Investigator Global Assessment score of “almost clear” by 28 weeks of treatment. Approximately 75.4% of patients reached an Eczema Area and Severity Index (EASI) of 50, 49.2% reached EASI-75, and 24.6% reached EASI-90 at the 7-month follow-up.

Among patient-reported outcomes, 84.7% experienced improvements of 4 or more points on the Patient-Oriented Eczema Measure after the 28-week dupilumab treatment. In addition, improvements of 4 or more points on the Numeric Rating Scale for pruritus and pain were achieved by 45.3% and 77.4% of patients, respectively.

The most frequently reported side effects included conjunctivitis (n = 10) and headache (n = 4).

Of the 19 severity-associated serum biomarkers measured at baseline, week 4, and week 16, markers related to AD severity and treatment response significantly decreased during treatment (thymus- and activation-regulated chemokine, pulmonary and activation-regulated chemokine, periostin, soluble IL-2 receptor alpha).

A predicted EASI, calculated from selected biomarkers, demonstrated a significant association with disease severity in the cohort.
 

 

 

Implications for practice

When asked to comment on the study findings, Raegan Hunt, MD, the division chief of pediatric dermatology at Texas Children’s Hospital in Houston, said it is important to validate the changes in AD serum biomarkers in pediatric patients on dupilumab therapy, given that this treatment has historically been better studied in adults.

“This study adds to daily practice outcomes data, which in many cases is more relevant to the everyday care of patients than structured clinical trial data,” said Dr. Hunt, an associate professor at the Baylor College of Medicine, Houston.

Dr. Hunt, who didn’t participate in the study, noted that more research is needed on the adverse effects of dupilumab in the pediatric AD population.

Dr. Hui added that there is a lack of clear understanding of the exact underlying mechanisms for certain side effects, such as conjunctivitis, warranting further study.

The study’s BioDay registry is funded by Sanofi/Regeneron, AbbVie, Leo Pharma, Pfizer, and Eli Lilly. Several study coauthors report relationships with several pharmaceutical companies. Dr. Hunt and Dr. Hui report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A registry-based study provides further evidence that treatment with dupilumab significantly reduces severity and symptoms of atopic dermatitis (AD) in clinical practice.

Dupilumab also decreased severity-associated biomarkers in pediatric patients with moderate to severe AD, researchers in the Netherlands reported.

Obtaining serum biomarkers is not the current standard in everyday practice, but studying them may improve understanding of who might respond best to dupilumab, said Jessica Hui, MD, a pediatric allergist and immunologist at National Jewish Health in Denver, in an email comment to this news organization.

“AD is heterogeneous, as each patient may have different presentations and underlying biology,” said Dr. Hui, who wasn’t involved in the research. “Studying biomarkers can eventually assist us in providing targeted therapy to each individual patient.”

Dr. Hui added, “As blood biomarkers can inform us of severity and treatment response, we can be hopeful that this will assist us in the management of AD patients in the future.”
 

Examining effect on disease severity

Dupilumab, a monoclonal antibody that inhibits interleukin (IL)-4 and IL-13 signaling, is approved in Europe and the United States to treat moderate to severe AD in patients 6 months of age or older, and to treat certain other inflammatory conditions.

Phase 3 studies show that dupilumab is effective for improving AD symptoms and quality of life in pediatric patients, but few clinical practice studies have researched the effect of the therapy on severity- and disease-related biomarkers in this population, the study authors write.

The study was published online in Pediatric Allergy Immunology.

In a new study, a team led by Esmé Kamphuis, MD, of the University of Groningen, the Netherlands, and colleagues evaluated the efficacy and safety of a 28-week dupilumab treatment course in 61 pediatric patients with moderate to severe AD. Additionally, the investigators examined the effect of this treatment regimen on serum biomarkers associated with disease severity.

Patients in the study were registered in the multicenter BioDay registry, which includes patients with moderate to severe AD receiving biologics or small-molecule agents. The AD cohort included children between 6 and 12 years of age (n = 16) and adolescents between 12 and less than 18 years of age (n = 45), all of whom received dupilumab on a dosing regimen indicated by age and body weight.

Over one-third (36.1%) of dupilumab-treated patients achieved an Investigator Global Assessment score of “almost clear” by 28 weeks of treatment. Approximately 75.4% of patients reached an Eczema Area and Severity Index (EASI) of 50, 49.2% reached EASI-75, and 24.6% reached EASI-90 at the 7-month follow-up.

Among patient-reported outcomes, 84.7% experienced improvements of 4 or more points on the Patient-Oriented Eczema Measure after the 28-week dupilumab treatment. In addition, improvements of 4 or more points on the Numeric Rating Scale for pruritus and pain were achieved by 45.3% and 77.4% of patients, respectively.

The most frequently reported side effects included conjunctivitis (n = 10) and headache (n = 4).

Of the 19 severity-associated serum biomarkers measured at baseline, week 4, and week 16, markers related to AD severity and treatment response significantly decreased during treatment (thymus- and activation-regulated chemokine, pulmonary and activation-regulated chemokine, periostin, soluble IL-2 receptor alpha).

A predicted EASI, calculated from selected biomarkers, demonstrated a significant association with disease severity in the cohort.
 

 

 

Implications for practice

When asked to comment on the study findings, Raegan Hunt, MD, the division chief of pediatric dermatology at Texas Children’s Hospital in Houston, said it is important to validate the changes in AD serum biomarkers in pediatric patients on dupilumab therapy, given that this treatment has historically been better studied in adults.

“This study adds to daily practice outcomes data, which in many cases is more relevant to the everyday care of patients than structured clinical trial data,” said Dr. Hunt, an associate professor at the Baylor College of Medicine, Houston.

Dr. Hunt, who didn’t participate in the study, noted that more research is needed on the adverse effects of dupilumab in the pediatric AD population.

Dr. Hui added that there is a lack of clear understanding of the exact underlying mechanisms for certain side effects, such as conjunctivitis, warranting further study.

The study’s BioDay registry is funded by Sanofi/Regeneron, AbbVie, Leo Pharma, Pfizer, and Eli Lilly. Several study coauthors report relationships with several pharmaceutical companies. Dr. Hunt and Dr. Hui report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PEDIATRIC ALLERGY IMMUNOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Infant anaphylaxis: Study characterizes symptoms, treatment

Article Type
Changed
Tue, 11/15/2022 - 11:44

A majority of infants who presented to the emergency department with anaphylaxis appropriately received epinephrine, with symptoms typically resolving after a single treatment dose, research findings indicate.

Given that early administration of epinephrine can be potentially lifesaving for infants with anaphylaxis, the study highlighted the real-world successes in increased uptake of treatment in this vulnerable patient population.

Most infants in the study who presented to the ED and received epinephrine were able to be discharged home after just a few hours, with only 1 out of 10 requiring hospitalization.

The study also reported that most symptoms were in the skin/mucosal, gastrointestinal, respiratory, and cardiovascular (CV) systems, providing improved characterization of anaphylaxis symptoms in the infant population.

Nearly “all episodes were triggered by food – especially egg, peanut, milk, and cashew,” commented Colleen Shannon, MD, a pediatrician at Children’s Hospital of Philadelphia, who presented the research findings at the annual meeting of the American College of Allergy, Asthma, and Immunology.

Dr. Shannon noted that despite previous research demonstrating age-based differences in the presentation of anaphylaxis, the symptomatology of anaphylaxis in infants has not been robustly characterized. Better characterization of anaphylaxis in infants with allergies may help ensure earlier and more accurate diagnosis and management, she said.

For the study, the researchers performed a retrospective chart review of 169 patients between 0 and 24 months of age (mean age, 1.0 years) who presented to the emergency department of a pediatric tertiary referral center between 2019 and 2022.

All patients in the study met diagnostic criteria for anaphylaxis. The investigators used the medical records of patients to evaluate for demographics, as well as presenting symptoms and treatment.

More than half (56.2%) of infants in the study were 12 months of age or younger, and 64.5% were male.

Nearly all (96.5%) anaphylaxis episodes presenting to the ED were triggered by food. The most common foods triggering these episodes were egg (26.6%), peanut (25.4%), milk (13.6%), and cashew (10.1%).

Most symptoms involved the skin/mucosal (97.6%) and GI (74.6%) systems, followed by respiratory (56.8%) and CV (34.3%) systems. Isolated tachycardia was recorded in 84.5% of patients with CV-related symptoms.

Epinephrine was administered to 86.4% of infants who presented to the ED with anaphylaxis. Nearly a third (30.1%) of these infants received epinephrine before arriving to the ED, and 9.5% required more than 1 dose.

The researchers also found that 10.1% of patients required hospital admission, but none had symptoms severe enough to require intensive care.

Jennifer Hoffmann, MD, an emergency medicine physician at the Lurie Children’s Hospital of Chicago, told this news organization that while characterizing anaphylaxis symptoms is relevant for clinicians, it also remains vitally important “to teach parents of infants how to recognize the signs of anaphylaxis, particularly as they begin to introduce new foods,” to ensure timely treatment.

She added that since most infants in the study improved after a single dose of epinephrine, most infants presenting to the ED with anaphylaxis can therefore be safely discharged home after only a brief period of observation. “That is, age alone should not be a reason for admission,” explained Dr. Hoffmann, who wasn’t involved in the research study.

The study was independently supported. Dr. Shannon and Dr. Hoffmann report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A majority of infants who presented to the emergency department with anaphylaxis appropriately received epinephrine, with symptoms typically resolving after a single treatment dose, research findings indicate.

Given that early administration of epinephrine can be potentially lifesaving for infants with anaphylaxis, the study highlighted the real-world successes in increased uptake of treatment in this vulnerable patient population.

Most infants in the study who presented to the ED and received epinephrine were able to be discharged home after just a few hours, with only 1 out of 10 requiring hospitalization.

The study also reported that most symptoms were in the skin/mucosal, gastrointestinal, respiratory, and cardiovascular (CV) systems, providing improved characterization of anaphylaxis symptoms in the infant population.

Nearly “all episodes were triggered by food – especially egg, peanut, milk, and cashew,” commented Colleen Shannon, MD, a pediatrician at Children’s Hospital of Philadelphia, who presented the research findings at the annual meeting of the American College of Allergy, Asthma, and Immunology.

Dr. Shannon noted that despite previous research demonstrating age-based differences in the presentation of anaphylaxis, the symptomatology of anaphylaxis in infants has not been robustly characterized. Better characterization of anaphylaxis in infants with allergies may help ensure earlier and more accurate diagnosis and management, she said.

For the study, the researchers performed a retrospective chart review of 169 patients between 0 and 24 months of age (mean age, 1.0 years) who presented to the emergency department of a pediatric tertiary referral center between 2019 and 2022.

All patients in the study met diagnostic criteria for anaphylaxis. The investigators used the medical records of patients to evaluate for demographics, as well as presenting symptoms and treatment.

More than half (56.2%) of infants in the study were 12 months of age or younger, and 64.5% were male.

Nearly all (96.5%) anaphylaxis episodes presenting to the ED were triggered by food. The most common foods triggering these episodes were egg (26.6%), peanut (25.4%), milk (13.6%), and cashew (10.1%).

Most symptoms involved the skin/mucosal (97.6%) and GI (74.6%) systems, followed by respiratory (56.8%) and CV (34.3%) systems. Isolated tachycardia was recorded in 84.5% of patients with CV-related symptoms.

Epinephrine was administered to 86.4% of infants who presented to the ED with anaphylaxis. Nearly a third (30.1%) of these infants received epinephrine before arriving to the ED, and 9.5% required more than 1 dose.

The researchers also found that 10.1% of patients required hospital admission, but none had symptoms severe enough to require intensive care.

Jennifer Hoffmann, MD, an emergency medicine physician at the Lurie Children’s Hospital of Chicago, told this news organization that while characterizing anaphylaxis symptoms is relevant for clinicians, it also remains vitally important “to teach parents of infants how to recognize the signs of anaphylaxis, particularly as they begin to introduce new foods,” to ensure timely treatment.

She added that since most infants in the study improved after a single dose of epinephrine, most infants presenting to the ED with anaphylaxis can therefore be safely discharged home after only a brief period of observation. “That is, age alone should not be a reason for admission,” explained Dr. Hoffmann, who wasn’t involved in the research study.

The study was independently supported. Dr. Shannon and Dr. Hoffmann report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A majority of infants who presented to the emergency department with anaphylaxis appropriately received epinephrine, with symptoms typically resolving after a single treatment dose, research findings indicate.

Given that early administration of epinephrine can be potentially lifesaving for infants with anaphylaxis, the study highlighted the real-world successes in increased uptake of treatment in this vulnerable patient population.

Most infants in the study who presented to the ED and received epinephrine were able to be discharged home after just a few hours, with only 1 out of 10 requiring hospitalization.

The study also reported that most symptoms were in the skin/mucosal, gastrointestinal, respiratory, and cardiovascular (CV) systems, providing improved characterization of anaphylaxis symptoms in the infant population.

Nearly “all episodes were triggered by food – especially egg, peanut, milk, and cashew,” commented Colleen Shannon, MD, a pediatrician at Children’s Hospital of Philadelphia, who presented the research findings at the annual meeting of the American College of Allergy, Asthma, and Immunology.

Dr. Shannon noted that despite previous research demonstrating age-based differences in the presentation of anaphylaxis, the symptomatology of anaphylaxis in infants has not been robustly characterized. Better characterization of anaphylaxis in infants with allergies may help ensure earlier and more accurate diagnosis and management, she said.

For the study, the researchers performed a retrospective chart review of 169 patients between 0 and 24 months of age (mean age, 1.0 years) who presented to the emergency department of a pediatric tertiary referral center between 2019 and 2022.

All patients in the study met diagnostic criteria for anaphylaxis. The investigators used the medical records of patients to evaluate for demographics, as well as presenting symptoms and treatment.

More than half (56.2%) of infants in the study were 12 months of age or younger, and 64.5% were male.

Nearly all (96.5%) anaphylaxis episodes presenting to the ED were triggered by food. The most common foods triggering these episodes were egg (26.6%), peanut (25.4%), milk (13.6%), and cashew (10.1%).

Most symptoms involved the skin/mucosal (97.6%) and GI (74.6%) systems, followed by respiratory (56.8%) and CV (34.3%) systems. Isolated tachycardia was recorded in 84.5% of patients with CV-related symptoms.

Epinephrine was administered to 86.4% of infants who presented to the ED with anaphylaxis. Nearly a third (30.1%) of these infants received epinephrine before arriving to the ED, and 9.5% required more than 1 dose.

The researchers also found that 10.1% of patients required hospital admission, but none had symptoms severe enough to require intensive care.

Jennifer Hoffmann, MD, an emergency medicine physician at the Lurie Children’s Hospital of Chicago, told this news organization that while characterizing anaphylaxis symptoms is relevant for clinicians, it also remains vitally important “to teach parents of infants how to recognize the signs of anaphylaxis, particularly as they begin to introduce new foods,” to ensure timely treatment.

She added that since most infants in the study improved after a single dose of epinephrine, most infants presenting to the ED with anaphylaxis can therefore be safely discharged home after only a brief period of observation. “That is, age alone should not be a reason for admission,” explained Dr. Hoffmann, who wasn’t involved in the research study.

The study was independently supported. Dr. Shannon and Dr. Hoffmann report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACAAI

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Hairdressers have ‘excess risk’ of contact allergies

Article Type
Changed
Mon, 11/07/2022 - 08:06

When compared with the general population, hairdressers experience an excess risk of contact allergy linked to hair cosmetic ingredients, a systematic review suggests.

“Research has shown that up to 70% of hairdressers suffer from work-related skin damage, mostly hand dermatitis, at some point during their career,” write Wolfgang Uter of Friedrich-Alexander University Erlangen-Nürnberg and coauthors. In general, they write, occupational skin diseases such as hand dermatitis represent up to 35% of reported occupational diseases. The study was published online in Contact Dermatitis.

Wet work and skin contact with detergents and hairdressing chemicals are top risk factors for developing occupational skin disease in this population, according to the researchers.

To further understand the burden of occupational contact allergy in hairdressers, the investigators gathered evidence published since 2000 on contact allergies to hair cosmetic chemicals. They searched the literature for nine substances selected beforehand by experts and stakeholders. The researchers also examined the prevalence of sensitization between hairdressers and other individuals given skin patch tests.
 

Substance by substance

Common potentially sensitizing cosmetic ingredients reported across studies included p-phenylenediamine (PPD), persulfates (mostly ammonium persulfate [APS]), glyceryl thioglycolate (GMTG), and ammonium thioglycolate (ATG).

In a pooled analysis, the overall prevalence of contact allergy to PPD was 4.3% in consecutively patch-tested patients, but in hairdressers specifically, the overall prevalence of contact allergy to this ingredient was 28.6%, reviewers reported.

The pooled prevalence of contact allergy to APS was 5.5% in consumers and 17.2% in hairdressers. In other review studies, contact allergy risks to APS, GMTG, and ATG were also elevated in hairdressers compared with all controls.



The calculated relative risk (RR) of contact allergy to PPD was approximately 5.4 higher for hairdressers, while the RR for ATG sensitization was 3.4 in hairdressers compared with consumers.

Commenting on these findings, James A. Yiannias, MD, professor of dermatology at the Mayo Medical School, Phoenix, told this news organization in an email that many providers and patients are concerned only about hair dye molecules such as PPD and aminophenol, as well as permanent, wave, and straightening chemicals such as GMTG.

“Although these are common allergens in hairdressers, allergens such as fragrances and some preservatives found in daily hair care products such as shampoos, conditioners, and hair sprays are also common causes of contact dermatitis,” said Dr. Yiannias, who wasn’t involved in the research.

Consequences of exposure

Dr. Yiannias explained that progressive worsening of the dermatitis can occur with ongoing allergen exposure and, if not properly mitigated, can lead to bigger issues. “Initial nuisances of mild irritation and hyperkeratosis can evolve to a state of fissuring with the risk of bleeding and significant pain,” he said.

But once severe and untreated dermatitis occurs, Dr. Yiannias said that hairdressers “may need to change careers” or at least face short- or long-term unemployment.

The researchers suggest reducing exposure to the allergen is key for prevention of symptoms, adding that adequate guidance on the safe use of new products is needed. Also, the researchers suggested that vocational schools should more rigorously implement education for hairdressers that addresses how to protect the skin appropriately at work.

“Hairdressers are taught during their training to be cautious about allergen exposure by avoiding touching high-risk ingredients such as hair dyes,” Dr. Yiannias added. “However, in practice, this is very difficult since the wearing of gloves can impair the tactile sensations that hairdressers often feel is essential in performing their job.”

The study received no industry funding. Dr. Yiannias reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

When compared with the general population, hairdressers experience an excess risk of contact allergy linked to hair cosmetic ingredients, a systematic review suggests.

“Research has shown that up to 70% of hairdressers suffer from work-related skin damage, mostly hand dermatitis, at some point during their career,” write Wolfgang Uter of Friedrich-Alexander University Erlangen-Nürnberg and coauthors. In general, they write, occupational skin diseases such as hand dermatitis represent up to 35% of reported occupational diseases. The study was published online in Contact Dermatitis.

Wet work and skin contact with detergents and hairdressing chemicals are top risk factors for developing occupational skin disease in this population, according to the researchers.

To further understand the burden of occupational contact allergy in hairdressers, the investigators gathered evidence published since 2000 on contact allergies to hair cosmetic chemicals. They searched the literature for nine substances selected beforehand by experts and stakeholders. The researchers also examined the prevalence of sensitization between hairdressers and other individuals given skin patch tests.
 

Substance by substance

Common potentially sensitizing cosmetic ingredients reported across studies included p-phenylenediamine (PPD), persulfates (mostly ammonium persulfate [APS]), glyceryl thioglycolate (GMTG), and ammonium thioglycolate (ATG).

In a pooled analysis, the overall prevalence of contact allergy to PPD was 4.3% in consecutively patch-tested patients, but in hairdressers specifically, the overall prevalence of contact allergy to this ingredient was 28.6%, reviewers reported.

The pooled prevalence of contact allergy to APS was 5.5% in consumers and 17.2% in hairdressers. In other review studies, contact allergy risks to APS, GMTG, and ATG were also elevated in hairdressers compared with all controls.



The calculated relative risk (RR) of contact allergy to PPD was approximately 5.4 higher for hairdressers, while the RR for ATG sensitization was 3.4 in hairdressers compared with consumers.

Commenting on these findings, James A. Yiannias, MD, professor of dermatology at the Mayo Medical School, Phoenix, told this news organization in an email that many providers and patients are concerned only about hair dye molecules such as PPD and aminophenol, as well as permanent, wave, and straightening chemicals such as GMTG.

“Although these are common allergens in hairdressers, allergens such as fragrances and some preservatives found in daily hair care products such as shampoos, conditioners, and hair sprays are also common causes of contact dermatitis,” said Dr. Yiannias, who wasn’t involved in the research.

Consequences of exposure

Dr. Yiannias explained that progressive worsening of the dermatitis can occur with ongoing allergen exposure and, if not properly mitigated, can lead to bigger issues. “Initial nuisances of mild irritation and hyperkeratosis can evolve to a state of fissuring with the risk of bleeding and significant pain,” he said.

But once severe and untreated dermatitis occurs, Dr. Yiannias said that hairdressers “may need to change careers” or at least face short- or long-term unemployment.

The researchers suggest reducing exposure to the allergen is key for prevention of symptoms, adding that adequate guidance on the safe use of new products is needed. Also, the researchers suggested that vocational schools should more rigorously implement education for hairdressers that addresses how to protect the skin appropriately at work.

“Hairdressers are taught during their training to be cautious about allergen exposure by avoiding touching high-risk ingredients such as hair dyes,” Dr. Yiannias added. “However, in practice, this is very difficult since the wearing of gloves can impair the tactile sensations that hairdressers often feel is essential in performing their job.”

The study received no industry funding. Dr. Yiannias reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

When compared with the general population, hairdressers experience an excess risk of contact allergy linked to hair cosmetic ingredients, a systematic review suggests.

“Research has shown that up to 70% of hairdressers suffer from work-related skin damage, mostly hand dermatitis, at some point during their career,” write Wolfgang Uter of Friedrich-Alexander University Erlangen-Nürnberg and coauthors. In general, they write, occupational skin diseases such as hand dermatitis represent up to 35% of reported occupational diseases. The study was published online in Contact Dermatitis.

Wet work and skin contact with detergents and hairdressing chemicals are top risk factors for developing occupational skin disease in this population, according to the researchers.

To further understand the burden of occupational contact allergy in hairdressers, the investigators gathered evidence published since 2000 on contact allergies to hair cosmetic chemicals. They searched the literature for nine substances selected beforehand by experts and stakeholders. The researchers also examined the prevalence of sensitization between hairdressers and other individuals given skin patch tests.
 

Substance by substance

Common potentially sensitizing cosmetic ingredients reported across studies included p-phenylenediamine (PPD), persulfates (mostly ammonium persulfate [APS]), glyceryl thioglycolate (GMTG), and ammonium thioglycolate (ATG).

In a pooled analysis, the overall prevalence of contact allergy to PPD was 4.3% in consecutively patch-tested patients, but in hairdressers specifically, the overall prevalence of contact allergy to this ingredient was 28.6%, reviewers reported.

The pooled prevalence of contact allergy to APS was 5.5% in consumers and 17.2% in hairdressers. In other review studies, contact allergy risks to APS, GMTG, and ATG were also elevated in hairdressers compared with all controls.



The calculated relative risk (RR) of contact allergy to PPD was approximately 5.4 higher for hairdressers, while the RR for ATG sensitization was 3.4 in hairdressers compared with consumers.

Commenting on these findings, James A. Yiannias, MD, professor of dermatology at the Mayo Medical School, Phoenix, told this news organization in an email that many providers and patients are concerned only about hair dye molecules such as PPD and aminophenol, as well as permanent, wave, and straightening chemicals such as GMTG.

“Although these are common allergens in hairdressers, allergens such as fragrances and some preservatives found in daily hair care products such as shampoos, conditioners, and hair sprays are also common causes of contact dermatitis,” said Dr. Yiannias, who wasn’t involved in the research.

Consequences of exposure

Dr. Yiannias explained that progressive worsening of the dermatitis can occur with ongoing allergen exposure and, if not properly mitigated, can lead to bigger issues. “Initial nuisances of mild irritation and hyperkeratosis can evolve to a state of fissuring with the risk of bleeding and significant pain,” he said.

But once severe and untreated dermatitis occurs, Dr. Yiannias said that hairdressers “may need to change careers” or at least face short- or long-term unemployment.

The researchers suggest reducing exposure to the allergen is key for prevention of symptoms, adding that adequate guidance on the safe use of new products is needed. Also, the researchers suggested that vocational schools should more rigorously implement education for hairdressers that addresses how to protect the skin appropriately at work.

“Hairdressers are taught during their training to be cautious about allergen exposure by avoiding touching high-risk ingredients such as hair dyes,” Dr. Yiannias added. “However, in practice, this is very difficult since the wearing of gloves can impair the tactile sensations that hairdressers often feel is essential in performing their job.”

The study received no industry funding. Dr. Yiannias reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Early emollient use reduces dermatitis in at-risk infants

Article Type
Changed
Wed, 09/28/2022 - 08:11

Recent study findings published in Allergy (2022 Aug 23. doi: 10.1111/all.15491) suggest that twice-daily application of emollients within the first 8 weeks of life significantly reduces the cumulative incidence of atopic dermatitis (AD) among infants at high risk for the condition, at least within the first year of life.

The single-center STOP-AD clinical trial recruited term infants within 4 days of birth who were at high risk for AD, as determined on the basis of a parent-reported history of the disease or asthma or allergic rhinitis. Infants were randomly assigned to undergo either a standard skin care routine (control group; n = 160) or twice-daily emollient application for the first 8 weeks of life (intervention group; n = 161).

In the intervention group, infants received an emollient that was specifically formulated for AD-prone skin. The control group received standard skin care advice, which did not include specific advice on bathing frequency or regular emollient use.

The mean age of the infants at randomization was 1.9 days. A total of 41 infants in the intervention group and 20 infants in the control group were withdrawn from the study. Most withdrawals (80%) occurred prior to the 2-week visit.

At 12 months, the cumulative incidence of AD was 32.8% in the intervention group and 46.4% in the control group (P = .036). The investigators note that daily emollient use was associated with a 29% lower risk of cumulative AD at 1 year in comparison with the control intervention.

No significant difference was observed between the groups regarding the incidence of parent-reported skin infections during the treatment period (5.0% vs. 5.7%; P > .05).

Study investigator Jonathan O’Brien Hourihane, MBBS, of the Royal College of Surgeons in Dublin, said in an interview that previously published findings from the BASELINE study supported the rationale for the early use of emollients in infancy to prevent AD.

The investigators of the BASELINE study found that skin barrier function, as measured by transepidermal water loss, increased from birth to 8 weeks but then became stable at 6 months. These observations suggest that the period during early infancy “could be a critical window in which to protect the skin barrier” of infants at risk for AD, Dr. Hourihane added.

Dr. Hourihane, who serves as the head of department of pediatrics at the Royal College of Surgeons, explained that the long-term clinical burden of AD is often more significant if the condition begins earlier in life, underscoring the importance of early prevention and control.



“The casual role [of AD] in other allergic conditions remains suspected but not proven, but its association is clear,” he said. He noted that infants with eczema “also have poorer sleep, and the condition causes increased family disruption,” highlighting the far-reaching burden of AD.

Commenting on the study, Adelaide Hebert, MD, professor of pediatric dermatology at the University of Texas, Houston, said in an interview that the barrier defect observed in AD is one of the prime areas to address as a means of controlling the chronic, relapsing disorder. She noted that the use of emollients can repair this defective barrier.

“Early initiation of emollients has the potential to reduce dryness, itching, transgression of allergens, and infectious agents,” explained Dr. Hebert, who wasn’t involved in the study. “Emollient application also allows the parent to inspect the skin surface and address any challenges in a timely manner.”

In the STOP-AD trial, Dr. Hourihane and colleagues also found that, among patients with loss-of-function (LoF) mutations in the filaggrin gene (FLG), the prevalence of AD at 6 and 12 months seemed to be a higher than among patients with the wild-type gene, but the difference did not reach statistical significance.

Commenting on this finding, Dr. Hebert noted that LoF FLG mutation carriers may benefit especially from emollient use, given that LoF mutations in FLG is associated with reduced production of natural moisturizing factors in the skin.

Regarding future research directions, Dr. Hourihane stated that there is a need for replication and validation of the findings in studies that include infants from different ethnic backgrounds as well as those from various social settings. These studies should also include variable treatment windows to determine both short- and longer-term effects of emollient use in this population, Dr. Hourihane explained.

Dr. Hourihane added that he and the investigators do not yet understand which aspect of the study’s program was key for reducing the incidence of AD in the first year of life. “The timing of emollient initiation, the duration of treatment, the products, or maybe just a combination of these” could be possible explanations.

The study was independently supported. Dr. Hourihand reported receiving grant funding from Aimmune Therapeutics and DBV Technologies. Dr. Hebert reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Recent study findings published in Allergy (2022 Aug 23. doi: 10.1111/all.15491) suggest that twice-daily application of emollients within the first 8 weeks of life significantly reduces the cumulative incidence of atopic dermatitis (AD) among infants at high risk for the condition, at least within the first year of life.

The single-center STOP-AD clinical trial recruited term infants within 4 days of birth who were at high risk for AD, as determined on the basis of a parent-reported history of the disease or asthma or allergic rhinitis. Infants were randomly assigned to undergo either a standard skin care routine (control group; n = 160) or twice-daily emollient application for the first 8 weeks of life (intervention group; n = 161).

In the intervention group, infants received an emollient that was specifically formulated for AD-prone skin. The control group received standard skin care advice, which did not include specific advice on bathing frequency or regular emollient use.

The mean age of the infants at randomization was 1.9 days. A total of 41 infants in the intervention group and 20 infants in the control group were withdrawn from the study. Most withdrawals (80%) occurred prior to the 2-week visit.

At 12 months, the cumulative incidence of AD was 32.8% in the intervention group and 46.4% in the control group (P = .036). The investigators note that daily emollient use was associated with a 29% lower risk of cumulative AD at 1 year in comparison with the control intervention.

No significant difference was observed between the groups regarding the incidence of parent-reported skin infections during the treatment period (5.0% vs. 5.7%; P > .05).

Study investigator Jonathan O’Brien Hourihane, MBBS, of the Royal College of Surgeons in Dublin, said in an interview that previously published findings from the BASELINE study supported the rationale for the early use of emollients in infancy to prevent AD.

The investigators of the BASELINE study found that skin barrier function, as measured by transepidermal water loss, increased from birth to 8 weeks but then became stable at 6 months. These observations suggest that the period during early infancy “could be a critical window in which to protect the skin barrier” of infants at risk for AD, Dr. Hourihane added.

Dr. Hourihane, who serves as the head of department of pediatrics at the Royal College of Surgeons, explained that the long-term clinical burden of AD is often more significant if the condition begins earlier in life, underscoring the importance of early prevention and control.



“The casual role [of AD] in other allergic conditions remains suspected but not proven, but its association is clear,” he said. He noted that infants with eczema “also have poorer sleep, and the condition causes increased family disruption,” highlighting the far-reaching burden of AD.

Commenting on the study, Adelaide Hebert, MD, professor of pediatric dermatology at the University of Texas, Houston, said in an interview that the barrier defect observed in AD is one of the prime areas to address as a means of controlling the chronic, relapsing disorder. She noted that the use of emollients can repair this defective barrier.

“Early initiation of emollients has the potential to reduce dryness, itching, transgression of allergens, and infectious agents,” explained Dr. Hebert, who wasn’t involved in the study. “Emollient application also allows the parent to inspect the skin surface and address any challenges in a timely manner.”

In the STOP-AD trial, Dr. Hourihane and colleagues also found that, among patients with loss-of-function (LoF) mutations in the filaggrin gene (FLG), the prevalence of AD at 6 and 12 months seemed to be a higher than among patients with the wild-type gene, but the difference did not reach statistical significance.

Commenting on this finding, Dr. Hebert noted that LoF FLG mutation carriers may benefit especially from emollient use, given that LoF mutations in FLG is associated with reduced production of natural moisturizing factors in the skin.

Regarding future research directions, Dr. Hourihane stated that there is a need for replication and validation of the findings in studies that include infants from different ethnic backgrounds as well as those from various social settings. These studies should also include variable treatment windows to determine both short- and longer-term effects of emollient use in this population, Dr. Hourihane explained.

Dr. Hourihane added that he and the investigators do not yet understand which aspect of the study’s program was key for reducing the incidence of AD in the first year of life. “The timing of emollient initiation, the duration of treatment, the products, or maybe just a combination of these” could be possible explanations.

The study was independently supported. Dr. Hourihand reported receiving grant funding from Aimmune Therapeutics and DBV Technologies. Dr. Hebert reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Recent study findings published in Allergy (2022 Aug 23. doi: 10.1111/all.15491) suggest that twice-daily application of emollients within the first 8 weeks of life significantly reduces the cumulative incidence of atopic dermatitis (AD) among infants at high risk for the condition, at least within the first year of life.

The single-center STOP-AD clinical trial recruited term infants within 4 days of birth who were at high risk for AD, as determined on the basis of a parent-reported history of the disease or asthma or allergic rhinitis. Infants were randomly assigned to undergo either a standard skin care routine (control group; n = 160) or twice-daily emollient application for the first 8 weeks of life (intervention group; n = 161).

In the intervention group, infants received an emollient that was specifically formulated for AD-prone skin. The control group received standard skin care advice, which did not include specific advice on bathing frequency or regular emollient use.

The mean age of the infants at randomization was 1.9 days. A total of 41 infants in the intervention group and 20 infants in the control group were withdrawn from the study. Most withdrawals (80%) occurred prior to the 2-week visit.

At 12 months, the cumulative incidence of AD was 32.8% in the intervention group and 46.4% in the control group (P = .036). The investigators note that daily emollient use was associated with a 29% lower risk of cumulative AD at 1 year in comparison with the control intervention.

No significant difference was observed between the groups regarding the incidence of parent-reported skin infections during the treatment period (5.0% vs. 5.7%; P > .05).

Study investigator Jonathan O’Brien Hourihane, MBBS, of the Royal College of Surgeons in Dublin, said in an interview that previously published findings from the BASELINE study supported the rationale for the early use of emollients in infancy to prevent AD.

The investigators of the BASELINE study found that skin barrier function, as measured by transepidermal water loss, increased from birth to 8 weeks but then became stable at 6 months. These observations suggest that the period during early infancy “could be a critical window in which to protect the skin barrier” of infants at risk for AD, Dr. Hourihane added.

Dr. Hourihane, who serves as the head of department of pediatrics at the Royal College of Surgeons, explained that the long-term clinical burden of AD is often more significant if the condition begins earlier in life, underscoring the importance of early prevention and control.



“The casual role [of AD] in other allergic conditions remains suspected but not proven, but its association is clear,” he said. He noted that infants with eczema “also have poorer sleep, and the condition causes increased family disruption,” highlighting the far-reaching burden of AD.

Commenting on the study, Adelaide Hebert, MD, professor of pediatric dermatology at the University of Texas, Houston, said in an interview that the barrier defect observed in AD is one of the prime areas to address as a means of controlling the chronic, relapsing disorder. She noted that the use of emollients can repair this defective barrier.

“Early initiation of emollients has the potential to reduce dryness, itching, transgression of allergens, and infectious agents,” explained Dr. Hebert, who wasn’t involved in the study. “Emollient application also allows the parent to inspect the skin surface and address any challenges in a timely manner.”

In the STOP-AD trial, Dr. Hourihane and colleagues also found that, among patients with loss-of-function (LoF) mutations in the filaggrin gene (FLG), the prevalence of AD at 6 and 12 months seemed to be a higher than among patients with the wild-type gene, but the difference did not reach statistical significance.

Commenting on this finding, Dr. Hebert noted that LoF FLG mutation carriers may benefit especially from emollient use, given that LoF mutations in FLG is associated with reduced production of natural moisturizing factors in the skin.

Regarding future research directions, Dr. Hourihane stated that there is a need for replication and validation of the findings in studies that include infants from different ethnic backgrounds as well as those from various social settings. These studies should also include variable treatment windows to determine both short- and longer-term effects of emollient use in this population, Dr. Hourihane explained.

Dr. Hourihane added that he and the investigators do not yet understand which aspect of the study’s program was key for reducing the incidence of AD in the first year of life. “The timing of emollient initiation, the duration of treatment, the products, or maybe just a combination of these” could be possible explanations.

The study was independently supported. Dr. Hourihand reported receiving grant funding from Aimmune Therapeutics and DBV Technologies. Dr. Hebert reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALLERGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Ulcerative colitis: New AI may standardize endoscopic classification of activity

Article Type
Changed
Mon, 08/08/2022 - 09:28

A newly developed artificial intelligence (AI) model accurately evaluated endoscopic images from patients with ulcerative colitis (UC), according to new research. The AI could even distinguish between all four Mayo endoscopic subscore (MES) levels of disease activity, which is a first among similar AI models, the researchers noted.

Although management of UC involves disease activity monitoring and prompt response with appropriate therapy, endoscopic assessment has shown significant intra- and interobserver variation, thereby reducing the reliability of individual evaluations. Techniques that use AI may eliminate observer variation and aid in distinguishing between all levels of endoscopic activity with good accuracy.

“However, up until now, only a few computer-assisted diagnostic tools have been available for UC, and none are capable of distinguishing between all levels of endoscopic activity with sufficient accuracy,” wrote study authors Bobby Lo, MD, of the Copenhagen University Hospital Hvidovre, and colleagues, who published their findings in The American Journal of Gastroenterology. The researchers believe their new AI could optimize and standardize the assessment of UC severity measured by MES, regardless of the operator’s level of expertise.

The researchers extracted 1,484 unique endoscopic images from 467 patients with UC (median age, 45 years; 45.3% male) who had undergone a colonoscopy or sigmoidoscopy. Images of healthy colon mucosa were also extracted from a colorectal cancer surveillance program “to adequately reflect the distribution in the clinic,” the researchers wrote.

Two experts blinded for clinical details or other identifying information separately scored all images according to the MES. A third expert, blinded to results from the initial two experts, also scored the images in case of disagreement between the first sets of scores. Nearly half of the images (47.3%) were classified as normal, while 26.0% were deemed MES 1 (mild activity), 20.2% were classified as MES 2 (moderate activity), and 6.5% were classified as MES 3 (severe activity).

All endoscopic images were randomly split into a training dataset (85%) and a testing dataset (15%) with stratified sampling. Several convolutional neural networks architectures were considered for automatically classifying the severity of UC. The investigators used a fivefold cross-validation of the training data to develop and select the optimal final model. Subsequently, the investigators then used unseen test datasets to evaluate the model.

The final chosen model was the EfficientNetB2, given the superiority of its mean accuracy during cross-validation. This model, according to the researchers, is able to “process images significantly faster and requires less computing power than InceptionNetV3,” which was the other model evaluated in the study.

The test accuracy of the final model in distinguishing between all categories of MES was 0.84. The investigators evaluated the model on binary tasks of distinguishing MES 0 versus MES 1-3 and MES 0-1 versus 2-3. They found the model achieved accuracies of 0.94 and 0.93 and areas under the receiver operating characteristic curves of 0.997 and 0.998, respectively.

According to the researchers, they used 10-fold fewer images in this study than have been used in similar studies but noted that the developed model demonstrated an accuracy of around 0.74 “even when using images from another cohort” that had lower image quality. The investigators added that the model could have achieved better results if more data were available, citing this as a limitation of the study.

“In conclusion, we have developed a deep learning model that exceeded previously reported results in classifying endoscopic images from UC patients. This may automate and optimize the evaluation of disease severity in both clinical and academic settings and ideally in clinical trials,” they wrote.“Finally, this study serves as a stepping stone for future projects, including the use of video material and the assessment of long-term outcomes.”

The authors reported no relevant conflicts of interest.

Publications
Topics
Sections

A newly developed artificial intelligence (AI) model accurately evaluated endoscopic images from patients with ulcerative colitis (UC), according to new research. The AI could even distinguish between all four Mayo endoscopic subscore (MES) levels of disease activity, which is a first among similar AI models, the researchers noted.

Although management of UC involves disease activity monitoring and prompt response with appropriate therapy, endoscopic assessment has shown significant intra- and interobserver variation, thereby reducing the reliability of individual evaluations. Techniques that use AI may eliminate observer variation and aid in distinguishing between all levels of endoscopic activity with good accuracy.

“However, up until now, only a few computer-assisted diagnostic tools have been available for UC, and none are capable of distinguishing between all levels of endoscopic activity with sufficient accuracy,” wrote study authors Bobby Lo, MD, of the Copenhagen University Hospital Hvidovre, and colleagues, who published their findings in The American Journal of Gastroenterology. The researchers believe their new AI could optimize and standardize the assessment of UC severity measured by MES, regardless of the operator’s level of expertise.

The researchers extracted 1,484 unique endoscopic images from 467 patients with UC (median age, 45 years; 45.3% male) who had undergone a colonoscopy or sigmoidoscopy. Images of healthy colon mucosa were also extracted from a colorectal cancer surveillance program “to adequately reflect the distribution in the clinic,” the researchers wrote.

Two experts blinded for clinical details or other identifying information separately scored all images according to the MES. A third expert, blinded to results from the initial two experts, also scored the images in case of disagreement between the first sets of scores. Nearly half of the images (47.3%) were classified as normal, while 26.0% were deemed MES 1 (mild activity), 20.2% were classified as MES 2 (moderate activity), and 6.5% were classified as MES 3 (severe activity).

All endoscopic images were randomly split into a training dataset (85%) and a testing dataset (15%) with stratified sampling. Several convolutional neural networks architectures were considered for automatically classifying the severity of UC. The investigators used a fivefold cross-validation of the training data to develop and select the optimal final model. Subsequently, the investigators then used unseen test datasets to evaluate the model.

The final chosen model was the EfficientNetB2, given the superiority of its mean accuracy during cross-validation. This model, according to the researchers, is able to “process images significantly faster and requires less computing power than InceptionNetV3,” which was the other model evaluated in the study.

The test accuracy of the final model in distinguishing between all categories of MES was 0.84. The investigators evaluated the model on binary tasks of distinguishing MES 0 versus MES 1-3 and MES 0-1 versus 2-3. They found the model achieved accuracies of 0.94 and 0.93 and areas under the receiver operating characteristic curves of 0.997 and 0.998, respectively.

According to the researchers, they used 10-fold fewer images in this study than have been used in similar studies but noted that the developed model demonstrated an accuracy of around 0.74 “even when using images from another cohort” that had lower image quality. The investigators added that the model could have achieved better results if more data were available, citing this as a limitation of the study.

“In conclusion, we have developed a deep learning model that exceeded previously reported results in classifying endoscopic images from UC patients. This may automate and optimize the evaluation of disease severity in both clinical and academic settings and ideally in clinical trials,” they wrote.“Finally, this study serves as a stepping stone for future projects, including the use of video material and the assessment of long-term outcomes.”

The authors reported no relevant conflicts of interest.

A newly developed artificial intelligence (AI) model accurately evaluated endoscopic images from patients with ulcerative colitis (UC), according to new research. The AI could even distinguish between all four Mayo endoscopic subscore (MES) levels of disease activity, which is a first among similar AI models, the researchers noted.

Although management of UC involves disease activity monitoring and prompt response with appropriate therapy, endoscopic assessment has shown significant intra- and interobserver variation, thereby reducing the reliability of individual evaluations. Techniques that use AI may eliminate observer variation and aid in distinguishing between all levels of endoscopic activity with good accuracy.

“However, up until now, only a few computer-assisted diagnostic tools have been available for UC, and none are capable of distinguishing between all levels of endoscopic activity with sufficient accuracy,” wrote study authors Bobby Lo, MD, of the Copenhagen University Hospital Hvidovre, and colleagues, who published their findings in The American Journal of Gastroenterology. The researchers believe their new AI could optimize and standardize the assessment of UC severity measured by MES, regardless of the operator’s level of expertise.

The researchers extracted 1,484 unique endoscopic images from 467 patients with UC (median age, 45 years; 45.3% male) who had undergone a colonoscopy or sigmoidoscopy. Images of healthy colon mucosa were also extracted from a colorectal cancer surveillance program “to adequately reflect the distribution in the clinic,” the researchers wrote.

Two experts blinded for clinical details or other identifying information separately scored all images according to the MES. A third expert, blinded to results from the initial two experts, also scored the images in case of disagreement between the first sets of scores. Nearly half of the images (47.3%) were classified as normal, while 26.0% were deemed MES 1 (mild activity), 20.2% were classified as MES 2 (moderate activity), and 6.5% were classified as MES 3 (severe activity).

All endoscopic images were randomly split into a training dataset (85%) and a testing dataset (15%) with stratified sampling. Several convolutional neural networks architectures were considered for automatically classifying the severity of UC. The investigators used a fivefold cross-validation of the training data to develop and select the optimal final model. Subsequently, the investigators then used unseen test datasets to evaluate the model.

The final chosen model was the EfficientNetB2, given the superiority of its mean accuracy during cross-validation. This model, according to the researchers, is able to “process images significantly faster and requires less computing power than InceptionNetV3,” which was the other model evaluated in the study.

The test accuracy of the final model in distinguishing between all categories of MES was 0.84. The investigators evaluated the model on binary tasks of distinguishing MES 0 versus MES 1-3 and MES 0-1 versus 2-3. They found the model achieved accuracies of 0.94 and 0.93 and areas under the receiver operating characteristic curves of 0.997 and 0.998, respectively.

According to the researchers, they used 10-fold fewer images in this study than have been used in similar studies but noted that the developed model demonstrated an accuracy of around 0.74 “even when using images from another cohort” that had lower image quality. The investigators added that the model could have achieved better results if more data were available, citing this as a limitation of the study.

“In conclusion, we have developed a deep learning model that exceeded previously reported results in classifying endoscopic images from UC patients. This may automate and optimize the evaluation of disease severity in both clinical and academic settings and ideally in clinical trials,” they wrote.“Finally, this study serves as a stepping stone for future projects, including the use of video material and the assessment of long-term outcomes.”

The authors reported no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE AMERICAN JOURNAL OF GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Fecal microbiota transplants: Finding new microbial markers for donor efficacy in UC

Article Type
Changed
Mon, 08/08/2022 - 09:27

Donor microbiota stability and species evenness, along with presence of certain microbial species, may predict donor efficacy in fecal microbiota transplantation (FMT) for the treatment of patients with ulcerative colitis (UC), a new study suggests.

The authors noted that these markers of donor efficacy could be used to optimize selection of donors to treat patients with UC and improve outcomes.

The investigators hypothesized that “there are features beyond microbial richness, individual bacterial species, and specific metabolites that may aid in successful identification of effective donors.” They published these findings in Gut.

The LOTUS clinical trial explored the efficacy of lyophilized FMT in patients with UC, but was cut short because of the COVID-19 pandemic. The study investigators analyzed fecal samples from the two donors enrolled in the trial to identify functional and taxonomic differences within the donors’ microbiota that have clinical relevance to their efficacy in active UC. Donor 1’s samples showed 100% efficacy among patients with UC, while donor 2’s samples showed 36% efficacy.

In donor 1, the researchers observed “robust stability in species richness” during the sampling periods, whereas donor 2 exhibited larger fluctuations. Although the species richness was significantly greater in the donor 2, the researchers reported that donor 1 exhibited significantly greater diversity at the higher taxonomic level of phylum. According to the investigators, this was reflected by the detection of Euryarchaeota, Synergistetes, and Verrucomicrobia in donor 1 but not in the second donor.

Despite a higher species richness in donor 2, the researchers found that a higher rate of uniquely classified metagenome-assembled genomes was produced per sample in the donor 1, which indicates greater species evenness, the second marker of efficacy according to investigators.

Blautia wexlerae was a highly prevalent metagenome-assembled genome that was enriched in donor 1 compared with donor 2, and the researchers reported that “a taxon with high similarity (OTU215) showed evidence of engraftment in patients receiving donor 1.” In addition, B. wexlerae demonstrated a trend toward enrichment in donor 2 samples that were associated with positive outcomes in patients and demonstrated evidence of engraftment in patients who received donor 2.

Ninety bacterial species as well as one archaeon were differentially abundant between donors, including 44 donor samples which were greater than 0.1% in relative abundance. According to the researchers, 17 out of the 44 species were enriched in the effective donor, with 11 (64.7%) assembled into high-quality genomes highly prevalent in that donor and 6 that demonstrated evidence of engraftment in patients.

Lastly, the investigators sought to validate the observed associations between certain microbial taxa and donor clinical efficacy in an independent cohort. In this analysis, the investigators evaluated shotgun metagenomics data of donors used to treat patients with UC and examined relative abundances against patient outcomes. Species associated with treatment success included Ruminococcus bromii, B. wexlerae, Eubacterium hallii, Coprococcus catus, Fusicatenibacter saccharivorans, and Parabacteroides merdae.

“We identified microbiota stability and species evenness as markers of donor efficacy, as well as specific microbial species that could be employed to improve donor selection and build artificial microbial consortia to treat UC,” the investigators concluded.

Given that the study enrolled only two donors, the generalizability of the findings may be limited. The researchers wrote that another limitation of the data analysis was “the bias towards more relatively abundant taxa due to the inability to assemble genomes from low-abundance species.” The lack of prospective validation studies on the novel metrics is another limitation of the study.

Some investigators disclosed relationships with biomedical companies, such as Takeda and Janssen.

Publications
Topics
Sections

Donor microbiota stability and species evenness, along with presence of certain microbial species, may predict donor efficacy in fecal microbiota transplantation (FMT) for the treatment of patients with ulcerative colitis (UC), a new study suggests.

The authors noted that these markers of donor efficacy could be used to optimize selection of donors to treat patients with UC and improve outcomes.

The investigators hypothesized that “there are features beyond microbial richness, individual bacterial species, and specific metabolites that may aid in successful identification of effective donors.” They published these findings in Gut.

The LOTUS clinical trial explored the efficacy of lyophilized FMT in patients with UC, but was cut short because of the COVID-19 pandemic. The study investigators analyzed fecal samples from the two donors enrolled in the trial to identify functional and taxonomic differences within the donors’ microbiota that have clinical relevance to their efficacy in active UC. Donor 1’s samples showed 100% efficacy among patients with UC, while donor 2’s samples showed 36% efficacy.

In donor 1, the researchers observed “robust stability in species richness” during the sampling periods, whereas donor 2 exhibited larger fluctuations. Although the species richness was significantly greater in the donor 2, the researchers reported that donor 1 exhibited significantly greater diversity at the higher taxonomic level of phylum. According to the investigators, this was reflected by the detection of Euryarchaeota, Synergistetes, and Verrucomicrobia in donor 1 but not in the second donor.

Despite a higher species richness in donor 2, the researchers found that a higher rate of uniquely classified metagenome-assembled genomes was produced per sample in the donor 1, which indicates greater species evenness, the second marker of efficacy according to investigators.

Blautia wexlerae was a highly prevalent metagenome-assembled genome that was enriched in donor 1 compared with donor 2, and the researchers reported that “a taxon with high similarity (OTU215) showed evidence of engraftment in patients receiving donor 1.” In addition, B. wexlerae demonstrated a trend toward enrichment in donor 2 samples that were associated with positive outcomes in patients and demonstrated evidence of engraftment in patients who received donor 2.

Ninety bacterial species as well as one archaeon were differentially abundant between donors, including 44 donor samples which were greater than 0.1% in relative abundance. According to the researchers, 17 out of the 44 species were enriched in the effective donor, with 11 (64.7%) assembled into high-quality genomes highly prevalent in that donor and 6 that demonstrated evidence of engraftment in patients.

Lastly, the investigators sought to validate the observed associations between certain microbial taxa and donor clinical efficacy in an independent cohort. In this analysis, the investigators evaluated shotgun metagenomics data of donors used to treat patients with UC and examined relative abundances against patient outcomes. Species associated with treatment success included Ruminococcus bromii, B. wexlerae, Eubacterium hallii, Coprococcus catus, Fusicatenibacter saccharivorans, and Parabacteroides merdae.

“We identified microbiota stability and species evenness as markers of donor efficacy, as well as specific microbial species that could be employed to improve donor selection and build artificial microbial consortia to treat UC,” the investigators concluded.

Given that the study enrolled only two donors, the generalizability of the findings may be limited. The researchers wrote that another limitation of the data analysis was “the bias towards more relatively abundant taxa due to the inability to assemble genomes from low-abundance species.” The lack of prospective validation studies on the novel metrics is another limitation of the study.

Some investigators disclosed relationships with biomedical companies, such as Takeda and Janssen.

Donor microbiota stability and species evenness, along with presence of certain microbial species, may predict donor efficacy in fecal microbiota transplantation (FMT) for the treatment of patients with ulcerative colitis (UC), a new study suggests.

The authors noted that these markers of donor efficacy could be used to optimize selection of donors to treat patients with UC and improve outcomes.

The investigators hypothesized that “there are features beyond microbial richness, individual bacterial species, and specific metabolites that may aid in successful identification of effective donors.” They published these findings in Gut.

The LOTUS clinical trial explored the efficacy of lyophilized FMT in patients with UC, but was cut short because of the COVID-19 pandemic. The study investigators analyzed fecal samples from the two donors enrolled in the trial to identify functional and taxonomic differences within the donors’ microbiota that have clinical relevance to their efficacy in active UC. Donor 1’s samples showed 100% efficacy among patients with UC, while donor 2’s samples showed 36% efficacy.

In donor 1, the researchers observed “robust stability in species richness” during the sampling periods, whereas donor 2 exhibited larger fluctuations. Although the species richness was significantly greater in the donor 2, the researchers reported that donor 1 exhibited significantly greater diversity at the higher taxonomic level of phylum. According to the investigators, this was reflected by the detection of Euryarchaeota, Synergistetes, and Verrucomicrobia in donor 1 but not in the second donor.

Despite a higher species richness in donor 2, the researchers found that a higher rate of uniquely classified metagenome-assembled genomes was produced per sample in the donor 1, which indicates greater species evenness, the second marker of efficacy according to investigators.

Blautia wexlerae was a highly prevalent metagenome-assembled genome that was enriched in donor 1 compared with donor 2, and the researchers reported that “a taxon with high similarity (OTU215) showed evidence of engraftment in patients receiving donor 1.” In addition, B. wexlerae demonstrated a trend toward enrichment in donor 2 samples that were associated with positive outcomes in patients and demonstrated evidence of engraftment in patients who received donor 2.

Ninety bacterial species as well as one archaeon were differentially abundant between donors, including 44 donor samples which were greater than 0.1% in relative abundance. According to the researchers, 17 out of the 44 species were enriched in the effective donor, with 11 (64.7%) assembled into high-quality genomes highly prevalent in that donor and 6 that demonstrated evidence of engraftment in patients.

Lastly, the investigators sought to validate the observed associations between certain microbial taxa and donor clinical efficacy in an independent cohort. In this analysis, the investigators evaluated shotgun metagenomics data of donors used to treat patients with UC and examined relative abundances against patient outcomes. Species associated with treatment success included Ruminococcus bromii, B. wexlerae, Eubacterium hallii, Coprococcus catus, Fusicatenibacter saccharivorans, and Parabacteroides merdae.

“We identified microbiota stability and species evenness as markers of donor efficacy, as well as specific microbial species that could be employed to improve donor selection and build artificial microbial consortia to treat UC,” the investigators concluded.

Given that the study enrolled only two donors, the generalizability of the findings may be limited. The researchers wrote that another limitation of the data analysis was “the bias towards more relatively abundant taxa due to the inability to assemble genomes from low-abundance species.” The lack of prospective validation studies on the novel metrics is another limitation of the study.

Some investigators disclosed relationships with biomedical companies, such as Takeda and Janssen.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GUT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AI-based CADe outperforms high-definition white light in colonoscopy

Augment, but not replace
Article Type
Changed
Mon, 06/13/2022 - 14:12

An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.

While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.

The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.

Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).

The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.

Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.

Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).

A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).

According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.

The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”

This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.

Body

Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).

This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis
Dr. Monika Fischer
Given its superior performance, compared with high-definition white light colonoscopy, AI-assisted colonoscopy will likely soon become standard of care. Beyond adenoma detection programs such as CADe, there will be systems to aid with the diagnosis and predict histology such as CADx and other AI programs that evaluate the quality of colon examination by the endoscopist. CADe systems are currently quite expensive but expected to be more affordable as new products become available on the market.

AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.

Publications
Topics
Sections
Body

Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).

This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis
Dr. Monika Fischer
Given its superior performance, compared with high-definition white light colonoscopy, AI-assisted colonoscopy will likely soon become standard of care. Beyond adenoma detection programs such as CADe, there will be systems to aid with the diagnosis and predict histology such as CADx and other AI programs that evaluate the quality of colon examination by the endoscopist. CADe systems are currently quite expensive but expected to be more affordable as new products become available on the market.

AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.

Body

Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).

This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis
Dr. Monika Fischer
Given its superior performance, compared with high-definition white light colonoscopy, AI-assisted colonoscopy will likely soon become standard of care. Beyond adenoma detection programs such as CADe, there will be systems to aid with the diagnosis and predict histology such as CADx and other AI programs that evaluate the quality of colon examination by the endoscopist. CADe systems are currently quite expensive but expected to be more affordable as new products become available on the market.

AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.

Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.

Title
Augment, but not replace
Augment, but not replace

An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.

While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.

The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.

Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).

The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.

Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.

Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).

A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).

According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.

The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”

This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.

An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.

While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.

The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.

Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).

The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.

Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.

Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).

A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).

According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.

The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”

This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Esophageal cancer screening isn’t for everyone: Study

One size doesn’t fit all
Article Type
Changed
Tue, 06/21/2022 - 11:08

Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.

Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.

Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.

In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.

The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.

In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.

Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.

The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.

By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.

The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.

“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.

The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.

Body

Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers.
Dr. R.E. Pouw
Therefore, Barrett’s esophagus and early esophageal adenocarcinoma would be ideal screening targets since this could prevent significant disease burden and health care costs. However, optimal screening strategies should be personalized, cost effective, and most importantly cause no harm to healthy subjects.

Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.

Publications
Topics
Sections
Body

Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers.
Dr. R.E. Pouw
Therefore, Barrett’s esophagus and early esophageal adenocarcinoma would be ideal screening targets since this could prevent significant disease burden and health care costs. However, optimal screening strategies should be personalized, cost effective, and most importantly cause no harm to healthy subjects.

Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.

Body

Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers.
Dr. R.E. Pouw
Therefore, Barrett’s esophagus and early esophageal adenocarcinoma would be ideal screening targets since this could prevent significant disease burden and health care costs. However, optimal screening strategies should be personalized, cost effective, and most importantly cause no harm to healthy subjects.

Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.

R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.

Title
One size doesn’t fit all
One size doesn’t fit all

Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.

Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.

Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.

In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.

The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.

In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.

Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.

The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.

By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.

The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.

“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.

The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.

Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.

Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.

Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.

In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.

The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.

In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.

Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.

The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.

By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.

The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.

“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.

The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Confronting endoscopic infection control

Cost, environmental impact remain issues
Article Type
Changed
Mon, 06/13/2022 - 14:22

The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.

Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
 

Current challenges in endoscope reprocessing

According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.

Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.

The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.

“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
 

New sterilization technologies

Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.

The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
 

 

 

Quality indicators in reprocessing

Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.

Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.

Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
 

Novel disposable duodenoscope technologies

Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.

There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
 

Environmental impact

The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.

Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.

“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”

The authors disclosed no conflicts of interest.

Body

 

The future remains to be seen

Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center
Dr. Jennifer Maranki
As a result of the challenges associated with duodenoscope reprocessing, the FDA recommended additional measures be implemented by facilities to help mitigate the infectious risk of duodenoscopes. Current research is being conducted on novel sterilization techniques as well as several point-of-care reprocessing quality indicators. Despite improvements in reprocessing quality, residual contamination of duodenoscopes led the FDA to recommend in 2019 that all units using duodenoscopes transition to duodenoscopes with disposable design elements. In addition, fully disposable duodenoscopes have been developed, with two types currently available in the United States.

One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.

In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.

Publications
Topics
Sections
Body

 

The future remains to be seen

Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center
Dr. Jennifer Maranki
As a result of the challenges associated with duodenoscope reprocessing, the FDA recommended additional measures be implemented by facilities to help mitigate the infectious risk of duodenoscopes. Current research is being conducted on novel sterilization techniques as well as several point-of-care reprocessing quality indicators. Despite improvements in reprocessing quality, residual contamination of duodenoscopes led the FDA to recommend in 2019 that all units using duodenoscopes transition to duodenoscopes with disposable design elements. In addition, fully disposable duodenoscopes have been developed, with two types currently available in the United States.

One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.

In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.

Body

 

The future remains to be seen

Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center
Dr. Jennifer Maranki
As a result of the challenges associated with duodenoscope reprocessing, the FDA recommended additional measures be implemented by facilities to help mitigate the infectious risk of duodenoscopes. Current research is being conducted on novel sterilization techniques as well as several point-of-care reprocessing quality indicators. Despite improvements in reprocessing quality, residual contamination of duodenoscopes led the FDA to recommend in 2019 that all units using duodenoscopes transition to duodenoscopes with disposable design elements. In addition, fully disposable duodenoscopes have been developed, with two types currently available in the United States.

One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.

In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.

Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.

Title
Cost, environmental impact remain issues
Cost, environmental impact remain issues

The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.

Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
 

Current challenges in endoscope reprocessing

According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.

Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.

The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.

“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
 

New sterilization technologies

Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.

The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
 

 

 

Quality indicators in reprocessing

Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.

Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.

Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
 

Novel disposable duodenoscope technologies

Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.

There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
 

Environmental impact

The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.

Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.

“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”

The authors disclosed no conflicts of interest.

The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.

Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
 

Current challenges in endoscope reprocessing

According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.

Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.

The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.

“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
 

New sterilization technologies

Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.

The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
 

 

 

Quality indicators in reprocessing

Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.

Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.

Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
 

Novel disposable duodenoscope technologies

Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.

There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
 

Environmental impact

The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.

Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.

“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”

The authors disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Do myenteric neurons replicate in small intestine?

Contrary to controversy
Article Type
Changed
Mon, 06/13/2022 - 14:19

A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.

Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.

According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.

For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.

The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.

In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.

“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”

The authors disclose no conflicts.

Body

The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.

Allan M. Goldstein, MD is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School
Dr. Allan M. Goldstein
The debate was sparked by a study from Kulkarni and colleagues in 2017 that showed a surprising rate of neuronal turnover, with 88% of all myenteric neurons in the ileum replaced every 2 weeks. Given the complexity of enteric neuronal network formation, the concept of continual neuronal death and rebirth came as a surprise to the field. That finding is in sharp contrast to multiple studies that show essentially no enteric neurogenesis in healthy adult intestine, and to recent transcriptomic studies of the ENS that, while supporting a high turnover of intestinal epithelial cells, have found no significant cycling population of enteric neurons.

To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.

Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.

Publications
Topics
Sections
Body

The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.

Allan M. Goldstein, MD is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School
Dr. Allan M. Goldstein
The debate was sparked by a study from Kulkarni and colleagues in 2017 that showed a surprising rate of neuronal turnover, with 88% of all myenteric neurons in the ileum replaced every 2 weeks. Given the complexity of enteric neuronal network formation, the concept of continual neuronal death and rebirth came as a surprise to the field. That finding is in sharp contrast to multiple studies that show essentially no enteric neurogenesis in healthy adult intestine, and to recent transcriptomic studies of the ENS that, while supporting a high turnover of intestinal epithelial cells, have found no significant cycling population of enteric neurons.

To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.

Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.

Body

The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.

Allan M. Goldstein, MD is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School
Dr. Allan M. Goldstein
The debate was sparked by a study from Kulkarni and colleagues in 2017 that showed a surprising rate of neuronal turnover, with 88% of all myenteric neurons in the ileum replaced every 2 weeks. Given the complexity of enteric neuronal network formation, the concept of continual neuronal death and rebirth came as a surprise to the field. That finding is in sharp contrast to multiple studies that show essentially no enteric neurogenesis in healthy adult intestine, and to recent transcriptomic studies of the ENS that, while supporting a high turnover of intestinal epithelial cells, have found no significant cycling population of enteric neurons.

To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.

Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.

Title
Contrary to controversy
Contrary to controversy

A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.

Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.

According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.

For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.

The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.

In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.

“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”

The authors disclose no conflicts.

A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.

Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.

According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.

For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.

The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.

In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.

“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”

The authors disclose no conflicts.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article