User login
Diabetes Drug Improved Symptoms in Small Study of Women With Central Centrifugal Cicatricial Alopecia
TOPLINE:
in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 up-regulated genes, which included up-regulated of 23 hair keratin–associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were down-regulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, Maryland, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. Additionally, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 up-regulated genes, which included up-regulated of 23 hair keratin–associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were down-regulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, Maryland, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. Additionally, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 up-regulated genes, which included up-regulated of 23 hair keratin–associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were down-regulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, Maryland, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. Additionally, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Is Minimal Access Nipple-Sparing Mastectomy a Safer Option?
TOPLINE:
Given that both procedures appear safe overall, the choice may be guided by patients’ preference.
METHODOLOGY:
- Compared with a conventional mastectomy, a nipple-sparing mastectomy offers superior esthetic outcomes in patients with breast cancer. However, even the typical nipple-sparing approach often results in visible scarring and a high risk for nipple or areola necrosis. A minimal access approach, using endoscopic or robotic techniques, offers the potential to minimize scarring and better outcomes, but the complication risks are not well understood.
- Researchers performed a retrospective study that included 1583 patients with breast cancer who underwent conventional nipple-sparing mastectomy (n = 1356) or minimal access nipple-sparing mastectomy (n = 227) between 2018 and 2020 across 21 institutions in the Republic of Korea.
- Postoperative complications, categorized as short term (< 30 days) and long term (< 90 days), were compared between the two groups.
- The minimal access group had a higher percentage of premenopausal patients (73.57% vs 66.67%) and women with firm breasts (66.08% vs 31.27%).
TAKEAWAY:
- In total, 72 individuals (5.31%) in the conventional nipple-sparing mastectomy group and 7 (3.08%) in the minimal access nipple-sparing mastectomy group developed postoperative complications of grade IIIb or higher.
- The rate of complications between the conventional and minimal access nipple-sparing mastectomy groups in the short term (34.29% for conventional vs 32.16% for minimal access; P = .53) and long term (38.72% vs 32.16%, respectively; P = .06) was not significantly different.
- The conventional group experienced significantly fewer wound infections — both in the short term (1.62% vs 7.49%) and long term (4.28% vs 7.93%) — but a significantly higher rate of seroma (14.23% vs 9.25%), likely because of the variations in surgical instruments used during the procedures.
- Necrosis of the nipple or areola occurred more often in the minimal access group in the short term (8.81% vs 3.91%) but occurred more frequently in the conventional group in the long term (6.71% vs 2.20%).
IN PRACTICE:
“The similar complication rates suggest that both C-NSM [conventional nipple-sparing mastectomy] and M-NSM [minimal access nipple-sparing mastectomy] may be equally safe options,” the authors wrote. “Therefore, the choice of surgical approach should be tailored to patient preferences and individual needs.”
SOURCE:
The study, led by Joo Heung Kim, MD, Yongin Severance Hospital, Yonsei University College of Medicine, Yongin, South Korea, was published online on August 14, 2024, in JAMA Surgery.
LIMITATIONS:
The retrospective design comes with inherent biases. The nonrandom assignment of the participants to the two groups and the relatively small number of cases of minimal access nipple-sparing mastectomy may have affected the findings. The involvement of different surgeons and use of early robotic surgery techniques may have introduced bias as well.
DISCLOSURES:
This study was supported by Yonsei University College of Medicine and the National Research Foundation of Korea. Two authors reported receiving grants and consulting fees outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Given that both procedures appear safe overall, the choice may be guided by patients’ preference.
METHODOLOGY:
- Compared with a conventional mastectomy, a nipple-sparing mastectomy offers superior esthetic outcomes in patients with breast cancer. However, even the typical nipple-sparing approach often results in visible scarring and a high risk for nipple or areola necrosis. A minimal access approach, using endoscopic or robotic techniques, offers the potential to minimize scarring and better outcomes, but the complication risks are not well understood.
- Researchers performed a retrospective study that included 1583 patients with breast cancer who underwent conventional nipple-sparing mastectomy (n = 1356) or minimal access nipple-sparing mastectomy (n = 227) between 2018 and 2020 across 21 institutions in the Republic of Korea.
- Postoperative complications, categorized as short term (< 30 days) and long term (< 90 days), were compared between the two groups.
- The minimal access group had a higher percentage of premenopausal patients (73.57% vs 66.67%) and women with firm breasts (66.08% vs 31.27%).
TAKEAWAY:
- In total, 72 individuals (5.31%) in the conventional nipple-sparing mastectomy group and 7 (3.08%) in the minimal access nipple-sparing mastectomy group developed postoperative complications of grade IIIb or higher.
- The rate of complications between the conventional and minimal access nipple-sparing mastectomy groups in the short term (34.29% for conventional vs 32.16% for minimal access; P = .53) and long term (38.72% vs 32.16%, respectively; P = .06) was not significantly different.
- The conventional group experienced significantly fewer wound infections — both in the short term (1.62% vs 7.49%) and long term (4.28% vs 7.93%) — but a significantly higher rate of seroma (14.23% vs 9.25%), likely because of the variations in surgical instruments used during the procedures.
- Necrosis of the nipple or areola occurred more often in the minimal access group in the short term (8.81% vs 3.91%) but occurred more frequently in the conventional group in the long term (6.71% vs 2.20%).
IN PRACTICE:
“The similar complication rates suggest that both C-NSM [conventional nipple-sparing mastectomy] and M-NSM [minimal access nipple-sparing mastectomy] may be equally safe options,” the authors wrote. “Therefore, the choice of surgical approach should be tailored to patient preferences and individual needs.”
SOURCE:
The study, led by Joo Heung Kim, MD, Yongin Severance Hospital, Yonsei University College of Medicine, Yongin, South Korea, was published online on August 14, 2024, in JAMA Surgery.
LIMITATIONS:
The retrospective design comes with inherent biases. The nonrandom assignment of the participants to the two groups and the relatively small number of cases of minimal access nipple-sparing mastectomy may have affected the findings. The involvement of different surgeons and use of early robotic surgery techniques may have introduced bias as well.
DISCLOSURES:
This study was supported by Yonsei University College of Medicine and the National Research Foundation of Korea. Two authors reported receiving grants and consulting fees outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Given that both procedures appear safe overall, the choice may be guided by patients’ preference.
METHODOLOGY:
- Compared with a conventional mastectomy, a nipple-sparing mastectomy offers superior esthetic outcomes in patients with breast cancer. However, even the typical nipple-sparing approach often results in visible scarring and a high risk for nipple or areola necrosis. A minimal access approach, using endoscopic or robotic techniques, offers the potential to minimize scarring and better outcomes, but the complication risks are not well understood.
- Researchers performed a retrospective study that included 1583 patients with breast cancer who underwent conventional nipple-sparing mastectomy (n = 1356) or minimal access nipple-sparing mastectomy (n = 227) between 2018 and 2020 across 21 institutions in the Republic of Korea.
- Postoperative complications, categorized as short term (< 30 days) and long term (< 90 days), were compared between the two groups.
- The minimal access group had a higher percentage of premenopausal patients (73.57% vs 66.67%) and women with firm breasts (66.08% vs 31.27%).
TAKEAWAY:
- In total, 72 individuals (5.31%) in the conventional nipple-sparing mastectomy group and 7 (3.08%) in the minimal access nipple-sparing mastectomy group developed postoperative complications of grade IIIb or higher.
- The rate of complications between the conventional and minimal access nipple-sparing mastectomy groups in the short term (34.29% for conventional vs 32.16% for minimal access; P = .53) and long term (38.72% vs 32.16%, respectively; P = .06) was not significantly different.
- The conventional group experienced significantly fewer wound infections — both in the short term (1.62% vs 7.49%) and long term (4.28% vs 7.93%) — but a significantly higher rate of seroma (14.23% vs 9.25%), likely because of the variations in surgical instruments used during the procedures.
- Necrosis of the nipple or areola occurred more often in the minimal access group in the short term (8.81% vs 3.91%) but occurred more frequently in the conventional group in the long term (6.71% vs 2.20%).
IN PRACTICE:
“The similar complication rates suggest that both C-NSM [conventional nipple-sparing mastectomy] and M-NSM [minimal access nipple-sparing mastectomy] may be equally safe options,” the authors wrote. “Therefore, the choice of surgical approach should be tailored to patient preferences and individual needs.”
SOURCE:
The study, led by Joo Heung Kim, MD, Yongin Severance Hospital, Yonsei University College of Medicine, Yongin, South Korea, was published online on August 14, 2024, in JAMA Surgery.
LIMITATIONS:
The retrospective design comes with inherent biases. The nonrandom assignment of the participants to the two groups and the relatively small number of cases of minimal access nipple-sparing mastectomy may have affected the findings. The involvement of different surgeons and use of early robotic surgery techniques may have introduced bias as well.
DISCLOSURES:
This study was supported by Yonsei University College of Medicine and the National Research Foundation of Korea. Two authors reported receiving grants and consulting fees outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Should Genetic Testing Be Routine for Breast Cancer?
TOPLINE:
METHODOLOGY:
- Traditional risk-based criteria, including family history and ancestry, are used to guide genetic testing decisions in women with breast cancer. However, these criteria may overlook patients with actionable genetic variants, particularly those outside the typical risk profile.
- To assess the efficacy of universal genetic testing, researchers conducted a cross-sectional study that included 729 women (median age at diagnosis, 53 years; 65.4% White women) newly diagnosed with invasive breast cancer between September 2019 and April 2022 at three Canadian institutions.
- All patients received genetic counseling followed by testing for the presence of germline pathogenic variants in 17 breast cancer susceptibility genes. The primary gene panel included screening for BRCA1, BRCA2, and PALB2, and the optional secondary panel included 14 additional breast cancer susceptibility genes.
- Of the participants, 659 (90.4%) were tested for both primary and secondary gene panels, whereas 70 (9.6%) underwent testing for only the primary panel. The majority of the cohort (66.8) were diagnosed with estrogen receptor–positive breast cancer, while 15.4% had triple-negative breast cancer.
TAKEAWAY:
- The prevalence of germline pathogenic variants was 7.3% (53 patients) — 5.3% for the primary gene panel and 2.1% for the secondary panel.
- Younger age (< 40 years; odds ratio [OR], 6.83), family history of ovarian cancer (OR, 9.75), high-grade disease (OR, 1.68), and triple-negative breast cancer (OR, 3.19) were independently associated with the presence of pathogenic genetic variants in BRCA1, BRCA2, or PALB2.
- Overall, 34.3% of patients with germline pathogenic variants in BRCA1, BRCA2, or PALB2, and 85.7% of carriers of secondary panel variants would not have qualified for traditional genetic testing according to the current risk factors.
- A total of 13 patients with BRCA1, BRCA2, or PALB2 variants had confirmed pathogenic mutations and were eligible for poly(adenosine diphosphate–ribose) polymerase (PARP) inhibitors.
IN PRACTICE:
These findings have “informed our clinical practice, and we now offer mainstream, oncology-led genetic testing to all women diagnosed with incident invasive breast cancer younger than 50 years of age, those with triple-negative breast cancer and/or bilateral breast cancer, those potentially eligible for PARP inhibitors,” as well as to men with breast cancer, the authors wrote.
SOURCE:
The study was led by Zoulikha Rezoug, MSc, Lady Davis Institute of the Jewish General Hospital, McGill University in Montreal, Québec, Canada. It was published online on September 3, 2024, in JAMA Network Open.
LIMITATIONS:
The COVID-19 pandemic resulted in a 6-month recruitment pause. Adjustments in recruitment criteria, focus on younger patients and those with triple-negative breast cancer could have overestimated prevalence of genetic pathogenic variants among women aged ≥ 70 years.
DISCLOSURES:
The study was supported by grants from the Jewish General Hospital Foundation and the Québec Breast Cancer Foundation, as well as an award from the Fonds de Recherche du Québec - Santé. Two authors reported receiving grants or personal fees from various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Traditional risk-based criteria, including family history and ancestry, are used to guide genetic testing decisions in women with breast cancer. However, these criteria may overlook patients with actionable genetic variants, particularly those outside the typical risk profile.
- To assess the efficacy of universal genetic testing, researchers conducted a cross-sectional study that included 729 women (median age at diagnosis, 53 years; 65.4% White women) newly diagnosed with invasive breast cancer between September 2019 and April 2022 at three Canadian institutions.
- All patients received genetic counseling followed by testing for the presence of germline pathogenic variants in 17 breast cancer susceptibility genes. The primary gene panel included screening for BRCA1, BRCA2, and PALB2, and the optional secondary panel included 14 additional breast cancer susceptibility genes.
- Of the participants, 659 (90.4%) were tested for both primary and secondary gene panels, whereas 70 (9.6%) underwent testing for only the primary panel. The majority of the cohort (66.8) were diagnosed with estrogen receptor–positive breast cancer, while 15.4% had triple-negative breast cancer.
TAKEAWAY:
- The prevalence of germline pathogenic variants was 7.3% (53 patients) — 5.3% for the primary gene panel and 2.1% for the secondary panel.
- Younger age (< 40 years; odds ratio [OR], 6.83), family history of ovarian cancer (OR, 9.75), high-grade disease (OR, 1.68), and triple-negative breast cancer (OR, 3.19) were independently associated with the presence of pathogenic genetic variants in BRCA1, BRCA2, or PALB2.
- Overall, 34.3% of patients with germline pathogenic variants in BRCA1, BRCA2, or PALB2, and 85.7% of carriers of secondary panel variants would not have qualified for traditional genetic testing according to the current risk factors.
- A total of 13 patients with BRCA1, BRCA2, or PALB2 variants had confirmed pathogenic mutations and were eligible for poly(adenosine diphosphate–ribose) polymerase (PARP) inhibitors.
IN PRACTICE:
These findings have “informed our clinical practice, and we now offer mainstream, oncology-led genetic testing to all women diagnosed with incident invasive breast cancer younger than 50 years of age, those with triple-negative breast cancer and/or bilateral breast cancer, those potentially eligible for PARP inhibitors,” as well as to men with breast cancer, the authors wrote.
SOURCE:
The study was led by Zoulikha Rezoug, MSc, Lady Davis Institute of the Jewish General Hospital, McGill University in Montreal, Québec, Canada. It was published online on September 3, 2024, in JAMA Network Open.
LIMITATIONS:
The COVID-19 pandemic resulted in a 6-month recruitment pause. Adjustments in recruitment criteria, focus on younger patients and those with triple-negative breast cancer could have overestimated prevalence of genetic pathogenic variants among women aged ≥ 70 years.
DISCLOSURES:
The study was supported by grants from the Jewish General Hospital Foundation and the Québec Breast Cancer Foundation, as well as an award from the Fonds de Recherche du Québec - Santé. Two authors reported receiving grants or personal fees from various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Traditional risk-based criteria, including family history and ancestry, are used to guide genetic testing decisions in women with breast cancer. However, these criteria may overlook patients with actionable genetic variants, particularly those outside the typical risk profile.
- To assess the efficacy of universal genetic testing, researchers conducted a cross-sectional study that included 729 women (median age at diagnosis, 53 years; 65.4% White women) newly diagnosed with invasive breast cancer between September 2019 and April 2022 at three Canadian institutions.
- All patients received genetic counseling followed by testing for the presence of germline pathogenic variants in 17 breast cancer susceptibility genes. The primary gene panel included screening for BRCA1, BRCA2, and PALB2, and the optional secondary panel included 14 additional breast cancer susceptibility genes.
- Of the participants, 659 (90.4%) were tested for both primary and secondary gene panels, whereas 70 (9.6%) underwent testing for only the primary panel. The majority of the cohort (66.8) were diagnosed with estrogen receptor–positive breast cancer, while 15.4% had triple-negative breast cancer.
TAKEAWAY:
- The prevalence of germline pathogenic variants was 7.3% (53 patients) — 5.3% for the primary gene panel and 2.1% for the secondary panel.
- Younger age (< 40 years; odds ratio [OR], 6.83), family history of ovarian cancer (OR, 9.75), high-grade disease (OR, 1.68), and triple-negative breast cancer (OR, 3.19) were independently associated with the presence of pathogenic genetic variants in BRCA1, BRCA2, or PALB2.
- Overall, 34.3% of patients with germline pathogenic variants in BRCA1, BRCA2, or PALB2, and 85.7% of carriers of secondary panel variants would not have qualified for traditional genetic testing according to the current risk factors.
- A total of 13 patients with BRCA1, BRCA2, or PALB2 variants had confirmed pathogenic mutations and were eligible for poly(adenosine diphosphate–ribose) polymerase (PARP) inhibitors.
IN PRACTICE:
These findings have “informed our clinical practice, and we now offer mainstream, oncology-led genetic testing to all women diagnosed with incident invasive breast cancer younger than 50 years of age, those with triple-negative breast cancer and/or bilateral breast cancer, those potentially eligible for PARP inhibitors,” as well as to men with breast cancer, the authors wrote.
SOURCE:
The study was led by Zoulikha Rezoug, MSc, Lady Davis Institute of the Jewish General Hospital, McGill University in Montreal, Québec, Canada. It was published online on September 3, 2024, in JAMA Network Open.
LIMITATIONS:
The COVID-19 pandemic resulted in a 6-month recruitment pause. Adjustments in recruitment criteria, focus on younger patients and those with triple-negative breast cancer could have overestimated prevalence of genetic pathogenic variants among women aged ≥ 70 years.
DISCLOSURES:
The study was supported by grants from the Jewish General Hospital Foundation and the Québec Breast Cancer Foundation, as well as an award from the Fonds de Recherche du Québec - Santé. Two authors reported receiving grants or personal fees from various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Mortality Risk From Early-Onset CRC Higher in Rural, Poor Areas
TOPLINE:
Patients with early-onset colorectal cancer (CRC) living in rural and impoverished areas face a significantly higher risk of dying from CRC.
METHODOLOGY:
- Previous research has shown that patients living in impoverished and rural areas have an increased risk of dying from CRC, but it is unclear if this trend applies to patients with early-onset CRC.
- Researchers analyzed 58,200 patients with early-onset CRC from the Surveillance, Epidemiology, and End Results Program between 2006 and 2015.
- Of these patients, 1346 (21%) lived in rural areas with persistent poverty. Persistent poverty was defined as having 20% or more of the population living below the poverty level for about 30 years, and rural locations were identified using specific US Department of Agriculture codes.
- The primary outcome was cancer-specific survival.
TAKEAWAY:
- The cancer-specific survival at 5 years was highest for patients who lived in neither poverty-stricken nor rural areas (72%) and the lowest for those who lived in impoverished areas irrespective of rural status (67%).
- Patients who lived in rural areas had a significantly higher risk of dying from CRC than those living in nonrural areas, with younger individuals facing the highest risk. More specifically, patients aged between 20 and 29 years had a 35% higher risk of dying from CRC, those aged between 30 and 39 years had a 26% higher risk, and those aged between 40 and 49 years had a 12% higher risk.
- Patients who lived in poverty and rural areas had a 29% increased risk of dying from CRC compared with those in nonrural areas — with the highest 51% greater risk for those aged between 30 and 39 years.
IN PRACTICE:
“Our results can be used to inform health system policies for ongoing investments in cancer diagnosis and treatment resources in rural or impoverished areas for younger CRC patients and their communities,” the authors wrote.
SOURCE:
The study, led by Meng-Han Tsai, PhD, Georgia Prevention Institute, Augusta University, Augusta, Georgia, was published online in JAMA Network Open.
LIMITATIONS:
Confounders, such as lifestyle factors, comorbidities, and structural barriers, could affect the findings.
DISCLOSURES:
This study was partially supported by a grant from the Georgia Cancer Center Paceline funding mechanism at Augusta University. The authors did not declare any conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Patients with early-onset colorectal cancer (CRC) living in rural and impoverished areas face a significantly higher risk of dying from CRC.
METHODOLOGY:
- Previous research has shown that patients living in impoverished and rural areas have an increased risk of dying from CRC, but it is unclear if this trend applies to patients with early-onset CRC.
- Researchers analyzed 58,200 patients with early-onset CRC from the Surveillance, Epidemiology, and End Results Program between 2006 and 2015.
- Of these patients, 1346 (21%) lived in rural areas with persistent poverty. Persistent poverty was defined as having 20% or more of the population living below the poverty level for about 30 years, and rural locations were identified using specific US Department of Agriculture codes.
- The primary outcome was cancer-specific survival.
TAKEAWAY:
- The cancer-specific survival at 5 years was highest for patients who lived in neither poverty-stricken nor rural areas (72%) and the lowest for those who lived in impoverished areas irrespective of rural status (67%).
- Patients who lived in rural areas had a significantly higher risk of dying from CRC than those living in nonrural areas, with younger individuals facing the highest risk. More specifically, patients aged between 20 and 29 years had a 35% higher risk of dying from CRC, those aged between 30 and 39 years had a 26% higher risk, and those aged between 40 and 49 years had a 12% higher risk.
- Patients who lived in poverty and rural areas had a 29% increased risk of dying from CRC compared with those in nonrural areas — with the highest 51% greater risk for those aged between 30 and 39 years.
IN PRACTICE:
“Our results can be used to inform health system policies for ongoing investments in cancer diagnosis and treatment resources in rural or impoverished areas for younger CRC patients and their communities,” the authors wrote.
SOURCE:
The study, led by Meng-Han Tsai, PhD, Georgia Prevention Institute, Augusta University, Augusta, Georgia, was published online in JAMA Network Open.
LIMITATIONS:
Confounders, such as lifestyle factors, comorbidities, and structural barriers, could affect the findings.
DISCLOSURES:
This study was partially supported by a grant from the Georgia Cancer Center Paceline funding mechanism at Augusta University. The authors did not declare any conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Patients with early-onset colorectal cancer (CRC) living in rural and impoverished areas face a significantly higher risk of dying from CRC.
METHODOLOGY:
- Previous research has shown that patients living in impoverished and rural areas have an increased risk of dying from CRC, but it is unclear if this trend applies to patients with early-onset CRC.
- Researchers analyzed 58,200 patients with early-onset CRC from the Surveillance, Epidemiology, and End Results Program between 2006 and 2015.
- Of these patients, 1346 (21%) lived in rural areas with persistent poverty. Persistent poverty was defined as having 20% or more of the population living below the poverty level for about 30 years, and rural locations were identified using specific US Department of Agriculture codes.
- The primary outcome was cancer-specific survival.
TAKEAWAY:
- The cancer-specific survival at 5 years was highest for patients who lived in neither poverty-stricken nor rural areas (72%) and the lowest for those who lived in impoverished areas irrespective of rural status (67%).
- Patients who lived in rural areas had a significantly higher risk of dying from CRC than those living in nonrural areas, with younger individuals facing the highest risk. More specifically, patients aged between 20 and 29 years had a 35% higher risk of dying from CRC, those aged between 30 and 39 years had a 26% higher risk, and those aged between 40 and 49 years had a 12% higher risk.
- Patients who lived in poverty and rural areas had a 29% increased risk of dying from CRC compared with those in nonrural areas — with the highest 51% greater risk for those aged between 30 and 39 years.
IN PRACTICE:
“Our results can be used to inform health system policies for ongoing investments in cancer diagnosis and treatment resources in rural or impoverished areas for younger CRC patients and their communities,” the authors wrote.
SOURCE:
The study, led by Meng-Han Tsai, PhD, Georgia Prevention Institute, Augusta University, Augusta, Georgia, was published online in JAMA Network Open.
LIMITATIONS:
Confounders, such as lifestyle factors, comorbidities, and structural barriers, could affect the findings.
DISCLOSURES:
This study was partially supported by a grant from the Georgia Cancer Center Paceline funding mechanism at Augusta University. The authors did not declare any conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Metformin Led to Improvements in Women with Central Centrifugal Cicatricial Alopecia
TOPLINE:
, in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 upregulated genes, which included upregulated of 23 hair keratin-associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were downregulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. In addition, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
, in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 upregulated genes, which included upregulated of 23 hair keratin-associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were downregulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. In addition, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
, in a retrospective case series.
METHODOLOGY:
- Researchers conducted a case series involving 12 Black women in their 30s, 40s, and 50s, with biopsy-confirmed, treatment-refractory CCCA, a chronic inflammatory hair disorder characterized by permanent hair loss, from the Johns Hopkins University alopecia clinic.
- Participants received CCCA treatment for at least 6 months and had stagnant or worsening symptoms before oral extended-release metformin (500 mg daily) was added to treatment. (Treatments included topical clobetasol, compounded minoxidil, and platelet-rich plasma injections.)
- Scalp biopsies were collected from four patients before and after metformin treatment to evaluate gene expression changes.
- Changes in clinical symptoms were assessed, including pruritus, inflammation, pain, scalp resistance, and hair regrowth, following initiation of metformin treatment.
TAKEAWAY:
- Metformin led to significant clinical improvement in eight patients, which included reductions in scalp pain, scalp resistance, pruritus, and inflammation. However, two patients experienced worsening symptoms.
- Six patients showed clinical evidence of hair regrowth after at least 6 months of metformin treatment with one experiencing hair loss again 3 months after discontinuing treatment.
- Transcriptomic analysis revealed 34 upregulated genes, which included upregulated of 23 hair keratin-associated proteins, and pathways related to keratinization, epidermis development, and the hair cycle. In addition, eight genes were downregulated, with pathways that included those associated with extracellular matrix organization, collagen fibril organization, and collagen metabolism.
- Gene set variation analysis showed reduced expression of T helper 17 cell and epithelial-mesenchymal transition pathways and elevated adenosine monophosphate kinase signaling and keratin-associated proteins after treatment with metformin.
IN PRACTICE:
“Metformin’s ability to concomitantly target fibrosis and inflammation provides a plausible mechanism for its therapeutic effects in CCCA and other fibrosing alopecia disorders,” the authors concluded. But, they added, “larger prospective, placebo-controlled randomized clinical trials are needed to rigorously evaluate metformin’s efficacy and optimal dosing for treatment of cicatricial alopecias.”
SOURCE:
The study was led by Aaron Bao, Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, and was published online on September 4 in JAMA Dermatology.
LIMITATIONS:
A small sample size, retrospective design, lack of a placebo control group, and the single-center setting limited the generalizability of the study findings. In addition, the absence of a validated activity or severity scale for CCCA and the single posttreatment sampling limit the assessment and comparison of clinical symptoms and transcriptomic changes.
DISCLOSURES:
The study was supported by the American Academy of Dermatology. One author reported several ties with pharmaceutical companies, a pending patent, and authorship for the UpToDate section on CCCA.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Tool Can Help Predict Futile Surgery in Pancreatic Cancer
TOPLINE:
METHODOLOGY:
- Immediate resection is associated with a high incidence of postoperative complications and disease recurrence within a year of surgery in patients with pancreatic ductal adenocarcinoma. Predicting which patients likely won’t benefit from upfront pancreatectomy is important.
- To identify preoperative risk factors for futile pancreatectomy, researchers evaluated 1426 patients (median age, 69 years; 53.2% men) with anatomically resectable pancreatic ductal adenocarcinoma who underwent pancreatic resection between January 2010 and December 2021.
- The patients were divided into derivation (n = 885) and validation (n = 541) cohorts.
- The primary outcome was the rate of futile upfront pancreatectomy, defined as death or disease recurrence within 6 months of surgery. Patients were divided into three risk categories — low, intermediate, and high risk — each with escalating likelihoods of futile resection, worse pathological features, and worse outcomes.
- The secondary endpoint was to develop criteria for surgical candidacy, setting a futility likelihood threshold of < 20%. This threshold corresponds to the lower bound of the 95% confidence interval (CI) for postneoadjuvant resection rates (resection rate, 0.90; 95% CI, 0.80-1.01) from recent meta-analyses.
TAKEAWAY:
- The futility rate for pancreatectomy was 18.9% — 19.2% in the development cohort and 18.6% in the validation cohort. Three independent risk factors for futile resection included American Society of Anesthesiologists (ASA) class (95% CI for coefficients, 0.68-0.87), preoperative cancer antigen 19.9 serum levels (95% CI for coefficients, 0.05-0.75), and radiologic tumor size (95% CI for coefficients, 0.28-0.46).
- Using these independent risk factors, the predictive model demonstrated adequate calibration and discrimination in both the derivation and validation cohorts.
- The researchers then identified three risk groups. In the derivation cohort, the rate of futile pancreatectomy was 9.2% in the low-risk group, 18.0% in the intermediate-risk group, and 28.7% in the high-risk group (P < .001 for trend). In the validation cohort, the futility rate was 10.9% in the low-risk group, 20.2% in the intermediate-risk group, and 29.2% in the high-risk group (P < .001 for trend).
- Researchers identified four conditions associated with a futility likelihood below 20%, where larger tumor size is paired with lower cancer antigen 19.9 levels (defined as cancer antigen 19.9–adjusted-to-size). Patients who met these criteria experienced significantly longer disease-free survival (median 18.4 months vs 11.2 months) and overall survival (38.5 months vs 22.1 months).
IN PRACTICE:
“Although the study provides an easy-to-use calculator for clinical decision-making, there are some methodological limitations,” according to the authors of accompanying commentary. These limitations include failing to accurately describe how ASA class, cancer antigen 19.9 level, and tumor size were chosen for the model. “While we do not think the model is yet ready for standard clinical use, it may prove to be a viable tool if tested in future randomized trials comparing the neoadjuvant approach to upfront surgery in resectable pancreatic cancer,” the editorialists added.
SOURCE:
This study, led by Stefano Crippa, MD, PhD, Division of Pancreatic Surgery, Pancreas Translational and Clinical Research Center, San Raffaele Scientific Institute, Vita-Salute San Raffaele University, Milan, Italy, and the accompanying commentary were published online in JAMA Surgery.
LIMITATIONS:
In addition to the limitations noted by the editorialists, others include the study’s retrospective design, which could introduce bias. Because preoperative imaging was not revised, the assigned resectability classes could show variability. Institutional differences existed in the selection process for upfront pancreatectomy. The model cannot be applied to cancer antigen 19.9 nonsecretors and was not externally validated.
DISCLOSURES:
The Italian Association for Cancer Research Special Program in Metastatic Disease and Italian Ministry of Health/Italian Foundation for the Research of Pancreatic Diseases supported the study in the form of a grant. Two authors reported receiving personal fees outside the submitted work. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Immediate resection is associated with a high incidence of postoperative complications and disease recurrence within a year of surgery in patients with pancreatic ductal adenocarcinoma. Predicting which patients likely won’t benefit from upfront pancreatectomy is important.
- To identify preoperative risk factors for futile pancreatectomy, researchers evaluated 1426 patients (median age, 69 years; 53.2% men) with anatomically resectable pancreatic ductal adenocarcinoma who underwent pancreatic resection between January 2010 and December 2021.
- The patients were divided into derivation (n = 885) and validation (n = 541) cohorts.
- The primary outcome was the rate of futile upfront pancreatectomy, defined as death or disease recurrence within 6 months of surgery. Patients were divided into three risk categories — low, intermediate, and high risk — each with escalating likelihoods of futile resection, worse pathological features, and worse outcomes.
- The secondary endpoint was to develop criteria for surgical candidacy, setting a futility likelihood threshold of < 20%. This threshold corresponds to the lower bound of the 95% confidence interval (CI) for postneoadjuvant resection rates (resection rate, 0.90; 95% CI, 0.80-1.01) from recent meta-analyses.
TAKEAWAY:
- The futility rate for pancreatectomy was 18.9% — 19.2% in the development cohort and 18.6% in the validation cohort. Three independent risk factors for futile resection included American Society of Anesthesiologists (ASA) class (95% CI for coefficients, 0.68-0.87), preoperative cancer antigen 19.9 serum levels (95% CI for coefficients, 0.05-0.75), and radiologic tumor size (95% CI for coefficients, 0.28-0.46).
- Using these independent risk factors, the predictive model demonstrated adequate calibration and discrimination in both the derivation and validation cohorts.
- The researchers then identified three risk groups. In the derivation cohort, the rate of futile pancreatectomy was 9.2% in the low-risk group, 18.0% in the intermediate-risk group, and 28.7% in the high-risk group (P < .001 for trend). In the validation cohort, the futility rate was 10.9% in the low-risk group, 20.2% in the intermediate-risk group, and 29.2% in the high-risk group (P < .001 for trend).
- Researchers identified four conditions associated with a futility likelihood below 20%, where larger tumor size is paired with lower cancer antigen 19.9 levels (defined as cancer antigen 19.9–adjusted-to-size). Patients who met these criteria experienced significantly longer disease-free survival (median 18.4 months vs 11.2 months) and overall survival (38.5 months vs 22.1 months).
IN PRACTICE:
“Although the study provides an easy-to-use calculator for clinical decision-making, there are some methodological limitations,” according to the authors of accompanying commentary. These limitations include failing to accurately describe how ASA class, cancer antigen 19.9 level, and tumor size were chosen for the model. “While we do not think the model is yet ready for standard clinical use, it may prove to be a viable tool if tested in future randomized trials comparing the neoadjuvant approach to upfront surgery in resectable pancreatic cancer,” the editorialists added.
SOURCE:
This study, led by Stefano Crippa, MD, PhD, Division of Pancreatic Surgery, Pancreas Translational and Clinical Research Center, San Raffaele Scientific Institute, Vita-Salute San Raffaele University, Milan, Italy, and the accompanying commentary were published online in JAMA Surgery.
LIMITATIONS:
In addition to the limitations noted by the editorialists, others include the study’s retrospective design, which could introduce bias. Because preoperative imaging was not revised, the assigned resectability classes could show variability. Institutional differences existed in the selection process for upfront pancreatectomy. The model cannot be applied to cancer antigen 19.9 nonsecretors and was not externally validated.
DISCLOSURES:
The Italian Association for Cancer Research Special Program in Metastatic Disease and Italian Ministry of Health/Italian Foundation for the Research of Pancreatic Diseases supported the study in the form of a grant. Two authors reported receiving personal fees outside the submitted work. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Immediate resection is associated with a high incidence of postoperative complications and disease recurrence within a year of surgery in patients with pancreatic ductal adenocarcinoma. Predicting which patients likely won’t benefit from upfront pancreatectomy is important.
- To identify preoperative risk factors for futile pancreatectomy, researchers evaluated 1426 patients (median age, 69 years; 53.2% men) with anatomically resectable pancreatic ductal adenocarcinoma who underwent pancreatic resection between January 2010 and December 2021.
- The patients were divided into derivation (n = 885) and validation (n = 541) cohorts.
- The primary outcome was the rate of futile upfront pancreatectomy, defined as death or disease recurrence within 6 months of surgery. Patients were divided into three risk categories — low, intermediate, and high risk — each with escalating likelihoods of futile resection, worse pathological features, and worse outcomes.
- The secondary endpoint was to develop criteria for surgical candidacy, setting a futility likelihood threshold of < 20%. This threshold corresponds to the lower bound of the 95% confidence interval (CI) for postneoadjuvant resection rates (resection rate, 0.90; 95% CI, 0.80-1.01) from recent meta-analyses.
TAKEAWAY:
- The futility rate for pancreatectomy was 18.9% — 19.2% in the development cohort and 18.6% in the validation cohort. Three independent risk factors for futile resection included American Society of Anesthesiologists (ASA) class (95% CI for coefficients, 0.68-0.87), preoperative cancer antigen 19.9 serum levels (95% CI for coefficients, 0.05-0.75), and radiologic tumor size (95% CI for coefficients, 0.28-0.46).
- Using these independent risk factors, the predictive model demonstrated adequate calibration and discrimination in both the derivation and validation cohorts.
- The researchers then identified three risk groups. In the derivation cohort, the rate of futile pancreatectomy was 9.2% in the low-risk group, 18.0% in the intermediate-risk group, and 28.7% in the high-risk group (P < .001 for trend). In the validation cohort, the futility rate was 10.9% in the low-risk group, 20.2% in the intermediate-risk group, and 29.2% in the high-risk group (P < .001 for trend).
- Researchers identified four conditions associated with a futility likelihood below 20%, where larger tumor size is paired with lower cancer antigen 19.9 levels (defined as cancer antigen 19.9–adjusted-to-size). Patients who met these criteria experienced significantly longer disease-free survival (median 18.4 months vs 11.2 months) and overall survival (38.5 months vs 22.1 months).
IN PRACTICE:
“Although the study provides an easy-to-use calculator for clinical decision-making, there are some methodological limitations,” according to the authors of accompanying commentary. These limitations include failing to accurately describe how ASA class, cancer antigen 19.9 level, and tumor size were chosen for the model. “While we do not think the model is yet ready for standard clinical use, it may prove to be a viable tool if tested in future randomized trials comparing the neoadjuvant approach to upfront surgery in resectable pancreatic cancer,” the editorialists added.
SOURCE:
This study, led by Stefano Crippa, MD, PhD, Division of Pancreatic Surgery, Pancreas Translational and Clinical Research Center, San Raffaele Scientific Institute, Vita-Salute San Raffaele University, Milan, Italy, and the accompanying commentary were published online in JAMA Surgery.
LIMITATIONS:
In addition to the limitations noted by the editorialists, others include the study’s retrospective design, which could introduce bias. Because preoperative imaging was not revised, the assigned resectability classes could show variability. Institutional differences existed in the selection process for upfront pancreatectomy. The model cannot be applied to cancer antigen 19.9 nonsecretors and was not externally validated.
DISCLOSURES:
The Italian Association for Cancer Research Special Program in Metastatic Disease and Italian Ministry of Health/Italian Foundation for the Research of Pancreatic Diseases supported the study in the form of a grant. Two authors reported receiving personal fees outside the submitted work. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
Targeted Pancreatic Cancer Screening May Save Lives
TOPLINE:
METHODOLOGY:
- Pancreatic ductal adenocarcinoma has poor 5-year survival rates and is often detected at later stages. General population screening is not recommended, but high-risk individuals, such as those with familial or genetic predispositions, may benefit from regular surveillance.
- The Cancer of the Pancreas Screening (CAPS) program, initiated in 1998, has been evaluating the effectiveness of such targeted surveillance for over two decades, but whether targeted surveillance confers a survival benefit remains unclear.
- The current study evaluated 26 high-risk individuals in the CAPS program who were ultimately diagnosed with pancreatic ductal adenocarcinoma. These high-risk individuals had undergone surveillance with annual endoscopic ultrasonography or MRI prior to diagnosis.
- The researchers compared these 26 individuals with 1504 matched control patients with pancreatic ductal adenocarcinoma from the Surveillance, Epidemiology, and End Results (SEER) database. The high-risk individuals and SEER control patients were matched on age, sex, and year of diagnosis.
- The primary outcomes were tumor stage at diagnosis, overall survival, and pancreatic cancer-specific mortality.
TAKEAWAY:
- High-risk individuals were significantly more likely to be diagnosed with early-stage pancreatic cancer: 38.5% were diagnosed at stage I vs 10.3% in the general US population, and 30.8% were diagnosed at stage II vs 25.1% in the general US population (P < .001).
- The median tumor size at diagnosis was smaller in high-risk individuals than in control patients (2.5 vs 3.6 cm; P < .001), and significantly fewer high-risk individuals had distant metastases at diagnosis (M1 stage) vs control patients (26.9% vs 53.8%; P = .01).
- Overall, high-risk individuals lived about 4.5 years longer — median of 61.7 months vs 8 months for control patients (hazard ratio [HR], 4.19; P < .001). In the 20 high-risk patients with screen-detected cancer, median overall survival was even higher at 144 months.
- The probability of surviving 5 years was significantly better in the high-risk group (50%) than in the control group (9%). And at 5 years, high-risk individuals had a significantly lower probability of dying from pancreatic cancer (HR, 3.58; P < .001).
IN PRACTICE:
Surveillance of high-risk individuals led to detection of “smaller pancreatic cancers, a greater number of patients with stage I disease,” as well as “a much higher likelihood of long-term survival than unscreened patients in the general population,” the authors concluded. “These findings suggest that selective surveillance of individuals at high risk for pancreatic cancer may improve clinical outcomes.”
SOURCE:
This study, with first author Amanda L. Blackford, from Johns Hopkins Medical Institutions, Baltimore, was published online July 3 in JAMA Oncology.
LIMITATIONS:
The findings might have limited generalizability due to enrollment at academic referral centers, limited racial and ethnic diversity, and a small number of high-risk individuals progressing to pancreatic cancer. The study also lacked a control group of unscreened high-risk individuals.
DISCLOSURES:
This study was supported by the National Institutes of Health, Susan Wojcicki and Dennis Troper, and others. Several authors reported financial ties outside this work.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Pancreatic ductal adenocarcinoma has poor 5-year survival rates and is often detected at later stages. General population screening is not recommended, but high-risk individuals, such as those with familial or genetic predispositions, may benefit from regular surveillance.
- The Cancer of the Pancreas Screening (CAPS) program, initiated in 1998, has been evaluating the effectiveness of such targeted surveillance for over two decades, but whether targeted surveillance confers a survival benefit remains unclear.
- The current study evaluated 26 high-risk individuals in the CAPS program who were ultimately diagnosed with pancreatic ductal adenocarcinoma. These high-risk individuals had undergone surveillance with annual endoscopic ultrasonography or MRI prior to diagnosis.
- The researchers compared these 26 individuals with 1504 matched control patients with pancreatic ductal adenocarcinoma from the Surveillance, Epidemiology, and End Results (SEER) database. The high-risk individuals and SEER control patients were matched on age, sex, and year of diagnosis.
- The primary outcomes were tumor stage at diagnosis, overall survival, and pancreatic cancer-specific mortality.
TAKEAWAY:
- High-risk individuals were significantly more likely to be diagnosed with early-stage pancreatic cancer: 38.5% were diagnosed at stage I vs 10.3% in the general US population, and 30.8% were diagnosed at stage II vs 25.1% in the general US population (P < .001).
- The median tumor size at diagnosis was smaller in high-risk individuals than in control patients (2.5 vs 3.6 cm; P < .001), and significantly fewer high-risk individuals had distant metastases at diagnosis (M1 stage) vs control patients (26.9% vs 53.8%; P = .01).
- Overall, high-risk individuals lived about 4.5 years longer — median of 61.7 months vs 8 months for control patients (hazard ratio [HR], 4.19; P < .001). In the 20 high-risk patients with screen-detected cancer, median overall survival was even higher at 144 months.
- The probability of surviving 5 years was significantly better in the high-risk group (50%) than in the control group (9%). And at 5 years, high-risk individuals had a significantly lower probability of dying from pancreatic cancer (HR, 3.58; P < .001).
IN PRACTICE:
Surveillance of high-risk individuals led to detection of “smaller pancreatic cancers, a greater number of patients with stage I disease,” as well as “a much higher likelihood of long-term survival than unscreened patients in the general population,” the authors concluded. “These findings suggest that selective surveillance of individuals at high risk for pancreatic cancer may improve clinical outcomes.”
SOURCE:
This study, with first author Amanda L. Blackford, from Johns Hopkins Medical Institutions, Baltimore, was published online July 3 in JAMA Oncology.
LIMITATIONS:
The findings might have limited generalizability due to enrollment at academic referral centers, limited racial and ethnic diversity, and a small number of high-risk individuals progressing to pancreatic cancer. The study also lacked a control group of unscreened high-risk individuals.
DISCLOSURES:
This study was supported by the National Institutes of Health, Susan Wojcicki and Dennis Troper, and others. Several authors reported financial ties outside this work.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Pancreatic ductal adenocarcinoma has poor 5-year survival rates and is often detected at later stages. General population screening is not recommended, but high-risk individuals, such as those with familial or genetic predispositions, may benefit from regular surveillance.
- The Cancer of the Pancreas Screening (CAPS) program, initiated in 1998, has been evaluating the effectiveness of such targeted surveillance for over two decades, but whether targeted surveillance confers a survival benefit remains unclear.
- The current study evaluated 26 high-risk individuals in the CAPS program who were ultimately diagnosed with pancreatic ductal adenocarcinoma. These high-risk individuals had undergone surveillance with annual endoscopic ultrasonography or MRI prior to diagnosis.
- The researchers compared these 26 individuals with 1504 matched control patients with pancreatic ductal adenocarcinoma from the Surveillance, Epidemiology, and End Results (SEER) database. The high-risk individuals and SEER control patients were matched on age, sex, and year of diagnosis.
- The primary outcomes were tumor stage at diagnosis, overall survival, and pancreatic cancer-specific mortality.
TAKEAWAY:
- High-risk individuals were significantly more likely to be diagnosed with early-stage pancreatic cancer: 38.5% were diagnosed at stage I vs 10.3% in the general US population, and 30.8% were diagnosed at stage II vs 25.1% in the general US population (P < .001).
- The median tumor size at diagnosis was smaller in high-risk individuals than in control patients (2.5 vs 3.6 cm; P < .001), and significantly fewer high-risk individuals had distant metastases at diagnosis (M1 stage) vs control patients (26.9% vs 53.8%; P = .01).
- Overall, high-risk individuals lived about 4.5 years longer — median of 61.7 months vs 8 months for control patients (hazard ratio [HR], 4.19; P < .001). In the 20 high-risk patients with screen-detected cancer, median overall survival was even higher at 144 months.
- The probability of surviving 5 years was significantly better in the high-risk group (50%) than in the control group (9%). And at 5 years, high-risk individuals had a significantly lower probability of dying from pancreatic cancer (HR, 3.58; P < .001).
IN PRACTICE:
Surveillance of high-risk individuals led to detection of “smaller pancreatic cancers, a greater number of patients with stage I disease,” as well as “a much higher likelihood of long-term survival than unscreened patients in the general population,” the authors concluded. “These findings suggest that selective surveillance of individuals at high risk for pancreatic cancer may improve clinical outcomes.”
SOURCE:
This study, with first author Amanda L. Blackford, from Johns Hopkins Medical Institutions, Baltimore, was published online July 3 in JAMA Oncology.
LIMITATIONS:
The findings might have limited generalizability due to enrollment at academic referral centers, limited racial and ethnic diversity, and a small number of high-risk individuals progressing to pancreatic cancer. The study also lacked a control group of unscreened high-risk individuals.
DISCLOSURES:
This study was supported by the National Institutes of Health, Susan Wojcicki and Dennis Troper, and others. Several authors reported financial ties outside this work.
A version of this article appeared on Medscape.com.
Parental e-Cigarette Use Linked to Atopic Dermatitis Risk in Children
TOPLINE:
METHODOLOGY:
- AD is one of the most common inflammatory conditions in children and is linked to environmental risk factors, such as exposure to secondhand smoke and prenatal exposure to tobacco.
- To address the effect of e-cigarettes use on children, researchers conducted a cross-sectional analysis of data from the 2014-2018 National Health Interview Survey, a nationally representative sample of the US population.
- The analysis included 48,637,111 individuals (mean age, 8.4 years), with 6,354,515 (13%) indicating a history of AD (mean age, 8 years).
TAKEAWAY:
- The prevalence of parental e-cigarette use was 18.0% among individuals with AD, compared with 14.4% among those without AD.
- This corresponded to a 24% higher risk for AD associated with parental e-cigarette use (adjusted odds ratio, 1.24; 95% CI, 1.08-1.42).
- The association between e-cigarette use and AD in children held regardless of parent’s sex.
IN PRACTICE:
“Our results suggest that parental e-cigarette use was associated with pediatric AD,” the authors concluded. They noted that the authors of a previous study that associated e-cigarette use with AD in adults postulated that the cause was “the inflammatory state created by” e-cigarettes.
SOURCE:
This study, led by Gun Min Youn, Department of Dermatology, Stanford University School of Medicine, Stanford, California, was published online in JAMA Dermatology.
LIMITATIONS:
The cross-sectional survey design limited the ability to draw causal inferences. Defining e-cigarette use as a single past instance could affect the strength of the findings. Only past-year e-cigarette use was considered. Furthermore, data on pediatric cigarette or e-cigarette use, a potential confounder, were unavailable.
DISCLOSURES:
The study did not disclose funding information. One author reported receiving consultation fees outside the submitted work. No other disclosures were reported.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- AD is one of the most common inflammatory conditions in children and is linked to environmental risk factors, such as exposure to secondhand smoke and prenatal exposure to tobacco.
- To address the effect of e-cigarettes use on children, researchers conducted a cross-sectional analysis of data from the 2014-2018 National Health Interview Survey, a nationally representative sample of the US population.
- The analysis included 48,637,111 individuals (mean age, 8.4 years), with 6,354,515 (13%) indicating a history of AD (mean age, 8 years).
TAKEAWAY:
- The prevalence of parental e-cigarette use was 18.0% among individuals with AD, compared with 14.4% among those without AD.
- This corresponded to a 24% higher risk for AD associated with parental e-cigarette use (adjusted odds ratio, 1.24; 95% CI, 1.08-1.42).
- The association between e-cigarette use and AD in children held regardless of parent’s sex.
IN PRACTICE:
“Our results suggest that parental e-cigarette use was associated with pediatric AD,” the authors concluded. They noted that the authors of a previous study that associated e-cigarette use with AD in adults postulated that the cause was “the inflammatory state created by” e-cigarettes.
SOURCE:
This study, led by Gun Min Youn, Department of Dermatology, Stanford University School of Medicine, Stanford, California, was published online in JAMA Dermatology.
LIMITATIONS:
The cross-sectional survey design limited the ability to draw causal inferences. Defining e-cigarette use as a single past instance could affect the strength of the findings. Only past-year e-cigarette use was considered. Furthermore, data on pediatric cigarette or e-cigarette use, a potential confounder, were unavailable.
DISCLOSURES:
The study did not disclose funding information. One author reported receiving consultation fees outside the submitted work. No other disclosures were reported.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- AD is one of the most common inflammatory conditions in children and is linked to environmental risk factors, such as exposure to secondhand smoke and prenatal exposure to tobacco.
- To address the effect of e-cigarettes use on children, researchers conducted a cross-sectional analysis of data from the 2014-2018 National Health Interview Survey, a nationally representative sample of the US population.
- The analysis included 48,637,111 individuals (mean age, 8.4 years), with 6,354,515 (13%) indicating a history of AD (mean age, 8 years).
TAKEAWAY:
- The prevalence of parental e-cigarette use was 18.0% among individuals with AD, compared with 14.4% among those without AD.
- This corresponded to a 24% higher risk for AD associated with parental e-cigarette use (adjusted odds ratio, 1.24; 95% CI, 1.08-1.42).
- The association between e-cigarette use and AD in children held regardless of parent’s sex.
IN PRACTICE:
“Our results suggest that parental e-cigarette use was associated with pediatric AD,” the authors concluded. They noted that the authors of a previous study that associated e-cigarette use with AD in adults postulated that the cause was “the inflammatory state created by” e-cigarettes.
SOURCE:
This study, led by Gun Min Youn, Department of Dermatology, Stanford University School of Medicine, Stanford, California, was published online in JAMA Dermatology.
LIMITATIONS:
The cross-sectional survey design limited the ability to draw causal inferences. Defining e-cigarette use as a single past instance could affect the strength of the findings. Only past-year e-cigarette use was considered. Furthermore, data on pediatric cigarette or e-cigarette use, a potential confounder, were unavailable.
DISCLOSURES:
The study did not disclose funding information. One author reported receiving consultation fees outside the submitted work. No other disclosures were reported.
A version of this article appeared on Medscape.com.
Further Support for CRC Screening to Start at Age 45: Meta-Analysis
TOPLINE:
For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.
METHODOLOGY:
- The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
- Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
- Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
- The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.
TAKEAWAY:
- Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
- Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
- Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
- The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).
IN PRACTICE:
“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.
SOURCE:
This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.
DISCLOSURES:
The study did not receive any funding. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.
METHODOLOGY:
- The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
- Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
- Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
- The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.
TAKEAWAY:
- Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
- Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
- Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
- The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).
IN PRACTICE:
“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.
SOURCE:
This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.
DISCLOSURES:
The study did not receive any funding. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.
METHODOLOGY:
- The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
- Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
- Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
- The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.
TAKEAWAY:
- Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
- Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
- Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
- The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).
IN PRACTICE:
“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.
SOURCE:
This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.
DISCLOSURES:
The study did not receive any funding. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Childhood Atopic Dermatitis Linked to IBD Risk
TOPLINE:
, but atopic manifestations are generally not associated with IBD.
METHODOLOGY:
- Studies examining the link between atopy and IBD have yielded inconsistent results. Many of these studies included adults, introducing recall bias, or relied on physician diagnoses that might have overlooked mild cases.
- Researchers analyzed prospectively collected data on 83,311 children from two cohort studies, ABIS (1997-1999) and MoBa (1999-2008), who were followed up from birth until 2021 or a diagnosis of IBD.
- Information on parents was collected prospectively via questionnaires on any atopy their children might have developed by the age of 3 years. Atopy included conditions such as AD, asthma, food allergy, or allergic rhinitis.
TAKEAWAY:
- A total of 301 participants were diagnosed with IBD over 1,174,756 person-years of follow-up. By the age of 3 years, 31,671 children (38%) were reported to have any atopic manifestation.
- Children with AD at the age of 3 years demonstrated a significantly higher risk for IBD (pooled adjusted hazard ratio [aHR], 1.46), Crohn’s disease (pooled aHR, 1.53), and ulcerative colitis (pooled aHR, 1.78).
- Any atopic manifestation by the age of 3 years was not associated with a subsequent risk for IBD, Crohn’s disease, or ulcerative colitis, nor were analyses focused on early-life food-related allergy, asthma, and allergic rhinitis.
IN PRACTICE:
According to the authors, these findings suggested potential shared underlying causes between AD and IBD, which could help identify individuals at risk, and “a deeper understanding could significantly benefit the development of novel treatment approaches capable of effectively addressing both conditions, consequently enhancing patient outcomes.”
SOURCE:
This study, led by Tereza Lerchova, MD, PhD, Department of Pediatrics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden, was published online in The Journal of Pediatrics.
LIMITATIONS:
The findings of this study were mostly related to childhood-onset IBD instead of IBD in adult life. Lower participation in the MoBa study could limit generalizability to a broader population. In addition, there might have been lower participation from families without atopic manifestations.
DISCLOSURES:
The study was funded by the Swedish Society for Medical Research, Swedish Research Council, and ALF and supported by grants from the Swedish Child Diabetes Foundation, Swedish Council for Working Life and Social Research, Swedish Research Council, Medical Research Council of Southeast Sweden, JDRF Wallenberg Foundation, Linkoping University, and Joanna Cocozza Foundation. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
, but atopic manifestations are generally not associated with IBD.
METHODOLOGY:
- Studies examining the link between atopy and IBD have yielded inconsistent results. Many of these studies included adults, introducing recall bias, or relied on physician diagnoses that might have overlooked mild cases.
- Researchers analyzed prospectively collected data on 83,311 children from two cohort studies, ABIS (1997-1999) and MoBa (1999-2008), who were followed up from birth until 2021 or a diagnosis of IBD.
- Information on parents was collected prospectively via questionnaires on any atopy their children might have developed by the age of 3 years. Atopy included conditions such as AD, asthma, food allergy, or allergic rhinitis.
TAKEAWAY:
- A total of 301 participants were diagnosed with IBD over 1,174,756 person-years of follow-up. By the age of 3 years, 31,671 children (38%) were reported to have any atopic manifestation.
- Children with AD at the age of 3 years demonstrated a significantly higher risk for IBD (pooled adjusted hazard ratio [aHR], 1.46), Crohn’s disease (pooled aHR, 1.53), and ulcerative colitis (pooled aHR, 1.78).
- Any atopic manifestation by the age of 3 years was not associated with a subsequent risk for IBD, Crohn’s disease, or ulcerative colitis, nor were analyses focused on early-life food-related allergy, asthma, and allergic rhinitis.
IN PRACTICE:
According to the authors, these findings suggested potential shared underlying causes between AD and IBD, which could help identify individuals at risk, and “a deeper understanding could significantly benefit the development of novel treatment approaches capable of effectively addressing both conditions, consequently enhancing patient outcomes.”
SOURCE:
This study, led by Tereza Lerchova, MD, PhD, Department of Pediatrics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden, was published online in The Journal of Pediatrics.
LIMITATIONS:
The findings of this study were mostly related to childhood-onset IBD instead of IBD in adult life. Lower participation in the MoBa study could limit generalizability to a broader population. In addition, there might have been lower participation from families without atopic manifestations.
DISCLOSURES:
The study was funded by the Swedish Society for Medical Research, Swedish Research Council, and ALF and supported by grants from the Swedish Child Diabetes Foundation, Swedish Council for Working Life and Social Research, Swedish Research Council, Medical Research Council of Southeast Sweden, JDRF Wallenberg Foundation, Linkoping University, and Joanna Cocozza Foundation. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
, but atopic manifestations are generally not associated with IBD.
METHODOLOGY:
- Studies examining the link between atopy and IBD have yielded inconsistent results. Many of these studies included adults, introducing recall bias, or relied on physician diagnoses that might have overlooked mild cases.
- Researchers analyzed prospectively collected data on 83,311 children from two cohort studies, ABIS (1997-1999) and MoBa (1999-2008), who were followed up from birth until 2021 or a diagnosis of IBD.
- Information on parents was collected prospectively via questionnaires on any atopy their children might have developed by the age of 3 years. Atopy included conditions such as AD, asthma, food allergy, or allergic rhinitis.
TAKEAWAY:
- A total of 301 participants were diagnosed with IBD over 1,174,756 person-years of follow-up. By the age of 3 years, 31,671 children (38%) were reported to have any atopic manifestation.
- Children with AD at the age of 3 years demonstrated a significantly higher risk for IBD (pooled adjusted hazard ratio [aHR], 1.46), Crohn’s disease (pooled aHR, 1.53), and ulcerative colitis (pooled aHR, 1.78).
- Any atopic manifestation by the age of 3 years was not associated with a subsequent risk for IBD, Crohn’s disease, or ulcerative colitis, nor were analyses focused on early-life food-related allergy, asthma, and allergic rhinitis.
IN PRACTICE:
According to the authors, these findings suggested potential shared underlying causes between AD and IBD, which could help identify individuals at risk, and “a deeper understanding could significantly benefit the development of novel treatment approaches capable of effectively addressing both conditions, consequently enhancing patient outcomes.”
SOURCE:
This study, led by Tereza Lerchova, MD, PhD, Department of Pediatrics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden, was published online in The Journal of Pediatrics.
LIMITATIONS:
The findings of this study were mostly related to childhood-onset IBD instead of IBD in adult life. Lower participation in the MoBa study could limit generalizability to a broader population. In addition, there might have been lower participation from families without atopic manifestations.
DISCLOSURES:
The study was funded by the Swedish Society for Medical Research, Swedish Research Council, and ALF and supported by grants from the Swedish Child Diabetes Foundation, Swedish Council for Working Life and Social Research, Swedish Research Council, Medical Research Council of Southeast Sweden, JDRF Wallenberg Foundation, Linkoping University, and Joanna Cocozza Foundation. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.