Surgery for early breast cancer can worsen frailty in older women

Article Type
Changed
Thu, 03/23/2023 - 10:55

 

A substantial number of older women may experience worsening frailty after undergoing surgery and radiation therapy for early-stage breast cancer, according to a new study.

About 1 in 5 experienced clinically significant deterioration in frailty status after treatment, the study team found. Women at highest risk for declines in frailty following treatment had “robust” baseline frailty status at diagnosis and underwent more invasive mastectomy compared with lumpectomy.

The fact that “robust” older women were more likely to become frail after locoregional therapy suggests that “thoughtful treatment decisions should be undertaken in all older women, not simply those who have frailty at diagnosis,” said the investigators, led by Christina Minami, MD, of Dana-Farber/Brigham and Women’s Cancer Center in Boston.

The study findings emphasize that there is no one-size-fits-all approach to breast cancer treatment in the elderly, said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the research. “Some patients will sail through a surgery, and others are severely affected by it.”

The study was published online in JAMA Surgery.

Given the growing number of older adults with breast cancer, understanding how age-related syndromes, such as frailty, may alter cancer outcomes and how cancer treatments change aging trajectories remains important.

To investigate, Dr. Minami and colleagues used Surveillance, Epidemiology, and End Results Medicare data to identify 31,084 women (mean age, 73) who had been diagnosed with ductal carcinoma in situ (DCIS) or stage I HR-positive, ERBB2-positive breast cancer and who underwent surgery (23% mastectomy, 77% lumpectomy) and radiation therapy.

Worsening frailty status was defined as a decline of 0.03 or greater in a validated frailty index from the time of diagnosis to 1 year. This level of change has been linked to greater mortality risk and greater cost of care.

Frailty status at diagnosis was “robust” in 56% of the women, prefrail in 40%, mildly frail in 4%, and moderately to severely frail in 0.3%.

According to the researchers, 21.4% of the women experienced clinically significant declines in their frailty status after treatment. These declines occurred in 25% of women who underwent mastectomy and 20% of those who underwent lumpectomy.

After adjusting for covariates, there was a higher likelihood of worsening frailty among women who were robustly frail at baseline, in comparison with those who were moderately to severely frail at baseline (odds ratio, 6.12), and in those who underwent mastectomy vs. lumpectomy (OR, 1.31).

Older age and race were also linked to worsening frailty status following treatment. Compared with younger women (aged 65-74 years), older women were more likely to experience worsening frailty (OR, 1.21 for women aged 75-79; OR, 1.53 for those aged 80-84; OR, 1.94 for those aged 85 and older). In addition, Black women were more likely than non-Hispanic White women to experience worsening frailty after treatment (OR, 1.12).

“Previous studies have documented lasting declines in functional status after surgery in older patients with breast cancer, but breast cancer treatment has not been implicated in worsening frailty to date,” Dr. Minami and colleagues explain. But “given the substantial proportion of women experiencing worsening frailty and the significant difference by breast surgery type, frailty status as a cancer therapy outcome should be further explored.” In addition, “tailoring locoregional therapy intensity in this population is important,” they write.

Dr. Cate explained that randomized clinical trials such as COMET and LORIS, which explore the monitoring of patients with DCIS in lieu of active treatment, “will likely make a big impact on this population, as we currently do not have randomized controlled data for observation of breast cancer.”

Dr. Cate added as well that assessing a patient’s ECOG [Eastern Cooperative Oncology Group] performance status is vital “to determine who can really tolerate a breast cancer surgery” and that opting for antiestrogens, such as aromatase inhibitors, which can keep cancer at bay for years, “may be preferable for many older patients.”

The study was funded by Brigham and Women’s Hospital’s Department of Surgery’s Beal Fellowship. Dr. Minami and Dr. Cate have disclosed no relevant financial relationships.
 

 

 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

A substantial number of older women may experience worsening frailty after undergoing surgery and radiation therapy for early-stage breast cancer, according to a new study.

About 1 in 5 experienced clinically significant deterioration in frailty status after treatment, the study team found. Women at highest risk for declines in frailty following treatment had “robust” baseline frailty status at diagnosis and underwent more invasive mastectomy compared with lumpectomy.

The fact that “robust” older women were more likely to become frail after locoregional therapy suggests that “thoughtful treatment decisions should be undertaken in all older women, not simply those who have frailty at diagnosis,” said the investigators, led by Christina Minami, MD, of Dana-Farber/Brigham and Women’s Cancer Center in Boston.

The study findings emphasize that there is no one-size-fits-all approach to breast cancer treatment in the elderly, said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the research. “Some patients will sail through a surgery, and others are severely affected by it.”

The study was published online in JAMA Surgery.

Given the growing number of older adults with breast cancer, understanding how age-related syndromes, such as frailty, may alter cancer outcomes and how cancer treatments change aging trajectories remains important.

To investigate, Dr. Minami and colleagues used Surveillance, Epidemiology, and End Results Medicare data to identify 31,084 women (mean age, 73) who had been diagnosed with ductal carcinoma in situ (DCIS) or stage I HR-positive, ERBB2-positive breast cancer and who underwent surgery (23% mastectomy, 77% lumpectomy) and radiation therapy.

Worsening frailty status was defined as a decline of 0.03 or greater in a validated frailty index from the time of diagnosis to 1 year. This level of change has been linked to greater mortality risk and greater cost of care.

Frailty status at diagnosis was “robust” in 56% of the women, prefrail in 40%, mildly frail in 4%, and moderately to severely frail in 0.3%.

According to the researchers, 21.4% of the women experienced clinically significant declines in their frailty status after treatment. These declines occurred in 25% of women who underwent mastectomy and 20% of those who underwent lumpectomy.

After adjusting for covariates, there was a higher likelihood of worsening frailty among women who were robustly frail at baseline, in comparison with those who were moderately to severely frail at baseline (odds ratio, 6.12), and in those who underwent mastectomy vs. lumpectomy (OR, 1.31).

Older age and race were also linked to worsening frailty status following treatment. Compared with younger women (aged 65-74 years), older women were more likely to experience worsening frailty (OR, 1.21 for women aged 75-79; OR, 1.53 for those aged 80-84; OR, 1.94 for those aged 85 and older). In addition, Black women were more likely than non-Hispanic White women to experience worsening frailty after treatment (OR, 1.12).

“Previous studies have documented lasting declines in functional status after surgery in older patients with breast cancer, but breast cancer treatment has not been implicated in worsening frailty to date,” Dr. Minami and colleagues explain. But “given the substantial proportion of women experiencing worsening frailty and the significant difference by breast surgery type, frailty status as a cancer therapy outcome should be further explored.” In addition, “tailoring locoregional therapy intensity in this population is important,” they write.

Dr. Cate explained that randomized clinical trials such as COMET and LORIS, which explore the monitoring of patients with DCIS in lieu of active treatment, “will likely make a big impact on this population, as we currently do not have randomized controlled data for observation of breast cancer.”

Dr. Cate added as well that assessing a patient’s ECOG [Eastern Cooperative Oncology Group] performance status is vital “to determine who can really tolerate a breast cancer surgery” and that opting for antiestrogens, such as aromatase inhibitors, which can keep cancer at bay for years, “may be preferable for many older patients.”

The study was funded by Brigham and Women’s Hospital’s Department of Surgery’s Beal Fellowship. Dr. Minami and Dr. Cate have disclosed no relevant financial relationships.
 

 

 

A version of this article first appeared on Medscape.com.

 

A substantial number of older women may experience worsening frailty after undergoing surgery and radiation therapy for early-stage breast cancer, according to a new study.

About 1 in 5 experienced clinically significant deterioration in frailty status after treatment, the study team found. Women at highest risk for declines in frailty following treatment had “robust” baseline frailty status at diagnosis and underwent more invasive mastectomy compared with lumpectomy.

The fact that “robust” older women were more likely to become frail after locoregional therapy suggests that “thoughtful treatment decisions should be undertaken in all older women, not simply those who have frailty at diagnosis,” said the investigators, led by Christina Minami, MD, of Dana-Farber/Brigham and Women’s Cancer Center in Boston.

The study findings emphasize that there is no one-size-fits-all approach to breast cancer treatment in the elderly, said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the research. “Some patients will sail through a surgery, and others are severely affected by it.”

The study was published online in JAMA Surgery.

Given the growing number of older adults with breast cancer, understanding how age-related syndromes, such as frailty, may alter cancer outcomes and how cancer treatments change aging trajectories remains important.

To investigate, Dr. Minami and colleagues used Surveillance, Epidemiology, and End Results Medicare data to identify 31,084 women (mean age, 73) who had been diagnosed with ductal carcinoma in situ (DCIS) or stage I HR-positive, ERBB2-positive breast cancer and who underwent surgery (23% mastectomy, 77% lumpectomy) and radiation therapy.

Worsening frailty status was defined as a decline of 0.03 or greater in a validated frailty index from the time of diagnosis to 1 year. This level of change has been linked to greater mortality risk and greater cost of care.

Frailty status at diagnosis was “robust” in 56% of the women, prefrail in 40%, mildly frail in 4%, and moderately to severely frail in 0.3%.

According to the researchers, 21.4% of the women experienced clinically significant declines in their frailty status after treatment. These declines occurred in 25% of women who underwent mastectomy and 20% of those who underwent lumpectomy.

After adjusting for covariates, there was a higher likelihood of worsening frailty among women who were robustly frail at baseline, in comparison with those who were moderately to severely frail at baseline (odds ratio, 6.12), and in those who underwent mastectomy vs. lumpectomy (OR, 1.31).

Older age and race were also linked to worsening frailty status following treatment. Compared with younger women (aged 65-74 years), older women were more likely to experience worsening frailty (OR, 1.21 for women aged 75-79; OR, 1.53 for those aged 80-84; OR, 1.94 for those aged 85 and older). In addition, Black women were more likely than non-Hispanic White women to experience worsening frailty after treatment (OR, 1.12).

“Previous studies have documented lasting declines in functional status after surgery in older patients with breast cancer, but breast cancer treatment has not been implicated in worsening frailty to date,” Dr. Minami and colleagues explain. But “given the substantial proportion of women experiencing worsening frailty and the significant difference by breast surgery type, frailty status as a cancer therapy outcome should be further explored.” In addition, “tailoring locoregional therapy intensity in this population is important,” they write.

Dr. Cate explained that randomized clinical trials such as COMET and LORIS, which explore the monitoring of patients with DCIS in lieu of active treatment, “will likely make a big impact on this population, as we currently do not have randomized controlled data for observation of breast cancer.”

Dr. Cate added as well that assessing a patient’s ECOG [Eastern Cooperative Oncology Group] performance status is vital “to determine who can really tolerate a breast cancer surgery” and that opting for antiestrogens, such as aromatase inhibitors, which can keep cancer at bay for years, “may be preferable for many older patients.”

The study was funded by Brigham and Women’s Hospital’s Department of Surgery’s Beal Fellowship. Dr. Minami and Dr. Cate have disclosed no relevant financial relationships.
 

 

 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Restless legs a new modifiable risk factor for dementia?

Article Type
Changed
Thu, 03/23/2023 - 10:56

 

Restless legs syndrome (RLS) is associated with an elevated risk of dementia among older adults, suggesting the disorder may be a risk factor for dementia or a very early noncognitive sign of dementia, researchers say.

In a large population-based cohort study, adults with RLS were significantly more likely to develop dementia over more than a decade than were their peers without RLS.

If confirmed in future studies, “regular check-ups for cognitive decline in older patients with RLS may facilitate earlier detection and intervention for those with dementia risk,” wrote investigators led by Eosu Kim, MD, PhD, with Yonsei University, Seoul, Republic of Korea.

The study was published online in Alzheimer’s Research and Therapy.
 

Sleep disorders and dementia

RLS is associated with poor sleep, depression/anxiety, poor diet, microvasculopathy, and hypoxia – all of which are known risk factors for dementia. However, the relationship between RLS and incident dementia has been unclear.

The researchers compared risk for all-cause dementia, Alzheimer’s disease (AD), and vascular dementia (VaD) among 2,501 adults with newly diagnosed RLS and 9,977 matched control persons participating in the Korean National Health Insurance Service–Elderly Cohort, a nationwide population-based cohort of adults aged 60 and older.

The mean age of the cohort was 73 years; most of the participants were women (65%). Among all 12,478 participants, 874 (7%) developed all-cause dementia during follow-up – 475 (54%) developed AD, and 194 (22%) developed VaD.

The incidence of all-cause dementia was significantly higher among the RLS group than among the control group (10.4% vs. 6.2%). Incidence rates of AD and VaD (5.6% and 2.6%, respectively) were also higher in the RLS group than in the control group (3.4% and 1.3%, respectively).

In Cox regression analysis, RLS was significantly associated with an increased risk of all-cause dementia (adjusted hazard ratio [aHR], 1.46; 95% confidence interval [CI], 1.24-1.72), AD (aHR 1.38; 95% CI, 1.11-1.72) and VaD (aHR, 1.81; 95% CI, 1.30-2.53).

The researchers noted that RLS may precede deterioration of cognitive function, leading to dementia, and they suggest that RLS could be regarded as a “newly identified” risk factor or prodromal sign of dementia.
 

Modifiable risk factor

Reached for comment, Thanh Dang-Vu, MD, PhD, professor and research chair in sleep, neuroimaging, and cognitive health at Concordia University in Montreal, said there is now “increasing literature that shows sleep as a modifiable risk factor for cognitive decline.

“Previous evidence indicates that both sleep apnea and insomnia disorder increase the risk for cognitive decline and possibly dementia. Here the study adds to this body of evidence linking sleep disorders to dementia, suggesting that RLS should also be considered as a sleep-related risk factor,” Dr. Dang-Vu told this news organization.

“More evidence is needed, though, as here, all diagnoses were based on national health insurance diagnostic codes, and it is likely there were missed diagnoses for RLS but also for other sleep disorders, as there was no systematic screening for them,” Dr. Dang-Vu cautioned.

Support for the study was provided by the Ministry of Health and Welfare, the Korean government, and Yonsei University. Dr. Kim and Dr. Dang-Vu reported no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Restless legs syndrome (RLS) is associated with an elevated risk of dementia among older adults, suggesting the disorder may be a risk factor for dementia or a very early noncognitive sign of dementia, researchers say.

In a large population-based cohort study, adults with RLS were significantly more likely to develop dementia over more than a decade than were their peers without RLS.

If confirmed in future studies, “regular check-ups for cognitive decline in older patients with RLS may facilitate earlier detection and intervention for those with dementia risk,” wrote investigators led by Eosu Kim, MD, PhD, with Yonsei University, Seoul, Republic of Korea.

The study was published online in Alzheimer’s Research and Therapy.
 

Sleep disorders and dementia

RLS is associated with poor sleep, depression/anxiety, poor diet, microvasculopathy, and hypoxia – all of which are known risk factors for dementia. However, the relationship between RLS and incident dementia has been unclear.

The researchers compared risk for all-cause dementia, Alzheimer’s disease (AD), and vascular dementia (VaD) among 2,501 adults with newly diagnosed RLS and 9,977 matched control persons participating in the Korean National Health Insurance Service–Elderly Cohort, a nationwide population-based cohort of adults aged 60 and older.

The mean age of the cohort was 73 years; most of the participants were women (65%). Among all 12,478 participants, 874 (7%) developed all-cause dementia during follow-up – 475 (54%) developed AD, and 194 (22%) developed VaD.

The incidence of all-cause dementia was significantly higher among the RLS group than among the control group (10.4% vs. 6.2%). Incidence rates of AD and VaD (5.6% and 2.6%, respectively) were also higher in the RLS group than in the control group (3.4% and 1.3%, respectively).

In Cox regression analysis, RLS was significantly associated with an increased risk of all-cause dementia (adjusted hazard ratio [aHR], 1.46; 95% confidence interval [CI], 1.24-1.72), AD (aHR 1.38; 95% CI, 1.11-1.72) and VaD (aHR, 1.81; 95% CI, 1.30-2.53).

The researchers noted that RLS may precede deterioration of cognitive function, leading to dementia, and they suggest that RLS could be regarded as a “newly identified” risk factor or prodromal sign of dementia.
 

Modifiable risk factor

Reached for comment, Thanh Dang-Vu, MD, PhD, professor and research chair in sleep, neuroimaging, and cognitive health at Concordia University in Montreal, said there is now “increasing literature that shows sleep as a modifiable risk factor for cognitive decline.

“Previous evidence indicates that both sleep apnea and insomnia disorder increase the risk for cognitive decline and possibly dementia. Here the study adds to this body of evidence linking sleep disorders to dementia, suggesting that RLS should also be considered as a sleep-related risk factor,” Dr. Dang-Vu told this news organization.

“More evidence is needed, though, as here, all diagnoses were based on national health insurance diagnostic codes, and it is likely there were missed diagnoses for RLS but also for other sleep disorders, as there was no systematic screening for them,” Dr. Dang-Vu cautioned.

Support for the study was provided by the Ministry of Health and Welfare, the Korean government, and Yonsei University. Dr. Kim and Dr. Dang-Vu reported no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

 

Restless legs syndrome (RLS) is associated with an elevated risk of dementia among older adults, suggesting the disorder may be a risk factor for dementia or a very early noncognitive sign of dementia, researchers say.

In a large population-based cohort study, adults with RLS were significantly more likely to develop dementia over more than a decade than were their peers without RLS.

If confirmed in future studies, “regular check-ups for cognitive decline in older patients with RLS may facilitate earlier detection and intervention for those with dementia risk,” wrote investigators led by Eosu Kim, MD, PhD, with Yonsei University, Seoul, Republic of Korea.

The study was published online in Alzheimer’s Research and Therapy.
 

Sleep disorders and dementia

RLS is associated with poor sleep, depression/anxiety, poor diet, microvasculopathy, and hypoxia – all of which are known risk factors for dementia. However, the relationship between RLS and incident dementia has been unclear.

The researchers compared risk for all-cause dementia, Alzheimer’s disease (AD), and vascular dementia (VaD) among 2,501 adults with newly diagnosed RLS and 9,977 matched control persons participating in the Korean National Health Insurance Service–Elderly Cohort, a nationwide population-based cohort of adults aged 60 and older.

The mean age of the cohort was 73 years; most of the participants were women (65%). Among all 12,478 participants, 874 (7%) developed all-cause dementia during follow-up – 475 (54%) developed AD, and 194 (22%) developed VaD.

The incidence of all-cause dementia was significantly higher among the RLS group than among the control group (10.4% vs. 6.2%). Incidence rates of AD and VaD (5.6% and 2.6%, respectively) were also higher in the RLS group than in the control group (3.4% and 1.3%, respectively).

In Cox regression analysis, RLS was significantly associated with an increased risk of all-cause dementia (adjusted hazard ratio [aHR], 1.46; 95% confidence interval [CI], 1.24-1.72), AD (aHR 1.38; 95% CI, 1.11-1.72) and VaD (aHR, 1.81; 95% CI, 1.30-2.53).

The researchers noted that RLS may precede deterioration of cognitive function, leading to dementia, and they suggest that RLS could be regarded as a “newly identified” risk factor or prodromal sign of dementia.
 

Modifiable risk factor

Reached for comment, Thanh Dang-Vu, MD, PhD, professor and research chair in sleep, neuroimaging, and cognitive health at Concordia University in Montreal, said there is now “increasing literature that shows sleep as a modifiable risk factor for cognitive decline.

“Previous evidence indicates that both sleep apnea and insomnia disorder increase the risk for cognitive decline and possibly dementia. Here the study adds to this body of evidence linking sleep disorders to dementia, suggesting that RLS should also be considered as a sleep-related risk factor,” Dr. Dang-Vu told this news organization.

“More evidence is needed, though, as here, all diagnoses were based on national health insurance diagnostic codes, and it is likely there were missed diagnoses for RLS but also for other sleep disorders, as there was no systematic screening for them,” Dr. Dang-Vu cautioned.

Support for the study was provided by the Ministry of Health and Welfare, the Korean government, and Yonsei University. Dr. Kim and Dr. Dang-Vu reported no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S RESEARCH AND THERAPY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Another FDA class I recall of Cardiosave Hybrid/Rescue IABPs

Article Type
Changed
Wed, 04/05/2023 - 11:33

Datascope/Getinge is recalling certain Cardiosave Hybrid and Cardiosave Rescue Intra-Aortic Balloon Pumps (IABPs) because the coiled cable connecting the display and base on some units may fail, causing an unexpected shutdown without warnings or alarms to alert the user.

The U.S. Food and Drug Administration has identified this as a class I recall, the most serious type of recall, because of the risk for serious injury or death.

The FDA warns that an unexpected pump shutdown and any interruption to therapy that occurs can lead to hemodynamic instability, organ damage, and/or death, especially in patients who are critically ill and most likely to receive therapy using these devices.

FDA icon
Wikimedia Commons/FitzColinGerald/Creative Commons License


The devices are indicated for acute coronary syndrome, cardiac and noncardiac surgery, and complications of heart failure in adults.

From June 2019 to August 2022, Datascope/Getinge reported 44 complaints about damaged coiled cords resulting in unexpected shutdowns. There have been no reports of injuries or deaths related to this issue, according to the recall notice posted on the FDA’s website. 

The recall includes a total of 2,300 CardioSave Hybrid or Rescue IABP units distributed prior to July 24, 2017, and/or coiled cord part number 0012-00-1801. Product model numbers for the recalled Cardiosave Hybrid and Cardiosave Rescue are available online.

The Cardiosave IABPs have previously been flagged by the FDA for subpar battery performance and fluid leaks.

To address the cable issue, Datascope/Getinge sent an urgent medical device correction letter to customers recommending that the coiled cable cord of the Cardiosave IABP be inspected for visible damage prior to use.

If an unexpected shutdown occurs, an attempt should be made to restart the Cardiosave IABP until an alternative pump is available. If the restart attempt is unsuccessful, an alternative IABP should be used. Any device that remains inoperable after a shutdown should be removed from patient care. 

Customers should inspect their inventory to identify any Cardiosave Hybrid and/or Rescue IABPs that have the recalled coiled cord.

The company also asks customers to complete and sign the Medical Device Correction-Response form included with the letter and return it to Datascope/Getinge by emailing a scanned copy to cardiosave-sdhl23.act@getinge.com or by faxing the form to 1-877-660-5841.

Customers with questions about this recall should contact their Datascope/Getinge representative or call Datascope/Getinge technical support at 1-888-943-8872, Monday through Friday, between 8:00 AM and 6:00 PM ET.

The company has developed a hardware correction to address this issue and says a service representative will contact customers to schedule installation of the correction when the correction kit is available.

Any adverse events or suspected adverse events related to the recalled CardioSave Hybrid/Rescue IABPs should be reported to the FDA through MedWatch, its adverse event reporting program.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Datascope/Getinge is recalling certain Cardiosave Hybrid and Cardiosave Rescue Intra-Aortic Balloon Pumps (IABPs) because the coiled cable connecting the display and base on some units may fail, causing an unexpected shutdown without warnings or alarms to alert the user.

The U.S. Food and Drug Administration has identified this as a class I recall, the most serious type of recall, because of the risk for serious injury or death.

The FDA warns that an unexpected pump shutdown and any interruption to therapy that occurs can lead to hemodynamic instability, organ damage, and/or death, especially in patients who are critically ill and most likely to receive therapy using these devices.

FDA icon
Wikimedia Commons/FitzColinGerald/Creative Commons License


The devices are indicated for acute coronary syndrome, cardiac and noncardiac surgery, and complications of heart failure in adults.

From June 2019 to August 2022, Datascope/Getinge reported 44 complaints about damaged coiled cords resulting in unexpected shutdowns. There have been no reports of injuries or deaths related to this issue, according to the recall notice posted on the FDA’s website. 

The recall includes a total of 2,300 CardioSave Hybrid or Rescue IABP units distributed prior to July 24, 2017, and/or coiled cord part number 0012-00-1801. Product model numbers for the recalled Cardiosave Hybrid and Cardiosave Rescue are available online.

The Cardiosave IABPs have previously been flagged by the FDA for subpar battery performance and fluid leaks.

To address the cable issue, Datascope/Getinge sent an urgent medical device correction letter to customers recommending that the coiled cable cord of the Cardiosave IABP be inspected for visible damage prior to use.

If an unexpected shutdown occurs, an attempt should be made to restart the Cardiosave IABP until an alternative pump is available. If the restart attempt is unsuccessful, an alternative IABP should be used. Any device that remains inoperable after a shutdown should be removed from patient care. 

Customers should inspect their inventory to identify any Cardiosave Hybrid and/or Rescue IABPs that have the recalled coiled cord.

The company also asks customers to complete and sign the Medical Device Correction-Response form included with the letter and return it to Datascope/Getinge by emailing a scanned copy to cardiosave-sdhl23.act@getinge.com or by faxing the form to 1-877-660-5841.

Customers with questions about this recall should contact their Datascope/Getinge representative or call Datascope/Getinge technical support at 1-888-943-8872, Monday through Friday, between 8:00 AM and 6:00 PM ET.

The company has developed a hardware correction to address this issue and says a service representative will contact customers to schedule installation of the correction when the correction kit is available.

Any adverse events or suspected adverse events related to the recalled CardioSave Hybrid/Rescue IABPs should be reported to the FDA through MedWatch, its adverse event reporting program.

A version of this article first appeared on Medscape.com.

Datascope/Getinge is recalling certain Cardiosave Hybrid and Cardiosave Rescue Intra-Aortic Balloon Pumps (IABPs) because the coiled cable connecting the display and base on some units may fail, causing an unexpected shutdown without warnings or alarms to alert the user.

The U.S. Food and Drug Administration has identified this as a class I recall, the most serious type of recall, because of the risk for serious injury or death.

The FDA warns that an unexpected pump shutdown and any interruption to therapy that occurs can lead to hemodynamic instability, organ damage, and/or death, especially in patients who are critically ill and most likely to receive therapy using these devices.

FDA icon
Wikimedia Commons/FitzColinGerald/Creative Commons License


The devices are indicated for acute coronary syndrome, cardiac and noncardiac surgery, and complications of heart failure in adults.

From June 2019 to August 2022, Datascope/Getinge reported 44 complaints about damaged coiled cords resulting in unexpected shutdowns. There have been no reports of injuries or deaths related to this issue, according to the recall notice posted on the FDA’s website. 

The recall includes a total of 2,300 CardioSave Hybrid or Rescue IABP units distributed prior to July 24, 2017, and/or coiled cord part number 0012-00-1801. Product model numbers for the recalled Cardiosave Hybrid and Cardiosave Rescue are available online.

The Cardiosave IABPs have previously been flagged by the FDA for subpar battery performance and fluid leaks.

To address the cable issue, Datascope/Getinge sent an urgent medical device correction letter to customers recommending that the coiled cable cord of the Cardiosave IABP be inspected for visible damage prior to use.

If an unexpected shutdown occurs, an attempt should be made to restart the Cardiosave IABP until an alternative pump is available. If the restart attempt is unsuccessful, an alternative IABP should be used. Any device that remains inoperable after a shutdown should be removed from patient care. 

Customers should inspect their inventory to identify any Cardiosave Hybrid and/or Rescue IABPs that have the recalled coiled cord.

The company also asks customers to complete and sign the Medical Device Correction-Response form included with the letter and return it to Datascope/Getinge by emailing a scanned copy to cardiosave-sdhl23.act@getinge.com or by faxing the form to 1-877-660-5841.

Customers with questions about this recall should contact their Datascope/Getinge representative or call Datascope/Getinge technical support at 1-888-943-8872, Monday through Friday, between 8:00 AM and 6:00 PM ET.

The company has developed a hardware correction to address this issue and says a service representative will contact customers to schedule installation of the correction when the correction kit is available.

Any adverse events or suspected adverse events related to the recalled CardioSave Hybrid/Rescue IABPs should be reported to the FDA through MedWatch, its adverse event reporting program.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Ulcerative colitis cases projected to top 2 million in eight countries by 2031

Article Type
Changed
Thu, 03/23/2023 - 12:45

Diagnosed prevalent cases of ulcerative colitis (UC) in the United States and seven other countries are projected to increase from 1.9 million in 2021 to 2.1 million in 2031, at an annual growth rate of 0.63%, according to a new report by GlobalData.

The data and analytics company’s report offers projections for diagnosed incident and prevalent cases of UC in the United States, United Kingdom, Germany, Spain, Japan, Italy, France, and Canada.

In 2031, the United States will have the highest number of diagnosed prevalent cases of UC, with 655,317 cases, whereas Canada will have the fewest, with 91,186 cases, the company projects.

“UC can occur at any age, although most people are diagnosed in their mid-thirties. Men and women are equally likely to be affected, but older men are more likely to be diagnosed than older women,” Bharti Prabhakar, MPH, associate project manager at GlobalData, said in a statement.

In all eight countries, adults aged 30-69 years accounted for more than 65% of the diagnosed prevalent cases of UC, whereas those younger than 20 years made up less than 3% of the cases, GlobalData noted.
 

Incidence also rising

Diagnosed incident cases of UC in the eight countries are expected to increase from 160,122 cases in 2021 to 168,467 cases in 2031, at an annual growth rate of 0.52%, the company said.

In 2031, the United States will have the highest number of diagnosed incident cases of UC, with 104,795 cases, and France will have the fewest, with 2972 cases, the company predicted.

GlobalData epidemiologists attribute the predicted increases in UC prevalence and incidence to changes in population dynamics in each country.

The forecast is supported by historical data obtained from peer-reviewed articles and population-based studies, the firm noted.

The methodology was kept consistent across the eight countries to allow for a meaningful comparison of the forecast incident and prevalent cases of UC across these markets, GlobalData added.

“UC can affect people of any racial or ethnic group,” Ms. Prabhakar stated. “Genes, abnormal immune reactions, the microbiome, diet, stress, and the environment have all been suggested as triggers, but there is no definite evidence that any one of these factors is the cause of UC.”

Western countries have reported high incidence and prevalence of UC, Ms. Prabhaker noted. “Therefore, environmental factors may either suppress or reinforce inherent predispositions for UC and might also be crucial in triggering disease onset.”

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

Diagnosed prevalent cases of ulcerative colitis (UC) in the United States and seven other countries are projected to increase from 1.9 million in 2021 to 2.1 million in 2031, at an annual growth rate of 0.63%, according to a new report by GlobalData.

The data and analytics company’s report offers projections for diagnosed incident and prevalent cases of UC in the United States, United Kingdom, Germany, Spain, Japan, Italy, France, and Canada.

In 2031, the United States will have the highest number of diagnosed prevalent cases of UC, with 655,317 cases, whereas Canada will have the fewest, with 91,186 cases, the company projects.

“UC can occur at any age, although most people are diagnosed in their mid-thirties. Men and women are equally likely to be affected, but older men are more likely to be diagnosed than older women,” Bharti Prabhakar, MPH, associate project manager at GlobalData, said in a statement.

In all eight countries, adults aged 30-69 years accounted for more than 65% of the diagnosed prevalent cases of UC, whereas those younger than 20 years made up less than 3% of the cases, GlobalData noted.
 

Incidence also rising

Diagnosed incident cases of UC in the eight countries are expected to increase from 160,122 cases in 2021 to 168,467 cases in 2031, at an annual growth rate of 0.52%, the company said.

In 2031, the United States will have the highest number of diagnosed incident cases of UC, with 104,795 cases, and France will have the fewest, with 2972 cases, the company predicted.

GlobalData epidemiologists attribute the predicted increases in UC prevalence and incidence to changes in population dynamics in each country.

The forecast is supported by historical data obtained from peer-reviewed articles and population-based studies, the firm noted.

The methodology was kept consistent across the eight countries to allow for a meaningful comparison of the forecast incident and prevalent cases of UC across these markets, GlobalData added.

“UC can affect people of any racial or ethnic group,” Ms. Prabhakar stated. “Genes, abnormal immune reactions, the microbiome, diet, stress, and the environment have all been suggested as triggers, but there is no definite evidence that any one of these factors is the cause of UC.”

Western countries have reported high incidence and prevalence of UC, Ms. Prabhaker noted. “Therefore, environmental factors may either suppress or reinforce inherent predispositions for UC and might also be crucial in triggering disease onset.”

A version of this article originally appeared on Medscape.com.

Diagnosed prevalent cases of ulcerative colitis (UC) in the United States and seven other countries are projected to increase from 1.9 million in 2021 to 2.1 million in 2031, at an annual growth rate of 0.63%, according to a new report by GlobalData.

The data and analytics company’s report offers projections for diagnosed incident and prevalent cases of UC in the United States, United Kingdom, Germany, Spain, Japan, Italy, France, and Canada.

In 2031, the United States will have the highest number of diagnosed prevalent cases of UC, with 655,317 cases, whereas Canada will have the fewest, with 91,186 cases, the company projects.

“UC can occur at any age, although most people are diagnosed in their mid-thirties. Men and women are equally likely to be affected, but older men are more likely to be diagnosed than older women,” Bharti Prabhakar, MPH, associate project manager at GlobalData, said in a statement.

In all eight countries, adults aged 30-69 years accounted for more than 65% of the diagnosed prevalent cases of UC, whereas those younger than 20 years made up less than 3% of the cases, GlobalData noted.
 

Incidence also rising

Diagnosed incident cases of UC in the eight countries are expected to increase from 160,122 cases in 2021 to 168,467 cases in 2031, at an annual growth rate of 0.52%, the company said.

In 2031, the United States will have the highest number of diagnosed incident cases of UC, with 104,795 cases, and France will have the fewest, with 2972 cases, the company predicted.

GlobalData epidemiologists attribute the predicted increases in UC prevalence and incidence to changes in population dynamics in each country.

The forecast is supported by historical data obtained from peer-reviewed articles and population-based studies, the firm noted.

The methodology was kept consistent across the eight countries to allow for a meaningful comparison of the forecast incident and prevalent cases of UC across these markets, GlobalData added.

“UC can affect people of any racial or ethnic group,” Ms. Prabhakar stated. “Genes, abnormal immune reactions, the microbiome, diet, stress, and the environment have all been suggested as triggers, but there is no definite evidence that any one of these factors is the cause of UC.”

Western countries have reported high incidence and prevalence of UC, Ms. Prabhaker noted. “Therefore, environmental factors may either suppress or reinforce inherent predispositions for UC and might also be crucial in triggering disease onset.”

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AI helps predict ulcerative colitis remission/activity, flare-ups

Article Type
Changed
Thu, 03/16/2023 - 11:36

Researchers have developed and validated an artificial intelligence (AI) tool that can accurately distinguish ulcerative colitis (UC) remission from activity (inflammation) in biopsies and help predict flare-ups.

The AI tool predicted UC disease activity with 89% accuracy and inflammation at the biopsy site with 80% accuracy. Its ability to stratify risk of UC flare was on par with human pathologists.

“This tool in the near future will speed up, simplify, and standardize histological assessment of ulcerative colitis and provide the clinician with accurate prognostic information in real time,” co–lead author Marietta Iacucci, MD, PhD, from the University of Birmingham (England), and University College Cork (Ireland), said in an interview.

“The tool needs to be refined and further validated before it is ready for daily clinical practice. That work is ongoing now,” Dr. Iacucci said.

The researchers describe their advanced AI-based computer-aided detection tool in a study published online in Gastroenterology.
 

‘Strong’ performance

They used 535 digitized biopsies from 273 patients with UC (mean age, 48 years; 41% women) to develop and test the tool. They used a subset of 118 to train it to distinguish remission from activity, 42 to calibrate it, and 375 to test it. An additional 154 biopsies from 58 patients with UC were used to externally validate the tool.

The model also was tested to predict the corresponding endoscopic assessment and occurrence of flares at 12 months.

UC disease activity was defined by three different histologic indices: the Robarts Histopathology Index (RHI), the Nancy Histological Index (NHI), and the newly developed PICaSSO Histologic Remission Index (PHRI).

The AI tool had “strong diagnostic performance to detect disease activity” (PHRI > 0) with an overall area under the receiver operating characteristic curve of 0.87 and sensitivity and specificity of 89% and 85%, respectively.

The researchers note that, while the AI tool was trained for the PHRI, its sensitivity for RHI and NHI histologic remission/activity was also high (94% and 89%, respectively).

Despite the different mix of severity grades, the AI model “maintained a good diagnostic performance, proving its applicability outside the original development setting,” they reported.

The AI tool could also predict the presence of endoscopic inflammation in the biopsy area with about 80% accuracy.

“Though imperfect, this result is consistent with human-assessed correlation between endoscopy and histology,” the researchers noted.

The model predicted the corresponding endoscopic remission/activity with 79% and 82% accuracy for UCEIS and PICaSSO, respectively.

The hazard ratios for disease flare-up between the AI system and pathologists assessed by PHRI was similar (4.64 and 3.56, respectively), “demonstrating the ability of the computer to stratify the risk of flare comparably well to pathologists,” they added. 

Both histology and outcome prediction were confirmed in the external validation cohort.

The AI system delivered results in an average of 9.8 seconds per slide.
 

Potential ‘game changer’

UC is a “complex condition to predict, and developing machine learning–derived systems to make this diagnostic job quicker and more accurate could be a game changer,” Dr. Iacucci said in a news release.

With refinement, the AI tool will have an impact on both clinical trials and daily practice, the researchers wrote. In clinical practice, histological reporting remains “largely descriptive and nonstandard, thus would greatly benefit from a quick and objective assessment. Similarly, clinical trials in UC could efficiently overcome costly central readings.”

Assessing and measuring improvement in endoscopy and histology are difficult parts of treating UC, said David Hudesman, MD, codirector of the Inflammatory Bowel Disease Center at New York University Langone Health.

“We do not know how much improvement is associated with improved long-term outcomes,” Dr. Hudesman said in an interview. “For example, does a patient need complete healing or is 50% better enough?” Dr. Hudesman was not involved with the current research.

“This study showed that AI can predict – with good accuracy – endoscopy and histology scores, as well as 1-year patient outcomes. If this is validated in larger studies, AI can help determine if we should adjust/change therapies or continue, which is very important,” he said.

This research was supported by the National Institute for Health Research Birmingham Biomedical Research Centre. Dr. Iacucci and Dr. Hudesman reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Researchers have developed and validated an artificial intelligence (AI) tool that can accurately distinguish ulcerative colitis (UC) remission from activity (inflammation) in biopsies and help predict flare-ups.

The AI tool predicted UC disease activity with 89% accuracy and inflammation at the biopsy site with 80% accuracy. Its ability to stratify risk of UC flare was on par with human pathologists.

“This tool in the near future will speed up, simplify, and standardize histological assessment of ulcerative colitis and provide the clinician with accurate prognostic information in real time,” co–lead author Marietta Iacucci, MD, PhD, from the University of Birmingham (England), and University College Cork (Ireland), said in an interview.

“The tool needs to be refined and further validated before it is ready for daily clinical practice. That work is ongoing now,” Dr. Iacucci said.

The researchers describe their advanced AI-based computer-aided detection tool in a study published online in Gastroenterology.
 

‘Strong’ performance

They used 535 digitized biopsies from 273 patients with UC (mean age, 48 years; 41% women) to develop and test the tool. They used a subset of 118 to train it to distinguish remission from activity, 42 to calibrate it, and 375 to test it. An additional 154 biopsies from 58 patients with UC were used to externally validate the tool.

The model also was tested to predict the corresponding endoscopic assessment and occurrence of flares at 12 months.

UC disease activity was defined by three different histologic indices: the Robarts Histopathology Index (RHI), the Nancy Histological Index (NHI), and the newly developed PICaSSO Histologic Remission Index (PHRI).

The AI tool had “strong diagnostic performance to detect disease activity” (PHRI > 0) with an overall area under the receiver operating characteristic curve of 0.87 and sensitivity and specificity of 89% and 85%, respectively.

The researchers note that, while the AI tool was trained for the PHRI, its sensitivity for RHI and NHI histologic remission/activity was also high (94% and 89%, respectively).

Despite the different mix of severity grades, the AI model “maintained a good diagnostic performance, proving its applicability outside the original development setting,” they reported.

The AI tool could also predict the presence of endoscopic inflammation in the biopsy area with about 80% accuracy.

“Though imperfect, this result is consistent with human-assessed correlation between endoscopy and histology,” the researchers noted.

The model predicted the corresponding endoscopic remission/activity with 79% and 82% accuracy for UCEIS and PICaSSO, respectively.

The hazard ratios for disease flare-up between the AI system and pathologists assessed by PHRI was similar (4.64 and 3.56, respectively), “demonstrating the ability of the computer to stratify the risk of flare comparably well to pathologists,” they added. 

Both histology and outcome prediction were confirmed in the external validation cohort.

The AI system delivered results in an average of 9.8 seconds per slide.
 

Potential ‘game changer’

UC is a “complex condition to predict, and developing machine learning–derived systems to make this diagnostic job quicker and more accurate could be a game changer,” Dr. Iacucci said in a news release.

With refinement, the AI tool will have an impact on both clinical trials and daily practice, the researchers wrote. In clinical practice, histological reporting remains “largely descriptive and nonstandard, thus would greatly benefit from a quick and objective assessment. Similarly, clinical trials in UC could efficiently overcome costly central readings.”

Assessing and measuring improvement in endoscopy and histology are difficult parts of treating UC, said David Hudesman, MD, codirector of the Inflammatory Bowel Disease Center at New York University Langone Health.

“We do not know how much improvement is associated with improved long-term outcomes,” Dr. Hudesman said in an interview. “For example, does a patient need complete healing or is 50% better enough?” Dr. Hudesman was not involved with the current research.

“This study showed that AI can predict – with good accuracy – endoscopy and histology scores, as well as 1-year patient outcomes. If this is validated in larger studies, AI can help determine if we should adjust/change therapies or continue, which is very important,” he said.

This research was supported by the National Institute for Health Research Birmingham Biomedical Research Centre. Dr. Iacucci and Dr. Hudesman reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Researchers have developed and validated an artificial intelligence (AI) tool that can accurately distinguish ulcerative colitis (UC) remission from activity (inflammation) in biopsies and help predict flare-ups.

The AI tool predicted UC disease activity with 89% accuracy and inflammation at the biopsy site with 80% accuracy. Its ability to stratify risk of UC flare was on par with human pathologists.

“This tool in the near future will speed up, simplify, and standardize histological assessment of ulcerative colitis and provide the clinician with accurate prognostic information in real time,” co–lead author Marietta Iacucci, MD, PhD, from the University of Birmingham (England), and University College Cork (Ireland), said in an interview.

“The tool needs to be refined and further validated before it is ready for daily clinical practice. That work is ongoing now,” Dr. Iacucci said.

The researchers describe their advanced AI-based computer-aided detection tool in a study published online in Gastroenterology.
 

‘Strong’ performance

They used 535 digitized biopsies from 273 patients with UC (mean age, 48 years; 41% women) to develop and test the tool. They used a subset of 118 to train it to distinguish remission from activity, 42 to calibrate it, and 375 to test it. An additional 154 biopsies from 58 patients with UC were used to externally validate the tool.

The model also was tested to predict the corresponding endoscopic assessment and occurrence of flares at 12 months.

UC disease activity was defined by three different histologic indices: the Robarts Histopathology Index (RHI), the Nancy Histological Index (NHI), and the newly developed PICaSSO Histologic Remission Index (PHRI).

The AI tool had “strong diagnostic performance to detect disease activity” (PHRI > 0) with an overall area under the receiver operating characteristic curve of 0.87 and sensitivity and specificity of 89% and 85%, respectively.

The researchers note that, while the AI tool was trained for the PHRI, its sensitivity for RHI and NHI histologic remission/activity was also high (94% and 89%, respectively).

Despite the different mix of severity grades, the AI model “maintained a good diagnostic performance, proving its applicability outside the original development setting,” they reported.

The AI tool could also predict the presence of endoscopic inflammation in the biopsy area with about 80% accuracy.

“Though imperfect, this result is consistent with human-assessed correlation between endoscopy and histology,” the researchers noted.

The model predicted the corresponding endoscopic remission/activity with 79% and 82% accuracy for UCEIS and PICaSSO, respectively.

The hazard ratios for disease flare-up between the AI system and pathologists assessed by PHRI was similar (4.64 and 3.56, respectively), “demonstrating the ability of the computer to stratify the risk of flare comparably well to pathologists,” they added. 

Both histology and outcome prediction were confirmed in the external validation cohort.

The AI system delivered results in an average of 9.8 seconds per slide.
 

Potential ‘game changer’

UC is a “complex condition to predict, and developing machine learning–derived systems to make this diagnostic job quicker and more accurate could be a game changer,” Dr. Iacucci said in a news release.

With refinement, the AI tool will have an impact on both clinical trials and daily practice, the researchers wrote. In clinical practice, histological reporting remains “largely descriptive and nonstandard, thus would greatly benefit from a quick and objective assessment. Similarly, clinical trials in UC could efficiently overcome costly central readings.”

Assessing and measuring improvement in endoscopy and histology are difficult parts of treating UC, said David Hudesman, MD, codirector of the Inflammatory Bowel Disease Center at New York University Langone Health.

“We do not know how much improvement is associated with improved long-term outcomes,” Dr. Hudesman said in an interview. “For example, does a patient need complete healing or is 50% better enough?” Dr. Hudesman was not involved with the current research.

“This study showed that AI can predict – with good accuracy – endoscopy and histology scores, as well as 1-year patient outcomes. If this is validated in larger studies, AI can help determine if we should adjust/change therapies or continue, which is very important,” he said.

This research was supported by the National Institute for Health Research Birmingham Biomedical Research Centre. Dr. Iacucci and Dr. Hudesman reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

FDA OKs first drug for Rett syndrome

Article Type
Changed
Wed, 04/05/2023 - 11:36

 

The Food and Drug Administration has approved trofinetide oral solution (Daybue, Acadia Pharmaceuticals) as the first treatment of Rett syndrome in adults and children aged 2 years and older.

Rett syndrome is a rare, genetic neurodevelopmental disorder that affects about 6,000-9,000 people in the United States, mostly females.

Symptoms typically present between 6 and 18 months of age, with patients experiencing a rapid decline with loss of fine motor and communication skills.

A stamp saying "FDA approved."
Olivier Le Moal/Getty Images

Trofinetide is a synthetic analogue of the amino-terminal tripeptide of insulinlike growth factor-1 (IGF-1), which occurs naturally in the brain. The drug is designed to treat the core symptoms of Rett syndrome by potentially reducing neuroinflammation and supporting synaptic function.

The approval of trofinetide was supported by results from the pivotal phase 3 LAVENDER study that tested the efficacy and safety of trofinetide vs. placebo in 187 female patients with Rett syndrome, aged 5-20 years. 

A total of 93 participants were randomly assigned to twice-daily oral trofinetide, and 94 received placebo for 12 weeks.

After 12 weeks, trofinetide showed a statistically significant improvement from baseline, compared with placebo, on both the caregiver-assessed Rett Syndrome Behavior Questionnaire (RSBQ) and 7-point Clinical Global Impression-Improvement (CGI-I) scale. 

The drug also outperformed placebo at 12 weeks in a key secondary endpoint: the composite score on the Communication and Symbolic Behavior Scales Developmental Profile Infant-Toddler Checklist-Social (CSBS-DP-IT Social), a scale on which caregivers assess nonverbal communication.

The most common adverse events with trofinetide treatment were diarrhea and vomiting. Almost all these events were considered mild or moderate.

‘Historic day’

“This is a historic day for the Rett syndrome community and a meaningful moment for the patients and caregivers who have eagerly awaited the arrival of an approved treatment for this condition,” Melissa Kennedy, MHA, chief executive officer of the International Rett Syndrome Foundation, said in a news release issued by Acadia.

“Rett syndrome is a complicated, devastating disease that affects not only the individual patient, but whole families. With today’s FDA decision, those impacted by Rett have a promising new treatment option that has demonstrated benefit across a variety of Rett symptoms, including those that impact the daily lives of those living with Rett and their loved ones,” Ms. Kennedy said.

Trofinetide is expected to be available in the United States by the end of April.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections

 

The Food and Drug Administration has approved trofinetide oral solution (Daybue, Acadia Pharmaceuticals) as the first treatment of Rett syndrome in adults and children aged 2 years and older.

Rett syndrome is a rare, genetic neurodevelopmental disorder that affects about 6,000-9,000 people in the United States, mostly females.

Symptoms typically present between 6 and 18 months of age, with patients experiencing a rapid decline with loss of fine motor and communication skills.

A stamp saying "FDA approved."
Olivier Le Moal/Getty Images

Trofinetide is a synthetic analogue of the amino-terminal tripeptide of insulinlike growth factor-1 (IGF-1), which occurs naturally in the brain. The drug is designed to treat the core symptoms of Rett syndrome by potentially reducing neuroinflammation and supporting synaptic function.

The approval of trofinetide was supported by results from the pivotal phase 3 LAVENDER study that tested the efficacy and safety of trofinetide vs. placebo in 187 female patients with Rett syndrome, aged 5-20 years. 

A total of 93 participants were randomly assigned to twice-daily oral trofinetide, and 94 received placebo for 12 weeks.

After 12 weeks, trofinetide showed a statistically significant improvement from baseline, compared with placebo, on both the caregiver-assessed Rett Syndrome Behavior Questionnaire (RSBQ) and 7-point Clinical Global Impression-Improvement (CGI-I) scale. 

The drug also outperformed placebo at 12 weeks in a key secondary endpoint: the composite score on the Communication and Symbolic Behavior Scales Developmental Profile Infant-Toddler Checklist-Social (CSBS-DP-IT Social), a scale on which caregivers assess nonverbal communication.

The most common adverse events with trofinetide treatment were diarrhea and vomiting. Almost all these events were considered mild or moderate.

‘Historic day’

“This is a historic day for the Rett syndrome community and a meaningful moment for the patients and caregivers who have eagerly awaited the arrival of an approved treatment for this condition,” Melissa Kennedy, MHA, chief executive officer of the International Rett Syndrome Foundation, said in a news release issued by Acadia.

“Rett syndrome is a complicated, devastating disease that affects not only the individual patient, but whole families. With today’s FDA decision, those impacted by Rett have a promising new treatment option that has demonstrated benefit across a variety of Rett symptoms, including those that impact the daily lives of those living with Rett and their loved ones,” Ms. Kennedy said.

Trofinetide is expected to be available in the United States by the end of April.

A version of this article first appeared on Medscape.com.

 

The Food and Drug Administration has approved trofinetide oral solution (Daybue, Acadia Pharmaceuticals) as the first treatment of Rett syndrome in adults and children aged 2 years and older.

Rett syndrome is a rare, genetic neurodevelopmental disorder that affects about 6,000-9,000 people in the United States, mostly females.

Symptoms typically present between 6 and 18 months of age, with patients experiencing a rapid decline with loss of fine motor and communication skills.

A stamp saying "FDA approved."
Olivier Le Moal/Getty Images

Trofinetide is a synthetic analogue of the amino-terminal tripeptide of insulinlike growth factor-1 (IGF-1), which occurs naturally in the brain. The drug is designed to treat the core symptoms of Rett syndrome by potentially reducing neuroinflammation and supporting synaptic function.

The approval of trofinetide was supported by results from the pivotal phase 3 LAVENDER study that tested the efficacy and safety of trofinetide vs. placebo in 187 female patients with Rett syndrome, aged 5-20 years. 

A total of 93 participants were randomly assigned to twice-daily oral trofinetide, and 94 received placebo for 12 weeks.

After 12 weeks, trofinetide showed a statistically significant improvement from baseline, compared with placebo, on both the caregiver-assessed Rett Syndrome Behavior Questionnaire (RSBQ) and 7-point Clinical Global Impression-Improvement (CGI-I) scale. 

The drug also outperformed placebo at 12 weeks in a key secondary endpoint: the composite score on the Communication and Symbolic Behavior Scales Developmental Profile Infant-Toddler Checklist-Social (CSBS-DP-IT Social), a scale on which caregivers assess nonverbal communication.

The most common adverse events with trofinetide treatment were diarrhea and vomiting. Almost all these events were considered mild or moderate.

‘Historic day’

“This is a historic day for the Rett syndrome community and a meaningful moment for the patients and caregivers who have eagerly awaited the arrival of an approved treatment for this condition,” Melissa Kennedy, MHA, chief executive officer of the International Rett Syndrome Foundation, said in a news release issued by Acadia.

“Rett syndrome is a complicated, devastating disease that affects not only the individual patient, but whole families. With today’s FDA decision, those impacted by Rett have a promising new treatment option that has demonstrated benefit across a variety of Rett symptoms, including those that impact the daily lives of those living with Rett and their loved ones,” Ms. Kennedy said.

Trofinetide is expected to be available in the United States by the end of April.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Two diets tied to lower Alzheimer’s pathology at autopsy

Article Type
Changed
Tue, 03/28/2023 - 17:29

A novel study provides strong evidence supporting the adoption of a healthy diet to protect the aging brain. In a cohort of deceased older adults, those who had adhered to the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) and Mediterranean diets for nearly a decade before death had less global Alzheimer’s disease–related pathology, primarily less beta-amyloid, at autopsy.

Those who most closely followed these diets had almost 40% lower odds of having an Alzheimer’s disease diagnosis at death. The findings offer one mechanism by which healthy diets protect cognition.

“While our research doesn’t prove that a healthy diet resulted in fewer brain deposits of amyloid plaques ... we know there is a relationship, and following the MIND and Mediterranean diets may be one way that people can improve their brain health and protect cognition as they age,” study investigator Puja Agarwal, PhD, of RUSH University Medical Center in Chicago, said in a statement.

The study was published online in Neurology.
 

Green leafy veggies key

The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died of cancer in 2020 at age 64.

Although similar, the Mediterranean diet recommends vegetables, fruit, and three or more servings of fish per week, whereas the MIND diet prioritizes green leafy vegetables, such as spinach, kale, and collard greens, along with other vegetables. The MIND diet also prioritizes berries over other fruit and recommends one or more servings of fish per week. Both diets recommend small amounts of wine.

The current study focused on 581 older adults who died while participating in the Rush Memory and Aging Project (MAP). All participants agreed to undergo annual clinical evaluations and brain autopsy after death.

Participants completed annual food frequency questionnaires beginning at a mean age of 84. The mean age at death was 91. Mean follow-up was 6.8 years.

Around the time of death, 224 participants (39%) had a diagnosis of clinical dementia, and 383 participants (66%) had a pathologic Alzheimer’s disease diagnosis at time of death.

The researchers used a series of regression analyses to investigate the MIND and Mediterranean diets and dietary components associated with Alzheimer’s disease pathology. They controlled for age at death, sex, education, APO-epsilon 4 status, and total calories.

Overall, both diets were significantly associated with lower global Alzheimer’s disease pathology (MIND: beta = –0.022, P = .034; and Mediterranean: beta = –0.007, P = .039) – specifically, with less beta-amyloid (MIND: beta = –0.068, P = .050; and Mediterranean: beta = –0.040, P = .004).

The findings persisted when the analysis was further adjusted for physical activity, smoking, and vascular disease burden and when participants with mild cognitive impairment or dementia at the baseline dietary assessment were excluded.

Individuals who most closely followed the Mediterranean diet had average beta-amyloid load similar to being 18 years younger than peers with the lowest adherence. And those who most closely followed the MIND diet had average beta-amyloid amounts similar to being 12 years younger than those with the lowest adherence.

A MIND diet score 1 point higher corresponded to typical plaque deposition of participants who are 4.25 years younger in age.

Regarding individual dietary components, those who ate seven or more servings of green leafy vegetables weekly had less global Alzheimer’s disease pathology than peers who ate one or fewer (beta = –0.115, P = .0038). Those who ate seven or more servings per week had plaque amounts in their brains corresponding to being almost 19 years younger in comparison with those who ate the fewest servings per week.

“Our finding that eating more green leafy vegetables is in itself associated with fewer signs of Alzheimer’s disease in the brain is intriguing enough for people to consider adding more of these vegetables to their diet,” Dr. Agarwal said in the news release.

Previous data from the MAP cohort showed that adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology.
 

 

 

Novel study, intriguing results

Heather Snyder, PhD, vice president of medical and scientific relations with the Alzheimer’s Association, noted that a number of studies have linked overall nutrition – especially a balanced diet low in saturated fats and sugar and high in vegetables – with brain health, including cognition, as one ages.

This new study “takes what we know about the link between nutrition and risk for cognitive decline a step further by looking at the specific brain changes that occur in Alzheimer’s disease. The study found an association of certain nutrition behaviors with less of these Alzheimer’s brain changes,” said Dr. Snyder, who was not involved in the study.

“This is intriguing, and more research is needed to test via an intervention if healthy dietary behaviors can modify the presence of Alzheimer’s brain changes and reduce risk of cognitive decline.”

The Alzheimer’s Association is leading a 2-year clinical trial known as US POINTER to study how targeting known dementia risk factors in combination may reduce risk of cognitive decline in older adults. The MIND diet is being used in US POINTER.

“But while we work to find an exact ‘recipe’ for risk reduction, it’s important to eat a heart-healthy diet that incorporates nutrients that our bodies and brains need to be at their best,” Dr. Snyder said.

The study was funded by the National Institutes of Health. Dr. Agarwal and Dr. Snyder have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections

A novel study provides strong evidence supporting the adoption of a healthy diet to protect the aging brain. In a cohort of deceased older adults, those who had adhered to the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) and Mediterranean diets for nearly a decade before death had less global Alzheimer’s disease–related pathology, primarily less beta-amyloid, at autopsy.

Those who most closely followed these diets had almost 40% lower odds of having an Alzheimer’s disease diagnosis at death. The findings offer one mechanism by which healthy diets protect cognition.

“While our research doesn’t prove that a healthy diet resulted in fewer brain deposits of amyloid plaques ... we know there is a relationship, and following the MIND and Mediterranean diets may be one way that people can improve their brain health and protect cognition as they age,” study investigator Puja Agarwal, PhD, of RUSH University Medical Center in Chicago, said in a statement.

The study was published online in Neurology.
 

Green leafy veggies key

The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died of cancer in 2020 at age 64.

Although similar, the Mediterranean diet recommends vegetables, fruit, and three or more servings of fish per week, whereas the MIND diet prioritizes green leafy vegetables, such as spinach, kale, and collard greens, along with other vegetables. The MIND diet also prioritizes berries over other fruit and recommends one or more servings of fish per week. Both diets recommend small amounts of wine.

The current study focused on 581 older adults who died while participating in the Rush Memory and Aging Project (MAP). All participants agreed to undergo annual clinical evaluations and brain autopsy after death.

Participants completed annual food frequency questionnaires beginning at a mean age of 84. The mean age at death was 91. Mean follow-up was 6.8 years.

Around the time of death, 224 participants (39%) had a diagnosis of clinical dementia, and 383 participants (66%) had a pathologic Alzheimer’s disease diagnosis at time of death.

The researchers used a series of regression analyses to investigate the MIND and Mediterranean diets and dietary components associated with Alzheimer’s disease pathology. They controlled for age at death, sex, education, APO-epsilon 4 status, and total calories.

Overall, both diets were significantly associated with lower global Alzheimer’s disease pathology (MIND: beta = –0.022, P = .034; and Mediterranean: beta = –0.007, P = .039) – specifically, with less beta-amyloid (MIND: beta = –0.068, P = .050; and Mediterranean: beta = –0.040, P = .004).

The findings persisted when the analysis was further adjusted for physical activity, smoking, and vascular disease burden and when participants with mild cognitive impairment or dementia at the baseline dietary assessment were excluded.

Individuals who most closely followed the Mediterranean diet had average beta-amyloid load similar to being 18 years younger than peers with the lowest adherence. And those who most closely followed the MIND diet had average beta-amyloid amounts similar to being 12 years younger than those with the lowest adherence.

A MIND diet score 1 point higher corresponded to typical plaque deposition of participants who are 4.25 years younger in age.

Regarding individual dietary components, those who ate seven or more servings of green leafy vegetables weekly had less global Alzheimer’s disease pathology than peers who ate one or fewer (beta = –0.115, P = .0038). Those who ate seven or more servings per week had plaque amounts in their brains corresponding to being almost 19 years younger in comparison with those who ate the fewest servings per week.

“Our finding that eating more green leafy vegetables is in itself associated with fewer signs of Alzheimer’s disease in the brain is intriguing enough for people to consider adding more of these vegetables to their diet,” Dr. Agarwal said in the news release.

Previous data from the MAP cohort showed that adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology.
 

 

 

Novel study, intriguing results

Heather Snyder, PhD, vice president of medical and scientific relations with the Alzheimer’s Association, noted that a number of studies have linked overall nutrition – especially a balanced diet low in saturated fats and sugar and high in vegetables – with brain health, including cognition, as one ages.

This new study “takes what we know about the link between nutrition and risk for cognitive decline a step further by looking at the specific brain changes that occur in Alzheimer’s disease. The study found an association of certain nutrition behaviors with less of these Alzheimer’s brain changes,” said Dr. Snyder, who was not involved in the study.

“This is intriguing, and more research is needed to test via an intervention if healthy dietary behaviors can modify the presence of Alzheimer’s brain changes and reduce risk of cognitive decline.”

The Alzheimer’s Association is leading a 2-year clinical trial known as US POINTER to study how targeting known dementia risk factors in combination may reduce risk of cognitive decline in older adults. The MIND diet is being used in US POINTER.

“But while we work to find an exact ‘recipe’ for risk reduction, it’s important to eat a heart-healthy diet that incorporates nutrients that our bodies and brains need to be at their best,” Dr. Snyder said.

The study was funded by the National Institutes of Health. Dr. Agarwal and Dr. Snyder have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A novel study provides strong evidence supporting the adoption of a healthy diet to protect the aging brain. In a cohort of deceased older adults, those who had adhered to the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) and Mediterranean diets for nearly a decade before death had less global Alzheimer’s disease–related pathology, primarily less beta-amyloid, at autopsy.

Those who most closely followed these diets had almost 40% lower odds of having an Alzheimer’s disease diagnosis at death. The findings offer one mechanism by which healthy diets protect cognition.

“While our research doesn’t prove that a healthy diet resulted in fewer brain deposits of amyloid plaques ... we know there is a relationship, and following the MIND and Mediterranean diets may be one way that people can improve their brain health and protect cognition as they age,” study investigator Puja Agarwal, PhD, of RUSH University Medical Center in Chicago, said in a statement.

The study was published online in Neurology.
 

Green leafy veggies key

The MIND diet was pioneered by the late Martha Clare Morris, ScD, a Rush nutritional epidemiologist, who died of cancer in 2020 at age 64.

Although similar, the Mediterranean diet recommends vegetables, fruit, and three or more servings of fish per week, whereas the MIND diet prioritizes green leafy vegetables, such as spinach, kale, and collard greens, along with other vegetables. The MIND diet also prioritizes berries over other fruit and recommends one or more servings of fish per week. Both diets recommend small amounts of wine.

The current study focused on 581 older adults who died while participating in the Rush Memory and Aging Project (MAP). All participants agreed to undergo annual clinical evaluations and brain autopsy after death.

Participants completed annual food frequency questionnaires beginning at a mean age of 84. The mean age at death was 91. Mean follow-up was 6.8 years.

Around the time of death, 224 participants (39%) had a diagnosis of clinical dementia, and 383 participants (66%) had a pathologic Alzheimer’s disease diagnosis at time of death.

The researchers used a series of regression analyses to investigate the MIND and Mediterranean diets and dietary components associated with Alzheimer’s disease pathology. They controlled for age at death, sex, education, APO-epsilon 4 status, and total calories.

Overall, both diets were significantly associated with lower global Alzheimer’s disease pathology (MIND: beta = –0.022, P = .034; and Mediterranean: beta = –0.007, P = .039) – specifically, with less beta-amyloid (MIND: beta = –0.068, P = .050; and Mediterranean: beta = –0.040, P = .004).

The findings persisted when the analysis was further adjusted for physical activity, smoking, and vascular disease burden and when participants with mild cognitive impairment or dementia at the baseline dietary assessment were excluded.

Individuals who most closely followed the Mediterranean diet had average beta-amyloid load similar to being 18 years younger than peers with the lowest adherence. And those who most closely followed the MIND diet had average beta-amyloid amounts similar to being 12 years younger than those with the lowest adherence.

A MIND diet score 1 point higher corresponded to typical plaque deposition of participants who are 4.25 years younger in age.

Regarding individual dietary components, those who ate seven or more servings of green leafy vegetables weekly had less global Alzheimer’s disease pathology than peers who ate one or fewer (beta = –0.115, P = .0038). Those who ate seven or more servings per week had plaque amounts in their brains corresponding to being almost 19 years younger in comparison with those who ate the fewest servings per week.

“Our finding that eating more green leafy vegetables is in itself associated with fewer signs of Alzheimer’s disease in the brain is intriguing enough for people to consider adding more of these vegetables to their diet,” Dr. Agarwal said in the news release.

Previous data from the MAP cohort showed that adherence to the MIND diet can improve memory and thinking skills of older adults, even in the presence of Alzheimer’s disease pathology.
 

 

 

Novel study, intriguing results

Heather Snyder, PhD, vice president of medical and scientific relations with the Alzheimer’s Association, noted that a number of studies have linked overall nutrition – especially a balanced diet low in saturated fats and sugar and high in vegetables – with brain health, including cognition, as one ages.

This new study “takes what we know about the link between nutrition and risk for cognitive decline a step further by looking at the specific brain changes that occur in Alzheimer’s disease. The study found an association of certain nutrition behaviors with less of these Alzheimer’s brain changes,” said Dr. Snyder, who was not involved in the study.

“This is intriguing, and more research is needed to test via an intervention if healthy dietary behaviors can modify the presence of Alzheimer’s brain changes and reduce risk of cognitive decline.”

The Alzheimer’s Association is leading a 2-year clinical trial known as US POINTER to study how targeting known dementia risk factors in combination may reduce risk of cognitive decline in older adults. The MIND diet is being used in US POINTER.

“But while we work to find an exact ‘recipe’ for risk reduction, it’s important to eat a heart-healthy diet that incorporates nutrients that our bodies and brains need to be at their best,” Dr. Snyder said.

The study was funded by the National Institutes of Health. Dr. Agarwal and Dr. Snyder have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Breast cancer surgery timing matters, but is faster always better?

Article Type
Changed
Wed, 03/08/2023 - 12:11

Most women with breast cancer undergo primary surgery within 8 weeks of diagnosis and any later may be associated with worse overall survival, according to findings from a case series.

With no national quality metrics delineating optimal breast cancer surgery timing, the researchers recommend surgery before 8 weeks from breast cancer diagnosis.

“This time interval does not appear to have a detrimental association with cancer outcomes and allows for multidisciplinary care,” the researchers, led by Alyssa A. Wiener, MD, from University of Wisconsin–Madison, said.

But, in an accompanying editorial, two surgical oncologists questioned whether faster surgery is always better.

“Efficiency might associate with quality, but doesn’t always ensure it,” Rita Mukhtar, MD, and Laura Esserman, MD, with the division of surgical oncology, University of California, San Francisco, said.

The study and editorial were published online in JAMA Surgery.


 

Optimal timing for surgery?

Some studies have found worse survival outcomes for women who experience delays between breast cancer diagnosis and surgical treatment, but the optimal window for surgery and the point at which surgery becomes less advantageous remain unknown.

Using the National Cancer Database, Dr. Wiener and colleagues identified 373,334 women (median age, 61) who were diagnosed with stage I to stage III ductal or lobular breast cancer from 2010 to 2014 and followed up through 2019.

All women underwent surgery as their first course of treatment. Patients with prior breast cancer, those who had neoadjuvant or experimental therapy or missing receptor information, and those who were diagnosed with breast cancer on the date of their primary surgery were excluded.

Most patients had timely surgery. The median time to surgery was 30 days, and 88% of patients underwent surgery before the 57-day time point.

Only 12% of patients had surgery more than 8 weeks after their diagnosis. Factors associated with longer times to surgery included age younger than 45, having Medicaid or no insurance, and lower household income.

The overall 5-year survival for the cohort was high at 90%. On multivariable analysis, the researchers found no statistically significant association between time to surgery and overall survival when surgery was performed between 0 and 8 weeks.

However, women who had surgery 9 or more weeks after diagnosis had a significantly higher rate of death within 5 years, compared with those who had surgery performed between 0 and 4 weeks (hazard ratio, 1.15; P < .001). Performing surgery up to 9 weeks (57-63 days) post diagnosis also did not appear to be negatively associated with survival.

This study “highlights that time to treatment of breast cancer is important,” said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the study. “Surgery is only one-third of the treatment of breast cancer, so these patients who had longer delays to the OR may have also not started their postsurgery treatments in time.”

In addition, the study found that socioeconomic status – Medicaid or uninsured status and lower household incomes – was associated with longer times to surgery.

“Socioeconomic factors like these may be independently associated with worse outcomes and may contribute to some of the disparities in cancer outcomes observed for resource-limited patients due to delayed care,” the authors said.

Identifying 8 weeks as a goal for time to surgery can help uncover delays associated with socioeconomic factors and provide adequate time for decision-making, the researchers noted.
 

 

 

Is faster always better?

Dr. Wiener and colleagues cautioned, however, that their findings should be considered “hypothesis generating,” given that decision-making surrounding breast cancer surgery is complex.

Importantly, the authors noted, tumor characteristics, such as tumor size, nodal status, and receptor subtype, appeared to have a pronounced impact on overall survival, compared with timing of surgery. For instance, compared with a tumor size of 2 cm or fewer, larger tumors – those > 2 cm to ≤ 5 cm and > 5 cm – were associated with worse survival (HR, 1.80 and 2.62, respectively).

“This highlights that tumor biology is the primary driver of patients’ breast cancer outcomes,” the authors noted.

In an accompanying editorial, two surgical oncologists highlighted that faster may not always be better.

For instance, Dr. Mukhtar and Dr. Esserman explained, if a patient with a large node-positive, triple-negative breast cancer receives surgery within a week of diagnosis, “one must question whether this timely care represents quality care, as the opportunity to understand tumor response and affect breast cancer survival has been lost.”

The editorialists noted that time to surgery might also matter very little for indolent, screen-detected cancers, and time to treatment start might matter a lot for fast-growing, interval cancers.

In addition, they questioned whether including the socioeconomic factors highlighted in the overall model would “mitigate the association between time to surgery and survival seen in this study.”

Overall, “operating too soon could indicate lack of quality, while operating too late perhaps reflects lack of access to care,” the editorialists said.

This study was supported by grants from the National Cancer Institute and the National Institutes of Health. Dr. Wiener and Dr. Cate report no relevant financial relationships. Dr. Esserman is a member of the Blue Cross Medical advisory panel, is a board member of the Quantum Leap Healthcare Collaborative, and leads an investigator-initiated vaccine trial for high-risk ductal carcinoma in situ, which is funded by Merck.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Most women with breast cancer undergo primary surgery within 8 weeks of diagnosis and any later may be associated with worse overall survival, according to findings from a case series.

With no national quality metrics delineating optimal breast cancer surgery timing, the researchers recommend surgery before 8 weeks from breast cancer diagnosis.

“This time interval does not appear to have a detrimental association with cancer outcomes and allows for multidisciplinary care,” the researchers, led by Alyssa A. Wiener, MD, from University of Wisconsin–Madison, said.

But, in an accompanying editorial, two surgical oncologists questioned whether faster surgery is always better.

“Efficiency might associate with quality, but doesn’t always ensure it,” Rita Mukhtar, MD, and Laura Esserman, MD, with the division of surgical oncology, University of California, San Francisco, said.

The study and editorial were published online in JAMA Surgery.


 

Optimal timing for surgery?

Some studies have found worse survival outcomes for women who experience delays between breast cancer diagnosis and surgical treatment, but the optimal window for surgery and the point at which surgery becomes less advantageous remain unknown.

Using the National Cancer Database, Dr. Wiener and colleagues identified 373,334 women (median age, 61) who were diagnosed with stage I to stage III ductal or lobular breast cancer from 2010 to 2014 and followed up through 2019.

All women underwent surgery as their first course of treatment. Patients with prior breast cancer, those who had neoadjuvant or experimental therapy or missing receptor information, and those who were diagnosed with breast cancer on the date of their primary surgery were excluded.

Most patients had timely surgery. The median time to surgery was 30 days, and 88% of patients underwent surgery before the 57-day time point.

Only 12% of patients had surgery more than 8 weeks after their diagnosis. Factors associated with longer times to surgery included age younger than 45, having Medicaid or no insurance, and lower household income.

The overall 5-year survival for the cohort was high at 90%. On multivariable analysis, the researchers found no statistically significant association between time to surgery and overall survival when surgery was performed between 0 and 8 weeks.

However, women who had surgery 9 or more weeks after diagnosis had a significantly higher rate of death within 5 years, compared with those who had surgery performed between 0 and 4 weeks (hazard ratio, 1.15; P < .001). Performing surgery up to 9 weeks (57-63 days) post diagnosis also did not appear to be negatively associated with survival.

This study “highlights that time to treatment of breast cancer is important,” said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the study. “Surgery is only one-third of the treatment of breast cancer, so these patients who had longer delays to the OR may have also not started their postsurgery treatments in time.”

In addition, the study found that socioeconomic status – Medicaid or uninsured status and lower household incomes – was associated with longer times to surgery.

“Socioeconomic factors like these may be independently associated with worse outcomes and may contribute to some of the disparities in cancer outcomes observed for resource-limited patients due to delayed care,” the authors said.

Identifying 8 weeks as a goal for time to surgery can help uncover delays associated with socioeconomic factors and provide adequate time for decision-making, the researchers noted.
 

 

 

Is faster always better?

Dr. Wiener and colleagues cautioned, however, that their findings should be considered “hypothesis generating,” given that decision-making surrounding breast cancer surgery is complex.

Importantly, the authors noted, tumor characteristics, such as tumor size, nodal status, and receptor subtype, appeared to have a pronounced impact on overall survival, compared with timing of surgery. For instance, compared with a tumor size of 2 cm or fewer, larger tumors – those > 2 cm to ≤ 5 cm and > 5 cm – were associated with worse survival (HR, 1.80 and 2.62, respectively).

“This highlights that tumor biology is the primary driver of patients’ breast cancer outcomes,” the authors noted.

In an accompanying editorial, two surgical oncologists highlighted that faster may not always be better.

For instance, Dr. Mukhtar and Dr. Esserman explained, if a patient with a large node-positive, triple-negative breast cancer receives surgery within a week of diagnosis, “one must question whether this timely care represents quality care, as the opportunity to understand tumor response and affect breast cancer survival has been lost.”

The editorialists noted that time to surgery might also matter very little for indolent, screen-detected cancers, and time to treatment start might matter a lot for fast-growing, interval cancers.

In addition, they questioned whether including the socioeconomic factors highlighted in the overall model would “mitigate the association between time to surgery and survival seen in this study.”

Overall, “operating too soon could indicate lack of quality, while operating too late perhaps reflects lack of access to care,” the editorialists said.

This study was supported by grants from the National Cancer Institute and the National Institutes of Health. Dr. Wiener and Dr. Cate report no relevant financial relationships. Dr. Esserman is a member of the Blue Cross Medical advisory panel, is a board member of the Quantum Leap Healthcare Collaborative, and leads an investigator-initiated vaccine trial for high-risk ductal carcinoma in situ, which is funded by Merck.

A version of this article first appeared on Medscape.com.

Most women with breast cancer undergo primary surgery within 8 weeks of diagnosis and any later may be associated with worse overall survival, according to findings from a case series.

With no national quality metrics delineating optimal breast cancer surgery timing, the researchers recommend surgery before 8 weeks from breast cancer diagnosis.

“This time interval does not appear to have a detrimental association with cancer outcomes and allows for multidisciplinary care,” the researchers, led by Alyssa A. Wiener, MD, from University of Wisconsin–Madison, said.

But, in an accompanying editorial, two surgical oncologists questioned whether faster surgery is always better.

“Efficiency might associate with quality, but doesn’t always ensure it,” Rita Mukhtar, MD, and Laura Esserman, MD, with the division of surgical oncology, University of California, San Francisco, said.

The study and editorial were published online in JAMA Surgery.


 

Optimal timing for surgery?

Some studies have found worse survival outcomes for women who experience delays between breast cancer diagnosis and surgical treatment, but the optimal window for surgery and the point at which surgery becomes less advantageous remain unknown.

Using the National Cancer Database, Dr. Wiener and colleagues identified 373,334 women (median age, 61) who were diagnosed with stage I to stage III ductal or lobular breast cancer from 2010 to 2014 and followed up through 2019.

All women underwent surgery as their first course of treatment. Patients with prior breast cancer, those who had neoadjuvant or experimental therapy or missing receptor information, and those who were diagnosed with breast cancer on the date of their primary surgery were excluded.

Most patients had timely surgery. The median time to surgery was 30 days, and 88% of patients underwent surgery before the 57-day time point.

Only 12% of patients had surgery more than 8 weeks after their diagnosis. Factors associated with longer times to surgery included age younger than 45, having Medicaid or no insurance, and lower household income.

The overall 5-year survival for the cohort was high at 90%. On multivariable analysis, the researchers found no statistically significant association between time to surgery and overall survival when surgery was performed between 0 and 8 weeks.

However, women who had surgery 9 or more weeks after diagnosis had a significantly higher rate of death within 5 years, compared with those who had surgery performed between 0 and 4 weeks (hazard ratio, 1.15; P < .001). Performing surgery up to 9 weeks (57-63 days) post diagnosis also did not appear to be negatively associated with survival.

This study “highlights that time to treatment of breast cancer is important,” said Sarah P. Cate, MD, director, Breast Surgery Quality Program, Mount Sinai Health System, New York, who wasn’t involved in the study. “Surgery is only one-third of the treatment of breast cancer, so these patients who had longer delays to the OR may have also not started their postsurgery treatments in time.”

In addition, the study found that socioeconomic status – Medicaid or uninsured status and lower household incomes – was associated with longer times to surgery.

“Socioeconomic factors like these may be independently associated with worse outcomes and may contribute to some of the disparities in cancer outcomes observed for resource-limited patients due to delayed care,” the authors said.

Identifying 8 weeks as a goal for time to surgery can help uncover delays associated with socioeconomic factors and provide adequate time for decision-making, the researchers noted.
 

 

 

Is faster always better?

Dr. Wiener and colleagues cautioned, however, that their findings should be considered “hypothesis generating,” given that decision-making surrounding breast cancer surgery is complex.

Importantly, the authors noted, tumor characteristics, such as tumor size, nodal status, and receptor subtype, appeared to have a pronounced impact on overall survival, compared with timing of surgery. For instance, compared with a tumor size of 2 cm or fewer, larger tumors – those > 2 cm to ≤ 5 cm and > 5 cm – were associated with worse survival (HR, 1.80 and 2.62, respectively).

“This highlights that tumor biology is the primary driver of patients’ breast cancer outcomes,” the authors noted.

In an accompanying editorial, two surgical oncologists highlighted that faster may not always be better.

For instance, Dr. Mukhtar and Dr. Esserman explained, if a patient with a large node-positive, triple-negative breast cancer receives surgery within a week of diagnosis, “one must question whether this timely care represents quality care, as the opportunity to understand tumor response and affect breast cancer survival has been lost.”

The editorialists noted that time to surgery might also matter very little for indolent, screen-detected cancers, and time to treatment start might matter a lot for fast-growing, interval cancers.

In addition, they questioned whether including the socioeconomic factors highlighted in the overall model would “mitigate the association between time to surgery and survival seen in this study.”

Overall, “operating too soon could indicate lack of quality, while operating too late perhaps reflects lack of access to care,” the editorialists said.

This study was supported by grants from the National Cancer Institute and the National Institutes of Health. Dr. Wiener and Dr. Cate report no relevant financial relationships. Dr. Esserman is a member of the Blue Cross Medical advisory panel, is a board member of the Quantum Leap Healthcare Collaborative, and leads an investigator-initiated vaccine trial for high-risk ductal carcinoma in situ, which is funded by Merck.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Black people are less likely to receive dementia meds

Article Type
Changed
Tue, 04/04/2023 - 14:07

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Black people with dementia are less likely than their White peers to receive cognitive enhancers and other medications for dementia in the outpatient setting, preliminary data from a retrospective study show.

“There have been disparities regarding the use of cognition-enhancing medications in the treatment of dementia described in the literature, and disparities in the use of adjunctive treatments for other neuropsychiatric symptoms of dementia described in hospital and nursing home settings,” said study investigator Alice Hawkins, MD, with the department of neurology, Icahn School of Medicine at Mount Sinai, New York. “However, less is known about use of dementia medications that people take at home. Our study found disparities in this area as well,” Dr. Hawkins said.

The findings were released ahead of the study’s scheduled presentation at the annual meeting of the American Academy of Neurology.
 

More research needed

The researchers analyzed data on 3,655 Black and 12,885 White patients with a diagnosis of dementia who were seen at Mount Sinai. They evaluated utilization of five medication classes:

  • cholinesterase inhibitors.
  • N-methyl D-aspartate (NMDA) receptor antagonists.
  • selective serotonin reuptake inhibitors (SSRIs).
  • antipsychotics.
  • benzodiazepines.

They found that Black patients with dementia received cognitive enhancers less often than White patients with dementia (20% vs. 30% for cholinesterase inhibitors; 10% vs. 17% for NMDA antagonists).

Black patients with dementia were also less likely to receive medications for behavioral and psychological symptom management, compared with White peers (24% vs. 40% for SSRIs; 18% vs. 22% for antipsychotics; and 18% vs. 37% for benzodiazepines).

These disparities remained even after controlling for factors such as demographics and insurance coverage.

“Larger systemic forces such as systemic racism, quality of care, and provider bias are harder to pin down, particularly in the medical record, though they all may be playing a role in perpetuating these inequities. More research will be needed to pinpoint all the factors that are contributing to these disparities,” said Dr. Hawkins.

The researchers found Black patients who were referred to a neurologist received cholinesterase inhibitors and NMDA antagonists at rates comparable with White patients. “Therefore, referrals to specialists such as neurologists may decrease the disparities for these prescriptions,” Dr. Hawkins said.
 

Crucial research

Commenting on the findings, Carl V. Hill, PhD, MPH, Alzheimer’s Association chief diversity, equity, and inclusion officer, said the study “adds to previous research that points to inequities in the administering of medications for dementia symptoms, and highlights the inequities we know exist in dementia care.”

“Cognitive enhancers and other behavioral/psychological management drugs, while they don’t stop, slow, or cure dementia, can offer relief for some of the challenging symptoms associated with diseases caused by dementia. If people aren’t being appropriately prescribed medications that may offer symptom relief from this challenging disease, it could lead to poorer health outcomes,” said Dr. Hill.

“These data underscore the importance of health disparities research that is crucial in uncovering inequities in dementia treatment, care, and research for Black individuals, as well as all underrepresented populations.

“We must create a society in which the underserved, disproportionately affected, and underrepresented are safe, cared for, and valued. This can be done through enhancing cultural competence in health care settings, improving representation within the health care system, and engaging and building trust with diverse communities,” Dr. Hill said.

The Alzheimer’s Association has partnered with more than 500 diverse community-based groups on disease education programs to ensure families have information and resources to navigate this devastating disease.

The study was supported by the American Academy of Neurology Resident Research Scholarship. Dr. Hawkins and Dr. Hill reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood nightmares a prelude to cognitive problems, Parkinson’s?

Article Type
Changed
Tue, 03/07/2023 - 17:19

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

 

Children who suffer from persistent bad dreams may be at increased risk for cognitive impairment or Parkinson’s disease (PD) later in life, new research shows.

Compared with children who never had distressing dreams between ages 7 and 11 years, those who had persistent distressing dreams were 76% more likely to develop cognitive impairment and roughly seven times more likely to develop PD by age 50 years.

It’s been shown previously that sleep problems in adulthood, including distressing dreams, can precede the onset of neurodegenerative diseases such as Alzheimer’s disease (AD) or PD by several years, and in some cases decades, study investigator Abidemi Otaiku, BMBS, University of Birmingham (England), told this news organization.

However, no studies have investigated whether distressing dreams during childhood might also be associated with increased risk for cognitive decline or PD.

“As such, these findings provide evidence for the first time that certain sleep problems in childhood (having regular distressing dreams) could be an early indicator of increased dementia and PD risk,” Dr. Otaiku said.

He noted that the findings build on previous studies which showed that regular nightmares in childhood could be an early indicator for psychiatric problems in adolescence, such as borderline personality disorder, attention-deficit/hyperactivity disorder, and psychosis.

The study was published online February 26 in The Lancet journal eClinicalMedicine.

Statistically significant

The prospective, longitudinal analysis used data from the 1958 British Birth Cohort Study, a prospective birth cohort which included all people born in Britain during a single week in 1958.

At age 7 years (in 1965) and 11 years (in 1969), mothers were asked to report whether their child experienced “bad dreams or night terrors” in the past 3 months, and cognitive impairment and PD were determined at age 50 (2008).

Among a total of 6,991 children (51% girls), 78.2% never had distressing dreams, 17.9% had transient distressing dreams (either at ages 7 or 11 years), and 3.8% had persistent distressing dreams (at both ages 7 and 11 years).

By age 50, 262 participants had developed cognitive impairment, and five had been diagnosed with PD.

After adjusting for all covariates, having more regular distressing dreams during childhood was “linearly and statistically significantly” associated with higher risk of developing cognitive impairment or PD by age 50 years (P = .037). This was the case in both boys and girls.

Compared with children who never had bad dreams, peers who had persistent distressing dreams (at ages 7 and 11 years) had an 85% increased risk for cognitive impairment or PD by age 50 (adjusted odds ratio, 1.85; 95% confidence interval, 1.10-3.11; P = .019).

The associations remained when incident cognitive impairment and incident PD were analyzed separately.

Compared with children who never had distressing dreams, children who had persistent distressing dreams were 76% more likely to develop cognitive impairment by age 50 years (aOR, 1.76; 95% CI, 1.03-2.99; P = .037), and were about seven times more likely to be diagnosed with PD by age 50 years (aOR, 7.35; 95% CI, 1.03-52.73; P = .047).

The linear association was statistically significant for PD (P = .050) and had a trend toward statistical significance for cognitive impairment (P = .074).

 

 

Mechanism unclear

“Early-life nightmares might be causally associated with cognitive impairment and PD, noncausally associated with cognitive impairment and PD, or both. At this stage it remains unclear which of the three options is correct. Therefore, further research on mechanisms is needed,” Dr. Otaiku told this news organization.

“One plausible noncausal explanation is that there are shared genetic factors which predispose individuals to having frequent nightmares in childhood, and to developing neurodegenerative diseases such as AD or PD in adulthood,” he added.

It’s also plausible that having regular nightmares throughout childhood could be a causal risk factor for cognitive impairment and PD by causing chronic sleep disruption, he noted.

“Chronic sleep disruption due to nightmares might lead to impaired glymphatic clearance during sleep – and thus greater accumulation of pathological proteins in the brain, such as amyloid-beta and alpha-synuclein,” Dr. Otaiku said.

Disrupted sleep throughout childhood might also impair normal brain development, which could make children’s brains less resilient to neuropathologic damage, he said.

Clinical implications?

There are established treatments for childhood nightmares, including nonpharmacologic approaches.

“For children who have regular nightmares that lead to impaired daytime functioning, it may well be a good idea for them to see a sleep physician to discuss whether treatment may be needed,” Dr. Otaiku said.

But should doctors treat children with persistent nightmares for the purpose of preventing neurodegenerative diseases in adulthood or psychiatric problems in adolescence?

“It’s an interesting possibility. However, more research is needed to confirm these epidemiological associations and to determine whether or not nightmares are a causal risk factor for these conditions,” Dr. Otaiku concluded.

The study received no external funding. Dr. Otaiku reports no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECLINICALMEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article