Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Is Anxiety a Prodromal Feature of Parkinson’s Disease?

Article Type
Changed
Tue, 07/02/2024 - 12:34

Individuals with anxiety have at least a twofold higher risk of developing Parkinson’s disease than those without anxiety, new research suggested.

Investigators drew on 10-year data from primary care registry to compare almost 110,000 patients who developed anxiety after the age of 50 years with close to 900,000 matched controls without anxiety.

After adjusting for a variety of sociodemographic, lifestyle, psychiatric, and neurological factors, they found that the risk of developing Parkinson’s disease was double in those with anxiety, compared with controls.

“Anxiety is known to be a feature of the early stages of Parkinson’s disease, but prior to our study, the prospective risk of Parkinson’s in those over the age of 50 with new-onset anxiety was unknown,” colead author Juan Bazo Alvarez, a senior research fellow in the Division of Epidemiology and Health at University College London, London, England, said in a news release.

The study was published online in the British Journal of General Practice.

The presence of anxiety is increased in prodromal Parkinson’s disease, but the prospective risk for Parkinson’s disease in those aged 50 years or older with new-onset anxiety was largely unknown.

Investigators analyzed data from a large UK primary care dataset that includes all people aged between 50 and 99 years who were registered with a participating practice from Jan. 1, 2008, to Dec. 31, 2018.

They identified 109,435 people (35% men) with more than one anxiety record in the database but no previous record of anxiety for 1 year or more and 878,256 people (37% men) with no history of anxiety (control group).

Features of Parkinson’s disease such as sleep problems, depression, tremor, and impaired balance were then tracked from the point of the anxiety diagnosis until 1 year before the Parkinson’s disease diagnosis.

Among those with anxiety, 331 developed Parkinson’s disease during the follow-up period, with a median time to diagnosis of 4.9 years after the first recorded episode of anxiety.

The incidence of Parkinson’s disease was 1.2 per 1000 person-years (95% CI, 0.92-1.13) in those with anxiety versus 0.49 (95% CI, 0.47-0.52) in those without anxiety.

After adjustment for age, sex, social deprivation, lifestyle factors, severe mental illness, head trauma, and dementia, the risk for Parkinson’s disease was double in those with anxiety, compared with the non-anxiety group (hazard ratio, 2.1; 95% CI, 1.9-2.4).

Individuals without anxiety also developed Parkinson’s disease later than those with anxiety.

The researchers identified specific symptoms that were associated with later development of Parkinson’s disease in those with anxiety, including depression, sleep disturbance, fatigue, and cognitive impairment, among other symptoms.

“The results suggest that there is a strong association between anxiety and diagnosis of Parkinson’s disease in patients aged over 50 years who present with a new diagnosis of anxiety,” the authors wrote. “This provides evidence for anxiety as a prodromal presentation of Parkinson’s disease.”

Future research “should explore anxiety in relation to other prodromal symptoms and how this symptom complex is associated with the incidence of Parkinson’s disease,” the researchers wrote. Doing so “may lead to earlier diagnosis and better management of Parkinson’s disease.”

This study was funded by the European Union. Specific authors received funding from the National Institute for Health and Care Research and the Alzheimer’s Society Clinical Training Fellowship program. The authors declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Individuals with anxiety have at least a twofold higher risk of developing Parkinson’s disease than those without anxiety, new research suggested.

Investigators drew on 10-year data from primary care registry to compare almost 110,000 patients who developed anxiety after the age of 50 years with close to 900,000 matched controls without anxiety.

After adjusting for a variety of sociodemographic, lifestyle, psychiatric, and neurological factors, they found that the risk of developing Parkinson’s disease was double in those with anxiety, compared with controls.

“Anxiety is known to be a feature of the early stages of Parkinson’s disease, but prior to our study, the prospective risk of Parkinson’s in those over the age of 50 with new-onset anxiety was unknown,” colead author Juan Bazo Alvarez, a senior research fellow in the Division of Epidemiology and Health at University College London, London, England, said in a news release.

The study was published online in the British Journal of General Practice.

The presence of anxiety is increased in prodromal Parkinson’s disease, but the prospective risk for Parkinson’s disease in those aged 50 years or older with new-onset anxiety was largely unknown.

Investigators analyzed data from a large UK primary care dataset that includes all people aged between 50 and 99 years who were registered with a participating practice from Jan. 1, 2008, to Dec. 31, 2018.

They identified 109,435 people (35% men) with more than one anxiety record in the database but no previous record of anxiety for 1 year or more and 878,256 people (37% men) with no history of anxiety (control group).

Features of Parkinson’s disease such as sleep problems, depression, tremor, and impaired balance were then tracked from the point of the anxiety diagnosis until 1 year before the Parkinson’s disease diagnosis.

Among those with anxiety, 331 developed Parkinson’s disease during the follow-up period, with a median time to diagnosis of 4.9 years after the first recorded episode of anxiety.

The incidence of Parkinson’s disease was 1.2 per 1000 person-years (95% CI, 0.92-1.13) in those with anxiety versus 0.49 (95% CI, 0.47-0.52) in those without anxiety.

After adjustment for age, sex, social deprivation, lifestyle factors, severe mental illness, head trauma, and dementia, the risk for Parkinson’s disease was double in those with anxiety, compared with the non-anxiety group (hazard ratio, 2.1; 95% CI, 1.9-2.4).

Individuals without anxiety also developed Parkinson’s disease later than those with anxiety.

The researchers identified specific symptoms that were associated with later development of Parkinson’s disease in those with anxiety, including depression, sleep disturbance, fatigue, and cognitive impairment, among other symptoms.

“The results suggest that there is a strong association between anxiety and diagnosis of Parkinson’s disease in patients aged over 50 years who present with a new diagnosis of anxiety,” the authors wrote. “This provides evidence for anxiety as a prodromal presentation of Parkinson’s disease.”

Future research “should explore anxiety in relation to other prodromal symptoms and how this symptom complex is associated with the incidence of Parkinson’s disease,” the researchers wrote. Doing so “may lead to earlier diagnosis and better management of Parkinson’s disease.”

This study was funded by the European Union. Specific authors received funding from the National Institute for Health and Care Research and the Alzheimer’s Society Clinical Training Fellowship program. The authors declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Individuals with anxiety have at least a twofold higher risk of developing Parkinson’s disease than those without anxiety, new research suggested.

Investigators drew on 10-year data from primary care registry to compare almost 110,000 patients who developed anxiety after the age of 50 years with close to 900,000 matched controls without anxiety.

After adjusting for a variety of sociodemographic, lifestyle, psychiatric, and neurological factors, they found that the risk of developing Parkinson’s disease was double in those with anxiety, compared with controls.

“Anxiety is known to be a feature of the early stages of Parkinson’s disease, but prior to our study, the prospective risk of Parkinson’s in those over the age of 50 with new-onset anxiety was unknown,” colead author Juan Bazo Alvarez, a senior research fellow in the Division of Epidemiology and Health at University College London, London, England, said in a news release.

The study was published online in the British Journal of General Practice.

The presence of anxiety is increased in prodromal Parkinson’s disease, but the prospective risk for Parkinson’s disease in those aged 50 years or older with new-onset anxiety was largely unknown.

Investigators analyzed data from a large UK primary care dataset that includes all people aged between 50 and 99 years who were registered with a participating practice from Jan. 1, 2008, to Dec. 31, 2018.

They identified 109,435 people (35% men) with more than one anxiety record in the database but no previous record of anxiety for 1 year or more and 878,256 people (37% men) with no history of anxiety (control group).

Features of Parkinson’s disease such as sleep problems, depression, tremor, and impaired balance were then tracked from the point of the anxiety diagnosis until 1 year before the Parkinson’s disease diagnosis.

Among those with anxiety, 331 developed Parkinson’s disease during the follow-up period, with a median time to diagnosis of 4.9 years after the first recorded episode of anxiety.

The incidence of Parkinson’s disease was 1.2 per 1000 person-years (95% CI, 0.92-1.13) in those with anxiety versus 0.49 (95% CI, 0.47-0.52) in those without anxiety.

After adjustment for age, sex, social deprivation, lifestyle factors, severe mental illness, head trauma, and dementia, the risk for Parkinson’s disease was double in those with anxiety, compared with the non-anxiety group (hazard ratio, 2.1; 95% CI, 1.9-2.4).

Individuals without anxiety also developed Parkinson’s disease later than those with anxiety.

The researchers identified specific symptoms that were associated with later development of Parkinson’s disease in those with anxiety, including depression, sleep disturbance, fatigue, and cognitive impairment, among other symptoms.

“The results suggest that there is a strong association between anxiety and diagnosis of Parkinson’s disease in patients aged over 50 years who present with a new diagnosis of anxiety,” the authors wrote. “This provides evidence for anxiety as a prodromal presentation of Parkinson’s disease.”

Future research “should explore anxiety in relation to other prodromal symptoms and how this symptom complex is associated with the incidence of Parkinson’s disease,” the researchers wrote. Doing so “may lead to earlier diagnosis and better management of Parkinson’s disease.”

This study was funded by the European Union. Specific authors received funding from the National Institute for Health and Care Research and the Alzheimer’s Society Clinical Training Fellowship program. The authors declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE BRITISH JOURNAL OF GENERAL PRACTICE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Benzos Are Hard on the Brain, But Do They Raise Dementia Risk?

Article Type
Changed
Tue, 07/02/2024 - 12:20

New research supports current guidelines cautioning against long-term use of benzodiazepines.

The study of more than 5000 older adults found that benzodiazepine use was associated with an accelerated reduction in the volume of the hippocampus and amygdala — brain regions involved in memory and mood regulation. However, benzodiazepine use overall was not associated with an increased risk for dementia.

The findings suggest that benzodiazepine use “may have subtle, long-term impact on brain health,” lead investigator Frank Wolters, MD, PhD, with Erasmus University Medical Center, Rotterdam, the Netherlands, and colleagues wrote.

The study was published online in BMC Medicine.
 

Conflicting Evidence 

Benzodiazepines are commonly prescribed in older adults for anxiety and sleep disorders. Though the short-term cognitive side effects are well documented, the long-term impact on neurodegeneration and dementia risk remains unclear. Some studies have linked benzodiazepine use to an increased risk for dementia, whereas others have not.

Dr. Wolters and colleagues assessed the effect of benzodiazepine use on long-term dementia risk and on imaging markers of neurodegeneration in 5443 cognitively healthy adults (mean age, 71 years; 57% women) from the population-based Rotterdam Study. 

Benzodiazepine use between 1991 and 2008 was determined using pharmacy dispensing records, and dementia incidence was determined from medical records. 

Half of the participants had used benzodiazepines at any time in the 15 years before baseline (2005-2008); 47% used anxiolytics, 20% used sedative-hypnotics, 34% used both, and 13% were still using the drugs at the baseline assessment. 

During an average follow-up of 11 years, 13% of participants developed dementia. 

Overall, use of benzodiazepines was not associated with dementia risk, compared with never-use (hazard ratio [HR], 1.06), irrespective of cumulative dose. 

The risk for dementia was somewhat higher with any use of anxiolytics than with sedative-hypnotics (HR, 1.17 vs HR, 0.92), although neither was statistically significant. The highest risk estimates were observed for high cumulative dose of anxiolytics (HR, 1.33). 

Sensitivity analyses of the two most commonly used anxiolytics found no differences in risk between use of short half-life oxazepam and long half-life diazepam (HR, 1.01 and HR, 1.06, respectively, for ever-use, compared with never-use for oxazepam and diazepam).
 

Brain Atrophy

The researchers investigated potential associations between benzodiazepine use and brain volumes using brain MRI imaging from 4836 participants.

They found that current use of a benzodiazepine at baseline was significantly associated with lower total brain volume — as well as lower hippocampus, amygdala, and thalamus volume cross-sectionally — and with accelerated volume loss of the hippocampus and, to a lesser extent, amygdala longitudinally. 

Imaging findings did not differ by type of benzodiazepine used or cumulative dose. 

“Given the availability of effective alternative pharmacological and nonpharmacological treatments for anxiety and sleep problems, it is important to carefully consider the necessity of prolonged benzodiazepine use in light of potential detrimental effects on brain health,” the authors wrote. 
 

Risks Go Beyond the Brain

Commenting on the study, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, Florida, noted that “chronic benzodiazepine use may reduce neuroplasticity, potentially interfering with the brain’s ability to form new connections and adapt.

“Long-term use can lead to down-regulation of GABA receptors, altering the brain’s natural inhibitory mechanisms and potentially contributing to tolerance and withdrawal symptoms. Prolonged use can also disrupt the balance of various neurotransmitter systems beyond just GABA, potentially affecting mood, cognition, and overall brain function,” said Dr. Lakhan, who was not involved in the study. 

“While the literature is mixed on chronic benzodiazepine use and dementia risk, prolonged use has consistently been associated with accelerated volume loss in certain brain regions, particularly the hippocampus and amygdala,” which are responsible for memory, learning, and emotional regulation, he noted. 

“Beyond cognitive impairments and brain volume loss, chronic benzodiazepine use is associated with tolerance and dependence, potential for abuse, interactions with other drugs, and increased fall risk, especially in older adults,” Dr. Lakhan added.

Current guidelines discourage long-term use of benzodiazepines because of risk for psychological and physical dependence; falls; and cognitive impairment, especially in older adults. Nevertheless, research shows that 30%-40% of older benzodiazepine users stay on the medication beyond the recommended period of several weeks.

Donovan T. Maust, MD, Department of Psychiatry, University of Michigan Medical School, Ann Arbor, said in an interview these new findings are consistent with other recently published observational research that suggest benzodiazepine use is not linked to dementia risk. 

“I realize that such meta-analyses that find a positive relationship between benzodiazepines and dementia are out there, but they include older, less rigorous studies,” said Dr. Maust, who was not part of the new study. “In my opinion, the jury is not still out on this topic. However, there are plenty of other reasons to avoid them — and in particular, starting them — in older adults, most notably the increased risk of fall injury as well as increased overdose risk when taken along with opioids.”

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

New research supports current guidelines cautioning against long-term use of benzodiazepines.

The study of more than 5000 older adults found that benzodiazepine use was associated with an accelerated reduction in the volume of the hippocampus and amygdala — brain regions involved in memory and mood regulation. However, benzodiazepine use overall was not associated with an increased risk for dementia.

The findings suggest that benzodiazepine use “may have subtle, long-term impact on brain health,” lead investigator Frank Wolters, MD, PhD, with Erasmus University Medical Center, Rotterdam, the Netherlands, and colleagues wrote.

The study was published online in BMC Medicine.
 

Conflicting Evidence 

Benzodiazepines are commonly prescribed in older adults for anxiety and sleep disorders. Though the short-term cognitive side effects are well documented, the long-term impact on neurodegeneration and dementia risk remains unclear. Some studies have linked benzodiazepine use to an increased risk for dementia, whereas others have not.

Dr. Wolters and colleagues assessed the effect of benzodiazepine use on long-term dementia risk and on imaging markers of neurodegeneration in 5443 cognitively healthy adults (mean age, 71 years; 57% women) from the population-based Rotterdam Study. 

Benzodiazepine use between 1991 and 2008 was determined using pharmacy dispensing records, and dementia incidence was determined from medical records. 

Half of the participants had used benzodiazepines at any time in the 15 years before baseline (2005-2008); 47% used anxiolytics, 20% used sedative-hypnotics, 34% used both, and 13% were still using the drugs at the baseline assessment. 

During an average follow-up of 11 years, 13% of participants developed dementia. 

Overall, use of benzodiazepines was not associated with dementia risk, compared with never-use (hazard ratio [HR], 1.06), irrespective of cumulative dose. 

The risk for dementia was somewhat higher with any use of anxiolytics than with sedative-hypnotics (HR, 1.17 vs HR, 0.92), although neither was statistically significant. The highest risk estimates were observed for high cumulative dose of anxiolytics (HR, 1.33). 

Sensitivity analyses of the two most commonly used anxiolytics found no differences in risk between use of short half-life oxazepam and long half-life diazepam (HR, 1.01 and HR, 1.06, respectively, for ever-use, compared with never-use for oxazepam and diazepam).
 

Brain Atrophy

The researchers investigated potential associations between benzodiazepine use and brain volumes using brain MRI imaging from 4836 participants.

They found that current use of a benzodiazepine at baseline was significantly associated with lower total brain volume — as well as lower hippocampus, amygdala, and thalamus volume cross-sectionally — and with accelerated volume loss of the hippocampus and, to a lesser extent, amygdala longitudinally. 

Imaging findings did not differ by type of benzodiazepine used or cumulative dose. 

“Given the availability of effective alternative pharmacological and nonpharmacological treatments for anxiety and sleep problems, it is important to carefully consider the necessity of prolonged benzodiazepine use in light of potential detrimental effects on brain health,” the authors wrote. 
 

Risks Go Beyond the Brain

Commenting on the study, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, Florida, noted that “chronic benzodiazepine use may reduce neuroplasticity, potentially interfering with the brain’s ability to form new connections and adapt.

“Long-term use can lead to down-regulation of GABA receptors, altering the brain’s natural inhibitory mechanisms and potentially contributing to tolerance and withdrawal symptoms. Prolonged use can also disrupt the balance of various neurotransmitter systems beyond just GABA, potentially affecting mood, cognition, and overall brain function,” said Dr. Lakhan, who was not involved in the study. 

“While the literature is mixed on chronic benzodiazepine use and dementia risk, prolonged use has consistently been associated with accelerated volume loss in certain brain regions, particularly the hippocampus and amygdala,” which are responsible for memory, learning, and emotional regulation, he noted. 

“Beyond cognitive impairments and brain volume loss, chronic benzodiazepine use is associated with tolerance and dependence, potential for abuse, interactions with other drugs, and increased fall risk, especially in older adults,” Dr. Lakhan added.

Current guidelines discourage long-term use of benzodiazepines because of risk for psychological and physical dependence; falls; and cognitive impairment, especially in older adults. Nevertheless, research shows that 30%-40% of older benzodiazepine users stay on the medication beyond the recommended period of several weeks.

Donovan T. Maust, MD, Department of Psychiatry, University of Michigan Medical School, Ann Arbor, said in an interview these new findings are consistent with other recently published observational research that suggest benzodiazepine use is not linked to dementia risk. 

“I realize that such meta-analyses that find a positive relationship between benzodiazepines and dementia are out there, but they include older, less rigorous studies,” said Dr. Maust, who was not part of the new study. “In my opinion, the jury is not still out on this topic. However, there are plenty of other reasons to avoid them — and in particular, starting them — in older adults, most notably the increased risk of fall injury as well as increased overdose risk when taken along with opioids.”

A version of this article first appeared on Medscape.com.

New research supports current guidelines cautioning against long-term use of benzodiazepines.

The study of more than 5000 older adults found that benzodiazepine use was associated with an accelerated reduction in the volume of the hippocampus and amygdala — brain regions involved in memory and mood regulation. However, benzodiazepine use overall was not associated with an increased risk for dementia.

The findings suggest that benzodiazepine use “may have subtle, long-term impact on brain health,” lead investigator Frank Wolters, MD, PhD, with Erasmus University Medical Center, Rotterdam, the Netherlands, and colleagues wrote.

The study was published online in BMC Medicine.
 

Conflicting Evidence 

Benzodiazepines are commonly prescribed in older adults for anxiety and sleep disorders. Though the short-term cognitive side effects are well documented, the long-term impact on neurodegeneration and dementia risk remains unclear. Some studies have linked benzodiazepine use to an increased risk for dementia, whereas others have not.

Dr. Wolters and colleagues assessed the effect of benzodiazepine use on long-term dementia risk and on imaging markers of neurodegeneration in 5443 cognitively healthy adults (mean age, 71 years; 57% women) from the population-based Rotterdam Study. 

Benzodiazepine use between 1991 and 2008 was determined using pharmacy dispensing records, and dementia incidence was determined from medical records. 

Half of the participants had used benzodiazepines at any time in the 15 years before baseline (2005-2008); 47% used anxiolytics, 20% used sedative-hypnotics, 34% used both, and 13% were still using the drugs at the baseline assessment. 

During an average follow-up of 11 years, 13% of participants developed dementia. 

Overall, use of benzodiazepines was not associated with dementia risk, compared with never-use (hazard ratio [HR], 1.06), irrespective of cumulative dose. 

The risk for dementia was somewhat higher with any use of anxiolytics than with sedative-hypnotics (HR, 1.17 vs HR, 0.92), although neither was statistically significant. The highest risk estimates were observed for high cumulative dose of anxiolytics (HR, 1.33). 

Sensitivity analyses of the two most commonly used anxiolytics found no differences in risk between use of short half-life oxazepam and long half-life diazepam (HR, 1.01 and HR, 1.06, respectively, for ever-use, compared with never-use for oxazepam and diazepam).
 

Brain Atrophy

The researchers investigated potential associations between benzodiazepine use and brain volumes using brain MRI imaging from 4836 participants.

They found that current use of a benzodiazepine at baseline was significantly associated with lower total brain volume — as well as lower hippocampus, amygdala, and thalamus volume cross-sectionally — and with accelerated volume loss of the hippocampus and, to a lesser extent, amygdala longitudinally. 

Imaging findings did not differ by type of benzodiazepine used or cumulative dose. 

“Given the availability of effective alternative pharmacological and nonpharmacological treatments for anxiety and sleep problems, it is important to carefully consider the necessity of prolonged benzodiazepine use in light of potential detrimental effects on brain health,” the authors wrote. 
 

Risks Go Beyond the Brain

Commenting on the study, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, Florida, noted that “chronic benzodiazepine use may reduce neuroplasticity, potentially interfering with the brain’s ability to form new connections and adapt.

“Long-term use can lead to down-regulation of GABA receptors, altering the brain’s natural inhibitory mechanisms and potentially contributing to tolerance and withdrawal symptoms. Prolonged use can also disrupt the balance of various neurotransmitter systems beyond just GABA, potentially affecting mood, cognition, and overall brain function,” said Dr. Lakhan, who was not involved in the study. 

“While the literature is mixed on chronic benzodiazepine use and dementia risk, prolonged use has consistently been associated with accelerated volume loss in certain brain regions, particularly the hippocampus and amygdala,” which are responsible for memory, learning, and emotional regulation, he noted. 

“Beyond cognitive impairments and brain volume loss, chronic benzodiazepine use is associated with tolerance and dependence, potential for abuse, interactions with other drugs, and increased fall risk, especially in older adults,” Dr. Lakhan added.

Current guidelines discourage long-term use of benzodiazepines because of risk for psychological and physical dependence; falls; and cognitive impairment, especially in older adults. Nevertheless, research shows that 30%-40% of older benzodiazepine users stay on the medication beyond the recommended period of several weeks.

Donovan T. Maust, MD, Department of Psychiatry, University of Michigan Medical School, Ann Arbor, said in an interview these new findings are consistent with other recently published observational research that suggest benzodiazepine use is not linked to dementia risk. 

“I realize that such meta-analyses that find a positive relationship between benzodiazepines and dementia are out there, but they include older, less rigorous studies,” said Dr. Maust, who was not part of the new study. “In my opinion, the jury is not still out on this topic. However, there are plenty of other reasons to avoid them — and in particular, starting them — in older adults, most notably the increased risk of fall injury as well as increased overdose risk when taken along with opioids.”

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM BMC MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cardiovascular Health Becoming a Major Risk Factor for Dementia

Article Type
Changed
Wed, 07/10/2024 - 14:05

In a shifting landscape in dementia risk factors, cardiovascular health is now taking precedence.

That’s according to researchers from University College London (UCL) in the United Kingdom who analyzed 27 papers about dementia that had data collected over more than 70 years. They calculated what share of dementia cases were due to different risk factors. Their findings were recently published in the Lancet Public Health.

Top risk factors for dementia over the years have been hypertension, obesity, diabetes, education, and smoking, according to a news release on the findings. But the prevalence of risk factors has changed over the decades.

Researchers said smoking and education have become less important risk factors because of “population-level interventions,” such as stop-smoking campaigns and compulsory public education. On the other hand, obesity and diabetes rates have increased and become bigger risk factors.

Hypertension remains the greatest risk factor, even though doctors and public health groups are putting more emphasis on managing the condition, the study said.

“Cardiovascular risk factors may have contributed more to dementia risk over time, so these deserve more targeted action for future dementia prevention efforts,” said Naaheed Mukadam, PhD, an associate professor at UCL and the lead author of the study.

Eliminating modifiable risk factors could theoretically prevent 40% of dementia cases, the release said. 

The CDC says that an estimated 5.8 million people in the United States have Alzheimer’s disease and related dementias, including 5.6 million people ages 65 and older and about 200,000 under age 65. The UCL release said an estimated 944,000 in the U.K. have dementia. 

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

In a shifting landscape in dementia risk factors, cardiovascular health is now taking precedence.

That’s according to researchers from University College London (UCL) in the United Kingdom who analyzed 27 papers about dementia that had data collected over more than 70 years. They calculated what share of dementia cases were due to different risk factors. Their findings were recently published in the Lancet Public Health.

Top risk factors for dementia over the years have been hypertension, obesity, diabetes, education, and smoking, according to a news release on the findings. But the prevalence of risk factors has changed over the decades.

Researchers said smoking and education have become less important risk factors because of “population-level interventions,” such as stop-smoking campaigns and compulsory public education. On the other hand, obesity and diabetes rates have increased and become bigger risk factors.

Hypertension remains the greatest risk factor, even though doctors and public health groups are putting more emphasis on managing the condition, the study said.

“Cardiovascular risk factors may have contributed more to dementia risk over time, so these deserve more targeted action for future dementia prevention efforts,” said Naaheed Mukadam, PhD, an associate professor at UCL and the lead author of the study.

Eliminating modifiable risk factors could theoretically prevent 40% of dementia cases, the release said. 

The CDC says that an estimated 5.8 million people in the United States have Alzheimer’s disease and related dementias, including 5.6 million people ages 65 and older and about 200,000 under age 65. The UCL release said an estimated 944,000 in the U.K. have dementia. 

A version of this article first appeared on WebMD.com.

In a shifting landscape in dementia risk factors, cardiovascular health is now taking precedence.

That’s according to researchers from University College London (UCL) in the United Kingdom who analyzed 27 papers about dementia that had data collected over more than 70 years. They calculated what share of dementia cases were due to different risk factors. Their findings were recently published in the Lancet Public Health.

Top risk factors for dementia over the years have been hypertension, obesity, diabetes, education, and smoking, according to a news release on the findings. But the prevalence of risk factors has changed over the decades.

Researchers said smoking and education have become less important risk factors because of “population-level interventions,” such as stop-smoking campaigns and compulsory public education. On the other hand, obesity and diabetes rates have increased and become bigger risk factors.

Hypertension remains the greatest risk factor, even though doctors and public health groups are putting more emphasis on managing the condition, the study said.

“Cardiovascular risk factors may have contributed more to dementia risk over time, so these deserve more targeted action for future dementia prevention efforts,” said Naaheed Mukadam, PhD, an associate professor at UCL and the lead author of the study.

Eliminating modifiable risk factors could theoretically prevent 40% of dementia cases, the release said. 

The CDC says that an estimated 5.8 million people in the United States have Alzheimer’s disease and related dementias, including 5.6 million people ages 65 and older and about 200,000 under age 65. The UCL release said an estimated 944,000 in the U.K. have dementia. 

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE LANCET PUBLIC HEALTH

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Revised Criteria for Alzheimer’s Diagnosis, Staging Released

Article Type
Changed
Mon, 07/01/2024 - 15:15

A work group convened by the Alzheimer’s Association has released revised biology-based criteria for the diagnosis and staging of Alzheimer’s disease, including a new biomarker classification system that incorporates fluid and imaging biomarkers as well as an updated disease staging system. 

“Plasma markers are here now, and it’s very important to incorporate them into the criteria for diagnosis,” said senior author Maria C. Carrillo, PhD, Alzheimer’s Association chief science officer and medical affairs lead. 

The revised criteria are the first updates since 2018.

“Defining diseases biologically, rather than based on syndromic presentation, has long been standard in many areas of medicine — including cancer, heart disease, and diabetes — and is becoming a unifying concept common to all neurodegenerative diseases,” lead author Clifford Jack Jr, MD, with Mayo Clinic, Rochester, Minnesota, said in a news release from the Alzheimer’s Association. 

“These updates to the diagnostic criteria are needed now because we know more about the underlying biology of Alzheimer’s and we are able to measure those changes,” Dr. Jack added. 

The 2024 revised criteria for diagnosis and staging of Alzheimer’s disease were published online in Alzheimer’s & Dementia
 

Core Biomarkers Defined

The revised criteria define Alzheimer’s disease as a biologic process that begins with the appearance of Alzheimer’s disease neuropathologic change (ADNPC) in the absence of symptoms. Progression of the neuropathologic burden leads to the later appearance and progression of clinical symptoms.

The work group organized Alzheimer’s disease biomarkers into three broad categories: (1) core biomarkers of ADNPC, (2) nonspecific biomarkers that are important in Alzheimer’s disease but are also involved in other brain diseases, and (3) biomarkers of diseases or conditions that commonly coexist with Alzheimer’s disease.

Core Alzheimer’s biomarkers are subdivided into Core 1 and Core 2. 

Core 1 biomarkers become abnormal early in the disease course and directly measure either amyloid plaques or phosphorylated tau (p-tau). They include amyloid PET; cerebrospinal fluid (CSF) amyloid beta 42/40 ratio, CSF p-tau181/amyloid beta 42 ratio, and CSF total (t)-tau/amyloid beta 42 ratio; and “accurate” plasma biomarkers, such as p-tau217. 

“An abnormal Core 1 biomarker result is sufficient to establish a diagnosis of Alzheimer’s disease and to inform clinical decision making [sic] throughout the disease continuum,” the work group wrote. 

Core 2 biomarkers become abnormal later in the disease process and are more closely linked with the onset of symptoms. Core 2 biomarkers include tau PET and certain soluble tau fragments associated with tau proteinopathy (eg, MTBR-tau243) but also pT205 and nonphosphorylated mid-region tau fragments. 

Core 2 biomarkers, when combined with Core 1, may be used to stage biologic disease severity; abnormal Core 2 biomarkers “increase confidence that Alzheimer’s disease is contributing to symptoms,” the work group noted. 

The revised criteria give clinicians “the flexibility to use plasma or PET scans or CSF,” Dr. Carrillo said. “They will have several tools that they can choose from and offer this variety of tools to their patients. We need different tools for different individuals. There will be differences in coverage and access to these diagnostics.” 

The revised criteria also include an integrated biologic and clinical staging scheme that acknowledges the fact that common co-pathologies, cognitive reserve, and resistance may modify relationships between clinical and biologic Alzheimer’s disease stages. 
 

 

 

Formal Guidelines to Come 

The work group noted that currently, the clinical use of Alzheimer’s disease biomarkers is intended for the evaluation of symptomatic patients, not cognitively unimpaired individuals.

Disease-targeted therapies have not yet been approved for cognitively unimpaired individuals. For this reason, the work group currently recommends against diagnostic testing in cognitively unimpaired individuals outside the context of observational or therapeutic research studies. 

This recommendation would change in the future if disease-targeted therapies that are currently being evaluated in trials demonstrate a benefit in preventing cognitive decline and are approved for use in preclinical Alzheimer’s disease, they wrote. 

They emphasize that the revised criteria are not intended to provide step-by-step clinical practice guidelines for clinicians. Rather, they provide general principles to inform diagnosis and staging of Alzheimer’s disease that reflect current science.

“This is just the beginning,” said Dr. Carrillo. “This is a gathering of the evidence to date and putting it in one place so we can have a consensus and actually a way to test it and make it better as we add new science.”

This also serves as a “springboard” for the Alzheimer’s Association to create formal clinical guidelines. “That will come, hopefully, over the next 12 months. We’ll be working on it, and we hope to have that in 2025,” Dr. Carrillo said. 

The revised criteria also emphasize the role of the clinician. 

“The biologically based diagnosis of Alzheimer’s disease is meant to assist, rather than supplant, the clinical evaluation of individuals with cognitive impairment,” the work group wrote in a related commentary published online in Nature Medicine

Recent diagnostics and therapeutic developments “herald a virtuous cycle in which improvements in diagnostic methods enable more sophisticated treatment approaches, which in turn steer advances in diagnostic methods,” they continued. “An unchanging principle, however, is that effective treatment will always rely on the ability to diagnose and stage the biology driving the disease process.”

Funding for this research was provided by the National Institutes of Health, Alexander family professorship, GHR Foundation, Alzheimer’s Association, Veterans Administration, Life Molecular Imaging, Michael J. Fox Foundation for Parkinson’s Research, Avid Radiopharmaceuticals, Eli Lilly, Gates Foundation, Biogen, C2N Diagnostics, Eisai, Fujirebio, GE Healthcare, Roche, National Institute on Aging, Roche/Genentech, BrightFocus Foundation, Hoffmann-La Roche, Novo Nordisk, Toyama, National MS Society, Alzheimer Drug Discovery Foundation, and others. A complete list of donors and disclosures is included in the original article.

 A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A work group convened by the Alzheimer’s Association has released revised biology-based criteria for the diagnosis and staging of Alzheimer’s disease, including a new biomarker classification system that incorporates fluid and imaging biomarkers as well as an updated disease staging system. 

“Plasma markers are here now, and it’s very important to incorporate them into the criteria for diagnosis,” said senior author Maria C. Carrillo, PhD, Alzheimer’s Association chief science officer and medical affairs lead. 

The revised criteria are the first updates since 2018.

“Defining diseases biologically, rather than based on syndromic presentation, has long been standard in many areas of medicine — including cancer, heart disease, and diabetes — and is becoming a unifying concept common to all neurodegenerative diseases,” lead author Clifford Jack Jr, MD, with Mayo Clinic, Rochester, Minnesota, said in a news release from the Alzheimer’s Association. 

“These updates to the diagnostic criteria are needed now because we know more about the underlying biology of Alzheimer’s and we are able to measure those changes,” Dr. Jack added. 

The 2024 revised criteria for diagnosis and staging of Alzheimer’s disease were published online in Alzheimer’s & Dementia
 

Core Biomarkers Defined

The revised criteria define Alzheimer’s disease as a biologic process that begins with the appearance of Alzheimer’s disease neuropathologic change (ADNPC) in the absence of symptoms. Progression of the neuropathologic burden leads to the later appearance and progression of clinical symptoms.

The work group organized Alzheimer’s disease biomarkers into three broad categories: (1) core biomarkers of ADNPC, (2) nonspecific biomarkers that are important in Alzheimer’s disease but are also involved in other brain diseases, and (3) biomarkers of diseases or conditions that commonly coexist with Alzheimer’s disease.

Core Alzheimer’s biomarkers are subdivided into Core 1 and Core 2. 

Core 1 biomarkers become abnormal early in the disease course and directly measure either amyloid plaques or phosphorylated tau (p-tau). They include amyloid PET; cerebrospinal fluid (CSF) amyloid beta 42/40 ratio, CSF p-tau181/amyloid beta 42 ratio, and CSF total (t)-tau/amyloid beta 42 ratio; and “accurate” plasma biomarkers, such as p-tau217. 

“An abnormal Core 1 biomarker result is sufficient to establish a diagnosis of Alzheimer’s disease and to inform clinical decision making [sic] throughout the disease continuum,” the work group wrote. 

Core 2 biomarkers become abnormal later in the disease process and are more closely linked with the onset of symptoms. Core 2 biomarkers include tau PET and certain soluble tau fragments associated with tau proteinopathy (eg, MTBR-tau243) but also pT205 and nonphosphorylated mid-region tau fragments. 

Core 2 biomarkers, when combined with Core 1, may be used to stage biologic disease severity; abnormal Core 2 biomarkers “increase confidence that Alzheimer’s disease is contributing to symptoms,” the work group noted. 

The revised criteria give clinicians “the flexibility to use plasma or PET scans or CSF,” Dr. Carrillo said. “They will have several tools that they can choose from and offer this variety of tools to their patients. We need different tools for different individuals. There will be differences in coverage and access to these diagnostics.” 

The revised criteria also include an integrated biologic and clinical staging scheme that acknowledges the fact that common co-pathologies, cognitive reserve, and resistance may modify relationships between clinical and biologic Alzheimer’s disease stages. 
 

 

 

Formal Guidelines to Come 

The work group noted that currently, the clinical use of Alzheimer’s disease biomarkers is intended for the evaluation of symptomatic patients, not cognitively unimpaired individuals.

Disease-targeted therapies have not yet been approved for cognitively unimpaired individuals. For this reason, the work group currently recommends against diagnostic testing in cognitively unimpaired individuals outside the context of observational or therapeutic research studies. 

This recommendation would change in the future if disease-targeted therapies that are currently being evaluated in trials demonstrate a benefit in preventing cognitive decline and are approved for use in preclinical Alzheimer’s disease, they wrote. 

They emphasize that the revised criteria are not intended to provide step-by-step clinical practice guidelines for clinicians. Rather, they provide general principles to inform diagnosis and staging of Alzheimer’s disease that reflect current science.

“This is just the beginning,” said Dr. Carrillo. “This is a gathering of the evidence to date and putting it in one place so we can have a consensus and actually a way to test it and make it better as we add new science.”

This also serves as a “springboard” for the Alzheimer’s Association to create formal clinical guidelines. “That will come, hopefully, over the next 12 months. We’ll be working on it, and we hope to have that in 2025,” Dr. Carrillo said. 

The revised criteria also emphasize the role of the clinician. 

“The biologically based diagnosis of Alzheimer’s disease is meant to assist, rather than supplant, the clinical evaluation of individuals with cognitive impairment,” the work group wrote in a related commentary published online in Nature Medicine

Recent diagnostics and therapeutic developments “herald a virtuous cycle in which improvements in diagnostic methods enable more sophisticated treatment approaches, which in turn steer advances in diagnostic methods,” they continued. “An unchanging principle, however, is that effective treatment will always rely on the ability to diagnose and stage the biology driving the disease process.”

Funding for this research was provided by the National Institutes of Health, Alexander family professorship, GHR Foundation, Alzheimer’s Association, Veterans Administration, Life Molecular Imaging, Michael J. Fox Foundation for Parkinson’s Research, Avid Radiopharmaceuticals, Eli Lilly, Gates Foundation, Biogen, C2N Diagnostics, Eisai, Fujirebio, GE Healthcare, Roche, National Institute on Aging, Roche/Genentech, BrightFocus Foundation, Hoffmann-La Roche, Novo Nordisk, Toyama, National MS Society, Alzheimer Drug Discovery Foundation, and others. A complete list of donors and disclosures is included in the original article.

 A version of this article appeared on Medscape.com.

A work group convened by the Alzheimer’s Association has released revised biology-based criteria for the diagnosis and staging of Alzheimer’s disease, including a new biomarker classification system that incorporates fluid and imaging biomarkers as well as an updated disease staging system. 

“Plasma markers are here now, and it’s very important to incorporate them into the criteria for diagnosis,” said senior author Maria C. Carrillo, PhD, Alzheimer’s Association chief science officer and medical affairs lead. 

The revised criteria are the first updates since 2018.

“Defining diseases biologically, rather than based on syndromic presentation, has long been standard in many areas of medicine — including cancer, heart disease, and diabetes — and is becoming a unifying concept common to all neurodegenerative diseases,” lead author Clifford Jack Jr, MD, with Mayo Clinic, Rochester, Minnesota, said in a news release from the Alzheimer’s Association. 

“These updates to the diagnostic criteria are needed now because we know more about the underlying biology of Alzheimer’s and we are able to measure those changes,” Dr. Jack added. 

The 2024 revised criteria for diagnosis and staging of Alzheimer’s disease were published online in Alzheimer’s & Dementia
 

Core Biomarkers Defined

The revised criteria define Alzheimer’s disease as a biologic process that begins with the appearance of Alzheimer’s disease neuropathologic change (ADNPC) in the absence of symptoms. Progression of the neuropathologic burden leads to the later appearance and progression of clinical symptoms.

The work group organized Alzheimer’s disease biomarkers into three broad categories: (1) core biomarkers of ADNPC, (2) nonspecific biomarkers that are important in Alzheimer’s disease but are also involved in other brain diseases, and (3) biomarkers of diseases or conditions that commonly coexist with Alzheimer’s disease.

Core Alzheimer’s biomarkers are subdivided into Core 1 and Core 2. 

Core 1 biomarkers become abnormal early in the disease course and directly measure either amyloid plaques or phosphorylated tau (p-tau). They include amyloid PET; cerebrospinal fluid (CSF) amyloid beta 42/40 ratio, CSF p-tau181/amyloid beta 42 ratio, and CSF total (t)-tau/amyloid beta 42 ratio; and “accurate” plasma biomarkers, such as p-tau217. 

“An abnormal Core 1 biomarker result is sufficient to establish a diagnosis of Alzheimer’s disease and to inform clinical decision making [sic] throughout the disease continuum,” the work group wrote. 

Core 2 biomarkers become abnormal later in the disease process and are more closely linked with the onset of symptoms. Core 2 biomarkers include tau PET and certain soluble tau fragments associated with tau proteinopathy (eg, MTBR-tau243) but also pT205 and nonphosphorylated mid-region tau fragments. 

Core 2 biomarkers, when combined with Core 1, may be used to stage biologic disease severity; abnormal Core 2 biomarkers “increase confidence that Alzheimer’s disease is contributing to symptoms,” the work group noted. 

The revised criteria give clinicians “the flexibility to use plasma or PET scans or CSF,” Dr. Carrillo said. “They will have several tools that they can choose from and offer this variety of tools to their patients. We need different tools for different individuals. There will be differences in coverage and access to these diagnostics.” 

The revised criteria also include an integrated biologic and clinical staging scheme that acknowledges the fact that common co-pathologies, cognitive reserve, and resistance may modify relationships between clinical and biologic Alzheimer’s disease stages. 
 

 

 

Formal Guidelines to Come 

The work group noted that currently, the clinical use of Alzheimer’s disease biomarkers is intended for the evaluation of symptomatic patients, not cognitively unimpaired individuals.

Disease-targeted therapies have not yet been approved for cognitively unimpaired individuals. For this reason, the work group currently recommends against diagnostic testing in cognitively unimpaired individuals outside the context of observational or therapeutic research studies. 

This recommendation would change in the future if disease-targeted therapies that are currently being evaluated in trials demonstrate a benefit in preventing cognitive decline and are approved for use in preclinical Alzheimer’s disease, they wrote. 

They emphasize that the revised criteria are not intended to provide step-by-step clinical practice guidelines for clinicians. Rather, they provide general principles to inform diagnosis and staging of Alzheimer’s disease that reflect current science.

“This is just the beginning,” said Dr. Carrillo. “This is a gathering of the evidence to date and putting it in one place so we can have a consensus and actually a way to test it and make it better as we add new science.”

This also serves as a “springboard” for the Alzheimer’s Association to create formal clinical guidelines. “That will come, hopefully, over the next 12 months. We’ll be working on it, and we hope to have that in 2025,” Dr. Carrillo said. 

The revised criteria also emphasize the role of the clinician. 

“The biologically based diagnosis of Alzheimer’s disease is meant to assist, rather than supplant, the clinical evaluation of individuals with cognitive impairment,” the work group wrote in a related commentary published online in Nature Medicine

Recent diagnostics and therapeutic developments “herald a virtuous cycle in which improvements in diagnostic methods enable more sophisticated treatment approaches, which in turn steer advances in diagnostic methods,” they continued. “An unchanging principle, however, is that effective treatment will always rely on the ability to diagnose and stage the biology driving the disease process.”

Funding for this research was provided by the National Institutes of Health, Alexander family professorship, GHR Foundation, Alzheimer’s Association, Veterans Administration, Life Molecular Imaging, Michael J. Fox Foundation for Parkinson’s Research, Avid Radiopharmaceuticals, Eli Lilly, Gates Foundation, Biogen, C2N Diagnostics, Eisai, Fujirebio, GE Healthcare, Roche, National Institute on Aging, Roche/Genentech, BrightFocus Foundation, Hoffmann-La Roche, Novo Nordisk, Toyama, National MS Society, Alzheimer Drug Discovery Foundation, and others. A complete list of donors and disclosures is included in the original article.

 A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S & DEMENTIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Common Cognitive Test Falls Short for Concussion Diagnosis

Article Type
Changed
Mon, 07/01/2024 - 14:13

 

A tool routinely used to evaluate concussion in college athletes fails to accurately diagnose the condition in many cases, a new study showed.

Investigators found that almost half of athletes diagnosed with a concussion tested normally on the Sports Concussion Assessment Tool 5 (SCAT5), the recommended tool for measuring cognitive skills in concussion evaluations. The most accurate measure of concussion was symptoms reported by the athletes.

“If you don’t do well on the cognitive exam, it suggests you have a concussion. But many people who are concussed do fine on the exam,” lead author Kimberly Harmon, MD, professor of family medicine and section head of sports medicine at the University of Washington School of Medicine, Seattle, said in a news release.

The study was published online in JAMA Network Open.

Introduced in 2004, the SCAT was created to standardize the collection of information clinicians use to diagnose concussion, including evaluation of symptoms, orientation, and balance. It also uses a 10-word list to assess immediate memory and delayed recall.

Dr. Harmon’s own experiences as a team physician led her to wonder about the accuracy of the cognitive screening portion of the SCAT. She saw that “some people were concussed, and they did well on the recall test. Some people weren’t concussed, and they didn’t do well. So I thought we should study it,” she said.

Investigators compared 92 National Collegiate Athletic Association (NCAA) Division 1 athletes who had sustained a concussion between 2020 and 2022 and had a concussion evaluation within 48 hours to 92 matched nonconcussed teammates (overall cohort, 52% men). Most concussions occurred in those who played football, followed by volleyball.

All athletes had previously completed NCAA-required baseline concussion screenings. Participants completed the SCAT5 screening test within 2 weeks of the incident concussion.

No significant differences were found between the baseline scores of athletes with and without concussion. Moreover, responses on the word recall section of the SCAT5 held little predictive value for concussion.

Nearly half (45%) of athletes with concussion performed at or even above their baseline cognitive report, which the authors said highlights the limitations of the cognitive components of SCAT5.

The most accurate predictor of concussion was participants’ responses to questions about their symptoms.

“If you get hit in the head and go to the sideline and say, ‘I have a headache, I’m dizzy, I don’t feel right,’ I can say with pretty good assurance that you have a concussion,” Dr. Harmon continued. “I don’t need to do any testing.”

Unfortunately, the problem is “that some athletes don’t want to come out. They don’t report their symptoms or may not recognize their symptoms. So then you need an objective, accurate test to tell you whether you can safely put the athlete back on the field. We don’t have that right now.”

The study did not control for concussion history, and the all–Division 1 cohort means the findings may not be generalizable to other athletes.

Nevertheless, investigators said the study “affirms that reported symptoms are the most sensitive indicator of concussion, and there are limitations to the objective cognitive testing included in the SCAT.” They concluded that concussion “remains a clinical diagnosis that should be based on a thorough review of signs, symptoms, and clinical findings.”

This study was funded in part by donations from University of Washington alumni Jack and Luellen Cherneski and the Chisholm Foundation. Dr. Harmon reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

A tool routinely used to evaluate concussion in college athletes fails to accurately diagnose the condition in many cases, a new study showed.

Investigators found that almost half of athletes diagnosed with a concussion tested normally on the Sports Concussion Assessment Tool 5 (SCAT5), the recommended tool for measuring cognitive skills in concussion evaluations. The most accurate measure of concussion was symptoms reported by the athletes.

“If you don’t do well on the cognitive exam, it suggests you have a concussion. But many people who are concussed do fine on the exam,” lead author Kimberly Harmon, MD, professor of family medicine and section head of sports medicine at the University of Washington School of Medicine, Seattle, said in a news release.

The study was published online in JAMA Network Open.

Introduced in 2004, the SCAT was created to standardize the collection of information clinicians use to diagnose concussion, including evaluation of symptoms, orientation, and balance. It also uses a 10-word list to assess immediate memory and delayed recall.

Dr. Harmon’s own experiences as a team physician led her to wonder about the accuracy of the cognitive screening portion of the SCAT. She saw that “some people were concussed, and they did well on the recall test. Some people weren’t concussed, and they didn’t do well. So I thought we should study it,” she said.

Investigators compared 92 National Collegiate Athletic Association (NCAA) Division 1 athletes who had sustained a concussion between 2020 and 2022 and had a concussion evaluation within 48 hours to 92 matched nonconcussed teammates (overall cohort, 52% men). Most concussions occurred in those who played football, followed by volleyball.

All athletes had previously completed NCAA-required baseline concussion screenings. Participants completed the SCAT5 screening test within 2 weeks of the incident concussion.

No significant differences were found between the baseline scores of athletes with and without concussion. Moreover, responses on the word recall section of the SCAT5 held little predictive value for concussion.

Nearly half (45%) of athletes with concussion performed at or even above their baseline cognitive report, which the authors said highlights the limitations of the cognitive components of SCAT5.

The most accurate predictor of concussion was participants’ responses to questions about their symptoms.

“If you get hit in the head and go to the sideline and say, ‘I have a headache, I’m dizzy, I don’t feel right,’ I can say with pretty good assurance that you have a concussion,” Dr. Harmon continued. “I don’t need to do any testing.”

Unfortunately, the problem is “that some athletes don’t want to come out. They don’t report their symptoms or may not recognize their symptoms. So then you need an objective, accurate test to tell you whether you can safely put the athlete back on the field. We don’t have that right now.”

The study did not control for concussion history, and the all–Division 1 cohort means the findings may not be generalizable to other athletes.

Nevertheless, investigators said the study “affirms that reported symptoms are the most sensitive indicator of concussion, and there are limitations to the objective cognitive testing included in the SCAT.” They concluded that concussion “remains a clinical diagnosis that should be based on a thorough review of signs, symptoms, and clinical findings.”

This study was funded in part by donations from University of Washington alumni Jack and Luellen Cherneski and the Chisholm Foundation. Dr. Harmon reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

A tool routinely used to evaluate concussion in college athletes fails to accurately diagnose the condition in many cases, a new study showed.

Investigators found that almost half of athletes diagnosed with a concussion tested normally on the Sports Concussion Assessment Tool 5 (SCAT5), the recommended tool for measuring cognitive skills in concussion evaluations. The most accurate measure of concussion was symptoms reported by the athletes.

“If you don’t do well on the cognitive exam, it suggests you have a concussion. But many people who are concussed do fine on the exam,” lead author Kimberly Harmon, MD, professor of family medicine and section head of sports medicine at the University of Washington School of Medicine, Seattle, said in a news release.

The study was published online in JAMA Network Open.

Introduced in 2004, the SCAT was created to standardize the collection of information clinicians use to diagnose concussion, including evaluation of symptoms, orientation, and balance. It also uses a 10-word list to assess immediate memory and delayed recall.

Dr. Harmon’s own experiences as a team physician led her to wonder about the accuracy of the cognitive screening portion of the SCAT. She saw that “some people were concussed, and they did well on the recall test. Some people weren’t concussed, and they didn’t do well. So I thought we should study it,” she said.

Investigators compared 92 National Collegiate Athletic Association (NCAA) Division 1 athletes who had sustained a concussion between 2020 and 2022 and had a concussion evaluation within 48 hours to 92 matched nonconcussed teammates (overall cohort, 52% men). Most concussions occurred in those who played football, followed by volleyball.

All athletes had previously completed NCAA-required baseline concussion screenings. Participants completed the SCAT5 screening test within 2 weeks of the incident concussion.

No significant differences were found between the baseline scores of athletes with and without concussion. Moreover, responses on the word recall section of the SCAT5 held little predictive value for concussion.

Nearly half (45%) of athletes with concussion performed at or even above their baseline cognitive report, which the authors said highlights the limitations of the cognitive components of SCAT5.

The most accurate predictor of concussion was participants’ responses to questions about their symptoms.

“If you get hit in the head and go to the sideline and say, ‘I have a headache, I’m dizzy, I don’t feel right,’ I can say with pretty good assurance that you have a concussion,” Dr. Harmon continued. “I don’t need to do any testing.”

Unfortunately, the problem is “that some athletes don’t want to come out. They don’t report their symptoms or may not recognize their symptoms. So then you need an objective, accurate test to tell you whether you can safely put the athlete back on the field. We don’t have that right now.”

The study did not control for concussion history, and the all–Division 1 cohort means the findings may not be generalizable to other athletes.

Nevertheless, investigators said the study “affirms that reported symptoms are the most sensitive indicator of concussion, and there are limitations to the objective cognitive testing included in the SCAT.” They concluded that concussion “remains a clinical diagnosis that should be based on a thorough review of signs, symptoms, and clinical findings.”

This study was funded in part by donations from University of Washington alumni Jack and Luellen Cherneski and the Chisholm Foundation. Dr. Harmon reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Form of B12 Deficiency Affecting the Central Nervous System May Be New Autoimmune Disease

Article Type
Changed
Mon, 07/01/2024 - 13:53

Researchers have identified a form of B12 deficiency caused by autoantibodies that specifically affects the central nervous system.

Discovered while studying a puzzling case of one patient with inexplicable neurological systems, the same autoantibody was detected in a small percentage of healthy individuals and was nearly four times as prevalent in patients with neuropsychiatric systemic lupus erythematosus (SLE).

“I didn’t think this single investigation was going to yield a broader phenomenon with other patients,” lead author John V. Pluvinage, MD, PhD, a neurology resident at the University of California San Francisco, said in an interview. “It started as an N-of-one study just based on scientific curiosity.”

“It’s a beautifully done study,” added Betty Diamond, MD, director of the Institute of Molecular Medicine at the Feinstein Institutes for Medical Research in Manhasset, New York, commenting on the research. It uncovers “yet another example of a disease where antibodies getting into the brain are the problem.”

The research was published in Science Translational Medicine.
 

The Patient

The investigation began in 2014 with a 67-year-old woman presenting with difficulty speaking, ataxia, and tremor. Her blood tests showed no signs of B12 deficiency, and testing for known autoantibodies came back negative.

Solving this mystery required a more exhaustive approach. The patient enrolled in a research study focused on identifying novel autoantibodies in suspected neuroinflammatory disease, using a screening technology called phage immunoprecipitation sequencing.

“We adapted this technology to screen for autoantibodies in an unbiased manner by displaying every peptide across the human proteome and then mixing those peptides with patient antibodies in order to figure out what the antibodies are binding to,” explained Dr. Pluvinage.

Using this method, he and colleagues discovered that this woman had autoantibodies that target CD320 — a receptor important in the cellular uptake of B12. While her blood tests were normal, B12 in the patient’s cerebral spinal fluid (CSF) was “nearly undetectable,” Dr. Pluvinage said. Using an in vitro model of the blood-brain barrier (BBB), the researchers determined that anti-CD320 impaired the transport of B12 across the BBB by targeting receptors on the cell surface.

Treating the patient with a combination of immunosuppressant medication and high-dose B12 supplementation increased B12 levels in the patient’s CSF and improved clinical symptoms.
 

Identifying More Cases

Dr. Pluvinage and colleagues tested the 254 other individuals enrolled in the neuroinflammatory disease study and identified seven participants with CSF anti-CD320 autoantibodies — four of whom had low B12 in the CSF.

In a group of healthy controls, anti-CD320 seropositivity was 6%, similar to the positivity rate in 132 paired serum and CSF samples from a cohort of patients with multiple sclerosis (5.7%). In this group of patients with multiple sclerosis, anti-CD320 presence in the blood was highly predictive of high levels of CSF methylmalonic acid, a metabolic marker of B12 deficiency.

Researchers also screened for anti-CD320 seropositivity in 408 patients with non-neurologic SLE and 28 patients with neuropsychiatric SLE and found that the autoantibody was nearly four times as prevalent in patients with neurologic symptoms (21.4%) compared with in those with non-neurologic SLE (5.6%).

“The clinical relevance of anti-CD320 in healthy controls remains uncertain,” the authors wrote. However, it is not uncommon to have healthy patients with known autoantibodies.

“There are always people who have autoantibodies who don’t get disease, and why that is we don’t know,” said Dr. Diamond. Some individuals may develop clinical symptoms later, or there may be other reasons why they are protected against disease.

Pluvinage is eager to follow some seropositive healthy individuals to track their neurologic health overtime, to see if the presence of anti-CD320 “alters their neurologic trajectories.”
 

 

 

Alternative Pathways

Lastly, Dr. Pluvinage and colleagues set out to explain why patients with anti-CD320 in their blood did not show any signs of B12 deficiency. They hypothesized that another receptor may be compensating and still allowing blood cells to take up B12. Using CRISPR screening, the team identified the low-density lipoprotein receptor as an alternative pathway to B12 uptake.

“These findings suggest a model in which anti-CD320 impairs transport of B12 across the BBB, leading to autoimmune B12 central deficiency (ABCD) with varied neurologic manifestations but sparing peripheral manifestations of B12 deficiency,” the authors wrote.

The work was supported by the National Institute of Mental Health, National Center for Chronic Disease Prevention and Health Promotion, Department of Defense, UCSF Helen Diller Family Comprehensive Cancer Center Laboratory for Cell Analysis Shared Resource Facility, National Multiple Sclerosis Society, Valhalla Foundation, and the Westridge Foundation. Dr. Pluvinage is a co-inventor on a patent application related to this work. Dr. Diamond had no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Researchers have identified a form of B12 deficiency caused by autoantibodies that specifically affects the central nervous system.

Discovered while studying a puzzling case of one patient with inexplicable neurological systems, the same autoantibody was detected in a small percentage of healthy individuals and was nearly four times as prevalent in patients with neuropsychiatric systemic lupus erythematosus (SLE).

“I didn’t think this single investigation was going to yield a broader phenomenon with other patients,” lead author John V. Pluvinage, MD, PhD, a neurology resident at the University of California San Francisco, said in an interview. “It started as an N-of-one study just based on scientific curiosity.”

“It’s a beautifully done study,” added Betty Diamond, MD, director of the Institute of Molecular Medicine at the Feinstein Institutes for Medical Research in Manhasset, New York, commenting on the research. It uncovers “yet another example of a disease where antibodies getting into the brain are the problem.”

The research was published in Science Translational Medicine.
 

The Patient

The investigation began in 2014 with a 67-year-old woman presenting with difficulty speaking, ataxia, and tremor. Her blood tests showed no signs of B12 deficiency, and testing for known autoantibodies came back negative.

Solving this mystery required a more exhaustive approach. The patient enrolled in a research study focused on identifying novel autoantibodies in suspected neuroinflammatory disease, using a screening technology called phage immunoprecipitation sequencing.

“We adapted this technology to screen for autoantibodies in an unbiased manner by displaying every peptide across the human proteome and then mixing those peptides with patient antibodies in order to figure out what the antibodies are binding to,” explained Dr. Pluvinage.

Using this method, he and colleagues discovered that this woman had autoantibodies that target CD320 — a receptor important in the cellular uptake of B12. While her blood tests were normal, B12 in the patient’s cerebral spinal fluid (CSF) was “nearly undetectable,” Dr. Pluvinage said. Using an in vitro model of the blood-brain barrier (BBB), the researchers determined that anti-CD320 impaired the transport of B12 across the BBB by targeting receptors on the cell surface.

Treating the patient with a combination of immunosuppressant medication and high-dose B12 supplementation increased B12 levels in the patient’s CSF and improved clinical symptoms.
 

Identifying More Cases

Dr. Pluvinage and colleagues tested the 254 other individuals enrolled in the neuroinflammatory disease study and identified seven participants with CSF anti-CD320 autoantibodies — four of whom had low B12 in the CSF.

In a group of healthy controls, anti-CD320 seropositivity was 6%, similar to the positivity rate in 132 paired serum and CSF samples from a cohort of patients with multiple sclerosis (5.7%). In this group of patients with multiple sclerosis, anti-CD320 presence in the blood was highly predictive of high levels of CSF methylmalonic acid, a metabolic marker of B12 deficiency.

Researchers also screened for anti-CD320 seropositivity in 408 patients with non-neurologic SLE and 28 patients with neuropsychiatric SLE and found that the autoantibody was nearly four times as prevalent in patients with neurologic symptoms (21.4%) compared with in those with non-neurologic SLE (5.6%).

“The clinical relevance of anti-CD320 in healthy controls remains uncertain,” the authors wrote. However, it is not uncommon to have healthy patients with known autoantibodies.

“There are always people who have autoantibodies who don’t get disease, and why that is we don’t know,” said Dr. Diamond. Some individuals may develop clinical symptoms later, or there may be other reasons why they are protected against disease.

Pluvinage is eager to follow some seropositive healthy individuals to track their neurologic health overtime, to see if the presence of anti-CD320 “alters their neurologic trajectories.”
 

 

 

Alternative Pathways

Lastly, Dr. Pluvinage and colleagues set out to explain why patients with anti-CD320 in their blood did not show any signs of B12 deficiency. They hypothesized that another receptor may be compensating and still allowing blood cells to take up B12. Using CRISPR screening, the team identified the low-density lipoprotein receptor as an alternative pathway to B12 uptake.

“These findings suggest a model in which anti-CD320 impairs transport of B12 across the BBB, leading to autoimmune B12 central deficiency (ABCD) with varied neurologic manifestations but sparing peripheral manifestations of B12 deficiency,” the authors wrote.

The work was supported by the National Institute of Mental Health, National Center for Chronic Disease Prevention and Health Promotion, Department of Defense, UCSF Helen Diller Family Comprehensive Cancer Center Laboratory for Cell Analysis Shared Resource Facility, National Multiple Sclerosis Society, Valhalla Foundation, and the Westridge Foundation. Dr. Pluvinage is a co-inventor on a patent application related to this work. Dr. Diamond had no relevant disclosures.

A version of this article first appeared on Medscape.com.

Researchers have identified a form of B12 deficiency caused by autoantibodies that specifically affects the central nervous system.

Discovered while studying a puzzling case of one patient with inexplicable neurological systems, the same autoantibody was detected in a small percentage of healthy individuals and was nearly four times as prevalent in patients with neuropsychiatric systemic lupus erythematosus (SLE).

“I didn’t think this single investigation was going to yield a broader phenomenon with other patients,” lead author John V. Pluvinage, MD, PhD, a neurology resident at the University of California San Francisco, said in an interview. “It started as an N-of-one study just based on scientific curiosity.”

“It’s a beautifully done study,” added Betty Diamond, MD, director of the Institute of Molecular Medicine at the Feinstein Institutes for Medical Research in Manhasset, New York, commenting on the research. It uncovers “yet another example of a disease where antibodies getting into the brain are the problem.”

The research was published in Science Translational Medicine.
 

The Patient

The investigation began in 2014 with a 67-year-old woman presenting with difficulty speaking, ataxia, and tremor. Her blood tests showed no signs of B12 deficiency, and testing for known autoantibodies came back negative.

Solving this mystery required a more exhaustive approach. The patient enrolled in a research study focused on identifying novel autoantibodies in suspected neuroinflammatory disease, using a screening technology called phage immunoprecipitation sequencing.

“We adapted this technology to screen for autoantibodies in an unbiased manner by displaying every peptide across the human proteome and then mixing those peptides with patient antibodies in order to figure out what the antibodies are binding to,” explained Dr. Pluvinage.

Using this method, he and colleagues discovered that this woman had autoantibodies that target CD320 — a receptor important in the cellular uptake of B12. While her blood tests were normal, B12 in the patient’s cerebral spinal fluid (CSF) was “nearly undetectable,” Dr. Pluvinage said. Using an in vitro model of the blood-brain barrier (BBB), the researchers determined that anti-CD320 impaired the transport of B12 across the BBB by targeting receptors on the cell surface.

Treating the patient with a combination of immunosuppressant medication and high-dose B12 supplementation increased B12 levels in the patient’s CSF and improved clinical symptoms.
 

Identifying More Cases

Dr. Pluvinage and colleagues tested the 254 other individuals enrolled in the neuroinflammatory disease study and identified seven participants with CSF anti-CD320 autoantibodies — four of whom had low B12 in the CSF.

In a group of healthy controls, anti-CD320 seropositivity was 6%, similar to the positivity rate in 132 paired serum and CSF samples from a cohort of patients with multiple sclerosis (5.7%). In this group of patients with multiple sclerosis, anti-CD320 presence in the blood was highly predictive of high levels of CSF methylmalonic acid, a metabolic marker of B12 deficiency.

Researchers also screened for anti-CD320 seropositivity in 408 patients with non-neurologic SLE and 28 patients with neuropsychiatric SLE and found that the autoantibody was nearly four times as prevalent in patients with neurologic symptoms (21.4%) compared with in those with non-neurologic SLE (5.6%).

“The clinical relevance of anti-CD320 in healthy controls remains uncertain,” the authors wrote. However, it is not uncommon to have healthy patients with known autoantibodies.

“There are always people who have autoantibodies who don’t get disease, and why that is we don’t know,” said Dr. Diamond. Some individuals may develop clinical symptoms later, or there may be other reasons why they are protected against disease.

Pluvinage is eager to follow some seropositive healthy individuals to track their neurologic health overtime, to see if the presence of anti-CD320 “alters their neurologic trajectories.”
 

 

 

Alternative Pathways

Lastly, Dr. Pluvinage and colleagues set out to explain why patients with anti-CD320 in their blood did not show any signs of B12 deficiency. They hypothesized that another receptor may be compensating and still allowing blood cells to take up B12. Using CRISPR screening, the team identified the low-density lipoprotein receptor as an alternative pathway to B12 uptake.

“These findings suggest a model in which anti-CD320 impairs transport of B12 across the BBB, leading to autoimmune B12 central deficiency (ABCD) with varied neurologic manifestations but sparing peripheral manifestations of B12 deficiency,” the authors wrote.

The work was supported by the National Institute of Mental Health, National Center for Chronic Disease Prevention and Health Promotion, Department of Defense, UCSF Helen Diller Family Comprehensive Cancer Center Laboratory for Cell Analysis Shared Resource Facility, National Multiple Sclerosis Society, Valhalla Foundation, and the Westridge Foundation. Dr. Pluvinage is a co-inventor on a patent application related to this work. Dr. Diamond had no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SCIENCE TRANSLATIONAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Clues on How Blast Exposure May Lead to Alzheimer’s Disease

Article Type
Changed
Mon, 06/24/2024 - 13:22

In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.

A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”

Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.

New data suggest that repeated blast exposure may impair the brain’s waste clearance system, leading to biomarker changes indicative of preclinical Alzheimer’s disease 20 years earlier than typical. A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.

In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”

In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.

Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
 

What Is the Underlying Biology?

Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.

The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.

“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
 

Common Biomarker Signatures

A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.

“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.

The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.

The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.

For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).

High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.

The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).

In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
 

 

 

Is Impaired Clearance the Culprit?

Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.

“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”

Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”

A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.

The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.

In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”

It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.

The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”

Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.

In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.

And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.

The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
 

 

 

Veterans Especially Vulnerable

Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.

“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”

The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.

Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”

Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.

Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”

Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”

He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
 

Clinical Implications

For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.

But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”

The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.

Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.

Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”

Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”

Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”

Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.

The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.

A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”

Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.

New data suggest that repeated blast exposure may impair the brain’s waste clearance system, leading to biomarker changes indicative of preclinical Alzheimer’s disease 20 years earlier than typical. A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.

In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”

In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.

Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
 

What Is the Underlying Biology?

Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.

The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.

“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
 

Common Biomarker Signatures

A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.

“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.

The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.

The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.

For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).

High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.

The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).

In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
 

 

 

Is Impaired Clearance the Culprit?

Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.

“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”

Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”

A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.

The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.

In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”

It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.

The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”

Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.

In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.

And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.

The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
 

 

 

Veterans Especially Vulnerable

Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.

“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”

The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.

Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”

Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.

Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”

Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”

He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
 

Clinical Implications

For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.

But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”

The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.

Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.

Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”

Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”

Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”

Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.

The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
 

A version of this article appeared on Medscape.com.

In October 2023, Robert Card — a grenade instructor in the Army Reserve — shot and killed 18 people in Maine, before turning the gun on himself. As reported by The New York Times, his family said that he had become increasingly erratic and violent during the months before the rampage.

A postmortem conducted by the Chronic Traumatic Encephalopathy (CTE) Center at Boston University found “significant evidence of traumatic brain injuries” [TBIs] and “significant degeneration, axonal and myelin loss, inflammation, and small blood vessel injury” in the white matter, the center’s director, Ann McKee, MD, said in a press release. “These findings align with our previous studies on the effects of blast injury in humans and experimental models.”

Members of the military, such as Mr. Card, are exposed to blasts from repeated firing of heavy weapons not only during combat but also during training.

New data suggest that repeated blast exposure may impair the brain’s waste clearance system, leading to biomarker changes indicative of preclinical Alzheimer’s disease 20 years earlier than typical. A higher index of suspicion for dementia or Alzheimer’s disease may be warranted in patients with a history of blast exposure or subconcussive brain injury who present with cognitive issues, according to experts interviewed.

In 2022, the US Department of Defense (DOD) launched its Warfighter Brain Health Initiative with the aim of “optimizing service member brain health and countering traumatic brain injuries.”

In April 2024, the Blast Overpressure Safety Act was introduced in the Senate to require the DOD to enact better blast screening, tracking, prevention, and treatment. The DOD initiated 26 blast overpressure studies.

Heather Snyder, PhD, Alzheimer’s Association vice president of Medical and Scientific Relations, said that an important component of that research involves “the need to study the difference between TBI-caused dementia and dementia caused independently” and “the need to study biomarkers to better understand the long-term consequences of TBI.”
 

What Is the Underlying Biology?

Dr. Snyder was the lead author of a white paper produced by the Alzheimer’s Association in 2018 on military-related risk factors for Alzheimer’s disease and related dementias. “There is a lot of work trying to understand the effect of pure blast waves on the brain, as opposed to the actual impact of the injury,” she said.

The white paper speculated that blast exposure may be analogous to subconcussive brain injury in athletes where there are no obvious immediate clinical symptoms or neurological dysfunction but which can cause cumulative injury and functional impairment over time.

“We are also trying to understand the underlying biology around brain changes, such as accumulation of tau and amyloid and other specific markers related to brain changes in Alzheimer’s disease,” said Dr. Snyder, chair of the Peer Reviewed Alzheimer’s Research Program Programmatic Panel for Alzheimer’s Disease/Alzheimer’s Disease and Related Dementias and TBI.
 

Common Biomarker Signatures

A recent study in Neurology comparing 51 veterans with mild TBI (mTBI) with 85 veterans and civilians with no lifetime history of TBI is among the first to explore these biomarker changes in human beings.

“Our findings suggest that chronic neuropathologic processes associated with blast mTBI share properties in common with pathogenic processes that are precursors to Alzheimer’s disease onset,” said coauthor Elaine R. Peskind, MD, professor of psychiatry and behavioral sciences, University of Washington, Seattle.

The largely male participants were a mean age of 34 years and underwent standardized clinical and neuropsychological testing as well as lumbar puncture to collect cerebrospinal fluid (CSF). The mTBI group had experienced at least one war zone blast or combined blast/impact that met criteria for mTBI, but 91% had more than one blast mTBI, and the study took place over 13 years.

The researchers found that the mTBI group “had biomarker signatures in common with the earliest stages of Alzheimer’s disease,” said Dr. Peskind.

For example, at age 50, they had lower mean levels of CSF amyloid beta 42 (Abeta42), the earliest marker of brain parenchymal Abeta deposition, compared with the control group (154 pg/mL and 1864 pg/mL lower, respectively).

High CSF phosphorylated tau181 (p-tau181) and total tau are established biomarkers for Alzheimer’s disease. However, levels of these biomarkers remained “relatively constant with age” in participants with mTBI but were higher in older ages for the non-TBI group.

The mTBI group also showed worse cognitive performance at older ages (P < .08). Poorer verbal memory and verbal fluency performance were associated with lower CSF Abeta42 in older participants (P ≤ .05).

In Alzheimer’s disease, a reduction in CSF Abeta42 may occur up to 20 years before the onset of clinical symptoms, according to Dr. Peskind. “But what we don’t know from this study is what this means, as total tau protein and p-tau181 in the CSF were also low, which isn’t entirely typical in the picture of preclinical Alzheimer’s disease,” she said. However, changes in total tau and p-tau181 lag behind changes in Abeta42.
 

 

 

Is Impaired Clearance the Culprit?

Coauthor Jeffrey Iliff, PhD, professor, University of Washington Department of Psychiatry and Behavioral Sciences and University of Washington Department of Neurology, Seattle, elaborated.

“In the setting of Alzheimer’s disease, a signature of the disease is reduced CSF Abeta42, which is thought to reflect that much of the amyloid gets ‘stuck’ in the brain in the form of amyloid plaques,” he said. “There are usually higher levels of phosphorylated tau and total tau, which are thought to reflect the presence of tau tangles and degeneration of neurons in the brain. But in this study, all of those were lowered, which is not exactly an Alzheimer’s disease profile.”

Dr. Iliff, associate director for research, VA Northwest Mental Illness Research, Education, and Clinical Center at VA Puget Sound Health Care System, Seattle, suggested that the culprit may be impairment in the brain’s glymphatic system. “Recently described biological research supports [the concept of] clearance of waste out of the brain during sleep via the glymphatic system, with amyloid and tau being cleared from the brain interstitium during sleep.”

A recent hypothesis is that blast TBI impairs that process. “This is why we see less of those proteins in the CSF. They’re not being cleared, which might contribute downstream to the clumping up of protein in the brain,” he suggested.

The evidence base corroborating that hypothesis is in its infancy; however, new research conducted by Dr. Iliff and his colleagues sheds light on this potential mechanism.

In blast TBI, energy from the explosion and resulting overpressure wave are “transmitted through the brain, which causes tissues of different densities — such as gray and white matter — to accelerate at different rates,” according to Dr. Iliff. This results in the shearing and stretching of brain tissue, leading to a “diffuse pattern of tissue damage.”

It is known that blast TBI has clinical overlap and associations with posttraumatic stress disorder (PTSD), depression, and persistent neurobehavioral symptoms; that veterans with a history of TBI are more than twice as likely to die by suicide than veterans with no TBI history; and that TBI may increase the risk for Alzheimer’s disease and related dementing disorders, as well as CTE.

The missing link may be the glymphatic system — a “brain-wide network of perivascular pathways, along which CSF and interstitial fluid (ISF) exchange, supporting the clearance of interstitial solutes, including amyloid-beta.”

Dr. Iliff and his group previously found that glymphatic function is “markedly and chronically impaired” following impact TBI in mice and that this impairment is associated with the mislocalization of astroglial aquaporin 4 (AQP4), a water channel that lines perivascular spaces and plays a role in healthy glymphatic exchange.

In their new study, the researchers examined both the expression and the localization of AQP4 in the human postmortem frontal cortex and found “distinct laminar differences” in AQP4 expression following blast exposure. They observed similar changes as well as impairment of glymphatic function, which emerged 28 days following blast injury in a mouse model of repetitive blast mTBI.

And in a cohort of veterans with blast mTBI, blast exposure was found to be associated with an increased burden of frontal cortical MRI-visible perivascular spaces — a “putative neuroimaging marker” of glymphatic perivascular dysfunction.

The earlier Neurology study “showed impairment of biomarkers in the CSF, but the new study showed ‘why’ or ‘how’ these biomarkers are impaired, which is via impairment of the glymphatic clearance process,” Dr. Iliff explained.
 

 

 

Veterans Especially Vulnerable

Dr. Peskind, co-director of the VA Northwest Mental Illness Research, Education and Clinical Center, VA Puget Sound Health Care System, noted that while the veterans in the earlier study had at least one TBI, the average number was 20, and it was more common to have more than 50 mTBIs than to have a single one.

“These were highly exposed combat vets,” she said. “And that number doesn’t even account for subconcussive exposure to blasts, which now appear to cause detectable brain damage, even in the absence of a diagnosable TBI.”

The Maine shooter, Mr. Card, had not seen combat and was not assessed for TBI during a psychiatric hospitalization, according to The New York Times.

Dr. Peskind added that this type of blast damage is likely specific to individuals in the military. “It isn’t the sound that causes the damage,” she explained. “It’s the blast wave, the pressure wave, and there aren’t a lot of other occupations that have those types of occupational exposures.”

Dr. Snyder added that the majority of blast TBIs have been studied in military personnel, and she is not aware of studies that have looked at blast injuries in other industries, such as demolition or mining, to see if they have the same type of biologic consequences.

Dr. Snyder hopes that the researchers will follow the participants in the Neurology study and continue looking at specific markers related to Alzheimer’s disease brain changes. What the research so far shows “is that, at an earlier age, we’re starting to see those markers changing, suggesting that the underlying biology in people with mild blast TBI is similar to the underlying biology in Alzheimer’s disease as well.”

Michael Alosco, PhD, associate professor and vice chair of research, department of neurology, Boston University Chobanian & Avedisian School of Medicine, called the issue of blast exposure and TBI “a very complex and nuanced topic,” especially because TBI is “considered a risk factor of Alzheimer’s disease” and “different types of TBIs could trigger distinct pathophysiologic processes; however, the long-term impact of repetitive blast TBIs on neurodegenerative disease changes remains unknown.”

He coauthored an editorial on the earlier Neurology study that noted its limitations, such as a small sample size and lack of consideration of lifestyle and health factors but acknowledged that the “findings provide preliminary evidence that repetitive blast exposures might influence beta-amyloid accumulation.”
 

Clinical Implications

For Dr. Peskind, the “inflection point” was seeing lower CSF Abeta42, about 20 years earlier than ages 60 and 70, which is more typical in cognitively normal community volunteers.

But she described herself as “loath to say that veterans or service members have a 20-year acceleration of risk of Alzheimer’s disease,” adding, “I don’t want to scare the heck out of our service members of veterans.” Although “this is what we fear, we’re not ready to say it for sure yet because we need to do more work. Nevertheless, it does increase the index of suspicion.”

The clinical take-home messages are not unique to service members or veterans or people with a history of head injuries or a genetic predisposition to Alzheimer’s disease, she emphasized. “If anyone of any age or occupation comes in with cognitive issues, such as [impaired] memory or executive function, they deserve a workup for dementing disorders.” Frontotemporal dementia, for example, can present earlier than Alzheimer’s disease typically does.

Common comorbidities with TBI are PTSD and obstructive sleep apnea (OSA), which can also cause cognitive issues and are also risk factors for dementia.

Dr. Iliff agreed. “If you see a veteran with a history of PTSD, a history of blast TBI, and a history of OSA or some combination of those three, I recommend having a higher index of suspicion [for potential dementia] than for an average person without any of these, even at a younger age than one would ordinarily expect.”

Of all of these factors, the only truly directly modifiable one is sleep disruption, including that caused by OSA or sleep disorders related to PTSD, he added. “Epidemiologic data suggest a connection particularly between midlife sleep disruption and the risk of dementia and Alzheimer’s disease, and so it’s worth thinking about sleep as a modifiable risk factor even as early as the 40s and 50s, whether the patient is or isn’t a veteran.”

Dr. Peskind recommended asking patients, “Do they snore? Do they thrash about during sleep? Do they have trauma nightmares? This will inform the type of intervention required.”

Dr. Alosco added that there is no known “safe” threshold of exposure to blasts, and that thresholds are “unclear, particularly at the individual level.” In American football, there is a dose-response relationship between years of play and risk for later-life neurological disorder. “The best way to mitigate risk is to limit cumulative exposure,” he said.

The study by Li and colleagues was funded by grant funding from the Department of Veterans Affairs Rehabilitation Research and Development Service and the University of Washington Friends of Alzheimer’s Research. Other sources of funding to individual researchers are listed in the original paper. The study by Braun and colleagues was supported by the National Heart, Lung and Blood Institute; the Department of Veterans Affairs Rehabilitation Research and Development Service; and the National Institute on Aging. The white paper included studies that received funding from numerous sources, including the National Institutes of Health and the DOD. Dr. Iliff serves as the chair of the Scientific Advisory Board for Applied Cognition Inc., from which he receives compensation and in which he holds an equity stake. In the last year, he served as a paid consultant to Gryphon Biosciences. Dr. Peskind has served as a paid consultant to the companies Genentech, Roche, and Alpha Cognition. Dr. Alosco was supported by grant funding from the NIH; he received research support from Rainwater Charitable Foundation Inc., and Life Molecular Imaging Inc.; he has received a single honorarium from the Michael J. Fox Foundation for services unrelated to this editorial; and he received royalties from Oxford University Press Inc. The other authors’ disclosures are listed in the original papers.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vision Impairment Tied to Higher Dementia Risk in Older Adults

Article Type
Changed
Mon, 06/24/2024 - 11:40

 

TOPLINE:

Poor vision and contrast sensitivity at baseline increase the likelihood of incident dementia in older adults; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.

METHODOLOGY:

  • Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
  • All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
  • Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
  • Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.

TAKEAWAY:

  • Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
  • Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
  • Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
  • A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
  • Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).

IN PRACTICE:

“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”

SOURCE:

The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.

LIMITATIONS:

The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.

DISCLOSURES:

The study did not have any funding source. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Poor vision and contrast sensitivity at baseline increase the likelihood of incident dementia in older adults; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.

METHODOLOGY:

  • Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
  • All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
  • Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
  • Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.

TAKEAWAY:

  • Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
  • Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
  • Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
  • A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
  • Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).

IN PRACTICE:

“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”

SOURCE:

The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.

LIMITATIONS:

The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.

DISCLOSURES:

The study did not have any funding source. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

 

TOPLINE:

Poor vision and contrast sensitivity at baseline increase the likelihood of incident dementia in older adults; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.

METHODOLOGY:

  • Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
  • All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
  • Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
  • Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.

TAKEAWAY:

  • Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
  • Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
  • Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
  • A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
  • Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).

IN PRACTICE:

“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”

SOURCE:

The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.

LIMITATIONS:

The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.

DISCLOSURES:

The study did not have any funding source. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A New Psychotherapeutic ‘Gold Standard’ for Chronic Pain?

Article Type
Changed
Thu, 06/20/2024 - 11:39

A single course of treatment with emotional awareness and expression therapy (EAET) was associated with a significantly greater reduction in chronic pain severity than cognitive-behavioral therapy (CBT), the current psychotherapeutic gold standard, a new study suggested.

Two thirds of the patients who received EAET reported at least a 30% reduction in pain compared with 17% of those who received CBT. The randomized clinical trial also showed that individuals with depression and anxiety responded more favorably to EAET, a novel finding.

The study is one of only a few to directly compare EAET with CBT.

“Most people with chronic pain don’t consider psychotherapy at all,” said study investigator Brandon C. Yarns, MD, a staff psychiatrist at the VA Greater Los Angeles Healthcare System, and clinical professor of health sciences at the Department of Psychiatry and Biobehavioral Sciences, UCLA Health.

Although patients were allowed to continue medication for pain and other comorbidities during the study, those who received EAET “had larger improvements in pain, depression, and anxiety,” Dr. Yarns said. “That suggests that the effect was due to the EAET.”

The findings were published online in JAMA Network Open.
 

‘Gold Standard’

EAET was first used in the early 2010s. In the therapy, patients are asked to recall a difficult or traumatic memory, engage in experiencing how the related emotions feel in the body, express those feelings in words, and release or let them go. They are taught that the brain’s perception of pain is strongly influenced by the evasion of grief, fear, rage, or guilt, Dr. Yarns said.

This contrasts with CBT — considered the current gold standard for chronic pain — which teaches patients to improve the ability to tolerate pain though guided imagery, muscle relaxation, and other exercises and to adapt their thinking to change how they think about pain.

Although prior studies suggested EAET is effective in reducing pain in fibromyalgia and chronic musculoskeletal, pelvic, and head pain, most included primarily younger, female patients.

The research is the “first full-scale evaluation of EAET, to our knowledge, in a medically or psychiatrically complex, racially and ethnically diverse, older sample comprising predominantly men,” investigators wrote.

The trial enrolled 126 veterans (92% men; 55% Black or African American) aged 60-95 years with at least 3 months of musculoskeletal pain. More than two thirds of patients had a psychiatric diagnosis, with about one third having posttraumatic stress disorder (PTSD). Almost all had back pain, and many had pain in multiple locations.

All services were delivered in-person at the US Department of Veterans Affairs Greater Los Angeles Healthcare System, Los Angeles. Half underwent CBT, while the other half received EAET.

Each patient had one 90-minute individual session and eight additional 90-minute group sessions.

Patients were asked to rate their pain using a 0-10 scale in the Brief Pain Inventory (BPI) before starting treatment, at the end of the nine sessions (at week 10), and 6 months after the sessions ended. Baseline BPI score for both groups was a mean of around 6.

Post treatment, people in the EAET versus CBT group had a mean two-point reduction versus 0.60 reduction, respectively, on the BPI scale. A clinically significant reduction in pain — defined as ≥ 30% decrease — was reported in 63% of EAET patients versus 17% of CBT patients (odds ratio [OR], 21.54; P < .001).

At 6 months, the mean reduction was 1.2 for the EAET group compared with 0.25 for the CBT group, and 40% of the EAET group reported a clinically significant reduction in pain.

A little more than a third (35%) of veterans receiving EAET reported at least a 50% reduction in pain at 10 weeks compared with 7% of those receiving CBT. At 6 months, 16% of the EAET arm reported a halving of their pain.

EAET was also superior to CBT in reducing anxiety, depression, and PTSD symptoms at the 10-week mark.
 

 

 

More Work Needed

In an accompanying editorial, Matthias Karst, MD, PhD, a clinician with the Pain Clinic, Hannover Medical School, in Hannover, Germany, noted that EAET’s effects “are significantly superior to those of CBT in almost all dimensions, even after 6 months.”

EAET “assigns a special place to the integration of the body into the emotional experience,” he wrote.

The study demonstrated that “the evocation and expression of emotions is superior to the mere cognitive discussion of these emotions in therapy of patients with chronic pain.”

Commenting on the findings, Traci J. Speed, MD, PhD, assistant professor of psychiatry and behavioral sciences and an attending psychiatrist of the Johns Hopkins Pain Treatment Program at Johns Hopkins University, Baltimore, called the study “ground-breaking” because it showed effectiveness in people with high rates of PTSD, anxiety, and depression.

“It is a little bit surprising how impressive the study outcomes are in terms of maintaining the effects at the end of the treatment and sustaining some of the effects on pain sensitivity even at the 6-month follow-up,” said Dr. Speed, who was not part of the study.

However, she continued, “I don’t think it changes the current standard of practice yet. CBT has decades of research and evidence that it is effective for chronic pain and that will I think continue to be the standard of care.”

Although EAET is in its infancy, chronic pain experts are interested in learning more about the therapy, Dr. Speed added.

“It blends well with the current techniques and extends the current gold standard treatment approaches,” she said. “We are starting to really appreciate the role that emotions play in pain sensitivity.”

Both Dr. Karst and Dr. Speed noted that more study is needed to determine the sustainability of treatment effects.

Dr. Yarns agreed. “We need more research on what the appropriate dose is and perhaps how one might go about personalizing that for the patient,” he said.

The study was funded by a career development award to Dr. Yarns from the VA Clinical Science Research and Development Service. Dr. Yarns reported receiving grants from the US Department of Veterans Affairs during the study. Other authors’ disclosures are in the original article. Dr. Speed reported no conflicts.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A single course of treatment with emotional awareness and expression therapy (EAET) was associated with a significantly greater reduction in chronic pain severity than cognitive-behavioral therapy (CBT), the current psychotherapeutic gold standard, a new study suggested.

Two thirds of the patients who received EAET reported at least a 30% reduction in pain compared with 17% of those who received CBT. The randomized clinical trial also showed that individuals with depression and anxiety responded more favorably to EAET, a novel finding.

The study is one of only a few to directly compare EAET with CBT.

“Most people with chronic pain don’t consider psychotherapy at all,” said study investigator Brandon C. Yarns, MD, a staff psychiatrist at the VA Greater Los Angeles Healthcare System, and clinical professor of health sciences at the Department of Psychiatry and Biobehavioral Sciences, UCLA Health.

Although patients were allowed to continue medication for pain and other comorbidities during the study, those who received EAET “had larger improvements in pain, depression, and anxiety,” Dr. Yarns said. “That suggests that the effect was due to the EAET.”

The findings were published online in JAMA Network Open.
 

‘Gold Standard’

EAET was first used in the early 2010s. In the therapy, patients are asked to recall a difficult or traumatic memory, engage in experiencing how the related emotions feel in the body, express those feelings in words, and release or let them go. They are taught that the brain’s perception of pain is strongly influenced by the evasion of grief, fear, rage, or guilt, Dr. Yarns said.

This contrasts with CBT — considered the current gold standard for chronic pain — which teaches patients to improve the ability to tolerate pain though guided imagery, muscle relaxation, and other exercises and to adapt their thinking to change how they think about pain.

Although prior studies suggested EAET is effective in reducing pain in fibromyalgia and chronic musculoskeletal, pelvic, and head pain, most included primarily younger, female patients.

The research is the “first full-scale evaluation of EAET, to our knowledge, in a medically or psychiatrically complex, racially and ethnically diverse, older sample comprising predominantly men,” investigators wrote.

The trial enrolled 126 veterans (92% men; 55% Black or African American) aged 60-95 years with at least 3 months of musculoskeletal pain. More than two thirds of patients had a psychiatric diagnosis, with about one third having posttraumatic stress disorder (PTSD). Almost all had back pain, and many had pain in multiple locations.

All services were delivered in-person at the US Department of Veterans Affairs Greater Los Angeles Healthcare System, Los Angeles. Half underwent CBT, while the other half received EAET.

Each patient had one 90-minute individual session and eight additional 90-minute group sessions.

Patients were asked to rate their pain using a 0-10 scale in the Brief Pain Inventory (BPI) before starting treatment, at the end of the nine sessions (at week 10), and 6 months after the sessions ended. Baseline BPI score for both groups was a mean of around 6.

Post treatment, people in the EAET versus CBT group had a mean two-point reduction versus 0.60 reduction, respectively, on the BPI scale. A clinically significant reduction in pain — defined as ≥ 30% decrease — was reported in 63% of EAET patients versus 17% of CBT patients (odds ratio [OR], 21.54; P < .001).

At 6 months, the mean reduction was 1.2 for the EAET group compared with 0.25 for the CBT group, and 40% of the EAET group reported a clinically significant reduction in pain.

A little more than a third (35%) of veterans receiving EAET reported at least a 50% reduction in pain at 10 weeks compared with 7% of those receiving CBT. At 6 months, 16% of the EAET arm reported a halving of their pain.

EAET was also superior to CBT in reducing anxiety, depression, and PTSD symptoms at the 10-week mark.
 

 

 

More Work Needed

In an accompanying editorial, Matthias Karst, MD, PhD, a clinician with the Pain Clinic, Hannover Medical School, in Hannover, Germany, noted that EAET’s effects “are significantly superior to those of CBT in almost all dimensions, even after 6 months.”

EAET “assigns a special place to the integration of the body into the emotional experience,” he wrote.

The study demonstrated that “the evocation and expression of emotions is superior to the mere cognitive discussion of these emotions in therapy of patients with chronic pain.”

Commenting on the findings, Traci J. Speed, MD, PhD, assistant professor of psychiatry and behavioral sciences and an attending psychiatrist of the Johns Hopkins Pain Treatment Program at Johns Hopkins University, Baltimore, called the study “ground-breaking” because it showed effectiveness in people with high rates of PTSD, anxiety, and depression.

“It is a little bit surprising how impressive the study outcomes are in terms of maintaining the effects at the end of the treatment and sustaining some of the effects on pain sensitivity even at the 6-month follow-up,” said Dr. Speed, who was not part of the study.

However, she continued, “I don’t think it changes the current standard of practice yet. CBT has decades of research and evidence that it is effective for chronic pain and that will I think continue to be the standard of care.”

Although EAET is in its infancy, chronic pain experts are interested in learning more about the therapy, Dr. Speed added.

“It blends well with the current techniques and extends the current gold standard treatment approaches,” she said. “We are starting to really appreciate the role that emotions play in pain sensitivity.”

Both Dr. Karst and Dr. Speed noted that more study is needed to determine the sustainability of treatment effects.

Dr. Yarns agreed. “We need more research on what the appropriate dose is and perhaps how one might go about personalizing that for the patient,” he said.

The study was funded by a career development award to Dr. Yarns from the VA Clinical Science Research and Development Service. Dr. Yarns reported receiving grants from the US Department of Veterans Affairs during the study. Other authors’ disclosures are in the original article. Dr. Speed reported no conflicts.
 

A version of this article appeared on Medscape.com.

A single course of treatment with emotional awareness and expression therapy (EAET) was associated with a significantly greater reduction in chronic pain severity than cognitive-behavioral therapy (CBT), the current psychotherapeutic gold standard, a new study suggested.

Two thirds of the patients who received EAET reported at least a 30% reduction in pain compared with 17% of those who received CBT. The randomized clinical trial also showed that individuals with depression and anxiety responded more favorably to EAET, a novel finding.

The study is one of only a few to directly compare EAET with CBT.

“Most people with chronic pain don’t consider psychotherapy at all,” said study investigator Brandon C. Yarns, MD, a staff psychiatrist at the VA Greater Los Angeles Healthcare System, and clinical professor of health sciences at the Department of Psychiatry and Biobehavioral Sciences, UCLA Health.

Although patients were allowed to continue medication for pain and other comorbidities during the study, those who received EAET “had larger improvements in pain, depression, and anxiety,” Dr. Yarns said. “That suggests that the effect was due to the EAET.”

The findings were published online in JAMA Network Open.
 

‘Gold Standard’

EAET was first used in the early 2010s. In the therapy, patients are asked to recall a difficult or traumatic memory, engage in experiencing how the related emotions feel in the body, express those feelings in words, and release or let them go. They are taught that the brain’s perception of pain is strongly influenced by the evasion of grief, fear, rage, or guilt, Dr. Yarns said.

This contrasts with CBT — considered the current gold standard for chronic pain — which teaches patients to improve the ability to tolerate pain though guided imagery, muscle relaxation, and other exercises and to adapt their thinking to change how they think about pain.

Although prior studies suggested EAET is effective in reducing pain in fibromyalgia and chronic musculoskeletal, pelvic, and head pain, most included primarily younger, female patients.

The research is the “first full-scale evaluation of EAET, to our knowledge, in a medically or psychiatrically complex, racially and ethnically diverse, older sample comprising predominantly men,” investigators wrote.

The trial enrolled 126 veterans (92% men; 55% Black or African American) aged 60-95 years with at least 3 months of musculoskeletal pain. More than two thirds of patients had a psychiatric diagnosis, with about one third having posttraumatic stress disorder (PTSD). Almost all had back pain, and many had pain in multiple locations.

All services were delivered in-person at the US Department of Veterans Affairs Greater Los Angeles Healthcare System, Los Angeles. Half underwent CBT, while the other half received EAET.

Each patient had one 90-minute individual session and eight additional 90-minute group sessions.

Patients were asked to rate their pain using a 0-10 scale in the Brief Pain Inventory (BPI) before starting treatment, at the end of the nine sessions (at week 10), and 6 months after the sessions ended. Baseline BPI score for both groups was a mean of around 6.

Post treatment, people in the EAET versus CBT group had a mean two-point reduction versus 0.60 reduction, respectively, on the BPI scale. A clinically significant reduction in pain — defined as ≥ 30% decrease — was reported in 63% of EAET patients versus 17% of CBT patients (odds ratio [OR], 21.54; P < .001).

At 6 months, the mean reduction was 1.2 for the EAET group compared with 0.25 for the CBT group, and 40% of the EAET group reported a clinically significant reduction in pain.

A little more than a third (35%) of veterans receiving EAET reported at least a 50% reduction in pain at 10 weeks compared with 7% of those receiving CBT. At 6 months, 16% of the EAET arm reported a halving of their pain.

EAET was also superior to CBT in reducing anxiety, depression, and PTSD symptoms at the 10-week mark.
 

 

 

More Work Needed

In an accompanying editorial, Matthias Karst, MD, PhD, a clinician with the Pain Clinic, Hannover Medical School, in Hannover, Germany, noted that EAET’s effects “are significantly superior to those of CBT in almost all dimensions, even after 6 months.”

EAET “assigns a special place to the integration of the body into the emotional experience,” he wrote.

The study demonstrated that “the evocation and expression of emotions is superior to the mere cognitive discussion of these emotions in therapy of patients with chronic pain.”

Commenting on the findings, Traci J. Speed, MD, PhD, assistant professor of psychiatry and behavioral sciences and an attending psychiatrist of the Johns Hopkins Pain Treatment Program at Johns Hopkins University, Baltimore, called the study “ground-breaking” because it showed effectiveness in people with high rates of PTSD, anxiety, and depression.

“It is a little bit surprising how impressive the study outcomes are in terms of maintaining the effects at the end of the treatment and sustaining some of the effects on pain sensitivity even at the 6-month follow-up,” said Dr. Speed, who was not part of the study.

However, she continued, “I don’t think it changes the current standard of practice yet. CBT has decades of research and evidence that it is effective for chronic pain and that will I think continue to be the standard of care.”

Although EAET is in its infancy, chronic pain experts are interested in learning more about the therapy, Dr. Speed added.

“It blends well with the current techniques and extends the current gold standard treatment approaches,” she said. “We are starting to really appreciate the role that emotions play in pain sensitivity.”

Both Dr. Karst and Dr. Speed noted that more study is needed to determine the sustainability of treatment effects.

Dr. Yarns agreed. “We need more research on what the appropriate dose is and perhaps how one might go about personalizing that for the patient,” he said.

The study was funded by a career development award to Dr. Yarns from the VA Clinical Science Research and Development Service. Dr. Yarns reported receiving grants from the US Department of Veterans Affairs during the study. Other authors’ disclosures are in the original article. Dr. Speed reported no conflicts.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Intensive Lifestyle Changes May Counter Early Alzheimer’s Symptoms

Article Type
Changed
Wed, 06/19/2024 - 13:37

An intensive lifestyle intervention significantly improved cognition and function in many patients with mild cognitive impairment (MCI) or early dementia due to Alzheimer’s disease, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. The study was published online in Alzheimer’s Research & Therapy.

Although technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
 

Study Methodology

Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.

Two patients, both in the treatment group, withdrew over logistical concerns.

After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:

  • Clinical Global Impression of Change (CGIC, P = .001)
  • Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
  • Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
  • Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)

The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.

In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
 

Caveats

Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.

Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.

Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
 

Further Study

The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.

Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.

“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”

Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field. 

Publications
Topics
Sections

An intensive lifestyle intervention significantly improved cognition and function in many patients with mild cognitive impairment (MCI) or early dementia due to Alzheimer’s disease, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. The study was published online in Alzheimer’s Research & Therapy.

Although technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
 

Study Methodology

Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.

Two patients, both in the treatment group, withdrew over logistical concerns.

After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:

  • Clinical Global Impression of Change (CGIC, P = .001)
  • Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
  • Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
  • Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)

The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.

In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
 

Caveats

Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.

Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.

Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
 

Further Study

The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.

Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.

“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”

Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field. 

An intensive lifestyle intervention significantly improved cognition and function in many patients with mild cognitive impairment (MCI) or early dementia due to Alzheimer’s disease, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. The study was published online in Alzheimer’s Research & Therapy.

Although technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
 

Study Methodology

Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.

Two patients, both in the treatment group, withdrew over logistical concerns.

After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:

  • Clinical Global Impression of Change (CGIC, P = .001)
  • Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
  • Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
  • Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)

The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.

In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
 

Caveats

Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.

Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.

Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
 

Further Study

The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.

Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.

“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”

Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field. 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S RESEARCH & THERAPY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article