Living in a disadvantaged neighborhood is associated with accelerated brain aging and a higher risk for early dementia, regardless of income level or education, new research suggested.
“If you want to prevent dementia and you’re not asking someone about their neighborhood, you’re missing information that’s important to know,” lead author Aaron Reuben, PhD, postdoctoral scholar in neuropsychology and environmental health at Duke University, Durham, North Carolina, said in a news release.
The study was published online in Alzheimer’s & Dementia.
Higher Risk in Men
Few interventions exist to halt or delay the progression of Alzheimer’s disease and related dementias (ADRD), which has increasingly led to a focus on primary prevention.
Although previous research pointed to a link between socioeconomically disadvantaged neighborhoods and a greater risk for cognitive deficits, mild cognitive impairment, dementia, and poor brain health, the timeline for the emergence of that risk is unknown.
To fill in the gaps, investigators studied data on all 1.4 million New Zealand residents, dividing neighborhoods into quintiles based on level of disadvantage (assessed by the New Zealand Index of Deprivation) to see whether dementia diagnoses followed neighborhood socioeconomic gradients.
After adjusting for covariates, they found that overall, those living in disadvantaged areas were slightly more likely to develop dementia across the 20-year study period (adjusted hazard ratio [HR], 1.09; 95% CI, 1.08-1.10).
The more disadvantaged the neighborhood, the higher the dementia risk, with a 43% higher risk for ADRD among those in the highest quintile than among those in the lowest quintile (HR, 1.43; 95% CI, 1.36-1.49).
The effect was larger in men than in women and in younger vs older individuals, with the youngest age group showing 21% greater risk in women and 26% greater risk in men vs the oldest age group.
Dementia Prevention Starts Early
Researchers then turned to the Dunedin Study, a cohort of 938 New Zealanders (50% female) followed from birth to age 45 to track their psychological, social, and physiological health with brain scans, memory tests, and cognitive self-assessments.
The analysis suggested that by age 45, those living in more disadvantaged neighborhoods across adulthood had accumulated a significantly greater number of midlife risk factors for later ADRD.
They also had worse structural brain integrity, with each standard deviation increase in neighborhood disadvantage resulting in a thinner cortex, greater white matter hyperintensities volume, and older brain age.
Those living in poorer areas had lower cognitive test scores, reported more issues with everyday cognitive function, and showed a greater reduction in IQ from childhood to midlife. Analysis of brain scans also revealed mean brain ages 2.98 years older than those living in the least disadvantaged areas (P = .001).
Limitations included the study’s observational design, which could not establish causation, and the fact that the researchers did not have access to individual-level socioeconomic information for the entire population. Additionally, brain-integrity measures in the Dunedin Study were largely cross-sectional.
“If you want to truly prevent dementia, you’ve got to start early because 20 years before anyone will get a diagnosis, we’re seeing dementia’s emergence,” Dr. Reuben said. “And it could be even earlier.”
Funding for the study was provided by the National Institutes for Health; UK Medical Research Council; Health Research Council of New Zealand; Brain Research New Zealand; New Zealand Ministry of Business, Innovation, & Employment; and the Duke University and the University of North Carolina Alzheimer’s Disease Research Center. The authors declared no relevant financial relationships.
A version of this article appeared on Medscape.com.