Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort

Disadvantaged Neighborhoods Tied to Higher Dementia Risk, Brain Aging

Article Type
Changed
Wed, 03/27/2024 - 10:39

 

Living in a disadvantaged neighborhood is associated with accelerated brain aging and a higher risk for early dementia, regardless of income level or education, new research suggested.

Analysis of two datasets revealed that people living in the most disadvantaged neighborhoods had a more than 20% higher risk for dementia than those in other areas and measurably poorer brain health as early as age 45, regardless of their own personal income and education.

“If you want to prevent dementia and you’re not asking someone about their neighborhood, you’re missing information that’s important to know,” lead author Aaron Reuben, PhD, postdoctoral scholar in neuropsychology and environmental health at Duke University, Durham, North Carolina, said in a news release.

The study was published online in Alzheimer’s & Dementia.

Higher Risk in Men

Few interventions exist to halt or delay the progression of Alzheimer’s disease and related dementias (ADRD), which has increasingly led to a focus on primary prevention.

Although previous research pointed to a link between socioeconomically disadvantaged neighborhoods and a greater risk for cognitive deficitsmild cognitive impairment, dementia, and poor brain health, the timeline for the emergence of that risk is unknown.

To fill in the gaps, investigators studied data on all 1.4 million New Zealand residents, dividing neighborhoods into quintiles based on level of disadvantage (assessed by the New Zealand Index of Deprivation) to see whether dementia diagnoses followed neighborhood socioeconomic gradients.

After adjusting for covariates, they found that overall, those living in disadvantaged areas were slightly more likely to develop dementia across the 20-year study period (adjusted hazard ratio [HR], 1.09; 95% CI, 1.08-1.10).

The more disadvantaged the neighborhood, the higher the dementia risk, with a 43% higher risk for ADRD among those in the highest quintile than among those in the lowest quintile (HR, 1.43; 95% CI, 1.36-1.49).

The effect was larger in men than in women and in younger vs older individuals, with the youngest age group showing 21% greater risk in women and 26% greater risk in men vs the oldest age group.

Dementia Prevention Starts Early

Researchers then turned to the Dunedin Study, a cohort of 938 New Zealanders (50% female) followed from birth to age 45 to track their psychological, social, and physiological health with brain scans, memory tests, and cognitive self-assessments.

The analysis suggested that by age 45, those living in more disadvantaged neighborhoods across adulthood had accumulated a significantly greater number of midlife risk factors for later ADRD.

They also had worse structural brain integrity, with each standard deviation increase in neighborhood disadvantage resulting in a thinner cortex, greater white matter hyperintensities volume, and older brain age.

Those living in poorer areas had lower cognitive test scores, reported more issues with everyday cognitive function, and showed a greater reduction in IQ from childhood to midlife. Analysis of brain scans also revealed mean brain ages 2.98 years older than those living in the least disadvantaged areas (P = .001).

Limitations included the study’s observational design, which could not establish causation, and the fact that the researchers did not have access to individual-level socioeconomic information for the entire population. Additionally, brain-integrity measures in the Dunedin Study were largely cross-sectional.

“If you want to truly prevent dementia, you’ve got to start early because 20 years before anyone will get a diagnosis, we’re seeing dementia’s emergence,” Dr. Reuben said. “And it could be even earlier.”

Funding for the study was provided by the National Institutes for Health; UK Medical Research Council; Health Research Council of New Zealand; Brain Research New Zealand; New Zealand Ministry of Business, Innovation, & Employment; and the Duke University and the University of North Carolina Alzheimer’s Disease Research Center. The authors declared no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

Living in a disadvantaged neighborhood is associated with accelerated brain aging and a higher risk for early dementia, regardless of income level or education, new research suggested.

Analysis of two datasets revealed that people living in the most disadvantaged neighborhoods had a more than 20% higher risk for dementia than those in other areas and measurably poorer brain health as early as age 45, regardless of their own personal income and education.

“If you want to prevent dementia and you’re not asking someone about their neighborhood, you’re missing information that’s important to know,” lead author Aaron Reuben, PhD, postdoctoral scholar in neuropsychology and environmental health at Duke University, Durham, North Carolina, said in a news release.

The study was published online in Alzheimer’s & Dementia.

Higher Risk in Men

Few interventions exist to halt or delay the progression of Alzheimer’s disease and related dementias (ADRD), which has increasingly led to a focus on primary prevention.

Although previous research pointed to a link between socioeconomically disadvantaged neighborhoods and a greater risk for cognitive deficitsmild cognitive impairment, dementia, and poor brain health, the timeline for the emergence of that risk is unknown.

To fill in the gaps, investigators studied data on all 1.4 million New Zealand residents, dividing neighborhoods into quintiles based on level of disadvantage (assessed by the New Zealand Index of Deprivation) to see whether dementia diagnoses followed neighborhood socioeconomic gradients.

After adjusting for covariates, they found that overall, those living in disadvantaged areas were slightly more likely to develop dementia across the 20-year study period (adjusted hazard ratio [HR], 1.09; 95% CI, 1.08-1.10).

The more disadvantaged the neighborhood, the higher the dementia risk, with a 43% higher risk for ADRD among those in the highest quintile than among those in the lowest quintile (HR, 1.43; 95% CI, 1.36-1.49).

The effect was larger in men than in women and in younger vs older individuals, with the youngest age group showing 21% greater risk in women and 26% greater risk in men vs the oldest age group.

Dementia Prevention Starts Early

Researchers then turned to the Dunedin Study, a cohort of 938 New Zealanders (50% female) followed from birth to age 45 to track their psychological, social, and physiological health with brain scans, memory tests, and cognitive self-assessments.

The analysis suggested that by age 45, those living in more disadvantaged neighborhoods across adulthood had accumulated a significantly greater number of midlife risk factors for later ADRD.

They also had worse structural brain integrity, with each standard deviation increase in neighborhood disadvantage resulting in a thinner cortex, greater white matter hyperintensities volume, and older brain age.

Those living in poorer areas had lower cognitive test scores, reported more issues with everyday cognitive function, and showed a greater reduction in IQ from childhood to midlife. Analysis of brain scans also revealed mean brain ages 2.98 years older than those living in the least disadvantaged areas (P = .001).

Limitations included the study’s observational design, which could not establish causation, and the fact that the researchers did not have access to individual-level socioeconomic information for the entire population. Additionally, brain-integrity measures in the Dunedin Study were largely cross-sectional.

“If you want to truly prevent dementia, you’ve got to start early because 20 years before anyone will get a diagnosis, we’re seeing dementia’s emergence,” Dr. Reuben said. “And it could be even earlier.”

Funding for the study was provided by the National Institutes for Health; UK Medical Research Council; Health Research Council of New Zealand; Brain Research New Zealand; New Zealand Ministry of Business, Innovation, & Employment; and the Duke University and the University of North Carolina Alzheimer’s Disease Research Center. The authors declared no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

Living in a disadvantaged neighborhood is associated with accelerated brain aging and a higher risk for early dementia, regardless of income level or education, new research suggested.

Analysis of two datasets revealed that people living in the most disadvantaged neighborhoods had a more than 20% higher risk for dementia than those in other areas and measurably poorer brain health as early as age 45, regardless of their own personal income and education.

“If you want to prevent dementia and you’re not asking someone about their neighborhood, you’re missing information that’s important to know,” lead author Aaron Reuben, PhD, postdoctoral scholar in neuropsychology and environmental health at Duke University, Durham, North Carolina, said in a news release.

The study was published online in Alzheimer’s & Dementia.

Higher Risk in Men

Few interventions exist to halt or delay the progression of Alzheimer’s disease and related dementias (ADRD), which has increasingly led to a focus on primary prevention.

Although previous research pointed to a link between socioeconomically disadvantaged neighborhoods and a greater risk for cognitive deficitsmild cognitive impairment, dementia, and poor brain health, the timeline for the emergence of that risk is unknown.

To fill in the gaps, investigators studied data on all 1.4 million New Zealand residents, dividing neighborhoods into quintiles based on level of disadvantage (assessed by the New Zealand Index of Deprivation) to see whether dementia diagnoses followed neighborhood socioeconomic gradients.

After adjusting for covariates, they found that overall, those living in disadvantaged areas were slightly more likely to develop dementia across the 20-year study period (adjusted hazard ratio [HR], 1.09; 95% CI, 1.08-1.10).

The more disadvantaged the neighborhood, the higher the dementia risk, with a 43% higher risk for ADRD among those in the highest quintile than among those in the lowest quintile (HR, 1.43; 95% CI, 1.36-1.49).

The effect was larger in men than in women and in younger vs older individuals, with the youngest age group showing 21% greater risk in women and 26% greater risk in men vs the oldest age group.

Dementia Prevention Starts Early

Researchers then turned to the Dunedin Study, a cohort of 938 New Zealanders (50% female) followed from birth to age 45 to track their psychological, social, and physiological health with brain scans, memory tests, and cognitive self-assessments.

The analysis suggested that by age 45, those living in more disadvantaged neighborhoods across adulthood had accumulated a significantly greater number of midlife risk factors for later ADRD.

They also had worse structural brain integrity, with each standard deviation increase in neighborhood disadvantage resulting in a thinner cortex, greater white matter hyperintensities volume, and older brain age.

Those living in poorer areas had lower cognitive test scores, reported more issues with everyday cognitive function, and showed a greater reduction in IQ from childhood to midlife. Analysis of brain scans also revealed mean brain ages 2.98 years older than those living in the least disadvantaged areas (P = .001).

Limitations included the study’s observational design, which could not establish causation, and the fact that the researchers did not have access to individual-level socioeconomic information for the entire population. Additionally, brain-integrity measures in the Dunedin Study were largely cross-sectional.

“If you want to truly prevent dementia, you’ve got to start early because 20 years before anyone will get a diagnosis, we’re seeing dementia’s emergence,” Dr. Reuben said. “And it could be even earlier.”

Funding for the study was provided by the National Institutes for Health; UK Medical Research Council; Health Research Council of New Zealand; Brain Research New Zealand; New Zealand Ministry of Business, Innovation, & Employment; and the Duke University and the University of North Carolina Alzheimer’s Disease Research Center. The authors declared no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S AND DEMENTIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Upfront Low-Dose Radiation Improves Advanced SCLC Outcomes

Article Type
Changed
Wed, 04/03/2024 - 12:13

 

Adding low-dose radiation to the current standard first-line treatment, durvalumab plus etoposide-platinum chemotherapy, appears to improve survival outcomes in patients with extensive-stage small cell lung cancer (SCLC), suggested new findings from a small, single-arm study.

The analysis, presented at the 2024 European Lung Cancer Congress, revealed that low-dose radiation improved patients’ median progression-free and overall survival compared with standard first-line treatment, reported in a 2019 trial, lead author Yan Zhang, MD, reported.

The standard first-line treatment results came from the 2019 CASPIAN trial, which found that patients receiving the first-line regimen had a median progression-free survival of 5 months and a median overall survival of 13 months, with 54% of patient alive at 1 year.

The latest data, which included a small cohort of 30 patients, revealed that adding low-dose radiation to the standard first-line therapy led to a higher median progression-free survival of 8.3 months and extended median overall survival beyond the study follow-up period of 17.3 months. Overall, 66% of patients were alive at 1 year.

These are “promising” improvements over CASPIAN, Dr. Zhang, a lung cancer medical oncologist at Sichuan University, Chengdu, China, said at the Congress, which was organized by the European Society for Medical Oncology.

Study discussant Gerry Hanna, PhD, MBBS, a radiation oncologist at Belfast City Hospital, Belfast, Northern Ireland, agreed. Although there were just 30 patients, “you cannot deny these are [strong] results in terms of extensive-stage small cell cancer,” Dr. Hanna said.

Although standard first-line treatment of extensive-stage SCLC is durvalumab plus etoposide-platinum chemotherapy, the benefits aren’t durable for many patients.

This problem led Dr. Zhang and his colleagues to look for ways to improve outcomes. Because the CASPIAN trial did not include radiation to the primary tumor, it seemed a logical strategy to explore.

In the current single-arm study, Dr. Zhang and his team added 15 Gy radiation in five fractions to the primary lung tumors of 30 patients during the first cycle of durvalumab plus etoposide-platinum.

Subjects received 1500 mg of durvalumab plus etoposide-platinum every 3 weeks for four cycles. Low-dose radiation to the primary tumor was delivered over 5 days at the start of treatment. Patients then continued with durvalumab maintenance every 4 weeks until progression or intolerable toxicity.

Six patients (20%) had liver metastases at the baseline, and three (10%) had brain metastases. Over half had prophylactic cranial radiation. Performance scores were 0-1, and all but one of the participants were men.

Six- and 12-month progression-free survival rates were 57% and 40%, respectively. Overall survival was 90% at 6 months and 66% at 12 months. Median overall survival was 13 months in the CASPIAN trial but not reached in Dr. Zhang’s trial after a median follow-up of 17.3 months, with the earliest deaths occurring at 10.8 months.

Grade 3 treatment-related adverse events occurred in 80% of patients, most frequently hematologic toxicities. Five patients (16.7%) had severe adverse reactions to radiation. Although the overall dose of radiation was low, at 3 Gy each, the fractions were on the large side.

Hanna wanted more information on the radiotoxicity issue, but even so, he said that adding low-dose radiation to our durvalumab-chemotherapy doublet warrants further investigation.

Both Dr. Hanna and Dr. Zhang thought that instead of killing cancer cells directly, the greatest benefit of upfront radiation, and the peritumoral inflammation it causes, is to augment durvalumab’s effect.

Overall, Dr. Hanna stressed that we haven’t had results like these before in a SCLC study, particularly for novel agents, let alone radiation.

The study was funded by AstraZeneca, maker of durvalumab. Dr. Zhang and Dr. Hanna didn’t have any relevant disclosures.

A version of this article appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Adding low-dose radiation to the current standard first-line treatment, durvalumab plus etoposide-platinum chemotherapy, appears to improve survival outcomes in patients with extensive-stage small cell lung cancer (SCLC), suggested new findings from a small, single-arm study.

The analysis, presented at the 2024 European Lung Cancer Congress, revealed that low-dose radiation improved patients’ median progression-free and overall survival compared with standard first-line treatment, reported in a 2019 trial, lead author Yan Zhang, MD, reported.

The standard first-line treatment results came from the 2019 CASPIAN trial, which found that patients receiving the first-line regimen had a median progression-free survival of 5 months and a median overall survival of 13 months, with 54% of patient alive at 1 year.

The latest data, which included a small cohort of 30 patients, revealed that adding low-dose radiation to the standard first-line therapy led to a higher median progression-free survival of 8.3 months and extended median overall survival beyond the study follow-up period of 17.3 months. Overall, 66% of patients were alive at 1 year.

These are “promising” improvements over CASPIAN, Dr. Zhang, a lung cancer medical oncologist at Sichuan University, Chengdu, China, said at the Congress, which was organized by the European Society for Medical Oncology.

Study discussant Gerry Hanna, PhD, MBBS, a radiation oncologist at Belfast City Hospital, Belfast, Northern Ireland, agreed. Although there were just 30 patients, “you cannot deny these are [strong] results in terms of extensive-stage small cell cancer,” Dr. Hanna said.

Although standard first-line treatment of extensive-stage SCLC is durvalumab plus etoposide-platinum chemotherapy, the benefits aren’t durable for many patients.

This problem led Dr. Zhang and his colleagues to look for ways to improve outcomes. Because the CASPIAN trial did not include radiation to the primary tumor, it seemed a logical strategy to explore.

In the current single-arm study, Dr. Zhang and his team added 15 Gy radiation in five fractions to the primary lung tumors of 30 patients during the first cycle of durvalumab plus etoposide-platinum.

Subjects received 1500 mg of durvalumab plus etoposide-platinum every 3 weeks for four cycles. Low-dose radiation to the primary tumor was delivered over 5 days at the start of treatment. Patients then continued with durvalumab maintenance every 4 weeks until progression or intolerable toxicity.

Six patients (20%) had liver metastases at the baseline, and three (10%) had brain metastases. Over half had prophylactic cranial radiation. Performance scores were 0-1, and all but one of the participants were men.

Six- and 12-month progression-free survival rates were 57% and 40%, respectively. Overall survival was 90% at 6 months and 66% at 12 months. Median overall survival was 13 months in the CASPIAN trial but not reached in Dr. Zhang’s trial after a median follow-up of 17.3 months, with the earliest deaths occurring at 10.8 months.

Grade 3 treatment-related adverse events occurred in 80% of patients, most frequently hematologic toxicities. Five patients (16.7%) had severe adverse reactions to radiation. Although the overall dose of radiation was low, at 3 Gy each, the fractions were on the large side.

Hanna wanted more information on the radiotoxicity issue, but even so, he said that adding low-dose radiation to our durvalumab-chemotherapy doublet warrants further investigation.

Both Dr. Hanna and Dr. Zhang thought that instead of killing cancer cells directly, the greatest benefit of upfront radiation, and the peritumoral inflammation it causes, is to augment durvalumab’s effect.

Overall, Dr. Hanna stressed that we haven’t had results like these before in a SCLC study, particularly for novel agents, let alone radiation.

The study was funded by AstraZeneca, maker of durvalumab. Dr. Zhang and Dr. Hanna didn’t have any relevant disclosures.

A version of this article appeared on Medscape.com.

 

Adding low-dose radiation to the current standard first-line treatment, durvalumab plus etoposide-platinum chemotherapy, appears to improve survival outcomes in patients with extensive-stage small cell lung cancer (SCLC), suggested new findings from a small, single-arm study.

The analysis, presented at the 2024 European Lung Cancer Congress, revealed that low-dose radiation improved patients’ median progression-free and overall survival compared with standard first-line treatment, reported in a 2019 trial, lead author Yan Zhang, MD, reported.

The standard first-line treatment results came from the 2019 CASPIAN trial, which found that patients receiving the first-line regimen had a median progression-free survival of 5 months and a median overall survival of 13 months, with 54% of patient alive at 1 year.

The latest data, which included a small cohort of 30 patients, revealed that adding low-dose radiation to the standard first-line therapy led to a higher median progression-free survival of 8.3 months and extended median overall survival beyond the study follow-up period of 17.3 months. Overall, 66% of patients were alive at 1 year.

These are “promising” improvements over CASPIAN, Dr. Zhang, a lung cancer medical oncologist at Sichuan University, Chengdu, China, said at the Congress, which was organized by the European Society for Medical Oncology.

Study discussant Gerry Hanna, PhD, MBBS, a radiation oncologist at Belfast City Hospital, Belfast, Northern Ireland, agreed. Although there were just 30 patients, “you cannot deny these are [strong] results in terms of extensive-stage small cell cancer,” Dr. Hanna said.

Although standard first-line treatment of extensive-stage SCLC is durvalumab plus etoposide-platinum chemotherapy, the benefits aren’t durable for many patients.

This problem led Dr. Zhang and his colleagues to look for ways to improve outcomes. Because the CASPIAN trial did not include radiation to the primary tumor, it seemed a logical strategy to explore.

In the current single-arm study, Dr. Zhang and his team added 15 Gy radiation in five fractions to the primary lung tumors of 30 patients during the first cycle of durvalumab plus etoposide-platinum.

Subjects received 1500 mg of durvalumab plus etoposide-platinum every 3 weeks for four cycles. Low-dose radiation to the primary tumor was delivered over 5 days at the start of treatment. Patients then continued with durvalumab maintenance every 4 weeks until progression or intolerable toxicity.

Six patients (20%) had liver metastases at the baseline, and three (10%) had brain metastases. Over half had prophylactic cranial radiation. Performance scores were 0-1, and all but one of the participants were men.

Six- and 12-month progression-free survival rates were 57% and 40%, respectively. Overall survival was 90% at 6 months and 66% at 12 months. Median overall survival was 13 months in the CASPIAN trial but not reached in Dr. Zhang’s trial after a median follow-up of 17.3 months, with the earliest deaths occurring at 10.8 months.

Grade 3 treatment-related adverse events occurred in 80% of patients, most frequently hematologic toxicities. Five patients (16.7%) had severe adverse reactions to radiation. Although the overall dose of radiation was low, at 3 Gy each, the fractions were on the large side.

Hanna wanted more information on the radiotoxicity issue, but even so, he said that adding low-dose radiation to our durvalumab-chemotherapy doublet warrants further investigation.

Both Dr. Hanna and Dr. Zhang thought that instead of killing cancer cells directly, the greatest benefit of upfront radiation, and the peritumoral inflammation it causes, is to augment durvalumab’s effect.

Overall, Dr. Hanna stressed that we haven’t had results like these before in a SCLC study, particularly for novel agents, let alone radiation.

The study was funded by AstraZeneca, maker of durvalumab. Dr. Zhang and Dr. Hanna didn’t have any relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELCC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vitamin D Deficiency May Be Linked to Peripheral Neuropathy

Article Type
Changed
Mon, 04/01/2024 - 17:59

 

TOPLINE:

Vitamin D deficiency is independently linked to the risk for diabetic peripheral neuropathy (DPN) by potentially affecting large nerve fibers in older patients with type 2 diabetes (T2D).

METHODOLOGY:

  • Although previous research has shown that vitamin D deficiency is common in patients with diabetes and may increase the risk for peripheral neuropathy, its effects on large and small nerve fiber lesions have not been well explored yet.
  • Researchers conducted a cross-sectional study to understand the association between vitamin D deficiency and DPN development in 230 older patients (mean age, 67 years) with T2D for about 15 years who were recruited from Beijing Hospital between 2020 and 2023.
  • All patients were evaluated for DPN based on poor blood sugar control or symptoms such as pain and sensory abnormalities, of which 175 patients diagnosed with DPN were propensity-matched with 55 patients without DPN.
  • Vitamin D deficiency, defined as serum 25-hydroxyvitamin D circulating levels below 20 ng/mL, was reported in 169 patients.
  • Large nerve fiber lesions were evaluated using electromyography, and small nerve fiber lesions were assessed by measuring skin conductance.

TAKEAWAY:

  • Vitamin D deficiency was more likely to affect large fiber lesions, suggested by longer median sensory nerve latency, minimum latency of the F-wave, and median nerve motor evoked potential latency than those in the vitamin D–sufficient group.
  • Furthermore, vitamin D deficiency was linked to large fiber neuropathy with increased odds of prolongation of motor nerve latency (odds ratio, 1.362; P = .038).
  • The electrochemical skin conductance, which indicates damage to small nerve fibers, was comparable between patients with and without vitamin D deficiency.

IN PRACTICE:

This study is too preliminary to have practice application.

SOURCE:

This study was led by Sijia Fei, Department of Endocrinology, Beijing Hospital, Beijing, People’s Republic of China, and was published online in Diabetes Research and Clinical Practice.

LIMITATIONS:

Skin biopsy, the “gold-standard” for quantifying intraepidermal nerve fiber density, was not used to assess small nerve fiber lesions. Additionally, a causal link between vitamin D deficiency and diabetic nerve damage was not established owing to the cross-sectional nature of the study. Some patients with T2D may have been receiving insulin therapy, which may have affected vitamin D levels.

DISCLOSURES:

The study was supported by grants from the National Natural Science Foundation of China and China National Key R&D Program. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Vitamin D deficiency is independently linked to the risk for diabetic peripheral neuropathy (DPN) by potentially affecting large nerve fibers in older patients with type 2 diabetes (T2D).

METHODOLOGY:

  • Although previous research has shown that vitamin D deficiency is common in patients with diabetes and may increase the risk for peripheral neuropathy, its effects on large and small nerve fiber lesions have not been well explored yet.
  • Researchers conducted a cross-sectional study to understand the association between vitamin D deficiency and DPN development in 230 older patients (mean age, 67 years) with T2D for about 15 years who were recruited from Beijing Hospital between 2020 and 2023.
  • All patients were evaluated for DPN based on poor blood sugar control or symptoms such as pain and sensory abnormalities, of which 175 patients diagnosed with DPN were propensity-matched with 55 patients without DPN.
  • Vitamin D deficiency, defined as serum 25-hydroxyvitamin D circulating levels below 20 ng/mL, was reported in 169 patients.
  • Large nerve fiber lesions were evaluated using electromyography, and small nerve fiber lesions were assessed by measuring skin conductance.

TAKEAWAY:

  • Vitamin D deficiency was more likely to affect large fiber lesions, suggested by longer median sensory nerve latency, minimum latency of the F-wave, and median nerve motor evoked potential latency than those in the vitamin D–sufficient group.
  • Furthermore, vitamin D deficiency was linked to large fiber neuropathy with increased odds of prolongation of motor nerve latency (odds ratio, 1.362; P = .038).
  • The electrochemical skin conductance, which indicates damage to small nerve fibers, was comparable between patients with and without vitamin D deficiency.

IN PRACTICE:

This study is too preliminary to have practice application.

SOURCE:

This study was led by Sijia Fei, Department of Endocrinology, Beijing Hospital, Beijing, People’s Republic of China, and was published online in Diabetes Research and Clinical Practice.

LIMITATIONS:

Skin biopsy, the “gold-standard” for quantifying intraepidermal nerve fiber density, was not used to assess small nerve fiber lesions. Additionally, a causal link between vitamin D deficiency and diabetic nerve damage was not established owing to the cross-sectional nature of the study. Some patients with T2D may have been receiving insulin therapy, which may have affected vitamin D levels.

DISCLOSURES:

The study was supported by grants from the National Natural Science Foundation of China and China National Key R&D Program. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Vitamin D deficiency is independently linked to the risk for diabetic peripheral neuropathy (DPN) by potentially affecting large nerve fibers in older patients with type 2 diabetes (T2D).

METHODOLOGY:

  • Although previous research has shown that vitamin D deficiency is common in patients with diabetes and may increase the risk for peripheral neuropathy, its effects on large and small nerve fiber lesions have not been well explored yet.
  • Researchers conducted a cross-sectional study to understand the association between vitamin D deficiency and DPN development in 230 older patients (mean age, 67 years) with T2D for about 15 years who were recruited from Beijing Hospital between 2020 and 2023.
  • All patients were evaluated for DPN based on poor blood sugar control or symptoms such as pain and sensory abnormalities, of which 175 patients diagnosed with DPN were propensity-matched with 55 patients without DPN.
  • Vitamin D deficiency, defined as serum 25-hydroxyvitamin D circulating levels below 20 ng/mL, was reported in 169 patients.
  • Large nerve fiber lesions were evaluated using electromyography, and small nerve fiber lesions were assessed by measuring skin conductance.

TAKEAWAY:

  • Vitamin D deficiency was more likely to affect large fiber lesions, suggested by longer median sensory nerve latency, minimum latency of the F-wave, and median nerve motor evoked potential latency than those in the vitamin D–sufficient group.
  • Furthermore, vitamin D deficiency was linked to large fiber neuropathy with increased odds of prolongation of motor nerve latency (odds ratio, 1.362; P = .038).
  • The electrochemical skin conductance, which indicates damage to small nerve fibers, was comparable between patients with and without vitamin D deficiency.

IN PRACTICE:

This study is too preliminary to have practice application.

SOURCE:

This study was led by Sijia Fei, Department of Endocrinology, Beijing Hospital, Beijing, People’s Republic of China, and was published online in Diabetes Research and Clinical Practice.

LIMITATIONS:

Skin biopsy, the “gold-standard” for quantifying intraepidermal nerve fiber density, was not used to assess small nerve fiber lesions. Additionally, a causal link between vitamin D deficiency and diabetic nerve damage was not established owing to the cross-sectional nature of the study. Some patients with T2D may have been receiving insulin therapy, which may have affected vitamin D levels.

DISCLOSURES:

The study was supported by grants from the National Natural Science Foundation of China and China National Key R&D Program. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Guidelines: Start PSA Screening Earlier in Black Men

Article Type
Changed
Fri, 03/22/2024 - 13:39

Lowering the recommended age for baseline prostate-specific antigen (PSA) would reduce prostate cancer deaths by about 30% in Black men without significantly increasing the rate of overdiagnosis, according to new screening guidelines from the Prostate Cancer Foundation.

Specifically, baseline PSA testing in Black men should begin at age 40-45, sooner than current guidelines recommend, and should be followed by regular screening intervals, preferably annually, at least until age 70, a multidisciplinary panel of experts and patient advocates determined based on a comprehensive literature review.

The panel’s findings were presented in a poster at the ASCO Genitourinary Cancers Symposium.

“Black men in the United States are considered a high-risk population for being diagnosed with and dying from prostate cancer,” wrote lead author Isla Garraway, MD, PhD, of the University of California, Los Angeles, and colleagues. Specifically, Black men are about two times more likely to be diagnosed with and die from prostate cancer than White men. But, the authors continued, “few guidelines have outlined specific recommendations for PSA-based prostate cancer screening among Black men.”

The US Preventive Services Task Force recommendations, which are currently being updated, set the PSA screening start age at 55. The task force recommendations, which dictate insurance coverage in the United States, acknowledged “a potential mortality benefit for African American men when beginning screening before age 55 years” but did not explicitly recommend screening earlier.

Current guidelines from the American Cancer Society call for discussions about screening in average-risk men to begin at age 50-55. The recommendations do specify lowering the age to 45 for those at a high risk for prostate cancer, which includes Black men as well as those with a first-degree relative diagnosed with prostate cancer before age 65. In some cases, screening can begin at age 40 in the highest risk men — those with more than one first-degree relative who had prostate cancer at a young age.

The Prostate Cancer Foundation “wanted to address the confusion around different guideline statements and the lack of clarity around screening recommendations for Black men,” said William K. Oh, MD, of The Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York City, who chaired the panel for the new guidelines. “We thus convened a distinguished panel of experts from diverse backgrounds and expertise to create six guidelines statements to help Black men, their families, and their healthcare providers to consider options for prostate cancer screening based on the best available evidence.”

After reviewing 287, the expert panel developed six new guideline statements, reaching at least 80% consensus among panel members, addressing screening for Black men:

  • Because Black men are at a high risk for prostate cancer, the benefits of screening generally outweigh the risks.
  • PSA testing should be considered first line for prostate cancer screening, although some providers may recommend an optional digital rectal exam in addition to the PSA test.
  • Black men should engage in shared decision-making with their healthcare providers and other trusted sources of information to learn about the pros and cons of screening.
  • For Black men who elect screening, a baseline PSA test should be done between ages 40 and 45, and annual PSA screening should be strongly considered based on the PSA value and the individual’s health status.
  • Black men over age 70 who have been undergoing prostate cancer screening should talk with their healthcare provider about whether to continue PSA testing and make an informed decision based on their age, life expectancy, health status, family history, and prior PSA levels.
  • Black men who are at even higher risk due to a strong family history and/or known carriers of high-risk genetic variants should consider initiating annual PSA screening as early as age 40.

These statements are based on “the best available evidence, which overwhelmingly supports the conclusion that Black men in the US could benefit from a risk-adapted PSA screening,” the investigators concluded, noting that the latest evidence “warrants revisiting current recommendations for early [prostate cancer] detection in Black men from other national guideline groups.”

“We believe that the outcome of these more directed guidelines will be to give clarity to these men,” added Oh, who is also chief medical officer for the Prostate Cancer Foundation.

This research was funded by the Prostate Cancer Foundation, National Cancer Institute, Veterans Affairs, Jean Perkins Foundation, and Department of Defense. Garraway reported having no disclosures.
 

A version of this article appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Lowering the recommended age for baseline prostate-specific antigen (PSA) would reduce prostate cancer deaths by about 30% in Black men without significantly increasing the rate of overdiagnosis, according to new screening guidelines from the Prostate Cancer Foundation.

Specifically, baseline PSA testing in Black men should begin at age 40-45, sooner than current guidelines recommend, and should be followed by regular screening intervals, preferably annually, at least until age 70, a multidisciplinary panel of experts and patient advocates determined based on a comprehensive literature review.

The panel’s findings were presented in a poster at the ASCO Genitourinary Cancers Symposium.

“Black men in the United States are considered a high-risk population for being diagnosed with and dying from prostate cancer,” wrote lead author Isla Garraway, MD, PhD, of the University of California, Los Angeles, and colleagues. Specifically, Black men are about two times more likely to be diagnosed with and die from prostate cancer than White men. But, the authors continued, “few guidelines have outlined specific recommendations for PSA-based prostate cancer screening among Black men.”

The US Preventive Services Task Force recommendations, which are currently being updated, set the PSA screening start age at 55. The task force recommendations, which dictate insurance coverage in the United States, acknowledged “a potential mortality benefit for African American men when beginning screening before age 55 years” but did not explicitly recommend screening earlier.

Current guidelines from the American Cancer Society call for discussions about screening in average-risk men to begin at age 50-55. The recommendations do specify lowering the age to 45 for those at a high risk for prostate cancer, which includes Black men as well as those with a first-degree relative diagnosed with prostate cancer before age 65. In some cases, screening can begin at age 40 in the highest risk men — those with more than one first-degree relative who had prostate cancer at a young age.

The Prostate Cancer Foundation “wanted to address the confusion around different guideline statements and the lack of clarity around screening recommendations for Black men,” said William K. Oh, MD, of The Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York City, who chaired the panel for the new guidelines. “We thus convened a distinguished panel of experts from diverse backgrounds and expertise to create six guidelines statements to help Black men, their families, and their healthcare providers to consider options for prostate cancer screening based on the best available evidence.”

After reviewing 287, the expert panel developed six new guideline statements, reaching at least 80% consensus among panel members, addressing screening for Black men:

  • Because Black men are at a high risk for prostate cancer, the benefits of screening generally outweigh the risks.
  • PSA testing should be considered first line for prostate cancer screening, although some providers may recommend an optional digital rectal exam in addition to the PSA test.
  • Black men should engage in shared decision-making with their healthcare providers and other trusted sources of information to learn about the pros and cons of screening.
  • For Black men who elect screening, a baseline PSA test should be done between ages 40 and 45, and annual PSA screening should be strongly considered based on the PSA value and the individual’s health status.
  • Black men over age 70 who have been undergoing prostate cancer screening should talk with their healthcare provider about whether to continue PSA testing and make an informed decision based on their age, life expectancy, health status, family history, and prior PSA levels.
  • Black men who are at even higher risk due to a strong family history and/or known carriers of high-risk genetic variants should consider initiating annual PSA screening as early as age 40.

These statements are based on “the best available evidence, which overwhelmingly supports the conclusion that Black men in the US could benefit from a risk-adapted PSA screening,” the investigators concluded, noting that the latest evidence “warrants revisiting current recommendations for early [prostate cancer] detection in Black men from other national guideline groups.”

“We believe that the outcome of these more directed guidelines will be to give clarity to these men,” added Oh, who is also chief medical officer for the Prostate Cancer Foundation.

This research was funded by the Prostate Cancer Foundation, National Cancer Institute, Veterans Affairs, Jean Perkins Foundation, and Department of Defense. Garraway reported having no disclosures.
 

A version of this article appeared on Medscape.com.

Lowering the recommended age for baseline prostate-specific antigen (PSA) would reduce prostate cancer deaths by about 30% in Black men without significantly increasing the rate of overdiagnosis, according to new screening guidelines from the Prostate Cancer Foundation.

Specifically, baseline PSA testing in Black men should begin at age 40-45, sooner than current guidelines recommend, and should be followed by regular screening intervals, preferably annually, at least until age 70, a multidisciplinary panel of experts and patient advocates determined based on a comprehensive literature review.

The panel’s findings were presented in a poster at the ASCO Genitourinary Cancers Symposium.

“Black men in the United States are considered a high-risk population for being diagnosed with and dying from prostate cancer,” wrote lead author Isla Garraway, MD, PhD, of the University of California, Los Angeles, and colleagues. Specifically, Black men are about two times more likely to be diagnosed with and die from prostate cancer than White men. But, the authors continued, “few guidelines have outlined specific recommendations for PSA-based prostate cancer screening among Black men.”

The US Preventive Services Task Force recommendations, which are currently being updated, set the PSA screening start age at 55. The task force recommendations, which dictate insurance coverage in the United States, acknowledged “a potential mortality benefit for African American men when beginning screening before age 55 years” but did not explicitly recommend screening earlier.

Current guidelines from the American Cancer Society call for discussions about screening in average-risk men to begin at age 50-55. The recommendations do specify lowering the age to 45 for those at a high risk for prostate cancer, which includes Black men as well as those with a first-degree relative diagnosed with prostate cancer before age 65. In some cases, screening can begin at age 40 in the highest risk men — those with more than one first-degree relative who had prostate cancer at a young age.

The Prostate Cancer Foundation “wanted to address the confusion around different guideline statements and the lack of clarity around screening recommendations for Black men,” said William K. Oh, MD, of The Tisch Cancer Institute, Icahn School of Medicine at Mount Sinai, New York City, who chaired the panel for the new guidelines. “We thus convened a distinguished panel of experts from diverse backgrounds and expertise to create six guidelines statements to help Black men, their families, and their healthcare providers to consider options for prostate cancer screening based on the best available evidence.”

After reviewing 287, the expert panel developed six new guideline statements, reaching at least 80% consensus among panel members, addressing screening for Black men:

  • Because Black men are at a high risk for prostate cancer, the benefits of screening generally outweigh the risks.
  • PSA testing should be considered first line for prostate cancer screening, although some providers may recommend an optional digital rectal exam in addition to the PSA test.
  • Black men should engage in shared decision-making with their healthcare providers and other trusted sources of information to learn about the pros and cons of screening.
  • For Black men who elect screening, a baseline PSA test should be done between ages 40 and 45, and annual PSA screening should be strongly considered based on the PSA value and the individual’s health status.
  • Black men over age 70 who have been undergoing prostate cancer screening should talk with their healthcare provider about whether to continue PSA testing and make an informed decision based on their age, life expectancy, health status, family history, and prior PSA levels.
  • Black men who are at even higher risk due to a strong family history and/or known carriers of high-risk genetic variants should consider initiating annual PSA screening as early as age 40.

These statements are based on “the best available evidence, which overwhelmingly supports the conclusion that Black men in the US could benefit from a risk-adapted PSA screening,” the investigators concluded, noting that the latest evidence “warrants revisiting current recommendations for early [prostate cancer] detection in Black men from other national guideline groups.”

“We believe that the outcome of these more directed guidelines will be to give clarity to these men,” added Oh, who is also chief medical officer for the Prostate Cancer Foundation.

This research was funded by the Prostate Cancer Foundation, National Cancer Institute, Veterans Affairs, Jean Perkins Foundation, and Department of Defense. Garraway reported having no disclosures.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASCO GU 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Transparent AI Predicts Breast Cancer 5 Years Out

Article Type
Changed
Fri, 03/22/2024 - 13:06

A new way of using artificial intelligence (AI) can predict breast cancer 5 years in advance with impressive accuracy — and unlike previous AI models, we know how this one works.

The new AI system, called AsymMirai, simplifies previous models by solely comparing differences between right and left breasts to predict risk. It could potentially save lives, prevent unnecessary testing, and save the healthcare system money, its creators say.

“With traditional AI, you ask it a question and it spits out an answer, but no one really knows how it makes its decisions. It’s a black box,” said Jon Donnelly, a PhD student in the department of computer science at Duke University, Durham, North Carolina, and first author on a new paper in Radiology describing the model.

“With our approach, people know how the algorithm comes up with its output so they can fact-check it and trust it,” he said.

One in eight women will develop invasive breast cancer, and 1 in 39 will die from it. Mammograms miss about 20% of breast cancers. (The shortcomings of genetic screening and mammograms received extra attention recently when actress Olivia Munn disclosed that she’d been treated for an aggressive form of breast cancer despite a normal mammogram and a negative genetic test.)

The model could help doctors bring the often-abstract idea of AI to the bedside in a meaningful way, said radiologist Vivianne Freitas, MD, assistant professor of medical imaging at the University of Toronto.

“This marks a new chapter in the field of AI,” said Dr. Freitas, who authored an editorial lauding the new paper. “It makes AI more tangible and understandable, thereby improving its potential for acceptance.”
 

AI as a Second Set of Eyes

Mr. Donnelly described AsymMirai as a simpler, more transparent, and easier-to-use version of Mirai, a breakthrough AI model which made headlines in 2021 with its promise to determine with unprecedented accuracy whether a patient is likely to get breast cancer within the next 5 years.

Mirai identified up to twice as many future cancer diagnoses as the conventional risk calculator Tyrer-Cuzick. It also maintained accuracy across a diverse set of patients — a notable plus for two fields (AI and healthcare) notorious for delivering poorer results for minorities.

Tyrer-Cuzick and other lower-tech risk calculators use personal and family history to statistically calculate risk. Mirai, on the other hand, analyzes countless bits of raw data embedded in a mammogram to decipher patterns a radiologist’s eyes may not catch. Four images, including two angles from each breast, are fed into the model, which produces a score between 0 and 1 to indicate the person’s risk of getting breast cancer in 1, 3, or 5 years.

But even Mirai’s creators have conceded they didn’t know exactly how it arrives at that score — a fact that has fueled hesitancy among clinicians.

Study coauthor Fides Schwartz, MD, a radiologist at Brigham and Women’s Hospital, Boston, said researchers were able to crack the code on Mirai’s “black box,” finding that its scores were largely determined by assessing subtle differences between right breast tissue and left breast tissue.

Knowing this, the research team simplified the model to predict risk based solely on “local bilateral dissimilarity.” AsymMirai was born.

The team then used AsymMirai to look back at > 200,000 mammograms from nearly 82,000 patients. They found it worked nearly as well as its predecessor, assigning a higher risk to those who would go on to develop cancer 66% of the time (vs Mirai’s 71%). In patients where it noticed the same asymmetry multiple years in a row it worked even better, with an 88% chance of giving people who would develop cancer later a higher score than those who would not.

“We found that we can, with surprisingly high accuracy, predict whether a woman will develop cancer in the next 1-5 years based solely on localized differences between her left and right breast tissue,” said Mr. Donnelly.

Dr. Schwartz imagines a day when radiologists could use the model to help develop personalized screening strategies for patients. Doctors might advise those with higher scores to get screened more often than guidelines suggest, supplement mammograms with an MRI , and keep a close watch on trouble spots identified by AI.

“For people with really low risk, on the other hand, maybe we can save them an annual exam that’s not super pleasant and might not be necessary,” said Dr. Schwartz.
 

Cautious Optimism

Robert Smith, PhD, senior vice president of early cancer detection science at the American Cancer Society, noted that AI has been used for decades to try to reduce radiologists’ workload and improve diagnoses.

“But AI just never really lived up to its fullest potential,” Dr. Smith said, “quite often because it was being used as a crutch by inexperienced radiologists who, instead of interpreting the mammogram and then seeing what AI had to say ended up letting AI do most of the work which, frankly, just wasn’t that accurate.”

He’s hopeful that newer, more sophisticated iterations of AI medical imaging platforms (roughly 18-20 models are in development) can ultimately save women’s lives, particularly in areas where radiologists are in short supply.

But he believes it will be a long time before doctors, or their patients, are willing to risk postponing a mammogram based on an algorithm.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A new way of using artificial intelligence (AI) can predict breast cancer 5 years in advance with impressive accuracy — and unlike previous AI models, we know how this one works.

The new AI system, called AsymMirai, simplifies previous models by solely comparing differences between right and left breasts to predict risk. It could potentially save lives, prevent unnecessary testing, and save the healthcare system money, its creators say.

“With traditional AI, you ask it a question and it spits out an answer, but no one really knows how it makes its decisions. It’s a black box,” said Jon Donnelly, a PhD student in the department of computer science at Duke University, Durham, North Carolina, and first author on a new paper in Radiology describing the model.

“With our approach, people know how the algorithm comes up with its output so they can fact-check it and trust it,” he said.

One in eight women will develop invasive breast cancer, and 1 in 39 will die from it. Mammograms miss about 20% of breast cancers. (The shortcomings of genetic screening and mammograms received extra attention recently when actress Olivia Munn disclosed that she’d been treated for an aggressive form of breast cancer despite a normal mammogram and a negative genetic test.)

The model could help doctors bring the often-abstract idea of AI to the bedside in a meaningful way, said radiologist Vivianne Freitas, MD, assistant professor of medical imaging at the University of Toronto.

“This marks a new chapter in the field of AI,” said Dr. Freitas, who authored an editorial lauding the new paper. “It makes AI more tangible and understandable, thereby improving its potential for acceptance.”
 

AI as a Second Set of Eyes

Mr. Donnelly described AsymMirai as a simpler, more transparent, and easier-to-use version of Mirai, a breakthrough AI model which made headlines in 2021 with its promise to determine with unprecedented accuracy whether a patient is likely to get breast cancer within the next 5 years.

Mirai identified up to twice as many future cancer diagnoses as the conventional risk calculator Tyrer-Cuzick. It also maintained accuracy across a diverse set of patients — a notable plus for two fields (AI and healthcare) notorious for delivering poorer results for minorities.

Tyrer-Cuzick and other lower-tech risk calculators use personal and family history to statistically calculate risk. Mirai, on the other hand, analyzes countless bits of raw data embedded in a mammogram to decipher patterns a radiologist’s eyes may not catch. Four images, including two angles from each breast, are fed into the model, which produces a score between 0 and 1 to indicate the person’s risk of getting breast cancer in 1, 3, or 5 years.

But even Mirai’s creators have conceded they didn’t know exactly how it arrives at that score — a fact that has fueled hesitancy among clinicians.

Study coauthor Fides Schwartz, MD, a radiologist at Brigham and Women’s Hospital, Boston, said researchers were able to crack the code on Mirai’s “black box,” finding that its scores were largely determined by assessing subtle differences between right breast tissue and left breast tissue.

Knowing this, the research team simplified the model to predict risk based solely on “local bilateral dissimilarity.” AsymMirai was born.

The team then used AsymMirai to look back at > 200,000 mammograms from nearly 82,000 patients. They found it worked nearly as well as its predecessor, assigning a higher risk to those who would go on to develop cancer 66% of the time (vs Mirai’s 71%). In patients where it noticed the same asymmetry multiple years in a row it worked even better, with an 88% chance of giving people who would develop cancer later a higher score than those who would not.

“We found that we can, with surprisingly high accuracy, predict whether a woman will develop cancer in the next 1-5 years based solely on localized differences between her left and right breast tissue,” said Mr. Donnelly.

Dr. Schwartz imagines a day when radiologists could use the model to help develop personalized screening strategies for patients. Doctors might advise those with higher scores to get screened more often than guidelines suggest, supplement mammograms with an MRI , and keep a close watch on trouble spots identified by AI.

“For people with really low risk, on the other hand, maybe we can save them an annual exam that’s not super pleasant and might not be necessary,” said Dr. Schwartz.
 

Cautious Optimism

Robert Smith, PhD, senior vice president of early cancer detection science at the American Cancer Society, noted that AI has been used for decades to try to reduce radiologists’ workload and improve diagnoses.

“But AI just never really lived up to its fullest potential,” Dr. Smith said, “quite often because it was being used as a crutch by inexperienced radiologists who, instead of interpreting the mammogram and then seeing what AI had to say ended up letting AI do most of the work which, frankly, just wasn’t that accurate.”

He’s hopeful that newer, more sophisticated iterations of AI medical imaging platforms (roughly 18-20 models are in development) can ultimately save women’s lives, particularly in areas where radiologists are in short supply.

But he believes it will be a long time before doctors, or their patients, are willing to risk postponing a mammogram based on an algorithm.
 

A version of this article appeared on Medscape.com.

A new way of using artificial intelligence (AI) can predict breast cancer 5 years in advance with impressive accuracy — and unlike previous AI models, we know how this one works.

The new AI system, called AsymMirai, simplifies previous models by solely comparing differences between right and left breasts to predict risk. It could potentially save lives, prevent unnecessary testing, and save the healthcare system money, its creators say.

“With traditional AI, you ask it a question and it spits out an answer, but no one really knows how it makes its decisions. It’s a black box,” said Jon Donnelly, a PhD student in the department of computer science at Duke University, Durham, North Carolina, and first author on a new paper in Radiology describing the model.

“With our approach, people know how the algorithm comes up with its output so they can fact-check it and trust it,” he said.

One in eight women will develop invasive breast cancer, and 1 in 39 will die from it. Mammograms miss about 20% of breast cancers. (The shortcomings of genetic screening and mammograms received extra attention recently when actress Olivia Munn disclosed that she’d been treated for an aggressive form of breast cancer despite a normal mammogram and a negative genetic test.)

The model could help doctors bring the often-abstract idea of AI to the bedside in a meaningful way, said radiologist Vivianne Freitas, MD, assistant professor of medical imaging at the University of Toronto.

“This marks a new chapter in the field of AI,” said Dr. Freitas, who authored an editorial lauding the new paper. “It makes AI more tangible and understandable, thereby improving its potential for acceptance.”
 

AI as a Second Set of Eyes

Mr. Donnelly described AsymMirai as a simpler, more transparent, and easier-to-use version of Mirai, a breakthrough AI model which made headlines in 2021 with its promise to determine with unprecedented accuracy whether a patient is likely to get breast cancer within the next 5 years.

Mirai identified up to twice as many future cancer diagnoses as the conventional risk calculator Tyrer-Cuzick. It also maintained accuracy across a diverse set of patients — a notable plus for two fields (AI and healthcare) notorious for delivering poorer results for minorities.

Tyrer-Cuzick and other lower-tech risk calculators use personal and family history to statistically calculate risk. Mirai, on the other hand, analyzes countless bits of raw data embedded in a mammogram to decipher patterns a radiologist’s eyes may not catch. Four images, including two angles from each breast, are fed into the model, which produces a score between 0 and 1 to indicate the person’s risk of getting breast cancer in 1, 3, or 5 years.

But even Mirai’s creators have conceded they didn’t know exactly how it arrives at that score — a fact that has fueled hesitancy among clinicians.

Study coauthor Fides Schwartz, MD, a radiologist at Brigham and Women’s Hospital, Boston, said researchers were able to crack the code on Mirai’s “black box,” finding that its scores were largely determined by assessing subtle differences between right breast tissue and left breast tissue.

Knowing this, the research team simplified the model to predict risk based solely on “local bilateral dissimilarity.” AsymMirai was born.

The team then used AsymMirai to look back at > 200,000 mammograms from nearly 82,000 patients. They found it worked nearly as well as its predecessor, assigning a higher risk to those who would go on to develop cancer 66% of the time (vs Mirai’s 71%). In patients where it noticed the same asymmetry multiple years in a row it worked even better, with an 88% chance of giving people who would develop cancer later a higher score than those who would not.

“We found that we can, with surprisingly high accuracy, predict whether a woman will develop cancer in the next 1-5 years based solely on localized differences between her left and right breast tissue,” said Mr. Donnelly.

Dr. Schwartz imagines a day when radiologists could use the model to help develop personalized screening strategies for patients. Doctors might advise those with higher scores to get screened more often than guidelines suggest, supplement mammograms with an MRI , and keep a close watch on trouble spots identified by AI.

“For people with really low risk, on the other hand, maybe we can save them an annual exam that’s not super pleasant and might not be necessary,” said Dr. Schwartz.
 

Cautious Optimism

Robert Smith, PhD, senior vice president of early cancer detection science at the American Cancer Society, noted that AI has been used for decades to try to reduce radiologists’ workload and improve diagnoses.

“But AI just never really lived up to its fullest potential,” Dr. Smith said, “quite often because it was being used as a crutch by inexperienced radiologists who, instead of interpreting the mammogram and then seeing what AI had to say ended up letting AI do most of the work which, frankly, just wasn’t that accurate.”

He’s hopeful that newer, more sophisticated iterations of AI medical imaging platforms (roughly 18-20 models are in development) can ultimately save women’s lives, particularly in areas where radiologists are in short supply.

But he believes it will be a long time before doctors, or their patients, are willing to risk postponing a mammogram based on an algorithm.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New CRC Risk Prediction Model Outperforms Polyp-Based Model

Article Type
Changed
Fri, 03/22/2024 - 13:05

 

TOPLINE:

A comprehensive model considering patient age, diabetes, colonoscopy indications, and polyp findings can predict colorectal cancer (CRC) risk more accurately than the solely polyp-based model in patients with a first diagnosis of adenoma on colonoscopy.

METHODOLOGY:

  • Because colonoscopy surveillance guidelines relying solely on previous polyp findings to assess CRC risk are imprecise, researchers developed and tested a comprehensive risk prediction model from a list of CRC-related predictors that included patient characteristics and clinical factors in addition to polyp findings.
  • The comprehensive model included baseline colonoscopy indication, age group, diabetes diagnosis, and polyp findings (adenoma with advanced histology, polyp size ≥ 10 mm, and sessile serrated or traditional serrated adenoma).
  • They randomly assigned 95,001 patients (mean age, 61.9 years; 45.5% women) who underwent colonoscopy with polypectomy to remove a conventional adenoma into two cohorts: Model development (66,500) and internal validation (28,501).
  • In both cohorts, researchers compared the performance of the polyp findings-only method against the comprehensive model in predicting CRC, defined as an adenocarcinoma of the colon or rectum diagnosed a year after the baseline colonoscopy.

TAKEAWAY:

  • During the follow-up period starting 1 year after colonoscopy, 495 patients were diagnosed with CRC; 354 were in the development cohort and 141 were in the validation cohort.
  • The comprehensive model demonstrated better predictive performance than the traditional polyp-based model in the development cohort (area under the curve [AUC], 0.71 vs 0.61) and in the validation cohort (AUC, 0.7 vs 0.62).
  • The difference in the Akaike Information Criterion values between the comprehensive and polyp models was 45.7, much above the threshold of 10, strongly indicating the superior performance of the comprehensive model.

IN PRACTICE:

“Improving the ability to accurately predict the patients at highest risk for CRC after polypectomy is critically important, given the considerable costs and resources associated with treating CRC and the better prognosis associated with early cancer detection. The current findings provide proof of concept that inclusion of CRC risk factors beyond prior polyp findings has the potential to improve post-colonoscopy risk stratification,” the authors wrote.

SOURCE:

The study, led by Jeffrey K. Lee, MD, MPH, Division of Research, Kaiser Permanente Northern California, Oakland, California, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

External validation of the model’s performance is needed in different practice settings. The generalizability of the findings is limited because the study population did not include individuals without a prior adenoma or those with an isolated serrated polyp. Moreover, the examination of polyp size > 20 mm as a potential predictor of CRC was precluded due to incomplete data.

DISCLOSURES:

The study was conducted within the National Cancer Institute–funded Population-Based Research to Optimize the Screening Process II consortium and funded by a career development grant from the National Cancer Institute to Lee. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

A comprehensive model considering patient age, diabetes, colonoscopy indications, and polyp findings can predict colorectal cancer (CRC) risk more accurately than the solely polyp-based model in patients with a first diagnosis of adenoma on colonoscopy.

METHODOLOGY:

  • Because colonoscopy surveillance guidelines relying solely on previous polyp findings to assess CRC risk are imprecise, researchers developed and tested a comprehensive risk prediction model from a list of CRC-related predictors that included patient characteristics and clinical factors in addition to polyp findings.
  • The comprehensive model included baseline colonoscopy indication, age group, diabetes diagnosis, and polyp findings (adenoma with advanced histology, polyp size ≥ 10 mm, and sessile serrated or traditional serrated adenoma).
  • They randomly assigned 95,001 patients (mean age, 61.9 years; 45.5% women) who underwent colonoscopy with polypectomy to remove a conventional adenoma into two cohorts: Model development (66,500) and internal validation (28,501).
  • In both cohorts, researchers compared the performance of the polyp findings-only method against the comprehensive model in predicting CRC, defined as an adenocarcinoma of the colon or rectum diagnosed a year after the baseline colonoscopy.

TAKEAWAY:

  • During the follow-up period starting 1 year after colonoscopy, 495 patients were diagnosed with CRC; 354 were in the development cohort and 141 were in the validation cohort.
  • The comprehensive model demonstrated better predictive performance than the traditional polyp-based model in the development cohort (area under the curve [AUC], 0.71 vs 0.61) and in the validation cohort (AUC, 0.7 vs 0.62).
  • The difference in the Akaike Information Criterion values between the comprehensive and polyp models was 45.7, much above the threshold of 10, strongly indicating the superior performance of the comprehensive model.

IN PRACTICE:

“Improving the ability to accurately predict the patients at highest risk for CRC after polypectomy is critically important, given the considerable costs and resources associated with treating CRC and the better prognosis associated with early cancer detection. The current findings provide proof of concept that inclusion of CRC risk factors beyond prior polyp findings has the potential to improve post-colonoscopy risk stratification,” the authors wrote.

SOURCE:

The study, led by Jeffrey K. Lee, MD, MPH, Division of Research, Kaiser Permanente Northern California, Oakland, California, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

External validation of the model’s performance is needed in different practice settings. The generalizability of the findings is limited because the study population did not include individuals without a prior adenoma or those with an isolated serrated polyp. Moreover, the examination of polyp size > 20 mm as a potential predictor of CRC was precluded due to incomplete data.

DISCLOSURES:

The study was conducted within the National Cancer Institute–funded Population-Based Research to Optimize the Screening Process II consortium and funded by a career development grant from the National Cancer Institute to Lee. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

 

TOPLINE:

A comprehensive model considering patient age, diabetes, colonoscopy indications, and polyp findings can predict colorectal cancer (CRC) risk more accurately than the solely polyp-based model in patients with a first diagnosis of adenoma on colonoscopy.

METHODOLOGY:

  • Because colonoscopy surveillance guidelines relying solely on previous polyp findings to assess CRC risk are imprecise, researchers developed and tested a comprehensive risk prediction model from a list of CRC-related predictors that included patient characteristics and clinical factors in addition to polyp findings.
  • The comprehensive model included baseline colonoscopy indication, age group, diabetes diagnosis, and polyp findings (adenoma with advanced histology, polyp size ≥ 10 mm, and sessile serrated or traditional serrated adenoma).
  • They randomly assigned 95,001 patients (mean age, 61.9 years; 45.5% women) who underwent colonoscopy with polypectomy to remove a conventional adenoma into two cohorts: Model development (66,500) and internal validation (28,501).
  • In both cohorts, researchers compared the performance of the polyp findings-only method against the comprehensive model in predicting CRC, defined as an adenocarcinoma of the colon or rectum diagnosed a year after the baseline colonoscopy.

TAKEAWAY:

  • During the follow-up period starting 1 year after colonoscopy, 495 patients were diagnosed with CRC; 354 were in the development cohort and 141 were in the validation cohort.
  • The comprehensive model demonstrated better predictive performance than the traditional polyp-based model in the development cohort (area under the curve [AUC], 0.71 vs 0.61) and in the validation cohort (AUC, 0.7 vs 0.62).
  • The difference in the Akaike Information Criterion values between the comprehensive and polyp models was 45.7, much above the threshold of 10, strongly indicating the superior performance of the comprehensive model.

IN PRACTICE:

“Improving the ability to accurately predict the patients at highest risk for CRC after polypectomy is critically important, given the considerable costs and resources associated with treating CRC and the better prognosis associated with early cancer detection. The current findings provide proof of concept that inclusion of CRC risk factors beyond prior polyp findings has the potential to improve post-colonoscopy risk stratification,” the authors wrote.

SOURCE:

The study, led by Jeffrey K. Lee, MD, MPH, Division of Research, Kaiser Permanente Northern California, Oakland, California, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

External validation of the model’s performance is needed in different practice settings. The generalizability of the findings is limited because the study population did not include individuals without a prior adenoma or those with an isolated serrated polyp. Moreover, the examination of polyp size > 20 mm as a potential predictor of CRC was precluded due to incomplete data.

DISCLOSURES:

The study was conducted within the National Cancer Institute–funded Population-Based Research to Optimize the Screening Process II consortium and funded by a career development grant from the National Cancer Institute to Lee. The authors declared no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New CRC stool test beats FIT for sensitivity but not specificity

Article Type
Changed
Mon, 03/25/2024 - 09:15

A next-generation stool DNA test for colorectal cancer (CRC) screening had higher sensitivity for all screening-relevant lesions but lower specificity than a currently available fecal immunochemical test (FIT), according to the large prospective BLUE-C study.

The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.

Dr. Thomas F. Imperiale, Indiana University School of Medicine in Indianapolis
Indiana University School of Medicine
Dr. Thomas F. Imperiale

Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.

“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”

Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.

The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.

Aasma Shaukat, MD, MPH, AGAF, is Robert M. and Mary H. Glickman Professor of Medicine and Population Health and director of GI Outcomes Research at New York University.
New York University
Dr. Aasma Shaukat

“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.

In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.

While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
 

 

 

Testing details

Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.

Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:

  • Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
  • Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
  • Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
  • Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
  • Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
  • Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
  • No adverse events occurred.

In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.

In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.

An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.

The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”

The new version of the DNA test is expected to be available in about a year.

According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.

Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.

“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”

This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.

Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.

Publications
Topics
Sections

A next-generation stool DNA test for colorectal cancer (CRC) screening had higher sensitivity for all screening-relevant lesions but lower specificity than a currently available fecal immunochemical test (FIT), according to the large prospective BLUE-C study.

The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.

Dr. Thomas F. Imperiale, Indiana University School of Medicine in Indianapolis
Indiana University School of Medicine
Dr. Thomas F. Imperiale

Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.

“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”

Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.

The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.

Aasma Shaukat, MD, MPH, AGAF, is Robert M. and Mary H. Glickman Professor of Medicine and Population Health and director of GI Outcomes Research at New York University.
New York University
Dr. Aasma Shaukat

“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.

In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.

While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
 

 

 

Testing details

Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.

Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:

  • Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
  • Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
  • Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
  • Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
  • Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
  • Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
  • No adverse events occurred.

In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.

In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.

An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.

The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”

The new version of the DNA test is expected to be available in about a year.

According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.

Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.

“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”

This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.

Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.

A next-generation stool DNA test for colorectal cancer (CRC) screening had higher sensitivity for all screening-relevant lesions but lower specificity than a currently available fecal immunochemical test (FIT), according to the large prospective BLUE-C study.

The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.

Dr. Thomas F. Imperiale, Indiana University School of Medicine in Indianapolis
Indiana University School of Medicine
Dr. Thomas F. Imperiale

Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.

“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”

Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.

The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.

Aasma Shaukat, MD, MPH, AGAF, is Robert M. and Mary H. Glickman Professor of Medicine and Population Health and director of GI Outcomes Research at New York University.
New York University
Dr. Aasma Shaukat

“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.

In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.

While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
 

 

 

Testing details

Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.

Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:

  • Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
  • Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
  • Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
  • Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
  • Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
  • Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
  • No adverse events occurred.

In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.

In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.

An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.

The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”

The new version of the DNA test is expected to be available in about a year.

According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.

Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.

“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”

This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.

Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Diagnosis Denial: How Doctors Help Patients Accept Their Condition

Article Type
Changed
Thu, 03/21/2024 - 09:32

Informing patients of a dire diagnosis — or even one that will require significant lifestyle changes — is never easy. But what’s even more challenging is when patients don’t accept their medical condition or a future that might include a difficult treatment protocol or even new medications or surgery.

“This is a challenging space to be in because this isn’t an exact science,” said Jack Jacoub, MD, medical director of MemorialCare Cancer Institute at Orange Coast Memorial in Fountain Valley, California. “There’s no formal training to deal with this — experience is your best teacher.”

Ultimately, helping a person reconceptualize what their future looks like is at the heart of every one of these conversations, said Sourav Sengupta, MD, MPH, associate professor of psychiatry and pediatrics at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo, Buffalo, New York. “As physicians, we’re charged with helping our patients navigate a difficult and challenging time in their life,” he told this news organization.

“It’s not infrequent that patients are struggling to rethink what it will be like to be a person with an illness that might be chronic and how this will change their life,” he said.

And because denial is commonly the initial way a patient might cope with absorbing news that’s hard to hear, you’ll need to be extremely patient and empathetic.

“The goal is to build trust with this person, including trust in you, the hospital itself, and the entire team treating the patient,” Dr. Jacoub said.

“A diagnosis, especially in my field of oncology, can be scary. Spending time explaining their prognosis is very important. This can’t be a rushed scenario.”

More advice on helping patients who are in denial about their medical condition:
 

Make Sure They Understand What’s Going on

In cardiology, it’s common for patients to be hospitalized when they first learn that they have a disease they must manage for the rest of their life, said Stephanie Saucier, MD, a noninvasive cardiologist and codirector of the Women’s Heart Wellness Program at Hartford Healthcare’s Heart and Vascular Institute.

“Especially after someone has had a heart attack, a stroke, or they had bypass or stents placed, I like to see what their understanding of the disease is,” Dr. Saucier said. “I ask them, ‘What do you understand about what happened to you’. It can get confusing when you’re in the hospital and are told a lot of information in a short period of time.”
 

Share the Data

If a patient remains resistant to the news of a diagnosis, sharing test results can be beneficial. “I’ll often say, ‘here are the scans; this is the path report; this is the bloodwork; this is your biopsy report; these are the things we have’,” Dr. Jacoub said.

“Yes, this is clinical, but it helps to communicate the information you have and do it with data. For example, I might add, ‘Would you like to see some of the things [results, scans, tests] we’re talking about today?’ This also helps establish trust.”
 

 

 

Help Them Wrap Their Mind Around a Lifelong Condition

It’s often challenging for patients to accept that what they think is a one-time health issue will affect them for a lifetime. “I use juvenile diabetes as a way to explain this,” Dr. Saucier said. “I ask them what they would do if, say, their child was diagnosed with juvenile diabetes.”

Of course, patients agree that they wouldn’t give a child insulin for only a brief period. They understand that the condition must be treated in the long term. This kind of analogy can help patients understand that they, too, have a disorder requiring lifelong treatment.
 

Be Ready to Respond

Dr. Sengupta says that it’s important to be prepared with an answer if your patient is challenging or suggests that the diagnosis is fake or that you don’t have their best interests in mind.

“It’s understandable that patients might feel frustrated and upset,” he said. “It’s challenging when somehow a patient doesn’t assume my best intent.”

They might say something like, “You’re trying to make more money” or “you’re a shill for a pharma company.” In that case, you must listen. Patiently explain, “I’m your doctor; I work for you; I’m most interested in you feeling healthy and well.”

Occasionally, you’ll need a thick skin when it comes to inaccurate, controversial, or conspiratorial conversations with patients.
 

Acknowledge Differences

News of an illness may clash with a person’s take on the world. “A cancer diagnosis, for example, may clash with religious beliefs or faith-based ideology about the healthcare system,” said Aaron Fletcher, MD, a board-certified otolaryngologist specializing in head and neck surgery at the Georgia Center for Ear, Nose, Throat, and Facial Plastic Surgery in Atlanta, Georgia.

“If you have a patient who is coming to you with these beliefs, you need to have a lot of empathy, patience, and good communication skills. It’s up to you to break through the initial doubt and do your best to explain things in layman’s terms.”
 

Find Mutual Ground

If your patient still denies their health issues, try to find one thing you can agree on regarding a long-term game plan. “I’ll say, ‘Can we at least agree to discuss this with other family members or people who care about you’?” Dr. Jacoub said.

“I always tell patients that loved ones are welcome to call me so long as they [the patient] give permission. Sometimes, this is all that it takes to get them to accept their health situation.”
 

Seven Ways to Cope With Diagnosis  Denial

This news organization asked David Cutler, MD, a board-certified family medicine physician at Providence Saint John›s Health Center in Santa Monica, California, for tips in helping patients who are having a challenging time accepting their condition:

  • Listen Actively. Allow the patient to express their feelings and concerns without judgment. Active listening can help them feel heard and understood, which may open the door to discussing their condition more openly.
  • Provide Information. Offer factual information about their medical condition, treatment options, and the potential consequences of denial. Provide resources such as pamphlets, websites, or books that they can review at their own pace.
  • Encourage Professional Help. You may want to suggest that your patient seek professional help from a therapist, counselor, or support group. A mental health professional can assist patients in processing their emotions and addressing their denial constructively.
  • Involve Trusted Individuals. Enlist the support of trusted friends, family members, or healthcare professionals who can help reinforce the importance of facing their medical condition.
  • Respect Autonomy. While it’s essential to encourage the person to accept their diagnosis, ultimately, the decision to get treatment lies with them. Respect their autonomy and avoid pushing them too hard, which could lead to resistance or further denial.
  • Be Patient and Persistent. Overcoming denial is often a gradual process. Be patient and persistent in supporting the person, even if progress seems slow.
  • Set Boundaries. It’s essential to set boundaries to protect your well-being. While you can offer support and encouragement, you cannot force someone to accept their medical condition. Recognize when your efforts are not being productive and take care of yourself in the process.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Informing patients of a dire diagnosis — or even one that will require significant lifestyle changes — is never easy. But what’s even more challenging is when patients don’t accept their medical condition or a future that might include a difficult treatment protocol or even new medications or surgery.

“This is a challenging space to be in because this isn’t an exact science,” said Jack Jacoub, MD, medical director of MemorialCare Cancer Institute at Orange Coast Memorial in Fountain Valley, California. “There’s no formal training to deal with this — experience is your best teacher.”

Ultimately, helping a person reconceptualize what their future looks like is at the heart of every one of these conversations, said Sourav Sengupta, MD, MPH, associate professor of psychiatry and pediatrics at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo, Buffalo, New York. “As physicians, we’re charged with helping our patients navigate a difficult and challenging time in their life,” he told this news organization.

“It’s not infrequent that patients are struggling to rethink what it will be like to be a person with an illness that might be chronic and how this will change their life,” he said.

And because denial is commonly the initial way a patient might cope with absorbing news that’s hard to hear, you’ll need to be extremely patient and empathetic.

“The goal is to build trust with this person, including trust in you, the hospital itself, and the entire team treating the patient,” Dr. Jacoub said.

“A diagnosis, especially in my field of oncology, can be scary. Spending time explaining their prognosis is very important. This can’t be a rushed scenario.”

More advice on helping patients who are in denial about their medical condition:
 

Make Sure They Understand What’s Going on

In cardiology, it’s common for patients to be hospitalized when they first learn that they have a disease they must manage for the rest of their life, said Stephanie Saucier, MD, a noninvasive cardiologist and codirector of the Women’s Heart Wellness Program at Hartford Healthcare’s Heart and Vascular Institute.

“Especially after someone has had a heart attack, a stroke, or they had bypass or stents placed, I like to see what their understanding of the disease is,” Dr. Saucier said. “I ask them, ‘What do you understand about what happened to you’. It can get confusing when you’re in the hospital and are told a lot of information in a short period of time.”
 

Share the Data

If a patient remains resistant to the news of a diagnosis, sharing test results can be beneficial. “I’ll often say, ‘here are the scans; this is the path report; this is the bloodwork; this is your biopsy report; these are the things we have’,” Dr. Jacoub said.

“Yes, this is clinical, but it helps to communicate the information you have and do it with data. For example, I might add, ‘Would you like to see some of the things [results, scans, tests] we’re talking about today?’ This also helps establish trust.”
 

 

 

Help Them Wrap Their Mind Around a Lifelong Condition

It’s often challenging for patients to accept that what they think is a one-time health issue will affect them for a lifetime. “I use juvenile diabetes as a way to explain this,” Dr. Saucier said. “I ask them what they would do if, say, their child was diagnosed with juvenile diabetes.”

Of course, patients agree that they wouldn’t give a child insulin for only a brief period. They understand that the condition must be treated in the long term. This kind of analogy can help patients understand that they, too, have a disorder requiring lifelong treatment.
 

Be Ready to Respond

Dr. Sengupta says that it’s important to be prepared with an answer if your patient is challenging or suggests that the diagnosis is fake or that you don’t have their best interests in mind.

“It’s understandable that patients might feel frustrated and upset,” he said. “It’s challenging when somehow a patient doesn’t assume my best intent.”

They might say something like, “You’re trying to make more money” or “you’re a shill for a pharma company.” In that case, you must listen. Patiently explain, “I’m your doctor; I work for you; I’m most interested in you feeling healthy and well.”

Occasionally, you’ll need a thick skin when it comes to inaccurate, controversial, or conspiratorial conversations with patients.
 

Acknowledge Differences

News of an illness may clash with a person’s take on the world. “A cancer diagnosis, for example, may clash with religious beliefs or faith-based ideology about the healthcare system,” said Aaron Fletcher, MD, a board-certified otolaryngologist specializing in head and neck surgery at the Georgia Center for Ear, Nose, Throat, and Facial Plastic Surgery in Atlanta, Georgia.

“If you have a patient who is coming to you with these beliefs, you need to have a lot of empathy, patience, and good communication skills. It’s up to you to break through the initial doubt and do your best to explain things in layman’s terms.”
 

Find Mutual Ground

If your patient still denies their health issues, try to find one thing you can agree on regarding a long-term game plan. “I’ll say, ‘Can we at least agree to discuss this with other family members or people who care about you’?” Dr. Jacoub said.

“I always tell patients that loved ones are welcome to call me so long as they [the patient] give permission. Sometimes, this is all that it takes to get them to accept their health situation.”
 

Seven Ways to Cope With Diagnosis  Denial

This news organization asked David Cutler, MD, a board-certified family medicine physician at Providence Saint John›s Health Center in Santa Monica, California, for tips in helping patients who are having a challenging time accepting their condition:

  • Listen Actively. Allow the patient to express their feelings and concerns without judgment. Active listening can help them feel heard and understood, which may open the door to discussing their condition more openly.
  • Provide Information. Offer factual information about their medical condition, treatment options, and the potential consequences of denial. Provide resources such as pamphlets, websites, or books that they can review at their own pace.
  • Encourage Professional Help. You may want to suggest that your patient seek professional help from a therapist, counselor, or support group. A mental health professional can assist patients in processing their emotions and addressing their denial constructively.
  • Involve Trusted Individuals. Enlist the support of trusted friends, family members, or healthcare professionals who can help reinforce the importance of facing their medical condition.
  • Respect Autonomy. While it’s essential to encourage the person to accept their diagnosis, ultimately, the decision to get treatment lies with them. Respect their autonomy and avoid pushing them too hard, which could lead to resistance or further denial.
  • Be Patient and Persistent. Overcoming denial is often a gradual process. Be patient and persistent in supporting the person, even if progress seems slow.
  • Set Boundaries. It’s essential to set boundaries to protect your well-being. While you can offer support and encouragement, you cannot force someone to accept their medical condition. Recognize when your efforts are not being productive and take care of yourself in the process.

A version of this article first appeared on Medscape.com.

Informing patients of a dire diagnosis — or even one that will require significant lifestyle changes — is never easy. But what’s even more challenging is when patients don’t accept their medical condition or a future that might include a difficult treatment protocol or even new medications or surgery.

“This is a challenging space to be in because this isn’t an exact science,” said Jack Jacoub, MD, medical director of MemorialCare Cancer Institute at Orange Coast Memorial in Fountain Valley, California. “There’s no formal training to deal with this — experience is your best teacher.”

Ultimately, helping a person reconceptualize what their future looks like is at the heart of every one of these conversations, said Sourav Sengupta, MD, MPH, associate professor of psychiatry and pediatrics at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo, Buffalo, New York. “As physicians, we’re charged with helping our patients navigate a difficult and challenging time in their life,” he told this news organization.

“It’s not infrequent that patients are struggling to rethink what it will be like to be a person with an illness that might be chronic and how this will change their life,” he said.

And because denial is commonly the initial way a patient might cope with absorbing news that’s hard to hear, you’ll need to be extremely patient and empathetic.

“The goal is to build trust with this person, including trust in you, the hospital itself, and the entire team treating the patient,” Dr. Jacoub said.

“A diagnosis, especially in my field of oncology, can be scary. Spending time explaining their prognosis is very important. This can’t be a rushed scenario.”

More advice on helping patients who are in denial about their medical condition:
 

Make Sure They Understand What’s Going on

In cardiology, it’s common for patients to be hospitalized when they first learn that they have a disease they must manage for the rest of their life, said Stephanie Saucier, MD, a noninvasive cardiologist and codirector of the Women’s Heart Wellness Program at Hartford Healthcare’s Heart and Vascular Institute.

“Especially after someone has had a heart attack, a stroke, or they had bypass or stents placed, I like to see what their understanding of the disease is,” Dr. Saucier said. “I ask them, ‘What do you understand about what happened to you’. It can get confusing when you’re in the hospital and are told a lot of information in a short period of time.”
 

Share the Data

If a patient remains resistant to the news of a diagnosis, sharing test results can be beneficial. “I’ll often say, ‘here are the scans; this is the path report; this is the bloodwork; this is your biopsy report; these are the things we have’,” Dr. Jacoub said.

“Yes, this is clinical, but it helps to communicate the information you have and do it with data. For example, I might add, ‘Would you like to see some of the things [results, scans, tests] we’re talking about today?’ This also helps establish trust.”
 

 

 

Help Them Wrap Their Mind Around a Lifelong Condition

It’s often challenging for patients to accept that what they think is a one-time health issue will affect them for a lifetime. “I use juvenile diabetes as a way to explain this,” Dr. Saucier said. “I ask them what they would do if, say, their child was diagnosed with juvenile diabetes.”

Of course, patients agree that they wouldn’t give a child insulin for only a brief period. They understand that the condition must be treated in the long term. This kind of analogy can help patients understand that they, too, have a disorder requiring lifelong treatment.
 

Be Ready to Respond

Dr. Sengupta says that it’s important to be prepared with an answer if your patient is challenging or suggests that the diagnosis is fake or that you don’t have their best interests in mind.

“It’s understandable that patients might feel frustrated and upset,” he said. “It’s challenging when somehow a patient doesn’t assume my best intent.”

They might say something like, “You’re trying to make more money” or “you’re a shill for a pharma company.” In that case, you must listen. Patiently explain, “I’m your doctor; I work for you; I’m most interested in you feeling healthy and well.”

Occasionally, you’ll need a thick skin when it comes to inaccurate, controversial, or conspiratorial conversations with patients.
 

Acknowledge Differences

News of an illness may clash with a person’s take on the world. “A cancer diagnosis, for example, may clash with religious beliefs or faith-based ideology about the healthcare system,” said Aaron Fletcher, MD, a board-certified otolaryngologist specializing in head and neck surgery at the Georgia Center for Ear, Nose, Throat, and Facial Plastic Surgery in Atlanta, Georgia.

“If you have a patient who is coming to you with these beliefs, you need to have a lot of empathy, patience, and good communication skills. It’s up to you to break through the initial doubt and do your best to explain things in layman’s terms.”
 

Find Mutual Ground

If your patient still denies their health issues, try to find one thing you can agree on regarding a long-term game plan. “I’ll say, ‘Can we at least agree to discuss this with other family members or people who care about you’?” Dr. Jacoub said.

“I always tell patients that loved ones are welcome to call me so long as they [the patient] give permission. Sometimes, this is all that it takes to get them to accept their health situation.”
 

Seven Ways to Cope With Diagnosis  Denial

This news organization asked David Cutler, MD, a board-certified family medicine physician at Providence Saint John›s Health Center in Santa Monica, California, for tips in helping patients who are having a challenging time accepting their condition:

  • Listen Actively. Allow the patient to express their feelings and concerns without judgment. Active listening can help them feel heard and understood, which may open the door to discussing their condition more openly.
  • Provide Information. Offer factual information about their medical condition, treatment options, and the potential consequences of denial. Provide resources such as pamphlets, websites, or books that they can review at their own pace.
  • Encourage Professional Help. You may want to suggest that your patient seek professional help from a therapist, counselor, or support group. A mental health professional can assist patients in processing their emotions and addressing their denial constructively.
  • Involve Trusted Individuals. Enlist the support of trusted friends, family members, or healthcare professionals who can help reinforce the importance of facing their medical condition.
  • Respect Autonomy. While it’s essential to encourage the person to accept their diagnosis, ultimately, the decision to get treatment lies with them. Respect their autonomy and avoid pushing them too hard, which could lead to resistance or further denial.
  • Be Patient and Persistent. Overcoming denial is often a gradual process. Be patient and persistent in supporting the person, even if progress seems slow.
  • Set Boundaries. It’s essential to set boundaries to protect your well-being. While you can offer support and encouragement, you cannot force someone to accept their medical condition. Recognize when your efforts are not being productive and take care of yourself in the process.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Niacin and CV Risk: Should Advice on Intake Change?

Article Type
Changed
Mon, 03/25/2024 - 15:58

A recent study linking a niacin derivative to an increased risk for cardiovascular events has raised questions about the safety of this B vitamin, which is added to many food staples in the Western diet and taken in the form of supplements.

The findings, which were published in Nature Medicine, may also help explain why taking niacin, which lowers low-density lipoprotein cholesterol and raises high-density lipoprotein cholesterol, did not lead to a reduction in cardiovascular events in major clinical trials.

But could this essential micronutrient really have an adverse effect on cardiovascular risk, and what are the implications for niacin intake?

Senior author of the new study Stanley Hazen, MD, believes some prudence on excessive niacin intake may be justified.

“I’m not suggesting we should completely avoid niacin — it is an essential nutrient, but our results suggest that too much may be harmful,” Dr. Hazen said.

Niacin supplements are also sold with claims of antiaging effects, arthritis relief, and boosting brain function, although none of these claims have been proven. And the related compound, nicotinamide, is recommended to prevent skin cancer in high-risk patients; however, a recent study questioned that guidance.

“I would say to the general public that avoiding supplements containing niacin or related compounds could be a sensible approach at present, while these findings are investigated further.”

Other experts are unsure if such action is justified on the basis of this single study.
 

Residual Cardiovascular Risk

Dr. Hazen, who is chair of the Department of Cardiovascular & Metabolic Sciences, at the Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio, explained to this news organization that they did not set out to study niacin.

“It began as a study to look for novel pathways involved in residual cardiovascular disease risk — the risk for cardiovascular events after adjusting for traditional risk factors such as cholesterol, blood pressure, and diabetes.”

The researchers began looking for compounds in plasma that predicted future adverse cardiovascular events in individuals undergoing elective diagnostic cardiac evaluation. Two of the leading candidates identified were niacin derivatives — 2PY and 4PY — that are only formed in the presence of excess niacin.

They then developed assays to measure 2PY and 4PY and conducted further studies in two validation cohorts — 2331 US individuals and a European cohort of 832 individuals. In both cohorts, elevated plasma levels of 2PY and 4PY predicted future adverse cardiovascular events, with a doubling in cardiovascular risk seen in those with levels in the highest vs the lowest quartile.

To move beyond these observational studies and to explore a potentially causal relationship, Dr. Hazen’s team went on to perform genome-wide association studies and found that genetic variants that tracked with higher levels of 4PY also linked to levels of the inflammatory marker, vascular cell adhesion molecule 1 (VCAM-1).

And in cell culture and animal studies, they found that 4PY was a driver of inflammation, upregulating VCAM-1 and eliciting vascular inflammation responses.

“So, we have shown in several different ways that the niacin derivative, 4PY, is linked to increased cardiovascular risk,” Dr. Hazen commented.
 

Significant Health Implications?

Dr. Hazen believed these findings could have significant health implications.

He noted that Western populations have been consuming large amounts of niacin ever since World War 2 when we began to fortify many foods with essential vitamins to avoid diseases caused by deficiencies. Niacin was added to foods to prevent pellagra — a disease characterized by inflamed skin, diarrhea, and dementia, that was often fatal.

“While we may have eliminated pellagra, have we, as a consequence, increased the prevalence of cardiovascular disease many years later?” Dr. Hazen asked.

This may be a clue to why niacin does not lower cardiovascular risk as much as would be expected from the degree of cholesterol lowering it brings about. “This is the niacin paradox and has led to the thought that there could be some kind of adverse effect that niacin is promoting. I think we may have found something that contributes to the niacin paradox,” he said.

However, the niacin pathway is complicated. Niacin is the major source of nicotinamide adenine dinucleotide (NAD), an integral molecule that allows cells to create energy. “Because it is so important, our bodies are designed to salvage and retain NADs, but once storage capacity is exceeded, then these 4PY and 2PY derivatives are generated,” Dr. Hazen explained. “But you have to really eat a lot of niacin-rich foods for this to happen.”

He is not claiming that niacin causes cardiovascular disease. “It is 4PY that appears to be the driver of vascular inflammation. And 4PY is a breakdown product of niacin. But there is more than one pathway that could lead to 4PY generation. There is a whole interconnecting network of compounds that interchange with each other — known as the niacin pool — any one or more of these compounds can be ingested and raise pool levels and ultimately 4PY levels. However, by far and away, niacin is one of the major sources,” Dr. Hazen commented.
 

Are High-Protein Diets Also Implicated?

Other sources of NADs include tryptophan, present in protein. And one of the genetic variants linked to changes in 4PY levels is connected to how dietary protein is directed into the niacin pool, raising the possibility that a high-protein diet may also raise cardiovascular risk in some people, Dr. Hazen noted.

Dr. Hazen estimated that about 3% of the niacin pool in a normal diet comes from protein intake, but that the percentage could increase substantially in very high–protein diets.

“Our data support the concept that if we lower our 4PY level long-term, then that would result in a reduction in cardiovascular disease. But this is still just a hypothesis. If we lower niacin intake, we will lower 4PY,” Dr. Hazen stated.

He said that this research is at too early a stage to give firm recommendations in what this means for the consumer.

“Based on these findings, I would advise people to avoid taking niacin or nicotinic acid or nicotinamide supplements and to eat a sensible balanced diet — maybe not to overdo the high protein–type diets. That’s all we can really say at the moment.”

Noting that niacin can also be one of the major components in energy drinks, he suggested it may be prudent to limit consumption of these products.
 

 

 

What Is the Optimum Niacin Intake?

Dr. Hazen noted that the recommended dietary allowance (RDA) for niacin is well known — between 14 and 18 mg, but he said the average American ingests four times that amount, and some people have substantially higher intakes — up to 50 times the RDA if taking supplements.

While food fortification with niacin may have been useful in the past, Dr. Hazen questioned whether it should still be mandated.

“In the US, you cannot buy flour or cereal or rice that is not fortified. And if you look closely, some products have much higher levels than those that are mandated. The food companies advertise this as a benefit, but there is no good data in support of that. What if several decades of eating excessive amounts of niacin has led to an increase in cardiovascular disease?”

He does not propose stopping all niacin fortification, “but maybe, we could have the choice of selecting an unfortified option,” he said.
 

Causal Link Not Proven

Commenting for this news organization, John Guyton, MD, Professor Emeritus of Medicine, Duke University Medical Center, Durham, North Carolina, who has been involved in niacin research for many years, said the Nature Medicine study showed “interesting and important results,” but they do not at this point prove a causal link between niacin intake and risk for cardiovascular disease.

“These findings need to be investigated further, and more studies are certainly justified, but I don’t think that this study alone makes an adequate case for restricting niacin intake, or thinking about stopping niacin fortification of foodstuffs,” Dr. Guyton said.

Noting that niacin is present in large quantities in many fast foods, he suggested the researchers may have just picked up the consequences of eating an unhealthy diet.

“If you look at foods that contain high quantities of niacin, red meat is at the top of the list. And if you think of a hamburger, niacin is present in relatively large quantities both the burger and the bun. So, these findings may just be a reflection of an overall unhealthy diet,” he commented.

Dr. Guyton also pointed out that major clinical trials with niacin have shown mixed results, and its effect on cardiovascular risk is still not completely understood. While the HPS2-THRIVE and AIM-HIGH trials did not show benefits in reducing cardiovascular events, an earlier study, the Coronary Drug Project in which the agent was given with food, did show some positive effects with substantial reductions in myocardial infarction and stroke, and there was the suggestion of a reduction in long-term mortality in the niacin group several years after the trial had ended.
 

Nicotinamide in Skin Cancer Prevention

What about the use of nicotinamide in skin cancer prevention?

Addressing this question, Kristin Bibee, MD, assistant professor of dermatology at Johns Hopkins University School of Medicine, Baltimore, pointed out that nicotinamide, although closely related to niacin, may have different effects. “This study does not specifically address nicotinamide supplementation and 4PY levels,” she said.

Diona Damian, MD, professor of dermatology at the University of Sydney, Camperdown, Australia, told this news organization that it was hard to extrapolate these findings on basal levels of niacin in a cardiac cohort to the administration of supra-physiological doses of nicotinamide for skin cancer prevention.

There may be different effects of supplemental niacin compared to nicotinamide, which lacks the vasodilatory effects seen with niacin, Dr. Damian said, adding that it would be interesting to see the results from higher, therapeutic nicotinamide doses in patients with and without cardiac disease.

She pointed out that high vs low levels of nicotinamide supplementation can have different and even opposite effects on cellular processes, such as upregulating or inhibiting DNA repair enzymes. At high doses, nicotinamide is anti-inflammatory in skin.

Dr. Damian noted that two phase 3 studies (ONTRAC and ONTRANS) of nicotinamide 500 mg twice daily for skin cancer prevention did not find a significant increase in cardiovascular events compared to placebo over 12 months.

“Oral nicotinamide has been shown to reduce nonmelanoma skin cancer by about a quarter in patients with normal immunity and multiple skin cancers. The doses used for skin cancer prevention are well above daily dietary levels, and treatment needs to be ongoing for the protective effects to continue. Nicotinamide should not be recommended as a preventive agent for people who have not had multiple skin cancers but should be reserved for those with a heavy burden of skin cancers,” she commented.

“For now, it would be reasonable to balance the benefits of skin cancer reduction against possible effects on inflammatory markers in patients with cardiac risk factors, when helping patients to decide whether or not nicotinamide therapy is appropriate for them,” she added.

Meanwhile, Dr. Hazen said the most exciting part of this new research is the discovery of a new pathway that contributes to cardiovascular disease and potentially a new target to treat residual cardiovascular risk.

“I believe our results show that we should be measuring 4PY levels and individuals with high levels need to be extra vigilant about lowering their cardiovascular risk.”

The next step will be to confirm these results in other populations and then to develop a diagnostic test to identify people with a high 4PY level, he said.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A recent study linking a niacin derivative to an increased risk for cardiovascular events has raised questions about the safety of this B vitamin, which is added to many food staples in the Western diet and taken in the form of supplements.

The findings, which were published in Nature Medicine, may also help explain why taking niacin, which lowers low-density lipoprotein cholesterol and raises high-density lipoprotein cholesterol, did not lead to a reduction in cardiovascular events in major clinical trials.

But could this essential micronutrient really have an adverse effect on cardiovascular risk, and what are the implications for niacin intake?

Senior author of the new study Stanley Hazen, MD, believes some prudence on excessive niacin intake may be justified.

“I’m not suggesting we should completely avoid niacin — it is an essential nutrient, but our results suggest that too much may be harmful,” Dr. Hazen said.

Niacin supplements are also sold with claims of antiaging effects, arthritis relief, and boosting brain function, although none of these claims have been proven. And the related compound, nicotinamide, is recommended to prevent skin cancer in high-risk patients; however, a recent study questioned that guidance.

“I would say to the general public that avoiding supplements containing niacin or related compounds could be a sensible approach at present, while these findings are investigated further.”

Other experts are unsure if such action is justified on the basis of this single study.
 

Residual Cardiovascular Risk

Dr. Hazen, who is chair of the Department of Cardiovascular & Metabolic Sciences, at the Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio, explained to this news organization that they did not set out to study niacin.

“It began as a study to look for novel pathways involved in residual cardiovascular disease risk — the risk for cardiovascular events after adjusting for traditional risk factors such as cholesterol, blood pressure, and diabetes.”

The researchers began looking for compounds in plasma that predicted future adverse cardiovascular events in individuals undergoing elective diagnostic cardiac evaluation. Two of the leading candidates identified were niacin derivatives — 2PY and 4PY — that are only formed in the presence of excess niacin.

They then developed assays to measure 2PY and 4PY and conducted further studies in two validation cohorts — 2331 US individuals and a European cohort of 832 individuals. In both cohorts, elevated plasma levels of 2PY and 4PY predicted future adverse cardiovascular events, with a doubling in cardiovascular risk seen in those with levels in the highest vs the lowest quartile.

To move beyond these observational studies and to explore a potentially causal relationship, Dr. Hazen’s team went on to perform genome-wide association studies and found that genetic variants that tracked with higher levels of 4PY also linked to levels of the inflammatory marker, vascular cell adhesion molecule 1 (VCAM-1).

And in cell culture and animal studies, they found that 4PY was a driver of inflammation, upregulating VCAM-1 and eliciting vascular inflammation responses.

“So, we have shown in several different ways that the niacin derivative, 4PY, is linked to increased cardiovascular risk,” Dr. Hazen commented.
 

Significant Health Implications?

Dr. Hazen believed these findings could have significant health implications.

He noted that Western populations have been consuming large amounts of niacin ever since World War 2 when we began to fortify many foods with essential vitamins to avoid diseases caused by deficiencies. Niacin was added to foods to prevent pellagra — a disease characterized by inflamed skin, diarrhea, and dementia, that was often fatal.

“While we may have eliminated pellagra, have we, as a consequence, increased the prevalence of cardiovascular disease many years later?” Dr. Hazen asked.

This may be a clue to why niacin does not lower cardiovascular risk as much as would be expected from the degree of cholesterol lowering it brings about. “This is the niacin paradox and has led to the thought that there could be some kind of adverse effect that niacin is promoting. I think we may have found something that contributes to the niacin paradox,” he said.

However, the niacin pathway is complicated. Niacin is the major source of nicotinamide adenine dinucleotide (NAD), an integral molecule that allows cells to create energy. “Because it is so important, our bodies are designed to salvage and retain NADs, but once storage capacity is exceeded, then these 4PY and 2PY derivatives are generated,” Dr. Hazen explained. “But you have to really eat a lot of niacin-rich foods for this to happen.”

He is not claiming that niacin causes cardiovascular disease. “It is 4PY that appears to be the driver of vascular inflammation. And 4PY is a breakdown product of niacin. But there is more than one pathway that could lead to 4PY generation. There is a whole interconnecting network of compounds that interchange with each other — known as the niacin pool — any one or more of these compounds can be ingested and raise pool levels and ultimately 4PY levels. However, by far and away, niacin is one of the major sources,” Dr. Hazen commented.
 

Are High-Protein Diets Also Implicated?

Other sources of NADs include tryptophan, present in protein. And one of the genetic variants linked to changes in 4PY levels is connected to how dietary protein is directed into the niacin pool, raising the possibility that a high-protein diet may also raise cardiovascular risk in some people, Dr. Hazen noted.

Dr. Hazen estimated that about 3% of the niacin pool in a normal diet comes from protein intake, but that the percentage could increase substantially in very high–protein diets.

“Our data support the concept that if we lower our 4PY level long-term, then that would result in a reduction in cardiovascular disease. But this is still just a hypothesis. If we lower niacin intake, we will lower 4PY,” Dr. Hazen stated.

He said that this research is at too early a stage to give firm recommendations in what this means for the consumer.

“Based on these findings, I would advise people to avoid taking niacin or nicotinic acid or nicotinamide supplements and to eat a sensible balanced diet — maybe not to overdo the high protein–type diets. That’s all we can really say at the moment.”

Noting that niacin can also be one of the major components in energy drinks, he suggested it may be prudent to limit consumption of these products.
 

 

 

What Is the Optimum Niacin Intake?

Dr. Hazen noted that the recommended dietary allowance (RDA) for niacin is well known — between 14 and 18 mg, but he said the average American ingests four times that amount, and some people have substantially higher intakes — up to 50 times the RDA if taking supplements.

While food fortification with niacin may have been useful in the past, Dr. Hazen questioned whether it should still be mandated.

“In the US, you cannot buy flour or cereal or rice that is not fortified. And if you look closely, some products have much higher levels than those that are mandated. The food companies advertise this as a benefit, but there is no good data in support of that. What if several decades of eating excessive amounts of niacin has led to an increase in cardiovascular disease?”

He does not propose stopping all niacin fortification, “but maybe, we could have the choice of selecting an unfortified option,” he said.
 

Causal Link Not Proven

Commenting for this news organization, John Guyton, MD, Professor Emeritus of Medicine, Duke University Medical Center, Durham, North Carolina, who has been involved in niacin research for many years, said the Nature Medicine study showed “interesting and important results,” but they do not at this point prove a causal link between niacin intake and risk for cardiovascular disease.

“These findings need to be investigated further, and more studies are certainly justified, but I don’t think that this study alone makes an adequate case for restricting niacin intake, or thinking about stopping niacin fortification of foodstuffs,” Dr. Guyton said.

Noting that niacin is present in large quantities in many fast foods, he suggested the researchers may have just picked up the consequences of eating an unhealthy diet.

“If you look at foods that contain high quantities of niacin, red meat is at the top of the list. And if you think of a hamburger, niacin is present in relatively large quantities both the burger and the bun. So, these findings may just be a reflection of an overall unhealthy diet,” he commented.

Dr. Guyton also pointed out that major clinical trials with niacin have shown mixed results, and its effect on cardiovascular risk is still not completely understood. While the HPS2-THRIVE and AIM-HIGH trials did not show benefits in reducing cardiovascular events, an earlier study, the Coronary Drug Project in which the agent was given with food, did show some positive effects with substantial reductions in myocardial infarction and stroke, and there was the suggestion of a reduction in long-term mortality in the niacin group several years after the trial had ended.
 

Nicotinamide in Skin Cancer Prevention

What about the use of nicotinamide in skin cancer prevention?

Addressing this question, Kristin Bibee, MD, assistant professor of dermatology at Johns Hopkins University School of Medicine, Baltimore, pointed out that nicotinamide, although closely related to niacin, may have different effects. “This study does not specifically address nicotinamide supplementation and 4PY levels,” she said.

Diona Damian, MD, professor of dermatology at the University of Sydney, Camperdown, Australia, told this news organization that it was hard to extrapolate these findings on basal levels of niacin in a cardiac cohort to the administration of supra-physiological doses of nicotinamide for skin cancer prevention.

There may be different effects of supplemental niacin compared to nicotinamide, which lacks the vasodilatory effects seen with niacin, Dr. Damian said, adding that it would be interesting to see the results from higher, therapeutic nicotinamide doses in patients with and without cardiac disease.

She pointed out that high vs low levels of nicotinamide supplementation can have different and even opposite effects on cellular processes, such as upregulating or inhibiting DNA repair enzymes. At high doses, nicotinamide is anti-inflammatory in skin.

Dr. Damian noted that two phase 3 studies (ONTRAC and ONTRANS) of nicotinamide 500 mg twice daily for skin cancer prevention did not find a significant increase in cardiovascular events compared to placebo over 12 months.

“Oral nicotinamide has been shown to reduce nonmelanoma skin cancer by about a quarter in patients with normal immunity and multiple skin cancers. The doses used for skin cancer prevention are well above daily dietary levels, and treatment needs to be ongoing for the protective effects to continue. Nicotinamide should not be recommended as a preventive agent for people who have not had multiple skin cancers but should be reserved for those with a heavy burden of skin cancers,” she commented.

“For now, it would be reasonable to balance the benefits of skin cancer reduction against possible effects on inflammatory markers in patients with cardiac risk factors, when helping patients to decide whether or not nicotinamide therapy is appropriate for them,” she added.

Meanwhile, Dr. Hazen said the most exciting part of this new research is the discovery of a new pathway that contributes to cardiovascular disease and potentially a new target to treat residual cardiovascular risk.

“I believe our results show that we should be measuring 4PY levels and individuals with high levels need to be extra vigilant about lowering their cardiovascular risk.”

The next step will be to confirm these results in other populations and then to develop a diagnostic test to identify people with a high 4PY level, he said.

A version of this article appeared on Medscape.com.

A recent study linking a niacin derivative to an increased risk for cardiovascular events has raised questions about the safety of this B vitamin, which is added to many food staples in the Western diet and taken in the form of supplements.

The findings, which were published in Nature Medicine, may also help explain why taking niacin, which lowers low-density lipoprotein cholesterol and raises high-density lipoprotein cholesterol, did not lead to a reduction in cardiovascular events in major clinical trials.

But could this essential micronutrient really have an adverse effect on cardiovascular risk, and what are the implications for niacin intake?

Senior author of the new study Stanley Hazen, MD, believes some prudence on excessive niacin intake may be justified.

“I’m not suggesting we should completely avoid niacin — it is an essential nutrient, but our results suggest that too much may be harmful,” Dr. Hazen said.

Niacin supplements are also sold with claims of antiaging effects, arthritis relief, and boosting brain function, although none of these claims have been proven. And the related compound, nicotinamide, is recommended to prevent skin cancer in high-risk patients; however, a recent study questioned that guidance.

“I would say to the general public that avoiding supplements containing niacin or related compounds could be a sensible approach at present, while these findings are investigated further.”

Other experts are unsure if such action is justified on the basis of this single study.
 

Residual Cardiovascular Risk

Dr. Hazen, who is chair of the Department of Cardiovascular & Metabolic Sciences, at the Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio, explained to this news organization that they did not set out to study niacin.

“It began as a study to look for novel pathways involved in residual cardiovascular disease risk — the risk for cardiovascular events after adjusting for traditional risk factors such as cholesterol, blood pressure, and diabetes.”

The researchers began looking for compounds in plasma that predicted future adverse cardiovascular events in individuals undergoing elective diagnostic cardiac evaluation. Two of the leading candidates identified were niacin derivatives — 2PY and 4PY — that are only formed in the presence of excess niacin.

They then developed assays to measure 2PY and 4PY and conducted further studies in two validation cohorts — 2331 US individuals and a European cohort of 832 individuals. In both cohorts, elevated plasma levels of 2PY and 4PY predicted future adverse cardiovascular events, with a doubling in cardiovascular risk seen in those with levels in the highest vs the lowest quartile.

To move beyond these observational studies and to explore a potentially causal relationship, Dr. Hazen’s team went on to perform genome-wide association studies and found that genetic variants that tracked with higher levels of 4PY also linked to levels of the inflammatory marker, vascular cell adhesion molecule 1 (VCAM-1).

And in cell culture and animal studies, they found that 4PY was a driver of inflammation, upregulating VCAM-1 and eliciting vascular inflammation responses.

“So, we have shown in several different ways that the niacin derivative, 4PY, is linked to increased cardiovascular risk,” Dr. Hazen commented.
 

Significant Health Implications?

Dr. Hazen believed these findings could have significant health implications.

He noted that Western populations have been consuming large amounts of niacin ever since World War 2 when we began to fortify many foods with essential vitamins to avoid diseases caused by deficiencies. Niacin was added to foods to prevent pellagra — a disease characterized by inflamed skin, diarrhea, and dementia, that was often fatal.

“While we may have eliminated pellagra, have we, as a consequence, increased the prevalence of cardiovascular disease many years later?” Dr. Hazen asked.

This may be a clue to why niacin does not lower cardiovascular risk as much as would be expected from the degree of cholesterol lowering it brings about. “This is the niacin paradox and has led to the thought that there could be some kind of adverse effect that niacin is promoting. I think we may have found something that contributes to the niacin paradox,” he said.

However, the niacin pathway is complicated. Niacin is the major source of nicotinamide adenine dinucleotide (NAD), an integral molecule that allows cells to create energy. “Because it is so important, our bodies are designed to salvage and retain NADs, but once storage capacity is exceeded, then these 4PY and 2PY derivatives are generated,” Dr. Hazen explained. “But you have to really eat a lot of niacin-rich foods for this to happen.”

He is not claiming that niacin causes cardiovascular disease. “It is 4PY that appears to be the driver of vascular inflammation. And 4PY is a breakdown product of niacin. But there is more than one pathway that could lead to 4PY generation. There is a whole interconnecting network of compounds that interchange with each other — known as the niacin pool — any one or more of these compounds can be ingested and raise pool levels and ultimately 4PY levels. However, by far and away, niacin is one of the major sources,” Dr. Hazen commented.
 

Are High-Protein Diets Also Implicated?

Other sources of NADs include tryptophan, present in protein. And one of the genetic variants linked to changes in 4PY levels is connected to how dietary protein is directed into the niacin pool, raising the possibility that a high-protein diet may also raise cardiovascular risk in some people, Dr. Hazen noted.

Dr. Hazen estimated that about 3% of the niacin pool in a normal diet comes from protein intake, but that the percentage could increase substantially in very high–protein diets.

“Our data support the concept that if we lower our 4PY level long-term, then that would result in a reduction in cardiovascular disease. But this is still just a hypothesis. If we lower niacin intake, we will lower 4PY,” Dr. Hazen stated.

He said that this research is at too early a stage to give firm recommendations in what this means for the consumer.

“Based on these findings, I would advise people to avoid taking niacin or nicotinic acid or nicotinamide supplements and to eat a sensible balanced diet — maybe not to overdo the high protein–type diets. That’s all we can really say at the moment.”

Noting that niacin can also be one of the major components in energy drinks, he suggested it may be prudent to limit consumption of these products.
 

 

 

What Is the Optimum Niacin Intake?

Dr. Hazen noted that the recommended dietary allowance (RDA) for niacin is well known — between 14 and 18 mg, but he said the average American ingests four times that amount, and some people have substantially higher intakes — up to 50 times the RDA if taking supplements.

While food fortification with niacin may have been useful in the past, Dr. Hazen questioned whether it should still be mandated.

“In the US, you cannot buy flour or cereal or rice that is not fortified. And if you look closely, some products have much higher levels than those that are mandated. The food companies advertise this as a benefit, but there is no good data in support of that. What if several decades of eating excessive amounts of niacin has led to an increase in cardiovascular disease?”

He does not propose stopping all niacin fortification, “but maybe, we could have the choice of selecting an unfortified option,” he said.
 

Causal Link Not Proven

Commenting for this news organization, John Guyton, MD, Professor Emeritus of Medicine, Duke University Medical Center, Durham, North Carolina, who has been involved in niacin research for many years, said the Nature Medicine study showed “interesting and important results,” but they do not at this point prove a causal link between niacin intake and risk for cardiovascular disease.

“These findings need to be investigated further, and more studies are certainly justified, but I don’t think that this study alone makes an adequate case for restricting niacin intake, or thinking about stopping niacin fortification of foodstuffs,” Dr. Guyton said.

Noting that niacin is present in large quantities in many fast foods, he suggested the researchers may have just picked up the consequences of eating an unhealthy diet.

“If you look at foods that contain high quantities of niacin, red meat is at the top of the list. And if you think of a hamburger, niacin is present in relatively large quantities both the burger and the bun. So, these findings may just be a reflection of an overall unhealthy diet,” he commented.

Dr. Guyton also pointed out that major clinical trials with niacin have shown mixed results, and its effect on cardiovascular risk is still not completely understood. While the HPS2-THRIVE and AIM-HIGH trials did not show benefits in reducing cardiovascular events, an earlier study, the Coronary Drug Project in which the agent was given with food, did show some positive effects with substantial reductions in myocardial infarction and stroke, and there was the suggestion of a reduction in long-term mortality in the niacin group several years after the trial had ended.
 

Nicotinamide in Skin Cancer Prevention

What about the use of nicotinamide in skin cancer prevention?

Addressing this question, Kristin Bibee, MD, assistant professor of dermatology at Johns Hopkins University School of Medicine, Baltimore, pointed out that nicotinamide, although closely related to niacin, may have different effects. “This study does not specifically address nicotinamide supplementation and 4PY levels,” she said.

Diona Damian, MD, professor of dermatology at the University of Sydney, Camperdown, Australia, told this news organization that it was hard to extrapolate these findings on basal levels of niacin in a cardiac cohort to the administration of supra-physiological doses of nicotinamide for skin cancer prevention.

There may be different effects of supplemental niacin compared to nicotinamide, which lacks the vasodilatory effects seen with niacin, Dr. Damian said, adding that it would be interesting to see the results from higher, therapeutic nicotinamide doses in patients with and without cardiac disease.

She pointed out that high vs low levels of nicotinamide supplementation can have different and even opposite effects on cellular processes, such as upregulating or inhibiting DNA repair enzymes. At high doses, nicotinamide is anti-inflammatory in skin.

Dr. Damian noted that two phase 3 studies (ONTRAC and ONTRANS) of nicotinamide 500 mg twice daily for skin cancer prevention did not find a significant increase in cardiovascular events compared to placebo over 12 months.

“Oral nicotinamide has been shown to reduce nonmelanoma skin cancer by about a quarter in patients with normal immunity and multiple skin cancers. The doses used for skin cancer prevention are well above daily dietary levels, and treatment needs to be ongoing for the protective effects to continue. Nicotinamide should not be recommended as a preventive agent for people who have not had multiple skin cancers but should be reserved for those with a heavy burden of skin cancers,” she commented.

“For now, it would be reasonable to balance the benefits of skin cancer reduction against possible effects on inflammatory markers in patients with cardiac risk factors, when helping patients to decide whether or not nicotinamide therapy is appropriate for them,” she added.

Meanwhile, Dr. Hazen said the most exciting part of this new research is the discovery of a new pathway that contributes to cardiovascular disease and potentially a new target to treat residual cardiovascular risk.

“I believe our results show that we should be measuring 4PY levels and individuals with high levels need to be extra vigilant about lowering their cardiovascular risk.”

The next step will be to confirm these results in other populations and then to develop a diagnostic test to identify people with a high 4PY level, he said.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Intermittent Fasting Linked to Higher CVD Death Risk

Article Type
Changed
Mon, 03/25/2024 - 15:58

A new study raises a cautionary note on time-restricted eating (TRE), a type of intermittent fasting that is gaining popularity.

The observational analysis of over 20,000 US adults showed that those who limited their eating to a period of less than 8 hours per day had a higher risk for cardiovascular mortality compared with peers who ate across the typical 12-16 hours per day. This was the case in the overall sample and in those with cardiovascular disease (CVD) or cancer.

Lead author Victor Wenze Zhong, PhD, cautioned that the findings “require replication and we cannot demonstrate 8-hour TRE causes cardiovascular death in this observational study.

“However, it’s important for patients, particularly those with existing heart conditions or cancer, to be aware of the positive association between an 8-hour eating window and cardiovascular death,” Dr. Zhong, professor and chair, Department of Epidemiology and Biostatistics, School of Public Health, Shanghai Jiao Tong University School of Medicine, Shanghai, China, told this news organization. 

The results (Abstract P192) were presented March 18 at the American Heart Association (AHA) Epidemiology and Prevention/Lifestyle and Cardiometabolic Health Scientific Sessions 2024.
 

‘Provocative’ Results 

Short-term randomized controlled trials have suggested that 8-hour TRE may improve cardiometabolic risk profiles, but the potential long-term effects of this eating pattern are unknown. 

The observation that TRE may have short-term benefits but long-term adverse effects is “interesting and provocative” and needs further study, Christopher D. Gardner, PhD, professor of medicine at Stanford University in California, who wasn’t involved in the study, said in a conference statement, and he agreed that much more research is needed. 

The researchers analyzed data on dietary patterns for 20,078 adults (mean age, 48 years; 50% men; 73% non-Hispanic White) who participated in the 2003-2018 National Health and Nutrition Examination Surveys (NHANES). All of them completed two 24-hour dietary recall questionnaires within the first year of enrollment. Deaths through the end of 2019 were determined via the National Death Index.

During a median follow-up of 8 years, there were 2797 deaths due to any cause, including 840 CV deaths and 643 cancer deaths. 

In the overall sample, compared with an eating duration of 12-16 hours, 8-hour TRE was significantly associated with an increased risk for CV mortality (hazard ratio [HR], 1.91; 95% CI, 1.20-3.03).

This association was also observed in adults with CVD (HR, 2.07; 95% CI, 1.14-3.78) and adults with cancer (HR, 3.04; 95% CI, 1.44-6.41). 

Other eating durations were not associated with CV mortality, except for eating duration of 8 to less than 10 hours in people with CVD (HR, 1.66; 95% CI, 1.03-2.67). 

No significant associations were found between eating duration and all-cause or cancer mortality in the overall sample and CVD/cancer subsamples, except that eating duration of more than 16 hours was associated with a lower risk for cancer mortality in people with cancer (HR, 0.47; 95% CI, 0.23-0.95).
 

Quality More Important Than Timing 

Dr. Zhong noted that the study doesn’t address the underlying mechanisms driving the observed association between 8-hour TRE and CV death. 

“However, we did observe that people who restricted eating to a period less than 8 hours per day had less lean muscle mass compared with those with typical eating duration of 12-16 hours. Loss of lean body mass has been linked to higher risk of cardiovascular mortality,” Dr. Zhong said. 

“Based on the evidence as of now, focusing on what people eat appears to be more important than focusing on the time when they eat. There are certain dietary approaches with compelling health benefits to choose, such as DASH diet and Mediterranean diet,” Dr. Zhong said.

Intermittent fasting is “certainly an interesting concept and one on which the potential mechanisms underlying the improvements in short outcome studies and preclinical studies in animals are strongly being pursued,” Sean P. Heffron, MD, cardiologist at the Center for the Prevention of Cardiovascular Disease at NYU Langone Heart, New York, who wasn’t involved in the study, told this news organization. 

Dr. Heffron expressed skepticism about the study results calling them “far from complete” and noted that data on diet was based on only 2-day diet records without correction for confounding variables. 

Dr. Heffron also noted that the restricted diet group has more smokers and more men. “I would “strongly anticipate that once appropriate corrections are made, the findings will no longer persist in statistical significance,” Dr. Heffron said.

He emphasized the need for more rigorous research before making clinical recommendations. When patients ask about intermittent fasting, Dr. Heffron said he tells them, “If it works for you, that’s fine,” but he doesn’t provide a recommendation for or against it. 

Funding for the study was provided by the National Key Research and Development Program of China and the National Science Foundation of China. Zhong, Dr. Heffron and Dr. Gardner have no relevant disclosures.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A new study raises a cautionary note on time-restricted eating (TRE), a type of intermittent fasting that is gaining popularity.

The observational analysis of over 20,000 US adults showed that those who limited their eating to a period of less than 8 hours per day had a higher risk for cardiovascular mortality compared with peers who ate across the typical 12-16 hours per day. This was the case in the overall sample and in those with cardiovascular disease (CVD) or cancer.

Lead author Victor Wenze Zhong, PhD, cautioned that the findings “require replication and we cannot demonstrate 8-hour TRE causes cardiovascular death in this observational study.

“However, it’s important for patients, particularly those with existing heart conditions or cancer, to be aware of the positive association between an 8-hour eating window and cardiovascular death,” Dr. Zhong, professor and chair, Department of Epidemiology and Biostatistics, School of Public Health, Shanghai Jiao Tong University School of Medicine, Shanghai, China, told this news organization. 

The results (Abstract P192) were presented March 18 at the American Heart Association (AHA) Epidemiology and Prevention/Lifestyle and Cardiometabolic Health Scientific Sessions 2024.
 

‘Provocative’ Results 

Short-term randomized controlled trials have suggested that 8-hour TRE may improve cardiometabolic risk profiles, but the potential long-term effects of this eating pattern are unknown. 

The observation that TRE may have short-term benefits but long-term adverse effects is “interesting and provocative” and needs further study, Christopher D. Gardner, PhD, professor of medicine at Stanford University in California, who wasn’t involved in the study, said in a conference statement, and he agreed that much more research is needed. 

The researchers analyzed data on dietary patterns for 20,078 adults (mean age, 48 years; 50% men; 73% non-Hispanic White) who participated in the 2003-2018 National Health and Nutrition Examination Surveys (NHANES). All of them completed two 24-hour dietary recall questionnaires within the first year of enrollment. Deaths through the end of 2019 were determined via the National Death Index.

During a median follow-up of 8 years, there were 2797 deaths due to any cause, including 840 CV deaths and 643 cancer deaths. 

In the overall sample, compared with an eating duration of 12-16 hours, 8-hour TRE was significantly associated with an increased risk for CV mortality (hazard ratio [HR], 1.91; 95% CI, 1.20-3.03).

This association was also observed in adults with CVD (HR, 2.07; 95% CI, 1.14-3.78) and adults with cancer (HR, 3.04; 95% CI, 1.44-6.41). 

Other eating durations were not associated with CV mortality, except for eating duration of 8 to less than 10 hours in people with CVD (HR, 1.66; 95% CI, 1.03-2.67). 

No significant associations were found between eating duration and all-cause or cancer mortality in the overall sample and CVD/cancer subsamples, except that eating duration of more than 16 hours was associated with a lower risk for cancer mortality in people with cancer (HR, 0.47; 95% CI, 0.23-0.95).
 

Quality More Important Than Timing 

Dr. Zhong noted that the study doesn’t address the underlying mechanisms driving the observed association between 8-hour TRE and CV death. 

“However, we did observe that people who restricted eating to a period less than 8 hours per day had less lean muscle mass compared with those with typical eating duration of 12-16 hours. Loss of lean body mass has been linked to higher risk of cardiovascular mortality,” Dr. Zhong said. 

“Based on the evidence as of now, focusing on what people eat appears to be more important than focusing on the time when they eat. There are certain dietary approaches with compelling health benefits to choose, such as DASH diet and Mediterranean diet,” Dr. Zhong said.

Intermittent fasting is “certainly an interesting concept and one on which the potential mechanisms underlying the improvements in short outcome studies and preclinical studies in animals are strongly being pursued,” Sean P. Heffron, MD, cardiologist at the Center for the Prevention of Cardiovascular Disease at NYU Langone Heart, New York, who wasn’t involved in the study, told this news organization. 

Dr. Heffron expressed skepticism about the study results calling them “far from complete” and noted that data on diet was based on only 2-day diet records without correction for confounding variables. 

Dr. Heffron also noted that the restricted diet group has more smokers and more men. “I would “strongly anticipate that once appropriate corrections are made, the findings will no longer persist in statistical significance,” Dr. Heffron said.

He emphasized the need for more rigorous research before making clinical recommendations. When patients ask about intermittent fasting, Dr. Heffron said he tells them, “If it works for you, that’s fine,” but he doesn’t provide a recommendation for or against it. 

Funding for the study was provided by the National Key Research and Development Program of China and the National Science Foundation of China. Zhong, Dr. Heffron and Dr. Gardner have no relevant disclosures.
 

A version of this article appeared on Medscape.com.

A new study raises a cautionary note on time-restricted eating (TRE), a type of intermittent fasting that is gaining popularity.

The observational analysis of over 20,000 US adults showed that those who limited their eating to a period of less than 8 hours per day had a higher risk for cardiovascular mortality compared with peers who ate across the typical 12-16 hours per day. This was the case in the overall sample and in those with cardiovascular disease (CVD) or cancer.

Lead author Victor Wenze Zhong, PhD, cautioned that the findings “require replication and we cannot demonstrate 8-hour TRE causes cardiovascular death in this observational study.

“However, it’s important for patients, particularly those with existing heart conditions or cancer, to be aware of the positive association between an 8-hour eating window and cardiovascular death,” Dr. Zhong, professor and chair, Department of Epidemiology and Biostatistics, School of Public Health, Shanghai Jiao Tong University School of Medicine, Shanghai, China, told this news organization. 

The results (Abstract P192) were presented March 18 at the American Heart Association (AHA) Epidemiology and Prevention/Lifestyle and Cardiometabolic Health Scientific Sessions 2024.
 

‘Provocative’ Results 

Short-term randomized controlled trials have suggested that 8-hour TRE may improve cardiometabolic risk profiles, but the potential long-term effects of this eating pattern are unknown. 

The observation that TRE may have short-term benefits but long-term adverse effects is “interesting and provocative” and needs further study, Christopher D. Gardner, PhD, professor of medicine at Stanford University in California, who wasn’t involved in the study, said in a conference statement, and he agreed that much more research is needed. 

The researchers analyzed data on dietary patterns for 20,078 adults (mean age, 48 years; 50% men; 73% non-Hispanic White) who participated in the 2003-2018 National Health and Nutrition Examination Surveys (NHANES). All of them completed two 24-hour dietary recall questionnaires within the first year of enrollment. Deaths through the end of 2019 were determined via the National Death Index.

During a median follow-up of 8 years, there were 2797 deaths due to any cause, including 840 CV deaths and 643 cancer deaths. 

In the overall sample, compared with an eating duration of 12-16 hours, 8-hour TRE was significantly associated with an increased risk for CV mortality (hazard ratio [HR], 1.91; 95% CI, 1.20-3.03).

This association was also observed in adults with CVD (HR, 2.07; 95% CI, 1.14-3.78) and adults with cancer (HR, 3.04; 95% CI, 1.44-6.41). 

Other eating durations were not associated with CV mortality, except for eating duration of 8 to less than 10 hours in people with CVD (HR, 1.66; 95% CI, 1.03-2.67). 

No significant associations were found between eating duration and all-cause or cancer mortality in the overall sample and CVD/cancer subsamples, except that eating duration of more than 16 hours was associated with a lower risk for cancer mortality in people with cancer (HR, 0.47; 95% CI, 0.23-0.95).
 

Quality More Important Than Timing 

Dr. Zhong noted that the study doesn’t address the underlying mechanisms driving the observed association between 8-hour TRE and CV death. 

“However, we did observe that people who restricted eating to a period less than 8 hours per day had less lean muscle mass compared with those with typical eating duration of 12-16 hours. Loss of lean body mass has been linked to higher risk of cardiovascular mortality,” Dr. Zhong said. 

“Based on the evidence as of now, focusing on what people eat appears to be more important than focusing on the time when they eat. There are certain dietary approaches with compelling health benefits to choose, such as DASH diet and Mediterranean diet,” Dr. Zhong said.

Intermittent fasting is “certainly an interesting concept and one on which the potential mechanisms underlying the improvements in short outcome studies and preclinical studies in animals are strongly being pursued,” Sean P. Heffron, MD, cardiologist at the Center for the Prevention of Cardiovascular Disease at NYU Langone Heart, New York, who wasn’t involved in the study, told this news organization. 

Dr. Heffron expressed skepticism about the study results calling them “far from complete” and noted that data on diet was based on only 2-day diet records without correction for confounding variables. 

Dr. Heffron also noted that the restricted diet group has more smokers and more men. “I would “strongly anticipate that once appropriate corrections are made, the findings will no longer persist in statistical significance,” Dr. Heffron said.

He emphasized the need for more rigorous research before making clinical recommendations. When patients ask about intermittent fasting, Dr. Heffron said he tells them, “If it works for you, that’s fine,” but he doesn’t provide a recommendation for or against it. 

Funding for the study was provided by the National Key Research and Development Program of China and the National Science Foundation of China. Zhong, Dr. Heffron and Dr. Gardner have no relevant disclosures.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article