Specialty pharmacists may speed time to MS treatment

Article Type
Changed
Wed, 11/10/2021 - 16:15

Specialty pharmacists play a key and growing role in navigating the complexities of initiating disease-modifying therapies (DMTs) for multiple sclerosis (MS), resulting in earlier treatment, new data suggest.

Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, Department of Neurology, Duke University Hospital, Durham, North Carolina.
Dr. Jenelle Montgomery


“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.

“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.

Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.

In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
 

Aids early intervention

A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.

Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.

“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.

A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.

“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.

A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
 

 

 

High patient satisfaction

Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.

“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.

The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.

CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.

“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.

He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.

“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
 

Telemedicine, other models

Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.

“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.

“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.

Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.

“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.

Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Specialty pharmacists play a key and growing role in navigating the complexities of initiating disease-modifying therapies (DMTs) for multiple sclerosis (MS), resulting in earlier treatment, new data suggest.

Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, Department of Neurology, Duke University Hospital, Durham, North Carolina.
Dr. Jenelle Montgomery


“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.

“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.

Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.

In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
 

Aids early intervention

A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.

Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.

“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.

A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.

“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.

A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
 

 

 

High patient satisfaction

Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.

“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.

The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.

CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.

“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.

He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.

“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
 

Telemedicine, other models

Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.

“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.

“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.

Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.

“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.

Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.

A version of this article first appeared on Medscape.com.

Specialty pharmacists play a key and growing role in navigating the complexities of initiating disease-modifying therapies (DMTs) for multiple sclerosis (MS), resulting in earlier treatment, new data suggest.

Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, Department of Neurology, Duke University Hospital, Durham, North Carolina.
Dr. Jenelle Montgomery


“As DMT management and treatment options for MS symptoms become more complex, clinical pharmacists can be utilized for medication education and management,” Jenelle Hall Montgomery, PharmD, a clinical pharmacist practitioner at the Multiple Sclerosis and Neuroimmunology Division, department of neurology, Duke University Hospital, Durham, N.C., told delegates attending the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

Since 2018, more than half a dozen DMTs have been approved for MS by the U.S. Food and Drug Administration. However, there is currently no established DMT selection algorithm, and because of this, there is a need for specialty pharmacists, she added.

“DMT approvals by the FDA have outpaced MS guideline recommendations. This can be overwhelming for patients, especially now that they have so many options to choose from,” she said.

Key services provided by specialty pharmacists include coordinating pretreatment requirements, as well as help with dosing, side effects, safety monitoring, and treatment adherence. In addition, pharmacists help with switching therapies, dispensing, and cost and authorization problems.

In reporting on improvements associated with specialty pharmacists, researchers from prominent MS centers around the country described specific outcomes.
 

Aids early intervention

A report on the Kaiser Permanente Washington (KPWA) MS Pharmacy Program detailed significant reductions in the time to address patients’ needs through the use of specialty pharmacists. In an assessment of 391 referrals to the program from 2019 to 2020, the average total time spent per patient per year dropped from 145 minutes in 2019 to 109 minutes in 2020.

Services included assessment of medication adherence, adverse drug reaction consultation, lab monitoring, patient counseling on initiation of a DMT, shared decision making, and follow-up visits.

“The KPWA MS Pharmacy Program plays an integral role in the care of patients with MS. The MS clinical pharmacists ensure patients are well informed about their DMT options and are fully educated about selected treatment,” the investigators noted.

A report on an outpatient MS clinic at Emory Healthcare, Atlanta, described how use of specialty pharmacist services resulted in a 49% reduction in time to treatment initiation with fingolimod. The time decreased from 83.9 days to 42.9 days following the introduction of specialty pharmacist services.

“Integration of a clinical pharmacy specialist in the therapeutic management of MS patients is crucial to early intervention with disease-modifying therapy,” the investigators noted.

A report on the specialty pharmacy services provided at Johns Hopkins MS Precision Medicine Center of Excellence, Baltimore, described an evaluation of 708 assessments between July 2019 and June 2020. Results showed that the vast majority (98%) of patients reported no missed days from work or school due to MS-related symptoms and that 99.3% reported no hospitalizations due to MS relapses, which are both key measures of MS treatment adherence.
 

 

 

High patient satisfaction

Patients reported high satisfaction with the in-house pharmacy on the National Association of Specialty Pharmacy’s patient satisfaction survey. In the survey, the average score was 82, compared with 79 for external specialty pharmacies.

“Moreover, patients were highly satisfied with the services provided at the pharmacy and were likely to continue receiving their comprehensive pharmacy care at our institution,” the researchers reported.

The study “highlights the value of pharmacists’ involvement in patient care and supports the need for continuation of integrated clinical services in health system specialty pharmacy,” the investigators noted.

CMSC President Scott D. Newsome, DO, director of the Neurosciences Consultation and Infusion Center at Green Spring Station, Lutherville, Maryland, and associate professor of neurology at Johns Hopkins University School of Medicine, said that as a clinician, he is highly satisfied with the specialty pharmacy services for MS at Johns Hopkins.

“Our pharmacists are fantastic in communicating with the prescriber if something comes up related to medication safety or they are concerned that the patient isn’t adhering to the medication,” Dr. Newsome said.

He noted that in addition to helping to alleviate the burden of a myriad of tasks associated with prescribing for patients with MS, specialty pharmacists may have an important impact on outcomes, although more data are needed.

“Having a specialty pharmacy involved in the care of our patients can help navigate the challenges associated with the process of obtaining approval for DMTs,” he said. “We know how important it is to expedite and shorten the time frame from writing the prescription to getting the person on their DMT.”
 

Telemedicine, other models

Although integrated specialty pharmacist services may seem out of reach for smaller MS clinics, the use of telemedicine and other models may help achieve similar results.

“A model I have seen is having pharmacists split their time between a specialty pharmacy and the MS clinic,” said Dr. Montgomery.

“A telemedicine model can also be utilized, in which a pharmacist can reach out to patients by telephone or through video visits. This would allow a pharmacist to be utilized for multiple clinics or as an MS specialist within a specialty pharmacy,” she added.

Whether provided in house or through telemedicine, a key benefit for clinicians is in freeing up valuable time, which has a domino effect in improving quality all around.

“In addition to improving safety outcomes, specialty pharmacists help with the allocation of clinic staff to other clinic responsibilities, and the utilization of services by patients results in more resources allocated for their care,” Dr. Montgomery said.

Dr. Montgomery is a nonpromotional speaker for Novartis and is on its advisory board.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CMSC 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cannabis use common for MS-related spasticity

Article Type
Changed
Wed, 11/03/2021 - 09:18

Use of cannabis is common in patients with multiple sclerosis (MS), especially for the treatment of MS-related spasticity, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.

Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas,
Dr. Amber Salter

Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.

Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.

“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Administration routes vary

The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.

Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.

“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.

The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.

Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.

Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
 

Consistent use

The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.

Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.

The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.

Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.

Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.

“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.

“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.

Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.

As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
 

 

 

Mirrors clinical practice findings

Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.

“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.

One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.

“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.

“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.

While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.

She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.

“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
 

Countering misinformation

Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.

“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.

On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:

  • Does cannabis make sense for the symptoms being presented?
  • Has the patient received benefit so far?
  • Are there side effects they may be experiencing?
  • Would it be appropriate to lower the cannabis dose/frequency of its use?
  • If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?

The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Use of cannabis is common in patients with multiple sclerosis (MS), especially for the treatment of MS-related spasticity, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.

Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas,
Dr. Amber Salter

Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.

Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.

“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Administration routes vary

The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.

Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.

“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.

The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.

Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.

Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
 

Consistent use

The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.

Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.

The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.

Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.

Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.

“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.

“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.

Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.

As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
 

 

 

Mirrors clinical practice findings

Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.

“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.

One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.

“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.

“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.

While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.

She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.

“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
 

Countering misinformation

Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.

“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.

On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:

  • Does cannabis make sense for the symptoms being presented?
  • Has the patient received benefit so far?
  • Are there side effects they may be experiencing?
  • Would it be appropriate to lower the cannabis dose/frequency of its use?
  • If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?

The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Use of cannabis is common in patients with multiple sclerosis (MS), especially for the treatment of MS-related spasticity, new research suggests. Findings from a survey conducted through a large registry in 2020 showed that 31% of patients with MS reported trying cannabis to treat their symptoms – and 20% reported regular use.

Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas,
Dr. Amber Salter

Spasticity was reported by 80% as the reason why they used cannabis, while pain was cited as the reason by 69% and sleep problems/insomnia was cited by 61%.

Investigators noted that the new data reflect the latest patterns of use amid sweeping changes in recreational and medical marijuana laws.

“Interest in the use of cannabis for managing MS symptoms continues to increase as more data become available and access becomes easier,” co-investigator Amber Salter, PhD, associate professor, UT Southwestern Medical Center, Dallas, told attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Administration routes vary

The survey was conducted through the longitudinal North American Research Committee on Multiple Sclerosis (NARCOMS) Registry, a voluntary, self-report registry for patients with MS. Of 6,934 registry participants invited to participate, 3,249 (47%) responded. The majority of responders were women (79%) and the mean age was 61 years. About 63% were being treated with disease-modifying therapies.

Overall, 31% of respondents reported having used cannabis to treat their MS symptoms. In addition, 20% reported regular current cannabis use, with an average use of 20 days in the past month. As many as 40% of the current users reported using cannabis daily.

“In general we saw some small differences in current users, who tended to include more males; have higher spasticity, pain, and sleep symptoms; and [were] more likely to be unemployed and younger,” Dr. Salter said.

The most common forms of cannabis administration were smoking (33%) and eating (20%). In addition, 12% reported vaporizing cannabis with a highly concentrated material, 11% administered cannabis sublingually, and 11% reported swallowing it.

Further, 8% reported vaporizing cannabis as a dried flower, 5% used it topically, and 1% reported drinking it.

Of note, the definition of “cannabis/marijuana” in the study excluded hemp cannabidiol (CBD) or products marketed as CBD only.
 

Consistent use

The most common reason for use by far was spasticity (80%). This was followed by for pain (69%) and sleep/insomnia problems (61%). Among users, 37% reported doing so to treat all three of those problems.

Regarding other symptoms, 36% used cannabis for anxiety, 24% for depression, 18% for overactive bladder, 17% for nausea or gastrointestinal problems, 16% for migraine or headaches, 14% for tremors, and 6% for other purposes.

The vast majority (95%) reported cannabis to be very or somewhat helpful for their symptoms.

Among the 69% of respondents who reported not using cannabis for their MS symptoms, the most commonly cited reasons were a lack of evidence on efficacy (40%) or safety (27%), concerns of legality (25%), lack of insurance coverage (22%), prohibitive cost (18%), and adverse side effects.

Surprisingly, the dramatic shift in the legalization of cannabis use in many states does not appear to be reflected in changes in cannabis use for MS, Dr. Salter said.

“We conducted an anonymous NARCOMS survey a couple of years prior to this survey, and our results are generally consistent. There’s been a small increase in the use and an acceptance or willingness to consider cannabis, but it’s relatively consistent,” she said.

“Despite the changes in access, the landscape hasn’t really changed very much in terms of evidence of the effects on MS symptoms, so that could be why,” Dr. Salter added.

Most patients appear to feel comfortable discussing their cannabis use with their physician, with 75% reporting doing so. However, the most common primary source of medical guidance for treating MS with cannabis was “nobody/self”; for 20%, the source for medical guidance was a dispensary professional.

As many as 62% of respondents reported obtaining their cannabis products from dispensaries, while other sources included family/friend (18%) or an acquaintance (13%). About 31% reported their most preferred type of cannabis to be equal parts THC and cannabidiol, while 30% preferred high THC/low cannabidiol (30%).
 

 

 

Mirrors clinical practice findings

Commenting on the study, Laura T. Safar, MD, vice chair of Psychiatry at Lahey Hospital and Medical Center and assistant professor of psychiatry at Harvard Medical School, Boston, said the findings generally fall in line with cannabis use among patients with MS in her practice.

“This is [consistent] with my general experience: A high percentage of my patients with MS are using cannabis with the goal of addressing their MS symptoms that way,” said Dr. Safar, who was not involved with the research.

One notable recent change in patients’ inquiries about cannabis is their apparent confidence in the information they’re getting, she noted. This is a sign of the ever-expanding sources of information – but from sources who may or may not have an understanding of effects in MS, she added.

“What seems new is a certain level of specificity in the information patients state – regardless of its accuracy. There is more technical information widely available about cannabis online and in the dispensaries,” said Dr. Safar.

“A lot of that information may not have been tested scientifically, but it is presented with an aura of truth,” she said.

While misconceptions about cannabis use in MS may not be new, “the conviction with which they are stated and believed seems stronger,” even though they have been validated by questionably expert sources, Dr. Safar noted.

She pointed out that psychiatric effects are among her patients’ notable concerns of cannabis use in MS.

“Cannabis use, especially daily use in moderate to large amounts, can have negative cognitive side effects,” she said. “In addition, it can have other psychiatric side effects: worsening of mood and anxiety, apathy, and anhedonia, a lack of pleasure or enjoyment, and a flattening of the emotional experience.”
 

Countering misinformation

Dr. Safar said she works to counter misinformation and provide more reliable, evidence-based recommendations.

“I educate my patients about what we know from scientific trials about the potential benefits, including possible help with pain, excluding central pain, and with spasticity,” she said. Dr. Safar added that she also discusses possible risks, such as worsening of cognition, mood, and anxiety.

On the basis of an individual’s presentation, and working in collaboration with their neurologist as appropriate, Dr. Safar said she discusses the following issues with the patient:

  • Does cannabis make sense for the symptoms being presented?
  • Has the patient received benefit so far?
  • Are there side effects they may be experiencing?
  • Would it be appropriate to lower the cannabis dose/frequency of its use?
  • If a patient is using cannabis with an objective that is not backed up by the literature, such as depression, are they open to information about other treatment options?

The study was sponsored by GW Research. Dr. Salter has conducted research for GW Pharmaceuticals companies. Dr. Safar has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CMSC 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New consensus guideline on clinical MRI use in MS

Article Type
Changed
Mon, 11/29/2021 - 11:14

 

An updated consensus guideline on routine clinical use of magnetic resonance imaging in multiple sclerosis (MS) has been released collaboratively by three international expert groups.

The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.

Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.

Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.

“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

The guideline was also published this summer as a position paper in Lancet Neurology.

Key recommendations

The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.

For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.

Other recommendations include:

  • Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
  • 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
  • Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
  • At least two of three sagittal sequences are recommended for spinal cord MRI.
  • The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
  • However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
  • A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
 

 

For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.

“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.

“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.

Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.

Low protocol adherence

The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.

Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.

“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.

Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.

In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.

“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.

Resistance to change?

Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.

“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.

“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.

“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.

Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”

The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.

“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”

Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.

“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.

 

 

Spreading the word

In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.

“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”

Clinicians and patients alike can download the MRI protocol card from the CMSC website.

A version of this article first appeared on Medscape.com.

Meeting/Event
Issue
Neurology Reviews - 29(12)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

An updated consensus guideline on routine clinical use of magnetic resonance imaging in multiple sclerosis (MS) has been released collaboratively by three international expert groups.

The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.

Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.

Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.

“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

The guideline was also published this summer as a position paper in Lancet Neurology.

Key recommendations

The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.

For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.

Other recommendations include:

  • Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
  • 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
  • Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
  • At least two of three sagittal sequences are recommended for spinal cord MRI.
  • The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
  • However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
  • A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
 

 

For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.

“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.

“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.

Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.

Low protocol adherence

The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.

Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.

“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.

Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.

In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.

“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.

Resistance to change?

Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.

“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.

“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.

“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.

Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”

The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.

“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”

Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.

“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.

 

 

Spreading the word

In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.

“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”

Clinicians and patients alike can download the MRI protocol card from the CMSC website.

A version of this article first appeared on Medscape.com.

 

An updated consensus guideline on routine clinical use of magnetic resonance imaging in multiple sclerosis (MS) has been released collaboratively by three international expert groups.

The guideline represents a collaboration between the Consortium of Multiple Sclerosis Centers, the European-based Magnetic Resonance Imaging in Multiple Sclerosis, and North American Imaging in Multiple Sclerosis.

Among its recommendations for improving diagnosis and management of MS is the establishment of much-needed ways to boost protocol adherence. “The key part of these recommendations that we want to emphasize is how important it is for them to be used,” said David Li, MD, University of British Columbia, Vancouver, and cochair of the MRI guideline committee.

Dr. Li noted that there was a widespread lack of adherence among MRI centers to compliance with the 2018 CMSC guidelines in imaging for MS. This potentially compromised clinicians’ ability to identify lesions that allow for earlier and confident diagnoses and to monitor for disease changes that may necessitate the initiation or change of therapy, he said.

“The key to being able to know that brain changes have occurred in patients over time is to have scans that have been performed using standardized protocols – to be certain that the change is truly the result of a change in disease activity and progression and not erroneously due to differences resulting from different MRI scanning procedures,” he said to attendees at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).

The guideline was also published this summer as a position paper in Lancet Neurology.

Key recommendations

The new guideline covers a broad range of imaging topics, with key areas of focus including the use of three-dimensional imaging, when and when not to use gadolinium contrast, and spinal cord imaging.

For example, a 3 Tesla magnet strength is preferred when imaging the brain with MRI because of its increased sensitivity for detecting lesions – but a minimum magnet strength of at least 1.5 T can also be used. For the spinal cord, there is no advantage of 3 T over 1.5 T, the guideline notes.

Other recommendations include:

  • Core sequences for the brain should include sagittal and axial T2-weighted 3D fluid-attenuated inversion recovery (FLAIR), along with axial T2-weighted and diffusion-weighted sequences.
  • 3D acquisition, which is now available on most scanners, is preferable to 2D acquisitions.
  • Use of the subcallosal plane for consistent and reproducible alignment of axial scans is again emphasized, as it allows for easier and more confident comparison of follow-up studies to detect changes over time.
  • At least two of three sagittal sequences are recommended for spinal cord MRI.
  • The judicious use of macrocyclic gadolinium-based contrast agents (GBCA) is reemphasized because of its invaluable role in specific circumstances.
  • However, for routine follow-up monitoring for subclinical disease activity, high-quality nonenhanced scans will allow for identification of new or enlarging T2 lesions without the need for GBCA.
  • A new baseline brain MRI scan without gadolinium is recommended at least 3 months after treatment initiation, with annual follow-up scans without gadolinium.
 

 

For the diagnosis of MS, imaging of the entire spinal cord, as opposed to only the cervical segments, is recommended for the detection of lesions in the lower thoracic spinal segments and conus. However, 1.5-T scans are acceptable in that imaging, as 3-T scans provide no advantage. For routine follow-up monitoring, spinal cord MRI is optional.

“The current guidelines do not recommend routine follow-up spinal cord MRI, as it remains technically challenging and would disproportionately increase the scanning time, however experienced centers have the option to do so as a small number of asymptomatic spinal cord lesions do develop on follow-up,” the authors noted.

“However, follow up spinal cord MRI is recommended in special circumstances, including unexpected disease worsening and the possibility of a diagnosis other than multiple sclerosis,” they added.

Although the central vein sign has gained significant interest as a potential biomarker of inflammatory demyelination to help distinguish between MS and non-MS lesions, the 2021 protocol does not currently recommend imaging for the feature. However, those recommendations may change in future guidelines, the authors noted.

Low protocol adherence

The ongoing lack of adherence to guidelines that has resulted in frustrating inconsistencies in imaging was documented in no less than four studies presented at the meeting. They showed compliance with standard protocols to be strikingly poor.

Among the studies was one presented by Anthony Traboulsee, MD, professor and research chair of the MS Society of Canada, and from the University of British Columbia in Vancouver. Findings showed that only about half of scans acquired in a real-world dataset satisfied 2018 CMSC Standardized Brain MRI recommendations.

“Of note was that all the scans that were compliant were acquired in 3D while none of the 2D-acquired sequences were adherent,” Dr. Li commented.

Another study assessed use of standardized MRI protocols in a pragmatic, multisite MS clinical trial, the Traditional vs. Early Aggressive Therapy in Multiple Sclerosis (TREAT-MS) trial. Results showed that, upon enrollment, only 10% of scans followed CMSC guidelines for all three structural contrasts.

In that study, when the images provided by Johns Hopkins University Medical School were excluded, that figure dropped to 2.75% of remaining scans that met the criteria.

“Despite the importance of standardization of high-quality MRIs for the monitoring of people with MS, adoption of recommended imaging remains low,” the investigators wrote.

Resistance to change?

Commenting on the research and new guideline, Blake E. Dewey, PhD student, department of electrical and computer engineering at Johns Hopkins University, Baltimore, speculated that the noncompliance is often simply a matter of resistance to change.

“There are a number of reasons that are given for the retention of older, noncompliant MRI scans at different institutions, such as timing and patient throughput; but in my mind the issue is institutional inertia,” he said.

“It is difficult in many instances to get the clinician [radiologist] and institutional buy-in to make these kinds of changes across the board,” Mr. Dewey noted.

“The most common protocol that we see acquired is a set of 2D, low-resolution images with gaps between slices. These are simply not sufficient given modern MRI technology and the needs of MS clinicians,” he added.

Importantly, Mr. Dewey noted that, through direct communication with imaging staff and practitioners in the trial, compliance increased substantially – nearly 20-fold, “indicating a real possibility for outreach, including to commonly used outpatient radiology facilities.”

The updated MAGNIMS-CMSC-NAIMS MRI protocol is beneficial in providing “simple, reasonable guidelines that can be easily acquired at almost any imaging location in the U.S., and much of the rest of the world,” he said.

“As imaging researchers, we often reach for more that is needed clinically to properly diagnose and monitor a patient’s disease,” Mr. Dewey added. “This updated protocol has ‘trimmed the fat’ and left some discretion to institutions, which should help with compliance.”

Mr. Dewey said he also encourages imaging professionals to consider performing the sequences described as “optional” as well.

“Some of these are useful in measuring potential biomarkers currently under extensive validation, such as brain volumetrics and the central vein sign, that may help patient populations that are currently underserved by more traditional imaging, such as progressive patients and patients that could be potentially misdiagnosed,” he said.

 

 

Spreading the word

In the meantime, as part of its own outreach efforts, the CMSC is providing laminated cards that detail in simplified tables the 2021 updated MRI protocol. This makes it easy for centers to access the information and patients to help improve awareness of the protocol.

“We are urging clinicians to provide the cards to their MS patients and have them present the cards to their imaging center,” Dr. Li said. “This effort could make such an important difference in helping to encourage more to follow the protocol.”

Clinicians and patients alike can download the MRI protocol card from the CMSC website.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 29(12)
Issue
Neurology Reviews - 29(12)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CMSC 2021

Citation Override
Publish date: November 2, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Two diets linked to improved cognition, fatigue in MS

Article Type
Changed
Mon, 11/29/2021 - 11:04

A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.

In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.

“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.

The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Similar diets

The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).

The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.

The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.

The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.

To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.

After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.

Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.

Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
 

Reduction in fatigue, cognitive dysfunction

After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).

Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).

In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).

There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.

As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.

“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.

Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).

Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
 

Potential mechanisms

Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.

Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.

While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.

Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.

That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
 

Notable research with limitations

Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.

This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.

A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said

Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”

“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.

General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.

The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.

A version of this article first appeared on Medscape.com.

Meeting/Event
Issue
Neurology Reviews - 29(12)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.

In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.

“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.

The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Similar diets

The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).

The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.

The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.

The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.

To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.

After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.

Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.

Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
 

Reduction in fatigue, cognitive dysfunction

After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).

Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).

In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).

There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.

As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.

“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.

Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).

Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
 

Potential mechanisms

Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.

Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.

While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.

Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.

That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
 

Notable research with limitations

Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.

This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.

A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said

Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”

“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.

General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.

The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.

A version of this article first appeared on Medscape.com.

A Paleolithic elimination diet (Wahls diet) or a low-saturated fat diet (Swank diet) are associated with improved cognition, among other clinical outcomes, in relapsing-remitting multiple sclerosis (RRMS), new research suggests.

In a randomized study of patients with RRMS, the group that followed a Wahls diet and the group that followed a Swank diet both showed significant, unique improvement in measures of cognitive dysfunction, fatigue, and quality of life.

“Several dietary intervention studies have demonstrated favorable results on MS-related fatigue and quality of life. However, these results are among the first to show favorable reductions in cognitive dysfunction,” said co-investigator Tyler Titcomb, PhD, department of internal medicine, University of Iowa, Iowa City.

The results were presented at the 2021 Annual Meeting of the Consortium of Multiple Sclerosis Centers (CMSC).
 

Similar diets

The CMSC findings came from a secondary analysis of a randomized trial published online in July in the Multiple Sclerosis Journal Experimental, Translational, and Clinical (MSJ-ETC).

The primary analysis of the single-blind, parallel group, randomized trial showed the Wahls and Swank diets were linked to significant improvement in outcomes on the Fatigue Severity Scale (FSS), the Modified Fatigue Impact Scale (MFIS), and other measures among participants with RRMS. There were no significant differences between the two dietary regimens.

The Swank diet restricts saturated fat to a maximum of 15 g per day while providing 20 g to 50 g (4 to 10 teaspoons) of unsaturated fat per day, with four servings each of whole grains, fruits, and vegetables.

The Wahls diet recommends six to nine servings of fruits and vegetables per day, in addition to 6 to 12 ounces of meat per day, according to gender. Grains, legumes, eggs, and dairy, with the exception of clarified butter or ghee, are not permitted on this diet. Both diets eschew processed foods.

To further evaluate the diets’ effects on perceived fatigue and cognitive dysfunction, the researchers returned to the trial, which enrolled 95 adults with stable RRMS at the University of Iowa Prevention Intervention Center between August 2016 and May 2019.

After a 12-week run-in period with support and education from registered dietitians, participants were randomly assigned to either the Swank or Wahls diets in a 24-week intervention that did not include dietitian support.

Inclusion criteria included having moderate to severe fatigue, as shown by an FSS score of at least 4.0, while not having severe mental impairment, an eating disorder, or liver or kidney disease. There were no significant differences in baseline demographic or clinical characteristics between the groups.

Of the patients, 77 completed the 12-week run-in (38 in the Swank diet group and 39 in the Wahls group). A total of 72 participants completed the 24-week follow-up (37 and 35, respectively).
 

Reduction in fatigue, cognitive dysfunction

After the researchers controlled for smoking, alcohol consumption, age, sex, baseline distance 6-minute walk test, body mass index, serum vitamin D, and years with MS, results at 12 and 24 weeks showed significant improvements from baseline in the key outcomes of fatigue and cognitive function, as measured by the Fatigue Scale for Motor and Cognitive Function (FSMC).

Scores were −5.7 and −9.0, respectively, for the Swank diet group and −9.3 and −14.9 for the Wahls group (P ≤ .001 for all comparisons).

In addition, there was a significant reduction in both groups on the total Perceived Deficits Questionnaire (PDQ) at 12 and 24 weeks (Swank, −7.4 and −6.3, respectively; Wahls, −6.8 and −10.8; P ≤ .001 for all).

There were similar improvements for both diets in an analysis of the mental and physical scores on FSMC and on the subscales on PDQ of attention, retrospective memory, prospective memory, and planning.

As observed in the primary analysis, there were no significant differences between the two groups in absolute mean scores on FSMC, PDQ, or their subscales at any timepoint.

“Both diets led to significant reductions in fatigue and cognitive dysfunction,” Dr. Titcomb said.

Of note, the primary analysis further showed statistically and clinically significant increases in the 6-minute walk test at 24-weeks of 6% in the Wahls group (P = .007). After removal of nonadherent participants, the improvement was still significant at 24 weeks in the Wahls group (P = .02), as well as in the Swank group (P = .001).

Dr. Titcomb noted that the majority of study participants were taking disease-modifying therapies (DMTs). However, there were no interactions between any specific DMTs and dietary benefits.
 

Potential mechanisms

Although the similar outcomes between the diets point to a common mechanism, there are also various other possibilities, said Dr. Titcomb. These include modulation of the microbiome, inflammation, immune system, or micronutrient optimization, he said.

Previous research has shown reduced mass and diversity in the gut microbiota among patients with MS compared with those without MS, potentially promoting inflammation. Other research has shown improvements in those factors with dietary modification.

While there is no evidence of gut microbiota changes with the Wahls and Swank diets, each is rich in fiber and plant-derived phytochemicals, which are known to be associated with improvements in gut microbiota and neuroinflammation, the investigators noted.

Dr. Titcomb reported that research into the diets is continuing as they evaluate longer-term and other effects. “This trial was a short-term parallel arm trial that did not include MRI or a control group,” he said, adding that the investigators will soon start recruiting for a follow-up study that will include a control group, long-term follow-up, and MRIs.

That upcoming study “has the potential to answer several of the unknown questions regarding the effect of diet on MS,” Dr. Titcomb said.
 

Notable research with limitations

Commenting on the study, Rebecca Spain, MD, MSPH, associate professor of neurology at the Oregon Health & Science University, and associate director of clinical care at VA MS Center of Excellence West in Portland, said there were several notable findings.

This includes that most people with MS “were able to adhere to the protocols for significant lengths of time, even without the support of dietitians for the final 12 weeks of the study,” said Spain, who was not involved with the research.

A significant limitation was the lack of a control group. Without that, “it’s hard to know for sure if the improvements in fatigue and cognition were from the diets or were simply from the social support of participating in a research study,” she said

Nevertheless, trials reporting on dietary effects in MS such as the current study are important, Dr. Spain noted. They demonstrate “that it is feasible and safe to conduct dietary studies and suggest which key MS symptoms may benefit and should be evaluated in future studies.”

“Critically, diet studies address one of the most frequent concerns of people with MS, promoting self-management and empowerment,” Dr. Spain concluded.

General guidelines for common dietary elements with evidence of improving fatigue, cognition, and mood are available on the National MS Society’s website.

The study received no outside funding. Dr. Titcomb and Dr. Spain have disclosed no relevant financial relationships. Terry L. Wahls, MD, who developed the Wahls diet, was a senior author of the study.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 29(12)
Issue
Neurology Reviews - 29(12)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CMSC 2021

Citation Override
Publish date: October 29, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Free vitamin D no better at predicting death in men than standard testing

Article Type
Changed
Thu, 10/28/2021 - 10:27

In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.

Dr. Marian Dejaeger, of the Department of Public Health and Primary Care, KU Leuven (Belgium)
Dr. Marian Dejaeger

“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.

“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”

Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.

One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.

To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism

The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.

In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.

After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).

In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.

Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
 

Methods of measurement

An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.

“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.

A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.

The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”

But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”

Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.

“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”

Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.

Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.

Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
 

Free hormone hypothesis

Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.

The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.

However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.

“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.

The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.

Publications
Topics
Sections

In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.

Dr. Marian Dejaeger, of the Department of Public Health and Primary Care, KU Leuven (Belgium)
Dr. Marian Dejaeger

“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.

“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”

Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.

One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.

To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism

The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.

In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.

After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).

In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.

Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
 

Methods of measurement

An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.

“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.

A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.

The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”

But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”

Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.

“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”

Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.

Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.

Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
 

Free hormone hypothesis

Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.

The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.

However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.

“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.

The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.

In the clinical assessment of vitamin D concentrations, free 25-hydroxyvitamin D shows little added benefit to the current standard of total 25(OH)D, with deficiencies in each associated with at least a twofold risk of all-cause mortality, new research shows.

Dr. Marian Dejaeger, of the Department of Public Health and Primary Care, KU Leuven (Belgium)
Dr. Marian Dejaeger

“In this prospective, population-based study of middle-aged and older European men, total 25(OH)D levels below 20 mcg/L were independently associated with a twofold increased all-cause mortality,” the researchers reported.

“Lower concentrations of free 25(OH)D were also predictive of mortality, but did not provide any additional information,” they noted. “The data do not support routine measurement of free 25(OH)D or 1,25(OH)2D [1,25-dihydroxyvitamin D] over total 25(OH)D levels.”

Despite vitamin D deficiency being well established as playing a role in a wide range of adverse health effects, including cardiovascular disease and mortality, there has been a lack of consensus on the optimal concentration of total 25(OH)D, with studies showing inconsistent levels to define insufficiency and deficiency.

One aspect of the debate has focused on precisely how to measure the concentrations, with some evidence supporting the “free hormone hypothesis,” which suggests that free 25(OH)D could represent a better indicator than the standard total 25(OH)D of functional availability of vitamin D, and have stronger clinical utility.

To investigate both issues, Marian Dejaeger, MD, PhD, and colleagues evaluated prospective data on 1,915 men recruited from eight centers around Europe in the European Male Aging Study in a report published in the Journal of Clinical Endocrinology & Metabolism

The men, who were aged between 40 and 79 years, had a mean follow-up of 12.3 years; during that time, about a quarter (23.5%) of them died.

In addition to other factors, including being older, having a higher body mass index, and having at least two comorbidities, men who died had significantly lower levels of total 25(OH)D, total 1,25(OH)2D, free 25(OH)D, and free 1,25(OH)2D, as well as higher parathyroid hormone and creatinine values.

After adjustment for key confounders, including body mass index, smoking, alcohol consumption, kidney function, number of comorbidities at baseline and other factors, men with a total 25(OH)D below 20 mcg/L had a significantly increased risk of mortality, compared with those who had normal levels of vitamin D, defined as above 30 mcg/L (hazard ratio, 2.03; P < .001).

In terms of free 25(OH)D, the lowest three free 25(OH)D quintiles (under 4.43 ng/L) similarly had a significantly higher mortality risk, compared with the highest quintile (HR, 2.09; P < .01) after adjustment for the confounders.

Further observations of all quintiles of other measures of 1,25(OH)2D and vitamin D binding protein (DBP) showed no associations with mortality after adjusting for confounders.
 

Methods of measurement

An important caveat of the study is the type of method used to measure free 25(OH)D. The authors calculated free 25(OH)D using a formula, as opposed to the alternative of direct measurement with an enzyme-linked immunosorbent assay kit, and there can be important differences between the two approaches, said Daniel Bikle, MD, PhD, a professor of medicine and dermatology at the San Francisco Veterans Affairs Medical Center and University of California, San Francisco, in a comment on the research.

“The biggest problem is that calculating free 25(OH)D does not give an accurate estimate of the real free level, so making conclusions regarding its role in clinical situations is subject to error,” said Dr. Bikle, who recently authored a review of the free hormone hypothesis.

A calculation approach “depends heavily on the total 25(OH)D level, so in a population with reasonably normal DBP and albumin levels, the correlation with total 25(OH)D is very high, so I am not surprised by the results showing no additional value,” he said in an interview.

The authors addressed their use of the calculation over the direct measurement in the study, noting that there is a “high correlation between both methods.”

But they added that, “as no equilibrium analysis method is available for free 25(OH)D, nor for free 1,25(OH)2D, no method can be considered superior.”

Dr. Dejaeger, of the department of public health and primary care, Katholieke Universiteit Leuven (Belgium), added that she agreed that high or low DBP could potentially shift some correlations, but noted that other research has shown calculated and direct measures to match relatively well.

“So we partly agree [with Dr. Bikle] not being surprised that we did not find an added value because we also found little variation in DBP, but we are not convinced that a different measurement method could make the difference here.”

Another caveat of the study is that, despite half of the measurements being taken in the summer, more than 90% of subjects in the study’s cohort had vitamin D insufficiency, defined in the study as total 25(OH)D levels below 30 mcg/L, and as many as 70% had deficiency, with levels below 20 mcg/L.

Therefore, “as the number of participants with high levels of total 25(OH)D in our study is small, a true threshold concentration for optimal vitamin D status cannot be defined on basis of our data,” the authors noted.

Under current recommendations, the Endocrine Society indicates that concentrations below 30 mcg/L are insufficient, while other groups, including the Institute of Medicine, suggest concentrations of 20 mcg/L or above are adequate.
 

Free hormone hypothesis

Under the free hormone hypothesis, which is observed with thyroid hormones and sex steroids, the very small fraction of free hormones that are not bound to protein carriers can enter cells and help facilitate biologic activity.

The hypothesis of a role of free 25(OH)D in mortality was supported by a recent study, in which free 25(OH)D levels – but not total 25(OH)D levels, were found to be independently associated with an increased risk of all-cause and cardiovascular mortality among patients with coronary artery disease.

However, two other studies are more consistent with the new findings, including one study showing no added value of free 25(OH)D as a marker for bone mineral density in older women, and another study showing no value as a marker of metabolic variables in healthy children.

“Currently, there are no hard data to support routine measurements of free 25(OH)D or 1,25(OH)2D over total 25(OH)D, the current standard of assessing vitamin D status, as stated in guidelines from different scientific bodies,” Dr. Dejaeger said in an interview.

The study received support from Versus Arthritis and the National Institute for Health Research Manchester Biomedical Research Centre. Dr. Dejaeger and Dr. Bikle had no disclosures to report.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF CLINICAL ENDOCRINOLOGY & METABOLISM

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vitamin D status may play a pivotal role in colon cancer prevention

Article Type
Changed
Thu, 12/15/2022 - 14:35

In ongoing efforts to investigate a link between vitamin D and colorectal cancer, new research shows that women who consume higher levels of vitamin D – particularly from dietary sources – have a reduced risk of developing early-onset colorectal cancer, compared with those who have lower levels.

This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.

Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.

“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.

Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
 

The evidence is in, but it’s incomplete

In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.

A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.

In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.

The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.

Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.

The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.

“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”

In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.

Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.

“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
 

The hazards of a vitamin D deficiency

A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.

Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.

The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.

Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.

Publications
Topics
Sections

In ongoing efforts to investigate a link between vitamin D and colorectal cancer, new research shows that women who consume higher levels of vitamin D – particularly from dietary sources – have a reduced risk of developing early-onset colorectal cancer, compared with those who have lower levels.

This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.

Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.

“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.

Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
 

The evidence is in, but it’s incomplete

In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.

A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.

In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.

The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.

Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.

The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.

“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”

In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.

Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.

“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
 

The hazards of a vitamin D deficiency

A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.

Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.

The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.

Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.

In ongoing efforts to investigate a link between vitamin D and colorectal cancer, new research shows that women who consume higher levels of vitamin D – particularly from dietary sources – have a reduced risk of developing early-onset colorectal cancer, compared with those who have lower levels.

This is according to an observational study published in the journal Gastroenterology. The study included 94,205 women (aged 25-42 years) who were followed between 1991 and 2015 during which 111 incident cases of early-onset colorectal cancer were diagnosed. Among 29,186 women who had at least one lower endoscopy from 1991 to 2011, 1,439 newly diagnosed conventional adenomas and 1,878 serrated polyps were found.

Women who consumed the highest average levels of total vitamin D of 450 IU per day, compared with those consuming less than 300 IU per day, showed a significantly reduced risk of early-onset colorectal cancer. Consuming 400 IU each day was associated with a 54% reduced risk of early-onset colorectal cancer.

“If confirmed, our findings could potentially lead to recommendations for higher vitamin D intake as an inexpensive low-risk complement to colorectal cancer screening as a prevention strategy for adults younger than age 50,” wrote the study authors, led by Edward L. Giovannucci, MD, ScD, of the Harvard School of Public Health, Boston.

Associations between vitamin D levels and colorectal cancer have been documented in review articles over the years. The link is the subject of 10 recently completed or ongoing clinical trials. Few studies have focused on early colorectal cancer and vitamin D intake. Unlike advanced colorectal cancer, the early-onset form of the disease is not as strongly associated with the traditional risk factors of a family history of colorectal cancer and it is therefore believed to be more strongly linked to other factors, such as lifestyle and diet – including vitamin D supplementation.
 

The evidence is in, but it’s incomplete

In addition to the new study in Gastroenterology, other observational studies, as well as laboratory and animal studies, suggest that vitamin D plays a role in inhibiting carcinogenesis. Vitamin D, researchers theorize, contains anti-inflammatory, immunomodulatory, and tumor angiogenesis properties that can slow the growth of tumors, but the evidence is mixed.

A meta-analysis of 137,567 patients published in 2013 in Preventive Medicine found an inverse association between 25-hydroxyvitamin D (25[OH]D) and total cancer mortality in women, but not among men. Three meta-analyses published in 2014 and 2019 found that vitamin D supplementation does not affect cancer incidence but does significantly reduce total cancer mortality rates by 12%-13%.

In 2019, researchers led by Marjorie McCullough, ScD, RD, senior scientific director of epidemiology research for the American Cancer Society, described a causal relationship between circulating vitamin D and colorectal cancer risk among 17 cohorts from a pooled analysis. “Our study suggests that optimal circulating 25(OH)D concentrations for colorectal cancer risk reduction are 75-100 nmol/L, [which is] higher than current Institute of Medicine recommendations for bone health,” she and colleagues wrote. Their findings were published in the Journal of the National Cancer Institute.

The Vitamin D and Omega-3 Trial (VITAL) published in 2019 in the New England Journal of Medicine, showed no significant effect of vitamin D3 supplementation of 2,000 IU/day in lowering the risk of invasive cancer or cardiovascular events.

Despite the mixed results, studies offer valuable insights into cancer risks, said Scott Kopetz, MD, PhD, codirector of the colorectal cancer moon shot research program at the University of Texas MD Anderson Cancer Center, Houston.

The Gastroenterology study is noteworthy because it focuses on early-onset colorectal cancer, he said.

“[The authors] demonstrate for the first time that there is an association of vitamin D intake with early-onset colorectal incidence, especially in the left side of the colon and rectum where the increase in early onset colorectal cancer manifests,” Dr. Kopetz said. “The analysis suggests that it may require long-term vitamin D intake to derive the benefit, which may explain why some shorter-term randomized studies failed to demonstrate.”

In animal models, vitamin D3 is “estimated to lower the incidence of colorectal cancer by 50%,” according to Lidija Klampfer, PhD, formerly a molecular biologist and senior research scientist with the Southern Research Institute, Birmingham, Ala.

Dr. Klampfer, a founding partner of ProteXase Therapeutics, is the author of an article on vitamin D and colon cancer published in 2014 in the World Journal of Gastrointestinal Oncology.

“The levels of vitamin D3 appear to be an essential determinant for the development and progression of colon cancer and supplementation with vitamin D3 is effective in suppressing intestinal tumorigenesis in animal models,” she wrote. “Studies have shown that 1,25 dihydroxyvitamin D3 can inhibit tumor-promoting inflammation leading to the development and progression of colon cancer.”
 

The hazards of a vitamin D deficiency

A severe vitamin D deficiency is associated with compromised bone and muscle health, calcium absorption, immunity, heart function and it can affect mood. Other studies have linked vitamin D deficiency to colorectal cancer, blood cancers, and bowel cancer.

Serum 25(OH)D is the primary circulating form of vitamin D and is considered the best marker for assessing vitamin D status, says Karin Amrein, MD, MSc, an endocrinologist with the Medical University of Graz (Austria). She was the lead author of a review on vitamin D deficiency published in January 2020 in the European Journal of Clinical Nutrition.

The Global Consensus Recommendations define vitamin D insufficiency as 12-20 ng/mL (30-50 nmol/L) and a deficiency as a serum 25OHD concentration less than 12 ng/mL (30 nmol/L). A deficiency in adults is usually treated with 50,000 IU of vitamin D2 or D3 once weekly for 8 weeks followed by maintenance dosages of cholecalciferol (vitamin D3) at 800-1,000 IU daily from dietary and supplemental sources.

Screening is recommended for individuals who exhibit symptoms and conditions associated with a vitamin D deficiency, but there is little agreement on recommended serum levels because every individual is different, according to the U.S. Preventive Services Task Force which updated its vitamin D recommendations in April for the first time in 7 years.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Bone risk: Is time since menopause a better predictor than age?

Article Type
Changed
Fri, 10/22/2021 - 13:03

 

Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.

Doctor showing elderly woman model of spine
Steve Debenport/Getty Images

In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.

“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.

“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”

An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.

To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.

Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.

In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.

After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).

Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.

Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”

Fracture risk with earlier menopause

In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).

 

 

Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.

While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.

The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.

In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.

Addition to guidelines?

Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”

“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.

Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.

“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”

Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.

The authors and Dr. Ebeling had no disclosures to report.

Publications
Topics
Sections

 

Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.

Doctor showing elderly woman model of spine
Steve Debenport/Getty Images

In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.

“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.

“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”

An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.

To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.

Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.

In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.

After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).

Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.

Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”

Fracture risk with earlier menopause

In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).

 

 

Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.

While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.

The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.

In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.

Addition to guidelines?

Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”

“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.

Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.

“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”

Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.

The authors and Dr. Ebeling had no disclosures to report.

 

Although early menopause is linked to increased risks in bone loss and fracture, new research indicates that, even among the majority of women who have menopause after age 45, the time since the final menstrual period can be a stronger predictor than chronological age for key risks in bone health and fracture.

Doctor showing elderly woman model of spine
Steve Debenport/Getty Images

In a large longitudinal cohort, the number of years since a woman’s final menstrual period specifically showed a stronger association with femoral neck bone mineral density (BMD) than chronological age, while an earlier age at menopause – even among those over 45 years, was linked to an increased risk of fracture.

“Most of our clinical tools to predict osteoporosis-related outcomes use chronological age,” first author Albert Shieh, MD, told this news organization.

“Our findings suggest that more research should be done to examine whether ovarian age (time since final menstrual period) should be used in these tools as well.”

An increased focus on the significance of age at the time of the final menstrual period, compared with chronological age, has gained interest in risk assessment because of the known acceleration in the decline of BMD that occurs 1 year prior to the final menstrual period and continues at a rapid pace for 3 years afterwards before slowing.

To further investigate the association with BMD, Dr. Shieh, an endocrinologist specializing in osteoporosis at the University of California, Los Angeles, and his colleagues turned to data from the Study of Women’s Health Across the Nation (SWAN), a longitudinal cohort study of ambulatory women with pre- or early perimenopausal baseline data and 15 annual follow-up assessments.

Outcomes regarding postmenopausal lumbar spine (LS) or femoral neck (FN) BMD were evaluated in 1,038 women, while the time to fracture in relation to the final menstrual period was separately evaluated in 1,554 women.

In both cohorts, the women had a known final menstrual period at age 45 or older, and on average, their final menstrual period occurred at age 52.

After a multivariate adjustment for age, body mass index, and various other factors, they found that each additional year after a woman’s final menstrual period was associated with a significant (0.006 g/cm2) reduction in postmenopausal lumbar spine BMD and a 0.004 g/cm2 reduction femoral neck BMD (both P < .0001).

Conversely, chronological age was not associated with a change in femoral neck BMD when evaluated independently of years since the final menstrual period, the researchers reported in the Journal of Clinical Endocrinology and Metabolism.

Regarding lumbar spine BMD, chronological age was unexpectedly associated not just with change, but in fact with increases in lumbar spine BMD (P < .0001 per year). However, the authors speculate the change “is likely a reflection of age-associated degenerative changes causing false elevations in BMD measured by dual-energy x-ray absorptiometry.”

Fracture risk with earlier menopause

In terms of the fracture risk analysis, despite the women all being aged 45 or older, earlier age at menopause was still tied to an increased risk of incident fracture, with a 5% increase in risk for each earlier year in age at the time of the final menstrual period (P = .02).

 

 

Compared with women who had their final menstrual period at age 55, for instance, those who finished menstruating at age 47 had a 6.3% greater 20-year cumulative fracture risk, the authors note.

While previous findings from the Malmo Perimenopausal Study showed menopause prior to the age of 47 to be associated with an 83% and 59% greater risk of densitometric osteoporosis and fracture, respectively, by age 77, the authors note that the new study is unique in including only women who had a final menstrual period over the age of 45, therefore reducing the potential confounding of data on women under 45.

The new results “add to a growing body of literature suggesting that the endocrine changes that occur during the menopause transition trigger a pathophysiologic cascade that leads to organ dysfunction,” the authors note.

In terms of implications in risk assessment, “future studies should examine whether years since the final menstrual period predicts major osteoporotic fractures and hip fractures, specifically, and, if so, whether replacing chronological age with years since the final menstrual period improves the performance of clinical prediction tools, such as FRAX [Fracture Risk Assessment Tool],” they add.

Addition to guidelines?

Commenting on the findings, Peter Ebeling, MD, the current president of the American Society of Bone and Mineral Research, noted that the study importantly “confirms what we had previously anticipated, that in women with menopause who are 45 years of age or older a lower age of final menstrual period is associated with lower spine and hip BMD and more fractures.”

“We had already known this for women with premature ovarian insufficiency or an early menopause, and this extends the observation to the vast majority of women – more than 90% – with a normal menopause age,” said Dr. Ebeling, professor of medicine at Monash Health, Monash University, in Melbourne.

Despite the known importance of the time since final menstrual period, guidelines still focus on age in terms of chronology, rather than biology, emphasizing the risk among women over 50, in general, rather than the time since the last menstrual period, he noted.

“There is an important difference [between those two], as shown by this study,” he said. “Guidelines could be easily adapted to reflect this.”

Specifically, the association between lower age of final menstrual period and lower spine and hip BMD and more fractures requires “more formal assessment to determine whether adding age of final menstrual period to existing fracture risk calculator tools, like FRAX, can improve absolute fracture risk prediction,” Dr. Ebeling noted.

The authors and Dr. Ebeling had no disclosures to report.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF CLINICAL ENDOCRINOLOGY AND METABOLISM

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Use live donors for liver transplants for HCC patients, say experts

Article Type
Changed
Mon, 10/18/2021 - 16:40

For some patients with hepatocellular cancer (HCC), a liver transplant is the best treatment. But there is a long waiting list for all organ transplants.

A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.

“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.

The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.

“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.

“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.

“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
 

Live liver donor = lower death risk

The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.

The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.

The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.

About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.

The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).

After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).

“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.

Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.

The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).

Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
 

 

 

Diverse transplant centers, larger cohorts set study apart

Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.

“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.

In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.

“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.

Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.

“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.

As programs expand, the availability of live liver donors should improve, he suggested.

In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.

Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

For some patients with hepatocellular cancer (HCC), a liver transplant is the best treatment. But there is a long waiting list for all organ transplants.

A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.

“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.

The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.

“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.

“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.

“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
 

Live liver donor = lower death risk

The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.

The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.

The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.

About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.

The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).

After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).

“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.

Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.

The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).

Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
 

 

 

Diverse transplant centers, larger cohorts set study apart

Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.

“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.

In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.

“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.

Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.

“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.

As programs expand, the availability of live liver donors should improve, he suggested.

In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.

Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.

A version of this article first appeared on Medscape.com.

For some patients with hepatocellular cancer (HCC), a liver transplant is the best treatment. But there is a long waiting list for all organ transplants.

A new study shows that outcomes with liver transplants from live donors are better than outcomes with transplants from deceased donors, leading to calls for increasing the availability of live donation.

“Transplant programs worldwide should be encouraged to expand their live donor programs to manage patients with HCC,” suggest authors of the new study, published in September in JAMA Surgery.

The findings are important in light of the fact that among patients with HCC, liver transplants are restricted to those patients who have the highest chances of survival, owing to long donor organ waiting lists, say the authors. Use of transplants from living donors could increase the availability of organs for patients on the deceased donor waiting list.

“One could even argue that a living donor gives two organs back to the organ pool,” the authors comment.

“Efforts to expand the donor pool through living donor liver transplant for patients with HCC will ultimately increase the number of available deceased donor liver transplants to help all patients in need of liver transplant,” David A. Gerber, MD, of the University of North Carolina at Chapel Hill, and colleagues write in an accompanying commentary.

“It is very important that donors aren’t recruited or solicited, but with the growth of transplant programs, more potential donors will become aware of this opportunity and will step forward seeking to help someone else,” Dr. Gerber commented.
 

Live liver donor = lower death risk

The new study was conducted by first author Quirino Laid, MD, PhD, of the Department of General Surgery and Organ Transplantation, Sapienza University, Rome, and colleagues. They explain that the need to better understand the potential benefits of living donor organs is pressing. Liver cancer rates continue to rise, and the demand for organs outpaces the supply. Although various smaller studies have shown survival benefits of live donor liver transplant for people with HCC, debate continues. Previous evidence has suggested higher cancer recurrence rates and unfavorable outcomes.

The multicenter study is thought to be the largest to date on this issue. The investigators evaluated data from patients who were on liver donation waiting lists for a first transplant between January 2000 and December 2017. The study included two cohorts of patients on waiting lists: an international cohort, consisting of 3,052 patients at 12 collaborative transplant centers in Europe, Asia, and the United States; and a Canadian cohort, consisting of 906 patients.

The majority of patients were men (80.2%). The median age at the time of first referral was 58 years.

About a third of patients (33.1%) in the international cohort and slightly fewer than a third (27%) in the Canadian cohort received live donor liver transplants; the reminder received liver transplants from deceased donors.

The median follow-up period was 3.3 years. Receiving a live donor liver transplant was independently associated with a 49% reduction in the overall risk for death (hazard ratio, 0.51) in the international cohort and a 43% reduction in the Canadian cohort (HR, 0.57; both P < .001).

After adjustment for potential confounders, living donor liver transplantation remained independently associated with a reduced the risk for overall death. There was a reduction of 33% in the international cohort (P = .001) and a reduction of 48% in the Canadian cohort (P < .001).

“Divergent experiences all converged to a similar 40% to 50% reduction in intention-to-treat death risk,” the authors write.

Importantly, there were no increases in post-transplant cancer recurrence rates in the live donor groups in either cohort. Rates ranged from 13% to 16% over 5 years and from 17% to 22% after 10 years in both groups.

The median amount of time on the waiting list was significantly shorter for patients in the live donor group than for those in the deceased donor group (1 month vs. 6 months in the international cohort [P < .001]; 5 months vs. 6 months in the Canadian cohort [P = .006]).

Notably, in the deceased donor groups, there were 295 dropouts, compared with no dropouts among the live donor patients in the international cohort (P < .001). In the Canadian cohort, the corresponding rates were 32.2% and 13.9% (P < .001).
 

 

 

Diverse transplant centers, larger cohorts set study apart

Although these latest results are consistent with those of recent studies conducted in France, Hong Kong, and elsewhere, in the current study, the cohorts were larger, say the authors.

“Compared with previous studies, all of which were based on relatively small case series, the present study examined the data of almost 4,000 patients who were on a waiting list for a transplant; therefore, this study may be the largest cohort study on this topic,” they point out.

In addition to improved timing of a transplant, other factors, such as patient selection, help explain the better survival, editorialist Dr. Gerber commented.

“Survival improvement [with live donor liver transplants] is a combination of [surgeon] experience in this transplant procedure and an appropriate selection bias, meaning taking patients who aren’t too sick while waiting on the transplant but who would benefit from the operation,” he said.

Gaining that experience may be particularly challenging in the United States, owing to regulatory barriers to expanding the programs, but efforts to overcome that are moving ahead, Dr. Gerber added.

“This issue of where an individual gains the experience or expertise is being discussed as transplantation has grown worldwide,” he notes.

As programs expand, the availability of live liver donors should improve, he suggested.

In a related story, this news organization recently reported on the controversial issue of liver transplant as an option for the treatment of liver metastases resulting from colorectal cancer.

Study coauthor Gonzalo Sapisochin, MD, has received grants from Bayer and Roche outside the submitted work as well as personal fees from Integra, Novartis, and AstraZeneca. No other relevant financial relationships were reported.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Radiofrequency ablation gains favor for thyroid nodules in U.S.

Article Type
Changed
Thu, 10/14/2021 - 15:30

 

As radiofrequency ablation (RFA) in the treatment of benign thyroid nodules gains favor as a noninvasive alternative to surgery in the United States, clinicians are increasingly reporting their experiences in hospital as well as outpatient settings.

And in one case, a hospital has taken the unique step of forming a multidisciplinary thyroid nodule RFA tumor board, which helps in the often tricky decision-making process that is involved.

“Our multidisciplinary RFA tumor board has been invaluable in this process, and it is the only one of its kind in the nation that I’m aware of,” James Lim, MD, of the Division of Surgical Oncology, Thyroid, and Parathyroid Center at Oregon Health Sciences University (OHSU), told this news organization.

Dr. Lim reports receiving referrals from “all avenues, some from thyroid specialists and others from nonthyroid specialists such as primary care practitioners or patient self-referrals.”

“Because of this, our centralized process of multidisciplinary review ensures that each patient is evaluated thoroughly through each thyroid specialists’ lens to optimize patient outcomes,” noted Dr. Lim, an assistant professor of endocrine surgery.

The RFA tumor board consists of experts in all specialties involved in thyroid nodule assessment and treatment, including surgeons, interventional radiologists, and endocrinologists.
 

Just because you can, doesn’t mean you should

However, there should be some caution that although there is enthusiasm regarding this noninvasive alternative to surgery, there is another option, that of mere observation, which is appropriate in many cases of thyroid nodules and should not be overlooked.

“For a number of reasons, the key to keep in mind is that, just because we can do something doesn’t mean we should,” Michael Singer, MD, director of the Division of Thyroid & Parathyroid Surgery, Department of Otolaryngology – Head and Neck Surgery, at the Henry Ford Health System, Detroit, said in an interview.

While emphasizing that he believes RFA to be a promising technology that will likely benefit patients in the future, Dr. Singer voiced concern about the approach becoming an easy choice – particularly if profit is to be had – when observation is a clear alternative. “If RFA becomes seen as an opportunity to create revenue, potential conflicts of interest may arise,” he said.

“As it is not a major procedure with a dramatic risk profile, my concern is that some clinicians [could] adopt the attitude of ‘Why not do it?’ even when the indication is minimal or nonexistent,” he added.

Dr. Lim said he agrees that “any new medical technology requires thoughtful evaluation and appropriate patient selection in order to ensure optimal patient outcomes.”

That’s where the tumor board has been especially beneficial.

“We have found great benefit in reviewing potential RFA cases in a multidisciplinary fashion within our tumor board and would recommend other institutions to consider it,” he noted. In the absence of a tumor board, “at a minimum, a thyroid specialist should be involved in the evaluation of a potential thyroid RFA patient prior to ablation treatment,” he advised.
 

Tumor board was able to identify a small subset of patients for surgery

In his research presented at the 90th Annual Meeting of the American Thyroid Association (ATA), Dr. Lim and colleagues evaluated the tumor board’s efficacy in altering diagnosis and treatment plans in a retrospective review of cases presented to the board for RFA consideration since its inception in July 2020 through June 2021.

 

 

Over the study period, 65 patients with biopsy-proven benign thyroid nodules were newly referred for RFA, with 58 referred for mass effect symptoms and seven for autonomous function.

After the multidisciplinary review, about half of the cases, 37 (56.9%), were approved for RFA.

Of the remainder, 22 (33.8%) were determined to need additional studies, just two (3.0%) were recommended for surgery, and four (6.2%) were recommended to not receive any intervention.

Of the 22 cases recommended for additional studies, 15 were subsequently recommended for RFA and four were recommended to receive surgery due to suspicious clinical findings.

Of those that underwent surgery, two showed thyroid cancer on final pathology.

Among the nodules recommended to RFA, the average nodule volume was 15.1 mL, whereas the average volume for those recommended for surgery was 40.9 mL (P = .08).

No significant complications occurred among patients that underwent RFA or those who had surgery.

“The tumor board’s multidisciplinary review was able to identify high-risk features in some patients with benign biopsies. This led to a change in recommendation from RFA to surgery for possible malignancy in a small subset of patients,” Dr. Lim noted.

In a separate analysis, Dr. Lim and colleagues reported that, among patients treated with RFA (with a mean baseline nodule volume of 11.9 mL), mean nodule volume was 6.4 mL after 1 month, 4.5 mL after 3 months, and 3.8 mL at 6 months, which were all significantly reduced versus baseline (P < .001). Similar improvements were also reported in symptom and cosmetic scores at each timepoint (all P < .001).

There were no cases of postprocedural hypothyroidism or symptomatic thyrotoxicosis.

Underlining that patients can expect noticeable improvement in symptom scores by their 30-day visit, Dr. Lim noted that patients should be warned of some early swelling.

“It is important to inform patients that they may have swelling of their treated nodule immediately after the procedure, but this should subside within a few days,” he said.
 

Outpatient RFA safe and efficacious

In a separate study also presented at the meeting, three practitioners described their experiences with RFA in their outpatient thyroid practices in San Antonio; Santa Monica, California; and Gettysburg, Pennsylvania.

Overall, there were 68 cases involving benign, class II thyroid nodules, and the authors reported average procedure times of under an hour, with actual RFA time varying from 7 to 22 minutes.

Of note, for nodules larger than 4.5 cm, two procedures were necessary to achieve desired results.

Excluding the larger nodules requiring more than one procedure, there was an average decrease in nodule size of 48% at 1 month and a decrease of 82% after 3 months in more than 80% of cases.

None of the cases required surgery. There were no major complications, and all patients had preserved baseline thyroid function.

“This preliminary study of 68 patients shows how thyroid RFA is safe and efficacious when performed in an endocrine outpatient office practice,” Kathleen Hands, MD, of the Thyroid Center of South Texas, and coauthors concluded.
 

Insurance coverage an issue in U.S.

Among much larger studies demonstrating the safety and efficacy of RFA for benign nodules, a study of 450 Chinese patients published in January showed RFA to be superior to conventional thyroidectomy in terms of patient satisfaction, postoperative quality of life, and shorter hospital stay, although the caveat was it took longer to achieve nodule volume reduction.

 

 

But if RFA use is to become more widespread in the United States, a key obstacle is that insurance companies generally do not cover the procedure. Although patients in Dr. Lim’s analyses did have coverage, it didn’t come easily, he said.

“Thankfully, all of our patients have been approved by insurance, and no one has had to pay by themselves, but this has sometimes required multiple appeals to the insurance company,” Dr. Lim said.

“The American Association of Endocrine Surgeons and Society of Interventional Radiology are both working towards getting this valuable treatment more readily accepted by more insurance companies,” he said.

Dr. Lim and Dr. Singer have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

As radiofrequency ablation (RFA) in the treatment of benign thyroid nodules gains favor as a noninvasive alternative to surgery in the United States, clinicians are increasingly reporting their experiences in hospital as well as outpatient settings.

And in one case, a hospital has taken the unique step of forming a multidisciplinary thyroid nodule RFA tumor board, which helps in the often tricky decision-making process that is involved.

“Our multidisciplinary RFA tumor board has been invaluable in this process, and it is the only one of its kind in the nation that I’m aware of,” James Lim, MD, of the Division of Surgical Oncology, Thyroid, and Parathyroid Center at Oregon Health Sciences University (OHSU), told this news organization.

Dr. Lim reports receiving referrals from “all avenues, some from thyroid specialists and others from nonthyroid specialists such as primary care practitioners or patient self-referrals.”

“Because of this, our centralized process of multidisciplinary review ensures that each patient is evaluated thoroughly through each thyroid specialists’ lens to optimize patient outcomes,” noted Dr. Lim, an assistant professor of endocrine surgery.

The RFA tumor board consists of experts in all specialties involved in thyroid nodule assessment and treatment, including surgeons, interventional radiologists, and endocrinologists.
 

Just because you can, doesn’t mean you should

However, there should be some caution that although there is enthusiasm regarding this noninvasive alternative to surgery, there is another option, that of mere observation, which is appropriate in many cases of thyroid nodules and should not be overlooked.

“For a number of reasons, the key to keep in mind is that, just because we can do something doesn’t mean we should,” Michael Singer, MD, director of the Division of Thyroid & Parathyroid Surgery, Department of Otolaryngology – Head and Neck Surgery, at the Henry Ford Health System, Detroit, said in an interview.

While emphasizing that he believes RFA to be a promising technology that will likely benefit patients in the future, Dr. Singer voiced concern about the approach becoming an easy choice – particularly if profit is to be had – when observation is a clear alternative. “If RFA becomes seen as an opportunity to create revenue, potential conflicts of interest may arise,” he said.

“As it is not a major procedure with a dramatic risk profile, my concern is that some clinicians [could] adopt the attitude of ‘Why not do it?’ even when the indication is minimal or nonexistent,” he added.

Dr. Lim said he agrees that “any new medical technology requires thoughtful evaluation and appropriate patient selection in order to ensure optimal patient outcomes.”

That’s where the tumor board has been especially beneficial.

“We have found great benefit in reviewing potential RFA cases in a multidisciplinary fashion within our tumor board and would recommend other institutions to consider it,” he noted. In the absence of a tumor board, “at a minimum, a thyroid specialist should be involved in the evaluation of a potential thyroid RFA patient prior to ablation treatment,” he advised.
 

Tumor board was able to identify a small subset of patients for surgery

In his research presented at the 90th Annual Meeting of the American Thyroid Association (ATA), Dr. Lim and colleagues evaluated the tumor board’s efficacy in altering diagnosis and treatment plans in a retrospective review of cases presented to the board for RFA consideration since its inception in July 2020 through June 2021.

 

 

Over the study period, 65 patients with biopsy-proven benign thyroid nodules were newly referred for RFA, with 58 referred for mass effect symptoms and seven for autonomous function.

After the multidisciplinary review, about half of the cases, 37 (56.9%), were approved for RFA.

Of the remainder, 22 (33.8%) were determined to need additional studies, just two (3.0%) were recommended for surgery, and four (6.2%) were recommended to not receive any intervention.

Of the 22 cases recommended for additional studies, 15 were subsequently recommended for RFA and four were recommended to receive surgery due to suspicious clinical findings.

Of those that underwent surgery, two showed thyroid cancer on final pathology.

Among the nodules recommended to RFA, the average nodule volume was 15.1 mL, whereas the average volume for those recommended for surgery was 40.9 mL (P = .08).

No significant complications occurred among patients that underwent RFA or those who had surgery.

“The tumor board’s multidisciplinary review was able to identify high-risk features in some patients with benign biopsies. This led to a change in recommendation from RFA to surgery for possible malignancy in a small subset of patients,” Dr. Lim noted.

In a separate analysis, Dr. Lim and colleagues reported that, among patients treated with RFA (with a mean baseline nodule volume of 11.9 mL), mean nodule volume was 6.4 mL after 1 month, 4.5 mL after 3 months, and 3.8 mL at 6 months, which were all significantly reduced versus baseline (P < .001). Similar improvements were also reported in symptom and cosmetic scores at each timepoint (all P < .001).

There were no cases of postprocedural hypothyroidism or symptomatic thyrotoxicosis.

Underlining that patients can expect noticeable improvement in symptom scores by their 30-day visit, Dr. Lim noted that patients should be warned of some early swelling.

“It is important to inform patients that they may have swelling of their treated nodule immediately after the procedure, but this should subside within a few days,” he said.
 

Outpatient RFA safe and efficacious

In a separate study also presented at the meeting, three practitioners described their experiences with RFA in their outpatient thyroid practices in San Antonio; Santa Monica, California; and Gettysburg, Pennsylvania.

Overall, there were 68 cases involving benign, class II thyroid nodules, and the authors reported average procedure times of under an hour, with actual RFA time varying from 7 to 22 minutes.

Of note, for nodules larger than 4.5 cm, two procedures were necessary to achieve desired results.

Excluding the larger nodules requiring more than one procedure, there was an average decrease in nodule size of 48% at 1 month and a decrease of 82% after 3 months in more than 80% of cases.

None of the cases required surgery. There were no major complications, and all patients had preserved baseline thyroid function.

“This preliminary study of 68 patients shows how thyroid RFA is safe and efficacious when performed in an endocrine outpatient office practice,” Kathleen Hands, MD, of the Thyroid Center of South Texas, and coauthors concluded.
 

Insurance coverage an issue in U.S.

Among much larger studies demonstrating the safety and efficacy of RFA for benign nodules, a study of 450 Chinese patients published in January showed RFA to be superior to conventional thyroidectomy in terms of patient satisfaction, postoperative quality of life, and shorter hospital stay, although the caveat was it took longer to achieve nodule volume reduction.

 

 

But if RFA use is to become more widespread in the United States, a key obstacle is that insurance companies generally do not cover the procedure. Although patients in Dr. Lim’s analyses did have coverage, it didn’t come easily, he said.

“Thankfully, all of our patients have been approved by insurance, and no one has had to pay by themselves, but this has sometimes required multiple appeals to the insurance company,” Dr. Lim said.

“The American Association of Endocrine Surgeons and Society of Interventional Radiology are both working towards getting this valuable treatment more readily accepted by more insurance companies,” he said.

Dr. Lim and Dr. Singer have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

As radiofrequency ablation (RFA) in the treatment of benign thyroid nodules gains favor as a noninvasive alternative to surgery in the United States, clinicians are increasingly reporting their experiences in hospital as well as outpatient settings.

And in one case, a hospital has taken the unique step of forming a multidisciplinary thyroid nodule RFA tumor board, which helps in the often tricky decision-making process that is involved.

“Our multidisciplinary RFA tumor board has been invaluable in this process, and it is the only one of its kind in the nation that I’m aware of,” James Lim, MD, of the Division of Surgical Oncology, Thyroid, and Parathyroid Center at Oregon Health Sciences University (OHSU), told this news organization.

Dr. Lim reports receiving referrals from “all avenues, some from thyroid specialists and others from nonthyroid specialists such as primary care practitioners or patient self-referrals.”

“Because of this, our centralized process of multidisciplinary review ensures that each patient is evaluated thoroughly through each thyroid specialists’ lens to optimize patient outcomes,” noted Dr. Lim, an assistant professor of endocrine surgery.

The RFA tumor board consists of experts in all specialties involved in thyroid nodule assessment and treatment, including surgeons, interventional radiologists, and endocrinologists.
 

Just because you can, doesn’t mean you should

However, there should be some caution that although there is enthusiasm regarding this noninvasive alternative to surgery, there is another option, that of mere observation, which is appropriate in many cases of thyroid nodules and should not be overlooked.

“For a number of reasons, the key to keep in mind is that, just because we can do something doesn’t mean we should,” Michael Singer, MD, director of the Division of Thyroid & Parathyroid Surgery, Department of Otolaryngology – Head and Neck Surgery, at the Henry Ford Health System, Detroit, said in an interview.

While emphasizing that he believes RFA to be a promising technology that will likely benefit patients in the future, Dr. Singer voiced concern about the approach becoming an easy choice – particularly if profit is to be had – when observation is a clear alternative. “If RFA becomes seen as an opportunity to create revenue, potential conflicts of interest may arise,” he said.

“As it is not a major procedure with a dramatic risk profile, my concern is that some clinicians [could] adopt the attitude of ‘Why not do it?’ even when the indication is minimal or nonexistent,” he added.

Dr. Lim said he agrees that “any new medical technology requires thoughtful evaluation and appropriate patient selection in order to ensure optimal patient outcomes.”

That’s where the tumor board has been especially beneficial.

“We have found great benefit in reviewing potential RFA cases in a multidisciplinary fashion within our tumor board and would recommend other institutions to consider it,” he noted. In the absence of a tumor board, “at a minimum, a thyroid specialist should be involved in the evaluation of a potential thyroid RFA patient prior to ablation treatment,” he advised.
 

Tumor board was able to identify a small subset of patients for surgery

In his research presented at the 90th Annual Meeting of the American Thyroid Association (ATA), Dr. Lim and colleagues evaluated the tumor board’s efficacy in altering diagnosis and treatment plans in a retrospective review of cases presented to the board for RFA consideration since its inception in July 2020 through June 2021.

 

 

Over the study period, 65 patients with biopsy-proven benign thyroid nodules were newly referred for RFA, with 58 referred for mass effect symptoms and seven for autonomous function.

After the multidisciplinary review, about half of the cases, 37 (56.9%), were approved for RFA.

Of the remainder, 22 (33.8%) were determined to need additional studies, just two (3.0%) were recommended for surgery, and four (6.2%) were recommended to not receive any intervention.

Of the 22 cases recommended for additional studies, 15 were subsequently recommended for RFA and four were recommended to receive surgery due to suspicious clinical findings.

Of those that underwent surgery, two showed thyroid cancer on final pathology.

Among the nodules recommended to RFA, the average nodule volume was 15.1 mL, whereas the average volume for those recommended for surgery was 40.9 mL (P = .08).

No significant complications occurred among patients that underwent RFA or those who had surgery.

“The tumor board’s multidisciplinary review was able to identify high-risk features in some patients with benign biopsies. This led to a change in recommendation from RFA to surgery for possible malignancy in a small subset of patients,” Dr. Lim noted.

In a separate analysis, Dr. Lim and colleagues reported that, among patients treated with RFA (with a mean baseline nodule volume of 11.9 mL), mean nodule volume was 6.4 mL after 1 month, 4.5 mL after 3 months, and 3.8 mL at 6 months, which were all significantly reduced versus baseline (P < .001). Similar improvements were also reported in symptom and cosmetic scores at each timepoint (all P < .001).

There were no cases of postprocedural hypothyroidism or symptomatic thyrotoxicosis.

Underlining that patients can expect noticeable improvement in symptom scores by their 30-day visit, Dr. Lim noted that patients should be warned of some early swelling.

“It is important to inform patients that they may have swelling of their treated nodule immediately after the procedure, but this should subside within a few days,” he said.
 

Outpatient RFA safe and efficacious

In a separate study also presented at the meeting, three practitioners described their experiences with RFA in their outpatient thyroid practices in San Antonio; Santa Monica, California; and Gettysburg, Pennsylvania.

Overall, there were 68 cases involving benign, class II thyroid nodules, and the authors reported average procedure times of under an hour, with actual RFA time varying from 7 to 22 minutes.

Of note, for nodules larger than 4.5 cm, two procedures were necessary to achieve desired results.

Excluding the larger nodules requiring more than one procedure, there was an average decrease in nodule size of 48% at 1 month and a decrease of 82% after 3 months in more than 80% of cases.

None of the cases required surgery. There were no major complications, and all patients had preserved baseline thyroid function.

“This preliminary study of 68 patients shows how thyroid RFA is safe and efficacious when performed in an endocrine outpatient office practice,” Kathleen Hands, MD, of the Thyroid Center of South Texas, and coauthors concluded.
 

Insurance coverage an issue in U.S.

Among much larger studies demonstrating the safety and efficacy of RFA for benign nodules, a study of 450 Chinese patients published in January showed RFA to be superior to conventional thyroidectomy in terms of patient satisfaction, postoperative quality of life, and shorter hospital stay, although the caveat was it took longer to achieve nodule volume reduction.

 

 

But if RFA use is to become more widespread in the United States, a key obstacle is that insurance companies generally do not cover the procedure. Although patients in Dr. Lim’s analyses did have coverage, it didn’t come easily, he said.

“Thankfully, all of our patients have been approved by insurance, and no one has had to pay by themselves, but this has sometimes required multiple appeals to the insurance company,” Dr. Lim said.

“The American Association of Endocrine Surgeons and Society of Interventional Radiology are both working towards getting this valuable treatment more readily accepted by more insurance companies,” he said.

Dr. Lim and Dr. Singer have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Adding statins to steroids in thyroid eye disease improves outcomes

Article Type
Changed
Mon, 10/11/2021 - 14:37

Treatment of Graves’ orbitopathy with statins in combination with glucocorticoids shows benefits among people with – and even without – high cholesterol, results from a new randomized clinical trial show.

Blister pack of generic statins
RogerAshford/Thinkstock

“Our results [indicate] that adding atorvastatin to intravenous glucocorticoids seems to potentiate the effects of glucocorticoids,” senior author Michelle Marino, MD, associate professor of endocrinology in the department of clinical and experimental medicine at the University of Pisa, Italy, told this news organization.

“At least in hypercholesterolemic patients with moderate to severe and active Graves’ orbitopathy, atorvastatin should be considered in addition to intravenous glucocorticoids,” Dr. Marino said.

The study was presented by first author Giulia Lanzolla, MD, also of the University of Pisa and University Hospital of Pisa, at the virtual annual meeting of the American Thyroid Association.

Hypercholesterolemia, well known to promote systemic inflammation, has been previously linked to Graves’ orbitopathy, and the use of statins has also been shown to possibly provide a protective effect in the risk of developing the thyroid eye disease.

Furthermore, patients with Graves’ orbitopathy and high cholesterol levels, compared with those with normal cholesterol, have been shown to have poorer responses to treatment with glucocorticoids, which have long been the first line of treatment.

Asked for comment on the findings, Marius Stan, MD, a consultant in the division of endocrinology, diabetes, metabolism, and nutrition, Mayo Clinic, Rochester, Minn., said he didn’t think the outcome measure used – a composite of a variety of measures of thyroid eye disease – was best to truly understand the benefits.
 

Statins for Graves’ orbitopathy (STAGO) study details

For a better understanding of the effects with and without the addition of statins in a randomized trial, Dr. Lanzolla and colleagues enrolled 88 patients with high cholesterol and moderate to severe active Graves’ orbitopathy in the phase 2 STAGO trial.

Patients were randomized to two groups of 44 patients each to receive treatment either with intravenous (IV) methylprednisolone at 500 mg per week for 6 weeks, followed by 250 mg per week for another 6 weeks, in combination with atorvastatin 20 mg daily for 12 weeks, or methylprednisolone alone for 12 weeks.

The primary endpoint was a composite of Graves orbitopathy outcomes and included measures of exophthalmos, clinical activity score, eyelid aperture, diplopia, and visual acuity, as assessed in the modified intention-to-treat population.

The trial met the primary composite endpoint, with 51.2% of those treated with statins achieving the outcome (21 of 41) versus 28.2% (11 of 39) of those treated with glucocorticoids alone (odds ratio, 2.76; P = .03).

The study also achieved secondary outcomes, with 43.9% in the statin group having a response to treatment at 12 weeks versus 23% in the glucocorticoid group (OR 2.60; P = .05). The statin group also had a greater improvement in quality of life measures (P = .03).

The glucocorticoid-only group meanwhile had a significantly greater rate of Graves orbitopathy relapse at 24 weeks, with six relapses versus none in the statin group (15.3% vs. 0.0%; OR 0.06; P = .01).

There were no significant differences in low-density lipoprotein (LDL) cholesterol between those who did and did not respond to treatment in the statin group.

The most likely explanation for those findings is that “atorvastatin acts through its pleiotropic action, resulting in an anti-inflammatory effect,” Dr. Marino said.

“In addition, the effect may be related to the capability of statins to inhibit fibroblast proliferation,” Dr. Marino added.

“Total cholesterol had a behavior similar to LDL cholesterol, [while] HDL cholesterol did not change across the study.”

There were no major adverse events related to atorvastatin, with one patient in each group requiring treatment discontinuation.

In the rapidly evolving landscape of treatments for Graves’ orbitopathy, including the recent Food and Drug Administration approval for teprotumumab in thyroid eye disease, the potential role of statins remains to be seen, Dr. Marino noted.

“Graves’ orbitopathy is a rather complex disease, and in its mild to moderate forms it is very rare for a patient to require only a single treatment,” Dr. Marino explained. “Rehabilitative surgery is needed quite often once the disease is inactive.”

The authors noted that a composite overall Graves’ orbitopathy outcome was used as the primary endpoint because the alternative of a change in single eye features may not reflect a true modification of Graves’ orbitopathy and could be affected by a number of unrelated factors.

“By contrast, the composite evaluation offers a more realistic picture,” the authors wrote in the article, which was published in The Lancet Diabetes and Endocrinology.  
 

 

 

Composite outcome not best way of assessing effects of statins

Dr. Stan extrapolated on his criticism of the trial.

“The study has interesting results but fails to show that any particular eye feature is benefited by the combination therapy, showing only the composite outcome to be improved,” he told this news organization.

“Unfortunately, that is hard to extrapolate to patient care, where one or another of Graves’ orbitopathy features are present and are the intended target of therapy,” he said.

Dr. Stan added that IV glucocorticoids are meanwhile also changing the landscape of treatment of thyroid eye disease.

“This ... current plan is to recommend a more individualized approach, depending on what is the main problem for that thyroid eye disease case,” he explained.

Dr. Marino noted that the authors are planning a double-blind, placebo-controlled phase 3 clinical trial of the statin/glucocorticoid combination to include patients regardless of their cholesterol levels.

The study received funding from Associazione Allievi Endocrinologia Pisana. The authors have reported no relevant financial relationships. Dr. Stan is on the advisory board for Horizon Pharma/Immunovant and provides general consulting for VasaraGen/Septerna and ValenzaBio/Medicxi.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Treatment of Graves’ orbitopathy with statins in combination with glucocorticoids shows benefits among people with – and even without – high cholesterol, results from a new randomized clinical trial show.

Blister pack of generic statins
RogerAshford/Thinkstock

“Our results [indicate] that adding atorvastatin to intravenous glucocorticoids seems to potentiate the effects of glucocorticoids,” senior author Michelle Marino, MD, associate professor of endocrinology in the department of clinical and experimental medicine at the University of Pisa, Italy, told this news organization.

“At least in hypercholesterolemic patients with moderate to severe and active Graves’ orbitopathy, atorvastatin should be considered in addition to intravenous glucocorticoids,” Dr. Marino said.

The study was presented by first author Giulia Lanzolla, MD, also of the University of Pisa and University Hospital of Pisa, at the virtual annual meeting of the American Thyroid Association.

Hypercholesterolemia, well known to promote systemic inflammation, has been previously linked to Graves’ orbitopathy, and the use of statins has also been shown to possibly provide a protective effect in the risk of developing the thyroid eye disease.

Furthermore, patients with Graves’ orbitopathy and high cholesterol levels, compared with those with normal cholesterol, have been shown to have poorer responses to treatment with glucocorticoids, which have long been the first line of treatment.

Asked for comment on the findings, Marius Stan, MD, a consultant in the division of endocrinology, diabetes, metabolism, and nutrition, Mayo Clinic, Rochester, Minn., said he didn’t think the outcome measure used – a composite of a variety of measures of thyroid eye disease – was best to truly understand the benefits.
 

Statins for Graves’ orbitopathy (STAGO) study details

For a better understanding of the effects with and without the addition of statins in a randomized trial, Dr. Lanzolla and colleagues enrolled 88 patients with high cholesterol and moderate to severe active Graves’ orbitopathy in the phase 2 STAGO trial.

Patients were randomized to two groups of 44 patients each to receive treatment either with intravenous (IV) methylprednisolone at 500 mg per week for 6 weeks, followed by 250 mg per week for another 6 weeks, in combination with atorvastatin 20 mg daily for 12 weeks, or methylprednisolone alone for 12 weeks.

The primary endpoint was a composite of Graves orbitopathy outcomes and included measures of exophthalmos, clinical activity score, eyelid aperture, diplopia, and visual acuity, as assessed in the modified intention-to-treat population.

The trial met the primary composite endpoint, with 51.2% of those treated with statins achieving the outcome (21 of 41) versus 28.2% (11 of 39) of those treated with glucocorticoids alone (odds ratio, 2.76; P = .03).

The study also achieved secondary outcomes, with 43.9% in the statin group having a response to treatment at 12 weeks versus 23% in the glucocorticoid group (OR 2.60; P = .05). The statin group also had a greater improvement in quality of life measures (P = .03).

The glucocorticoid-only group meanwhile had a significantly greater rate of Graves orbitopathy relapse at 24 weeks, with six relapses versus none in the statin group (15.3% vs. 0.0%; OR 0.06; P = .01).

There were no significant differences in low-density lipoprotein (LDL) cholesterol between those who did and did not respond to treatment in the statin group.

The most likely explanation for those findings is that “atorvastatin acts through its pleiotropic action, resulting in an anti-inflammatory effect,” Dr. Marino said.

“In addition, the effect may be related to the capability of statins to inhibit fibroblast proliferation,” Dr. Marino added.

“Total cholesterol had a behavior similar to LDL cholesterol, [while] HDL cholesterol did not change across the study.”

There were no major adverse events related to atorvastatin, with one patient in each group requiring treatment discontinuation.

In the rapidly evolving landscape of treatments for Graves’ orbitopathy, including the recent Food and Drug Administration approval for teprotumumab in thyroid eye disease, the potential role of statins remains to be seen, Dr. Marino noted.

“Graves’ orbitopathy is a rather complex disease, and in its mild to moderate forms it is very rare for a patient to require only a single treatment,” Dr. Marino explained. “Rehabilitative surgery is needed quite often once the disease is inactive.”

The authors noted that a composite overall Graves’ orbitopathy outcome was used as the primary endpoint because the alternative of a change in single eye features may not reflect a true modification of Graves’ orbitopathy and could be affected by a number of unrelated factors.

“By contrast, the composite evaluation offers a more realistic picture,” the authors wrote in the article, which was published in The Lancet Diabetes and Endocrinology.  
 

 

 

Composite outcome not best way of assessing effects of statins

Dr. Stan extrapolated on his criticism of the trial.

“The study has interesting results but fails to show that any particular eye feature is benefited by the combination therapy, showing only the composite outcome to be improved,” he told this news organization.

“Unfortunately, that is hard to extrapolate to patient care, where one or another of Graves’ orbitopathy features are present and are the intended target of therapy,” he said.

Dr. Stan added that IV glucocorticoids are meanwhile also changing the landscape of treatment of thyroid eye disease.

“This ... current plan is to recommend a more individualized approach, depending on what is the main problem for that thyroid eye disease case,” he explained.

Dr. Marino noted that the authors are planning a double-blind, placebo-controlled phase 3 clinical trial of the statin/glucocorticoid combination to include patients regardless of their cholesterol levels.

The study received funding from Associazione Allievi Endocrinologia Pisana. The authors have reported no relevant financial relationships. Dr. Stan is on the advisory board for Horizon Pharma/Immunovant and provides general consulting for VasaraGen/Septerna and ValenzaBio/Medicxi.

A version of this article first appeared on Medscape.com.

Treatment of Graves’ orbitopathy with statins in combination with glucocorticoids shows benefits among people with – and even without – high cholesterol, results from a new randomized clinical trial show.

Blister pack of generic statins
RogerAshford/Thinkstock

“Our results [indicate] that adding atorvastatin to intravenous glucocorticoids seems to potentiate the effects of glucocorticoids,” senior author Michelle Marino, MD, associate professor of endocrinology in the department of clinical and experimental medicine at the University of Pisa, Italy, told this news organization.

“At least in hypercholesterolemic patients with moderate to severe and active Graves’ orbitopathy, atorvastatin should be considered in addition to intravenous glucocorticoids,” Dr. Marino said.

The study was presented by first author Giulia Lanzolla, MD, also of the University of Pisa and University Hospital of Pisa, at the virtual annual meeting of the American Thyroid Association.

Hypercholesterolemia, well known to promote systemic inflammation, has been previously linked to Graves’ orbitopathy, and the use of statins has also been shown to possibly provide a protective effect in the risk of developing the thyroid eye disease.

Furthermore, patients with Graves’ orbitopathy and high cholesterol levels, compared with those with normal cholesterol, have been shown to have poorer responses to treatment with glucocorticoids, which have long been the first line of treatment.

Asked for comment on the findings, Marius Stan, MD, a consultant in the division of endocrinology, diabetes, metabolism, and nutrition, Mayo Clinic, Rochester, Minn., said he didn’t think the outcome measure used – a composite of a variety of measures of thyroid eye disease – was best to truly understand the benefits.
 

Statins for Graves’ orbitopathy (STAGO) study details

For a better understanding of the effects with and without the addition of statins in a randomized trial, Dr. Lanzolla and colleagues enrolled 88 patients with high cholesterol and moderate to severe active Graves’ orbitopathy in the phase 2 STAGO trial.

Patients were randomized to two groups of 44 patients each to receive treatment either with intravenous (IV) methylprednisolone at 500 mg per week for 6 weeks, followed by 250 mg per week for another 6 weeks, in combination with atorvastatin 20 mg daily for 12 weeks, or methylprednisolone alone for 12 weeks.

The primary endpoint was a composite of Graves orbitopathy outcomes and included measures of exophthalmos, clinical activity score, eyelid aperture, diplopia, and visual acuity, as assessed in the modified intention-to-treat population.

The trial met the primary composite endpoint, with 51.2% of those treated with statins achieving the outcome (21 of 41) versus 28.2% (11 of 39) of those treated with glucocorticoids alone (odds ratio, 2.76; P = .03).

The study also achieved secondary outcomes, with 43.9% in the statin group having a response to treatment at 12 weeks versus 23% in the glucocorticoid group (OR 2.60; P = .05). The statin group also had a greater improvement in quality of life measures (P = .03).

The glucocorticoid-only group meanwhile had a significantly greater rate of Graves orbitopathy relapse at 24 weeks, with six relapses versus none in the statin group (15.3% vs. 0.0%; OR 0.06; P = .01).

There were no significant differences in low-density lipoprotein (LDL) cholesterol between those who did and did not respond to treatment in the statin group.

The most likely explanation for those findings is that “atorvastatin acts through its pleiotropic action, resulting in an anti-inflammatory effect,” Dr. Marino said.

“In addition, the effect may be related to the capability of statins to inhibit fibroblast proliferation,” Dr. Marino added.

“Total cholesterol had a behavior similar to LDL cholesterol, [while] HDL cholesterol did not change across the study.”

There were no major adverse events related to atorvastatin, with one patient in each group requiring treatment discontinuation.

In the rapidly evolving landscape of treatments for Graves’ orbitopathy, including the recent Food and Drug Administration approval for teprotumumab in thyroid eye disease, the potential role of statins remains to be seen, Dr. Marino noted.

“Graves’ orbitopathy is a rather complex disease, and in its mild to moderate forms it is very rare for a patient to require only a single treatment,” Dr. Marino explained. “Rehabilitative surgery is needed quite often once the disease is inactive.”

The authors noted that a composite overall Graves’ orbitopathy outcome was used as the primary endpoint because the alternative of a change in single eye features may not reflect a true modification of Graves’ orbitopathy and could be affected by a number of unrelated factors.

“By contrast, the composite evaluation offers a more realistic picture,” the authors wrote in the article, which was published in The Lancet Diabetes and Endocrinology.  
 

 

 

Composite outcome not best way of assessing effects of statins

Dr. Stan extrapolated on his criticism of the trial.

“The study has interesting results but fails to show that any particular eye feature is benefited by the combination therapy, showing only the composite outcome to be improved,” he told this news organization.

“Unfortunately, that is hard to extrapolate to patient care, where one or another of Graves’ orbitopathy features are present and are the intended target of therapy,” he said.

Dr. Stan added that IV glucocorticoids are meanwhile also changing the landscape of treatment of thyroid eye disease.

“This ... current plan is to recommend a more individualized approach, depending on what is the main problem for that thyroid eye disease case,” he explained.

Dr. Marino noted that the authors are planning a double-blind, placebo-controlled phase 3 clinical trial of the statin/glucocorticoid combination to include patients regardless of their cholesterol levels.

The study received funding from Associazione Allievi Endocrinologia Pisana. The authors have reported no relevant financial relationships. Dr. Stan is on the advisory board for Horizon Pharma/Immunovant and provides general consulting for VasaraGen/Septerna and ValenzaBio/Medicxi.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ATA 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article