Prescription cascade more likely after CCBs than other hypertension meds

Article Type
Changed
Mon, 03/22/2021 - 14:08

Elderly adults with hypertension who are newly prescribed a calcium-channel blocker (CCB), compared to other antihypertensive agents, are at least twice as likely to be given a loop diuretic over the following months, a large cohort study suggests.

The likelihood remained elevated for as long as a year after the start of a CCB and was more pronounced when comparing CCBs to any other kind of medication.

“Our findings suggest that many older adults who begin taking a CCB may subsequently experience a prescribing cascade” when loop diuretics are prescribed for peripheral edema, a known CCB adverse effect, that is misinterpreted as a new medical condition, Rachel D. Savage, PhD, Women’s College Hospital, Toronto, Canada, told theheart.org/Medscape Cardiology.

Edema caused by CCBs is caused by fluid redistribution, not overload, and “treating euvolemic individuals with a diuretic places them at increased risk of overdiuresis, leading to falls, urinary incontinence, acute kidney injury, electrolyte imbalances, and a cascade of other downstream consequences to which older adults are especially vulnerable,” explain Savage and coauthors of the analysis published online February 24 in JAMA Internal Medicine.

However, 1.4% of the cohort had been prescribed a loop diuretic, and 4.5% had been given any diuretic within 90 days after the start of CCBs. The corresponding rates were 0.7% and 3.4%, respectively, for patients who had started on ACE inhibitors or angiotensin receptor blocker (ARB) rather than a CCB.

Also, Savage observed, “the likelihood of being prescribed a loop diuretic following initiation of a CCB changed over time and was greatest 61 to 90 days postinitiation.” At that point, it was increased 2.4 times compared with initiation of an ACE inhibitor or an ARB in an adjusted analysis and increased almost 4 times compared with starting on any non-CCB agent.

Importantly, the actual prevalence of peripheral edema among those started on CCBs, ACE inhibitors, ARBs, or any non-CCB medication was not available in the data sets.

However, “the main message for clinicians is to consider medication side effects as a potential cause for new symptoms when patients present. We also encourage patients to ask prescribers about whether new symptoms could be caused by a medication,” senior author Lisa M. McCarthy, PharmD, told theheart.org/Medscape Cardiology.

“If a patient experiences peripheral edema while taking a CCB, we would encourage clinicians to consider whether the calcium-channel blocker is still necessary, whether it could be discontinued or the dose reduced, or whether the patient can be switched to another therapy,” she said.

Based on the current analysis, if the rate of CCB-induced peripheral edema is assumed to be 10%, which is consistent with the literature, then “potentially 7% to 14% of people who develop edema while taking a calcium channel blocker may then receive a loop diuretic,” an accompanying editorial notes.

“Patients with polypharmacy are at heightened risk of being exposed to [a] series of prescribing cascades if their current use of medications is not carefully discussed before the decision to add a new antihypertensive,” observe Timothy S. Anderson, MD, Beth Israel Deaconess Medical Center, Boston, Massachusetts, and Michael A. Steinman, MD, San Francisco Veterans Affairs Medical Center and University of California, San Francisco.

“The initial prescribing cascade can set off many other negative consequences, including adverse drug events, potentially avoidable diagnostic testing, and hospitalizations,” the editorialists caution.

“Identifying prescribing cascades and their consequences is an important step to stem the tide of polypharmacy and inform deprescribing efforts.”

The analysis was based on administrative data from almost 340,000 adults in the community aged 66 years or older with hypertension and new drug prescriptions over 5 years ending in September 2016, the report notes. Their mean age was 74.5 years and 56.5% were women.

The data set included 41,086 patients who were newly prescribed a CCB; 66,494 who were newly prescribed an ACE inhibitor or ARB; and 231,439 newly prescribed any medication other than a CCB. The prescribed CCB was amlodipine in 79.6% of patients.

Although loop diuretics could possibly have been prescribed sometimes as a second-tier antihypertensive in the absence of peripheral edema, “we made efforts, through the design of our study, to limit this where possible,” Savage said in an interview.

For example, the focus was on loop diuretics, which aren’t generally recommended for blood-pressure lowering. Also, patients with heart failure and those with a recent history of diuretic or other antihypertensive medication use had been excluded, she said.

“As such, our cohort comprised individuals with new-onset or milder hypertension for whom diuretics would unlikely to be prescribed as part of guideline-based hypertension management.”

Although amlodipine was the most commonly prescribed CCB, the potential for a prescribing cascade seemed to be a class effect and to apply at a range of dosages.

That was unexpected, McCarthy observed, because “peripheral edema occurs more commonly in people taking dihydropyridine CCBs, like amlodipine, compared to non–dihydropyridine CCBs, such as verapamil and diltiazem.”

Savage, McCarthy, their coauthors, and the editorialists have disclosed no relevant financial relationships.
 

This article first appeared on Medscape.com.

Publications
Topics
Sections

Elderly adults with hypertension who are newly prescribed a calcium-channel blocker (CCB), compared to other antihypertensive agents, are at least twice as likely to be given a loop diuretic over the following months, a large cohort study suggests.

The likelihood remained elevated for as long as a year after the start of a CCB and was more pronounced when comparing CCBs to any other kind of medication.

“Our findings suggest that many older adults who begin taking a CCB may subsequently experience a prescribing cascade” when loop diuretics are prescribed for peripheral edema, a known CCB adverse effect, that is misinterpreted as a new medical condition, Rachel D. Savage, PhD, Women’s College Hospital, Toronto, Canada, told theheart.org/Medscape Cardiology.

Edema caused by CCBs is caused by fluid redistribution, not overload, and “treating euvolemic individuals with a diuretic places them at increased risk of overdiuresis, leading to falls, urinary incontinence, acute kidney injury, electrolyte imbalances, and a cascade of other downstream consequences to which older adults are especially vulnerable,” explain Savage and coauthors of the analysis published online February 24 in JAMA Internal Medicine.

However, 1.4% of the cohort had been prescribed a loop diuretic, and 4.5% had been given any diuretic within 90 days after the start of CCBs. The corresponding rates were 0.7% and 3.4%, respectively, for patients who had started on ACE inhibitors or angiotensin receptor blocker (ARB) rather than a CCB.

Also, Savage observed, “the likelihood of being prescribed a loop diuretic following initiation of a CCB changed over time and was greatest 61 to 90 days postinitiation.” At that point, it was increased 2.4 times compared with initiation of an ACE inhibitor or an ARB in an adjusted analysis and increased almost 4 times compared with starting on any non-CCB agent.

Importantly, the actual prevalence of peripheral edema among those started on CCBs, ACE inhibitors, ARBs, or any non-CCB medication was not available in the data sets.

However, “the main message for clinicians is to consider medication side effects as a potential cause for new symptoms when patients present. We also encourage patients to ask prescribers about whether new symptoms could be caused by a medication,” senior author Lisa M. McCarthy, PharmD, told theheart.org/Medscape Cardiology.

“If a patient experiences peripheral edema while taking a CCB, we would encourage clinicians to consider whether the calcium-channel blocker is still necessary, whether it could be discontinued or the dose reduced, or whether the patient can be switched to another therapy,” she said.

Based on the current analysis, if the rate of CCB-induced peripheral edema is assumed to be 10%, which is consistent with the literature, then “potentially 7% to 14% of people who develop edema while taking a calcium channel blocker may then receive a loop diuretic,” an accompanying editorial notes.

“Patients with polypharmacy are at heightened risk of being exposed to [a] series of prescribing cascades if their current use of medications is not carefully discussed before the decision to add a new antihypertensive,” observe Timothy S. Anderson, MD, Beth Israel Deaconess Medical Center, Boston, Massachusetts, and Michael A. Steinman, MD, San Francisco Veterans Affairs Medical Center and University of California, San Francisco.

“The initial prescribing cascade can set off many other negative consequences, including adverse drug events, potentially avoidable diagnostic testing, and hospitalizations,” the editorialists caution.

“Identifying prescribing cascades and their consequences is an important step to stem the tide of polypharmacy and inform deprescribing efforts.”

The analysis was based on administrative data from almost 340,000 adults in the community aged 66 years or older with hypertension and new drug prescriptions over 5 years ending in September 2016, the report notes. Their mean age was 74.5 years and 56.5% were women.

The data set included 41,086 patients who were newly prescribed a CCB; 66,494 who were newly prescribed an ACE inhibitor or ARB; and 231,439 newly prescribed any medication other than a CCB. The prescribed CCB was amlodipine in 79.6% of patients.

Although loop diuretics could possibly have been prescribed sometimes as a second-tier antihypertensive in the absence of peripheral edema, “we made efforts, through the design of our study, to limit this where possible,” Savage said in an interview.

For example, the focus was on loop diuretics, which aren’t generally recommended for blood-pressure lowering. Also, patients with heart failure and those with a recent history of diuretic or other antihypertensive medication use had been excluded, she said.

“As such, our cohort comprised individuals with new-onset or milder hypertension for whom diuretics would unlikely to be prescribed as part of guideline-based hypertension management.”

Although amlodipine was the most commonly prescribed CCB, the potential for a prescribing cascade seemed to be a class effect and to apply at a range of dosages.

That was unexpected, McCarthy observed, because “peripheral edema occurs more commonly in people taking dihydropyridine CCBs, like amlodipine, compared to non–dihydropyridine CCBs, such as verapamil and diltiazem.”

Savage, McCarthy, their coauthors, and the editorialists have disclosed no relevant financial relationships.
 

This article first appeared on Medscape.com.

Elderly adults with hypertension who are newly prescribed a calcium-channel blocker (CCB), compared to other antihypertensive agents, are at least twice as likely to be given a loop diuretic over the following months, a large cohort study suggests.

The likelihood remained elevated for as long as a year after the start of a CCB and was more pronounced when comparing CCBs to any other kind of medication.

“Our findings suggest that many older adults who begin taking a CCB may subsequently experience a prescribing cascade” when loop diuretics are prescribed for peripheral edema, a known CCB adverse effect, that is misinterpreted as a new medical condition, Rachel D. Savage, PhD, Women’s College Hospital, Toronto, Canada, told theheart.org/Medscape Cardiology.

Edema caused by CCBs is caused by fluid redistribution, not overload, and “treating euvolemic individuals with a diuretic places them at increased risk of overdiuresis, leading to falls, urinary incontinence, acute kidney injury, electrolyte imbalances, and a cascade of other downstream consequences to which older adults are especially vulnerable,” explain Savage and coauthors of the analysis published online February 24 in JAMA Internal Medicine.

However, 1.4% of the cohort had been prescribed a loop diuretic, and 4.5% had been given any diuretic within 90 days after the start of CCBs. The corresponding rates were 0.7% and 3.4%, respectively, for patients who had started on ACE inhibitors or angiotensin receptor blocker (ARB) rather than a CCB.

Also, Savage observed, “the likelihood of being prescribed a loop diuretic following initiation of a CCB changed over time and was greatest 61 to 90 days postinitiation.” At that point, it was increased 2.4 times compared with initiation of an ACE inhibitor or an ARB in an adjusted analysis and increased almost 4 times compared with starting on any non-CCB agent.

Importantly, the actual prevalence of peripheral edema among those started on CCBs, ACE inhibitors, ARBs, or any non-CCB medication was not available in the data sets.

However, “the main message for clinicians is to consider medication side effects as a potential cause for new symptoms when patients present. We also encourage patients to ask prescribers about whether new symptoms could be caused by a medication,” senior author Lisa M. McCarthy, PharmD, told theheart.org/Medscape Cardiology.

“If a patient experiences peripheral edema while taking a CCB, we would encourage clinicians to consider whether the calcium-channel blocker is still necessary, whether it could be discontinued or the dose reduced, or whether the patient can be switched to another therapy,” she said.

Based on the current analysis, if the rate of CCB-induced peripheral edema is assumed to be 10%, which is consistent with the literature, then “potentially 7% to 14% of people who develop edema while taking a calcium channel blocker may then receive a loop diuretic,” an accompanying editorial notes.

“Patients with polypharmacy are at heightened risk of being exposed to [a] series of prescribing cascades if their current use of medications is not carefully discussed before the decision to add a new antihypertensive,” observe Timothy S. Anderson, MD, Beth Israel Deaconess Medical Center, Boston, Massachusetts, and Michael A. Steinman, MD, San Francisco Veterans Affairs Medical Center and University of California, San Francisco.

“The initial prescribing cascade can set off many other negative consequences, including adverse drug events, potentially avoidable diagnostic testing, and hospitalizations,” the editorialists caution.

“Identifying prescribing cascades and their consequences is an important step to stem the tide of polypharmacy and inform deprescribing efforts.”

The analysis was based on administrative data from almost 340,000 adults in the community aged 66 years or older with hypertension and new drug prescriptions over 5 years ending in September 2016, the report notes. Their mean age was 74.5 years and 56.5% were women.

The data set included 41,086 patients who were newly prescribed a CCB; 66,494 who were newly prescribed an ACE inhibitor or ARB; and 231,439 newly prescribed any medication other than a CCB. The prescribed CCB was amlodipine in 79.6% of patients.

Although loop diuretics could possibly have been prescribed sometimes as a second-tier antihypertensive in the absence of peripheral edema, “we made efforts, through the design of our study, to limit this where possible,” Savage said in an interview.

For example, the focus was on loop diuretics, which aren’t generally recommended for blood-pressure lowering. Also, patients with heart failure and those with a recent history of diuretic or other antihypertensive medication use had been excluded, she said.

“As such, our cohort comprised individuals with new-onset or milder hypertension for whom diuretics would unlikely to be prescribed as part of guideline-based hypertension management.”

Although amlodipine was the most commonly prescribed CCB, the potential for a prescribing cascade seemed to be a class effect and to apply at a range of dosages.

That was unexpected, McCarthy observed, because “peripheral edema occurs more commonly in people taking dihydropyridine CCBs, like amlodipine, compared to non–dihydropyridine CCBs, such as verapamil and diltiazem.”

Savage, McCarthy, their coauthors, and the editorialists have disclosed no relevant financial relationships.
 

This article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Medscape Article

Varied nightly bedtime, sleep duration linked to CVD risk

Article Type
Changed
Mon, 03/22/2021 - 14:08

People who frequently alter the amount of sleep and time they go to bed each night are twofold more likely to develop cardiovascular disease, independent of traditional CVD risk factors, new research suggests.

Prior studies have focused on shift workers because night shift work will influence circadian rhythm and increase CVD risk. But it is increasingly recognized that circadian disruption may occur outside of shift work and accumulate over time, particularly given modern lifestyle factors such as increased use of mobile devices and television at night, said study coauthor Tianyi Huang, ScD, MSc, of Brigham and Women’s Hospital and Harvard Medical School in Boston, Massachusetts.

“Even if they tend to go to sleep at certain times, by following that lifestyle or behavior, it can interfere with their planned sleep timing,” he said.

“One thing that surprised me in this sample is that about one third of participants have irregular sleep patterns that can put them at increased risk of cardiovascular disease. So I think the prevalence is higher than expected,” Huang added.

As reported today in the Journal of the American College of Cardiology, the investigators used data from 7-day wrist actigraphy, 1 night of at-home polysomnography, and sleep questionnaires to assess sleep duration and sleep-onset timing among 1,992 Multi-Ethnic Study of Atherosclerosis () participants, aged 45 to 84 years, who were free of CVD and prospectively followed for a me MESA dian of 4.9 years.

A total of 786 patients (39.5%) had sleep duration standard deviation (SD) > 90 minutes and 510 (25.6%) had sleep-onset timing SD > 90 minutes.

During follow-up, there were 111 incident CVD events, including myocardial infarction, coronary heart disease death, stroke, and other coronary events.

Compared with people who had less than 1 hour of variation in sleep duration, the risk for incident CVD was 9% higher for people whose sleep duration varied 61 to 90 minutes (hazard ratio [HR], 1.09; 95% confidence interval [CI], 0.62 - 1.92), even after controlling for a variety of cardiovascular and sleep-related risk factors such as body mass index, systolic blood pressure, smoking status, total cholesterol, average sleep duration, insomnia symptoms, and sleep apnea.

Moreover, the adjusted CVD risk was substantially increased with 91 to 120 minutes of variation (HR, 1.59; 95% CI, 0.91 - 2.76) and more than 120 minutes of variation in sleep duration (HR, 2.14; 95% CI, 1.24 - 3.68).

Every 1-hour increase in sleep duration SD was associated with 36% higher CVD risk (95% CI; 1.07 - 1.73).

Compared with people with no more than a half hour of variation in nightly bedtimes, the adjusted hazard ratios for CVD were 1.16 (95% CI, 0.64 - 2.13), 1.52 (95% CI, 0.81 - 2.88), and 2.11 (95% CI, 1.13 - 3.91) when bedtimes varied by 31 to 60 minutes, 61 to 90 minutes, and more than 90 minutes.

For every 1-hour increase in sleep-onset timing SD, the risk of CVD was 18% higher (95% CI; 1.06 - 1.31).

“The results are similar for the regularity of sleep timing and the regularity of sleep duration, which means that both can contribute to circadian disruption and then lead to development of cardiovascular disease,” Huang said.

This is an important article and signals how sleep is an important marker and possibly a mediator of cardiovascular risk, said Harlan Krumholz, MD, of Yale School of Medicine in New Haven, Connecticut, who was not involved with the study.

“What I like about this is it’s a nice longitudinal, epidemiologic study with not just self-report, but sensor-detected sleep, that has been correlated with well-curated and adjudicated outcomes to give us a strong sense of this association,” he told theheart.org/Medscape Cardiology. “And also, that it goes beyond just the duration — they combine the duration and timing in order to give a fuller picture of sleep.”

Nevertheless, Krumholz said researchers are only at the beginning of being able to quantify the various dimensions of sleep and the degree to which sleep is a reflection of underlying physiologic issues, or whether patients are having erratic sleep patterns that are having a toxic effect on their overall health.

Questions also remain about the mechanism behind the association, whether the increased risk is universal or more harmful for some people, and the best way to measure factors during sleep that can most comprehensively and precisely predict risk.

“As we get more information flowing in from sensors, I think we will begin to develop more sophisticated approaches toward understanding risk, and it will be accompanied by other studies that will help us understand whether, again, this is a reflection of other processes that we should be paying attention to or whether it is a cause of disease and risk,” Krumholz said.

Subgroup analyses suggested positive associations between irregular sleep and CVD in African Americans, Hispanics, and Chinese Americans but not in whites. This could be because sleep irregularity, both timing and duration, was substantially higher in minorities, especially African Americans, but may also be as a result of chance because the study sample is relatively small, Huang explained.

The authors note that the overall findings are biologically plausible because of their previous work linking sleep irregularity with metabolic risk factors that predispose to atherosclerosis, such as obesity, diabetes, and hypertension. Participants with irregular sleep tended to have worse baseline cardiometabolic profiles, but this only explained a small portion of the associations between sleep irregularity and CVD, they note.

Other possible explanations include circadian clock genes, such as clock, per2 and bmal1, which have been shown experimentally to control a broad range of cardiovascular functions, from blood pressure and endothelial functions to vascular thrombosis and cardiac remodeling.

Irregular sleep may also influence the rhythms of the autonomic nervous system, and behavioral rhythms with regard to timing and/or amount of eating or exercise.

Further research is needed to understand the mechanisms driving the associations, the impact of sleep irregularity on individual CVD outcomes, and to determine whether a 7-day SD of more than 90 minutes for either sleep duration or sleep-onset timing can be used clinically as a threshold target for promoting cardiometabolically healthy sleep, Huang said.

“When providers communicate with their patients regarding strategies for CVD prevention, usually they focus on healthy diet and physical activity; and even when they talk about sleep, they talk about whether they have good sleep quality or sufficient sleep,” he said. “But one thing they should provide is advice regarding sleep regularity and [they should] recommend their patients follow a regular sleep pattern for the purpose of cardiovascular prevention.”

In a related editorial, Olaf Oldenburg, MD, Luderus-Kliniken Münster, Clemenshospital, Münster, Germany, and Jens Spiesshoefer, MD, Institute of Life Sciences, Scuola Superiore Sant’Anna, Pisa, Italy, write that the observed independent association between sleep irregularity and CVD “is a particularly striking finding given that impaired circadian rhythm is likely to be much more prevalent than the extreme example of shift work.”

They call on researchers to utilize big data to facilitate understanding of the association and say it is essential to test whether experimental data support the hypothesis that altered circadian rhythms would translate into unfavorable changes in 24-hour sympathovagal and neurohormonal balance, and ultimately CVD.

The present study “will, and should, stimulate much needed additional research on the association between sleep and CVD that may offer novel approaches to help improve the prognosis and daily symptom burden of patients with CVD, and might make sleep itself a therapeutic target in CVD,” the editorialists conclude.

This research was supported by contracts from the National Heart, Lung, and Blood Institute (NHLBI), and by grants from the National Center for Advancing Translational Sciences. The MESA Sleep Study was supported by an NHLBI grant. Huang was supported by a career development grant from the National Institutes of Health.

Krumholz and Oldenburg have disclosed no relevant financial relationships. Spiesshoefer is supported by grants from the Else-Kröner-Fresenius Stiftung, the Innovative Medical Research program at the University of Münster, and Deutsche Herzstiftung; and by young investigator research support from Scuola Superiore Sant’Anna Pisa. He also has received travel grants and lecture honoraria from Boehringer Ingelheim and Chiesi.
 

Source: J Am Coll Cardiol. 2020 Mar 2. doi: 10.1016/j.jacc.2019.12.054.

This article first appeared on Medscape.com.

Publications
Topics
Sections

People who frequently alter the amount of sleep and time they go to bed each night are twofold more likely to develop cardiovascular disease, independent of traditional CVD risk factors, new research suggests.

Prior studies have focused on shift workers because night shift work will influence circadian rhythm and increase CVD risk. But it is increasingly recognized that circadian disruption may occur outside of shift work and accumulate over time, particularly given modern lifestyle factors such as increased use of mobile devices and television at night, said study coauthor Tianyi Huang, ScD, MSc, of Brigham and Women’s Hospital and Harvard Medical School in Boston, Massachusetts.

“Even if they tend to go to sleep at certain times, by following that lifestyle or behavior, it can interfere with their planned sleep timing,” he said.

“One thing that surprised me in this sample is that about one third of participants have irregular sleep patterns that can put them at increased risk of cardiovascular disease. So I think the prevalence is higher than expected,” Huang added.

As reported today in the Journal of the American College of Cardiology, the investigators used data from 7-day wrist actigraphy, 1 night of at-home polysomnography, and sleep questionnaires to assess sleep duration and sleep-onset timing among 1,992 Multi-Ethnic Study of Atherosclerosis () participants, aged 45 to 84 years, who were free of CVD and prospectively followed for a me MESA dian of 4.9 years.

A total of 786 patients (39.5%) had sleep duration standard deviation (SD) > 90 minutes and 510 (25.6%) had sleep-onset timing SD > 90 minutes.

During follow-up, there were 111 incident CVD events, including myocardial infarction, coronary heart disease death, stroke, and other coronary events.

Compared with people who had less than 1 hour of variation in sleep duration, the risk for incident CVD was 9% higher for people whose sleep duration varied 61 to 90 minutes (hazard ratio [HR], 1.09; 95% confidence interval [CI], 0.62 - 1.92), even after controlling for a variety of cardiovascular and sleep-related risk factors such as body mass index, systolic blood pressure, smoking status, total cholesterol, average sleep duration, insomnia symptoms, and sleep apnea.

Moreover, the adjusted CVD risk was substantially increased with 91 to 120 minutes of variation (HR, 1.59; 95% CI, 0.91 - 2.76) and more than 120 minutes of variation in sleep duration (HR, 2.14; 95% CI, 1.24 - 3.68).

Every 1-hour increase in sleep duration SD was associated with 36% higher CVD risk (95% CI; 1.07 - 1.73).

Compared with people with no more than a half hour of variation in nightly bedtimes, the adjusted hazard ratios for CVD were 1.16 (95% CI, 0.64 - 2.13), 1.52 (95% CI, 0.81 - 2.88), and 2.11 (95% CI, 1.13 - 3.91) when bedtimes varied by 31 to 60 minutes, 61 to 90 minutes, and more than 90 minutes.

For every 1-hour increase in sleep-onset timing SD, the risk of CVD was 18% higher (95% CI; 1.06 - 1.31).

“The results are similar for the regularity of sleep timing and the regularity of sleep duration, which means that both can contribute to circadian disruption and then lead to development of cardiovascular disease,” Huang said.

This is an important article and signals how sleep is an important marker and possibly a mediator of cardiovascular risk, said Harlan Krumholz, MD, of Yale School of Medicine in New Haven, Connecticut, who was not involved with the study.

“What I like about this is it’s a nice longitudinal, epidemiologic study with not just self-report, but sensor-detected sleep, that has been correlated with well-curated and adjudicated outcomes to give us a strong sense of this association,” he told theheart.org/Medscape Cardiology. “And also, that it goes beyond just the duration — they combine the duration and timing in order to give a fuller picture of sleep.”

Nevertheless, Krumholz said researchers are only at the beginning of being able to quantify the various dimensions of sleep and the degree to which sleep is a reflection of underlying physiologic issues, or whether patients are having erratic sleep patterns that are having a toxic effect on their overall health.

Questions also remain about the mechanism behind the association, whether the increased risk is universal or more harmful for some people, and the best way to measure factors during sleep that can most comprehensively and precisely predict risk.

“As we get more information flowing in from sensors, I think we will begin to develop more sophisticated approaches toward understanding risk, and it will be accompanied by other studies that will help us understand whether, again, this is a reflection of other processes that we should be paying attention to or whether it is a cause of disease and risk,” Krumholz said.

Subgroup analyses suggested positive associations between irregular sleep and CVD in African Americans, Hispanics, and Chinese Americans but not in whites. This could be because sleep irregularity, both timing and duration, was substantially higher in minorities, especially African Americans, but may also be as a result of chance because the study sample is relatively small, Huang explained.

The authors note that the overall findings are biologically plausible because of their previous work linking sleep irregularity with metabolic risk factors that predispose to atherosclerosis, such as obesity, diabetes, and hypertension. Participants with irregular sleep tended to have worse baseline cardiometabolic profiles, but this only explained a small portion of the associations between sleep irregularity and CVD, they note.

Other possible explanations include circadian clock genes, such as clock, per2 and bmal1, which have been shown experimentally to control a broad range of cardiovascular functions, from blood pressure and endothelial functions to vascular thrombosis and cardiac remodeling.

Irregular sleep may also influence the rhythms of the autonomic nervous system, and behavioral rhythms with regard to timing and/or amount of eating or exercise.

Further research is needed to understand the mechanisms driving the associations, the impact of sleep irregularity on individual CVD outcomes, and to determine whether a 7-day SD of more than 90 minutes for either sleep duration or sleep-onset timing can be used clinically as a threshold target for promoting cardiometabolically healthy sleep, Huang said.

“When providers communicate with their patients regarding strategies for CVD prevention, usually they focus on healthy diet and physical activity; and even when they talk about sleep, they talk about whether they have good sleep quality or sufficient sleep,” he said. “But one thing they should provide is advice regarding sleep regularity and [they should] recommend their patients follow a regular sleep pattern for the purpose of cardiovascular prevention.”

In a related editorial, Olaf Oldenburg, MD, Luderus-Kliniken Münster, Clemenshospital, Münster, Germany, and Jens Spiesshoefer, MD, Institute of Life Sciences, Scuola Superiore Sant’Anna, Pisa, Italy, write that the observed independent association between sleep irregularity and CVD “is a particularly striking finding given that impaired circadian rhythm is likely to be much more prevalent than the extreme example of shift work.”

They call on researchers to utilize big data to facilitate understanding of the association and say it is essential to test whether experimental data support the hypothesis that altered circadian rhythms would translate into unfavorable changes in 24-hour sympathovagal and neurohormonal balance, and ultimately CVD.

The present study “will, and should, stimulate much needed additional research on the association between sleep and CVD that may offer novel approaches to help improve the prognosis and daily symptom burden of patients with CVD, and might make sleep itself a therapeutic target in CVD,” the editorialists conclude.

This research was supported by contracts from the National Heart, Lung, and Blood Institute (NHLBI), and by grants from the National Center for Advancing Translational Sciences. The MESA Sleep Study was supported by an NHLBI grant. Huang was supported by a career development grant from the National Institutes of Health.

Krumholz and Oldenburg have disclosed no relevant financial relationships. Spiesshoefer is supported by grants from the Else-Kröner-Fresenius Stiftung, the Innovative Medical Research program at the University of Münster, and Deutsche Herzstiftung; and by young investigator research support from Scuola Superiore Sant’Anna Pisa. He also has received travel grants and lecture honoraria from Boehringer Ingelheim and Chiesi.
 

Source: J Am Coll Cardiol. 2020 Mar 2. doi: 10.1016/j.jacc.2019.12.054.

This article first appeared on Medscape.com.

People who frequently alter the amount of sleep and time they go to bed each night are twofold more likely to develop cardiovascular disease, independent of traditional CVD risk factors, new research suggests.

Prior studies have focused on shift workers because night shift work will influence circadian rhythm and increase CVD risk. But it is increasingly recognized that circadian disruption may occur outside of shift work and accumulate over time, particularly given modern lifestyle factors such as increased use of mobile devices and television at night, said study coauthor Tianyi Huang, ScD, MSc, of Brigham and Women’s Hospital and Harvard Medical School in Boston, Massachusetts.

“Even if they tend to go to sleep at certain times, by following that lifestyle or behavior, it can interfere with their planned sleep timing,” he said.

“One thing that surprised me in this sample is that about one third of participants have irregular sleep patterns that can put them at increased risk of cardiovascular disease. So I think the prevalence is higher than expected,” Huang added.

As reported today in the Journal of the American College of Cardiology, the investigators used data from 7-day wrist actigraphy, 1 night of at-home polysomnography, and sleep questionnaires to assess sleep duration and sleep-onset timing among 1,992 Multi-Ethnic Study of Atherosclerosis () participants, aged 45 to 84 years, who were free of CVD and prospectively followed for a me MESA dian of 4.9 years.

A total of 786 patients (39.5%) had sleep duration standard deviation (SD) > 90 minutes and 510 (25.6%) had sleep-onset timing SD > 90 minutes.

During follow-up, there were 111 incident CVD events, including myocardial infarction, coronary heart disease death, stroke, and other coronary events.

Compared with people who had less than 1 hour of variation in sleep duration, the risk for incident CVD was 9% higher for people whose sleep duration varied 61 to 90 minutes (hazard ratio [HR], 1.09; 95% confidence interval [CI], 0.62 - 1.92), even after controlling for a variety of cardiovascular and sleep-related risk factors such as body mass index, systolic blood pressure, smoking status, total cholesterol, average sleep duration, insomnia symptoms, and sleep apnea.

Moreover, the adjusted CVD risk was substantially increased with 91 to 120 minutes of variation (HR, 1.59; 95% CI, 0.91 - 2.76) and more than 120 minutes of variation in sleep duration (HR, 2.14; 95% CI, 1.24 - 3.68).

Every 1-hour increase in sleep duration SD was associated with 36% higher CVD risk (95% CI; 1.07 - 1.73).

Compared with people with no more than a half hour of variation in nightly bedtimes, the adjusted hazard ratios for CVD were 1.16 (95% CI, 0.64 - 2.13), 1.52 (95% CI, 0.81 - 2.88), and 2.11 (95% CI, 1.13 - 3.91) when bedtimes varied by 31 to 60 minutes, 61 to 90 minutes, and more than 90 minutes.

For every 1-hour increase in sleep-onset timing SD, the risk of CVD was 18% higher (95% CI; 1.06 - 1.31).

“The results are similar for the regularity of sleep timing and the regularity of sleep duration, which means that both can contribute to circadian disruption and then lead to development of cardiovascular disease,” Huang said.

This is an important article and signals how sleep is an important marker and possibly a mediator of cardiovascular risk, said Harlan Krumholz, MD, of Yale School of Medicine in New Haven, Connecticut, who was not involved with the study.

“What I like about this is it’s a nice longitudinal, epidemiologic study with not just self-report, but sensor-detected sleep, that has been correlated with well-curated and adjudicated outcomes to give us a strong sense of this association,” he told theheart.org/Medscape Cardiology. “And also, that it goes beyond just the duration — they combine the duration and timing in order to give a fuller picture of sleep.”

Nevertheless, Krumholz said researchers are only at the beginning of being able to quantify the various dimensions of sleep and the degree to which sleep is a reflection of underlying physiologic issues, or whether patients are having erratic sleep patterns that are having a toxic effect on their overall health.

Questions also remain about the mechanism behind the association, whether the increased risk is universal or more harmful for some people, and the best way to measure factors during sleep that can most comprehensively and precisely predict risk.

“As we get more information flowing in from sensors, I think we will begin to develop more sophisticated approaches toward understanding risk, and it will be accompanied by other studies that will help us understand whether, again, this is a reflection of other processes that we should be paying attention to or whether it is a cause of disease and risk,” Krumholz said.

Subgroup analyses suggested positive associations between irregular sleep and CVD in African Americans, Hispanics, and Chinese Americans but not in whites. This could be because sleep irregularity, both timing and duration, was substantially higher in minorities, especially African Americans, but may also be as a result of chance because the study sample is relatively small, Huang explained.

The authors note that the overall findings are biologically plausible because of their previous work linking sleep irregularity with metabolic risk factors that predispose to atherosclerosis, such as obesity, diabetes, and hypertension. Participants with irregular sleep tended to have worse baseline cardiometabolic profiles, but this only explained a small portion of the associations between sleep irregularity and CVD, they note.

Other possible explanations include circadian clock genes, such as clock, per2 and bmal1, which have been shown experimentally to control a broad range of cardiovascular functions, from blood pressure and endothelial functions to vascular thrombosis and cardiac remodeling.

Irregular sleep may also influence the rhythms of the autonomic nervous system, and behavioral rhythms with regard to timing and/or amount of eating or exercise.

Further research is needed to understand the mechanisms driving the associations, the impact of sleep irregularity on individual CVD outcomes, and to determine whether a 7-day SD of more than 90 minutes for either sleep duration or sleep-onset timing can be used clinically as a threshold target for promoting cardiometabolically healthy sleep, Huang said.

“When providers communicate with their patients regarding strategies for CVD prevention, usually they focus on healthy diet and physical activity; and even when they talk about sleep, they talk about whether they have good sleep quality or sufficient sleep,” he said. “But one thing they should provide is advice regarding sleep regularity and [they should] recommend their patients follow a regular sleep pattern for the purpose of cardiovascular prevention.”

In a related editorial, Olaf Oldenburg, MD, Luderus-Kliniken Münster, Clemenshospital, Münster, Germany, and Jens Spiesshoefer, MD, Institute of Life Sciences, Scuola Superiore Sant’Anna, Pisa, Italy, write that the observed independent association between sleep irregularity and CVD “is a particularly striking finding given that impaired circadian rhythm is likely to be much more prevalent than the extreme example of shift work.”

They call on researchers to utilize big data to facilitate understanding of the association and say it is essential to test whether experimental data support the hypothesis that altered circadian rhythms would translate into unfavorable changes in 24-hour sympathovagal and neurohormonal balance, and ultimately CVD.

The present study “will, and should, stimulate much needed additional research on the association between sleep and CVD that may offer novel approaches to help improve the prognosis and daily symptom burden of patients with CVD, and might make sleep itself a therapeutic target in CVD,” the editorialists conclude.

This research was supported by contracts from the National Heart, Lung, and Blood Institute (NHLBI), and by grants from the National Center for Advancing Translational Sciences. The MESA Sleep Study was supported by an NHLBI grant. Huang was supported by a career development grant from the National Institutes of Health.

Krumholz and Oldenburg have disclosed no relevant financial relationships. Spiesshoefer is supported by grants from the Else-Kröner-Fresenius Stiftung, the Innovative Medical Research program at the University of Münster, and Deutsche Herzstiftung; and by young investigator research support from Scuola Superiore Sant’Anna Pisa. He also has received travel grants and lecture honoraria from Boehringer Ingelheim and Chiesi.
 

Source: J Am Coll Cardiol. 2020 Mar 2. doi: 10.1016/j.jacc.2019.12.054.

This article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Medscape Article

Transradial access gains converts among U.S. interventional neurologists

Article Type
Changed
Tue, 07/21/2020 - 14:33

The safety advantage that has already coaxed U.S. interventional cardiologists to switch many of their routine catheterizations from femoral-artery entry in the groin to a radial-artery approach through a patient’s wrist is now prompting a similar shift among U.S. interventional neurologists, who are increasingly pivoting to transradial access when performing many neurovascular procedures.

Dr. Raul G. Nogueira, professor of neurology, Emory University, Atlanta
Dr. Raul G. Nogueira

“It’s growing dramatically in U.S. practice. It may be hype, but there is big excitement. We are still in an assessment mode, but the adoption rate has been high,” Raul G. Nogueira, MD, said in an interview during the International Stroke Conference sponsored by the American Heart Association. “The big advantage [of transradial catheterization entry] is elimination of groin complications, some of which can be pretty bad. Is it safe for the brain? It’s probably okay, but that needs more study,” said Dr. Nogueira, professor of neurology at Emory University and director of the Neurovascular Service at the Grady Marcus Stroke and Neuroscience Center in Atlanta.

His uncertainty stems from the more difficult route taken to advance a catheter from the wrist into brain vessels, a maneuver that requires significant manipulation of the catheter tip, unlike the path from the right radial artery into the heart’s arteries, a “straight shot,” he explained. To reach the brain’s vasculature, the tip must execute a spin “that may scrape small emboli from the arch or arteries, so we need to look at this a little more carefully.” Ideally in a prospective, randomized study, he said. “We need to see whether the burden of [magnetic resonance] lesions is any higher when you go through the radial [artery].”

Some of the first-reported, large-scale U.S. experiences using a radial-artery approach for various neurovascular procedures, including a few thrombectomy cases, came in a series of 1,272 patients treated at any of four U.S. centers during July 2018 to June 2019, a period when the neurovascular staffs at all four centers transitioned from primarily using femoral-artery access to using radial access as their default mode. During the 12-month transition period, overall use of radial access at all four centers rose from roughly a quarter of all neurovascular interventions during July to September 2018 to closer to 80% by April to June 2019, Eyad Almallouhi, MD, reported at the conference.



During the entire 12 months, the operators ran up a 94% rate of successfully completed procedures using radial access, a rate that rose from about 88% during the first quarter to roughly 95% success during the fourth quarter tracked, said Dr. Almallouhi, a neurologist at the Medical University of South Carolina in Charleston. The rate of crossover from what began as a transradial procedure but switched to transfemoral was just under 6% overall, with a nearly 14% crossover rate during the first quarter that then dropped to around 5% for the rest of the transition year. Crossovers for interventional procedures throughout the study year occurred at a 12% rate, while crossovers for diagnostic procedures occurred at a 5% clip throughout the entire year.

None of the transradial patients had a major access-site complication, and minor complications occurred in less than 2% of the patients, including 11 with a forearm hematoma, 6 with forearm pain, and 5 with oozing at their access site. The absence of any major access-site complications among the transradial-access patients in this series contrasts with a recent report of a 1.7% rate of major complications secondary to femoral-artery access for mechanical thrombectomy in a combined analysis of data from seven published studies that included 660 thrombectomy procedures (Am J Neuroradiol. 2019 Feb. doi: 10.3174/ajnr.A6423). The other three centers that participated in the study Dr. Almallouhi presented were the University of Miami, Thomas Jefferson University in Philadelphia, and the University of Pittsburgh.

Of the 1,272 total procedures studied, 83% were diagnostic procedures, which had an overall 95% success rate, and 17% were interventional procedures, which had a success rate of 89%. The interventional transradial procedures included 62 primary coilings of aneurysms, 44 stent-assisted aneurysm coilings, 40 patients who underwent a flow diversion, 21 balloon-assisted aneurysm coilings, and 24 patients who underwent stroke thrombectomy.

The size of the devices commonly used for thrombectomy are often too large to allow for radial-artery access, noted Dr. Nogueira. For urgent interventions like thrombectomy “we use balloon-guided catheters that are large-bore and don’t fit well in the radial,” he said, although thrombectomy via the radial artery without a balloon-guided catheter is possible for clots located in the basilar artery. Last year, researchers in Germany reported using a balloon-guided catheter to perform mechanical thrombectomy via the radial artery (Interv Neuroradiol. 2019 Oct 1;25[5]:508-10). But it’s a different story for elective, diagnostic procedures. “I have moved most of these to transradial,” Dr. Nogueira said. He and his coauthors summarized the case for transradial access for cerebral angiography in a recent review; in addition to enhanced safety they cited other advantages including improved patient satisfaction and reduced cost because of a shorter length of stay (Interv Cardiol Clin. 2020 Jan;9[1]:75-86).

Dr. Jeremy Payne, director of the Banner Center for Neurovascular Medicine, and medical director of the Banner — University Medical Center Phoenix Comprehensive Stroke Program
Dr. Jeremy Payne

Despite his enthusiasm and the enthusiasm of other neurointerventionalists for the transradial approach, other stroke neurologists have been more cautious and slower to shift away from the femoral approach. “Our experience has been that for most cases it’s a bit more challenging to access the cervical vessels from the radial artery than from the traditional femoral approach. For arches with complex anatomy, however, the transradial approach can be of benefit in some cases, depending on the angles that need to be traversed,” commented Jeremy Payne, MD, director of the Banner Center for Neurovascular Medicine and medical director of the Banner—University Medical Center Phoenix Comprehensive Stroke Program. Dr. Payne highlighted that, while he is not an interventionalist himself, he and his interventional staff have regularly discussed the transradial option.

“In the cardiology literature the radial approach has been very successful, with better overall safety than the traditional femoral approach. Largely this seems to do with the anatomy of the aortic arch. It’s simply a more direct approach to the coronaries via the right radial artery; getting the wire into the correct vessel is significantly more difficult the more acute the angle it has to traverse,” such as when the target is an intracerebral vessel, Dr. Payne said in an interview.

“Our experience in the past 6 months has been about 25% transradial for some of our procedures, mainly diagnostic angiograms. We don’t find any difference in safety, however, as our transfemoral procedures are already very safe. One of the benefits of a transradial approach has been that a closure device may not be needed, with fewer vascular complications at the access site, such as fistula formation. We use ultrasound for access, and have not seen a difference in those approaches at all so far. One might argue that using ultrasound to establish access would slow us down, but so far our fastest case start-to-recanalization time in an acute stroke this year was 6 minutes, so speed does not appear to be a limiting issue. Another concern overall for transradial access is the potential limitation in the tools we may be able to deploy, given the smaller size of the vessel. It is reassuring [in the report from Dr. Almallouhi] that a variety of cases were successfully completed via this approach. However, fewer than 2% of their cases [24 patients] were apparently emergent, acute strokes, lending no specific support to that context. I do not expect that to change based on this paper,” Dr. Payne concluded.

“It is not clear to me that transradial neurointervention will change much. We have excellent safety data for the femoral approach, a proven track record of efficacy, and for most patients it seems to afford a somewhat wider range of tools that can be deployed, with simpler anatomy for accessing the cervical vessels in most arches. It is reassuring that the results reported by Dr. Almallouhi did not suggest negative outcomes, and as such I suspect the transradial approach at least gives us an additional option in a minority of patients. We have seen in the past 5-10 years an explosion of tools for the endovascular treatment of stroke; transradial access represents another potential strategy that appears so far to be safe,” Dr. Payne said.

Drs. Nogueira, Almallouhi, and Payne had no relevant disclosures.

SOURCE: Almallouhi E et al. Stroke. 2020 Feb;51(suppl 1):A64.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

The safety advantage that has already coaxed U.S. interventional cardiologists to switch many of their routine catheterizations from femoral-artery entry in the groin to a radial-artery approach through a patient’s wrist is now prompting a similar shift among U.S. interventional neurologists, who are increasingly pivoting to transradial access when performing many neurovascular procedures.

Dr. Raul G. Nogueira, professor of neurology, Emory University, Atlanta
Dr. Raul G. Nogueira

“It’s growing dramatically in U.S. practice. It may be hype, but there is big excitement. We are still in an assessment mode, but the adoption rate has been high,” Raul G. Nogueira, MD, said in an interview during the International Stroke Conference sponsored by the American Heart Association. “The big advantage [of transradial catheterization entry] is elimination of groin complications, some of which can be pretty bad. Is it safe for the brain? It’s probably okay, but that needs more study,” said Dr. Nogueira, professor of neurology at Emory University and director of the Neurovascular Service at the Grady Marcus Stroke and Neuroscience Center in Atlanta.

His uncertainty stems from the more difficult route taken to advance a catheter from the wrist into brain vessels, a maneuver that requires significant manipulation of the catheter tip, unlike the path from the right radial artery into the heart’s arteries, a “straight shot,” he explained. To reach the brain’s vasculature, the tip must execute a spin “that may scrape small emboli from the arch or arteries, so we need to look at this a little more carefully.” Ideally in a prospective, randomized study, he said. “We need to see whether the burden of [magnetic resonance] lesions is any higher when you go through the radial [artery].”

Some of the first-reported, large-scale U.S. experiences using a radial-artery approach for various neurovascular procedures, including a few thrombectomy cases, came in a series of 1,272 patients treated at any of four U.S. centers during July 2018 to June 2019, a period when the neurovascular staffs at all four centers transitioned from primarily using femoral-artery access to using radial access as their default mode. During the 12-month transition period, overall use of radial access at all four centers rose from roughly a quarter of all neurovascular interventions during July to September 2018 to closer to 80% by April to June 2019, Eyad Almallouhi, MD, reported at the conference.



During the entire 12 months, the operators ran up a 94% rate of successfully completed procedures using radial access, a rate that rose from about 88% during the first quarter to roughly 95% success during the fourth quarter tracked, said Dr. Almallouhi, a neurologist at the Medical University of South Carolina in Charleston. The rate of crossover from what began as a transradial procedure but switched to transfemoral was just under 6% overall, with a nearly 14% crossover rate during the first quarter that then dropped to around 5% for the rest of the transition year. Crossovers for interventional procedures throughout the study year occurred at a 12% rate, while crossovers for diagnostic procedures occurred at a 5% clip throughout the entire year.

None of the transradial patients had a major access-site complication, and minor complications occurred in less than 2% of the patients, including 11 with a forearm hematoma, 6 with forearm pain, and 5 with oozing at their access site. The absence of any major access-site complications among the transradial-access patients in this series contrasts with a recent report of a 1.7% rate of major complications secondary to femoral-artery access for mechanical thrombectomy in a combined analysis of data from seven published studies that included 660 thrombectomy procedures (Am J Neuroradiol. 2019 Feb. doi: 10.3174/ajnr.A6423). The other three centers that participated in the study Dr. Almallouhi presented were the University of Miami, Thomas Jefferson University in Philadelphia, and the University of Pittsburgh.

Of the 1,272 total procedures studied, 83% were diagnostic procedures, which had an overall 95% success rate, and 17% were interventional procedures, which had a success rate of 89%. The interventional transradial procedures included 62 primary coilings of aneurysms, 44 stent-assisted aneurysm coilings, 40 patients who underwent a flow diversion, 21 balloon-assisted aneurysm coilings, and 24 patients who underwent stroke thrombectomy.

The size of the devices commonly used for thrombectomy are often too large to allow for radial-artery access, noted Dr. Nogueira. For urgent interventions like thrombectomy “we use balloon-guided catheters that are large-bore and don’t fit well in the radial,” he said, although thrombectomy via the radial artery without a balloon-guided catheter is possible for clots located in the basilar artery. Last year, researchers in Germany reported using a balloon-guided catheter to perform mechanical thrombectomy via the radial artery (Interv Neuroradiol. 2019 Oct 1;25[5]:508-10). But it’s a different story for elective, diagnostic procedures. “I have moved most of these to transradial,” Dr. Nogueira said. He and his coauthors summarized the case for transradial access for cerebral angiography in a recent review; in addition to enhanced safety they cited other advantages including improved patient satisfaction and reduced cost because of a shorter length of stay (Interv Cardiol Clin. 2020 Jan;9[1]:75-86).

Dr. Jeremy Payne, director of the Banner Center for Neurovascular Medicine, and medical director of the Banner — University Medical Center Phoenix Comprehensive Stroke Program
Dr. Jeremy Payne

Despite his enthusiasm and the enthusiasm of other neurointerventionalists for the transradial approach, other stroke neurologists have been more cautious and slower to shift away from the femoral approach. “Our experience has been that for most cases it’s a bit more challenging to access the cervical vessels from the radial artery than from the traditional femoral approach. For arches with complex anatomy, however, the transradial approach can be of benefit in some cases, depending on the angles that need to be traversed,” commented Jeremy Payne, MD, director of the Banner Center for Neurovascular Medicine and medical director of the Banner—University Medical Center Phoenix Comprehensive Stroke Program. Dr. Payne highlighted that, while he is not an interventionalist himself, he and his interventional staff have regularly discussed the transradial option.

“In the cardiology literature the radial approach has been very successful, with better overall safety than the traditional femoral approach. Largely this seems to do with the anatomy of the aortic arch. It’s simply a more direct approach to the coronaries via the right radial artery; getting the wire into the correct vessel is significantly more difficult the more acute the angle it has to traverse,” such as when the target is an intracerebral vessel, Dr. Payne said in an interview.

“Our experience in the past 6 months has been about 25% transradial for some of our procedures, mainly diagnostic angiograms. We don’t find any difference in safety, however, as our transfemoral procedures are already very safe. One of the benefits of a transradial approach has been that a closure device may not be needed, with fewer vascular complications at the access site, such as fistula formation. We use ultrasound for access, and have not seen a difference in those approaches at all so far. One might argue that using ultrasound to establish access would slow us down, but so far our fastest case start-to-recanalization time in an acute stroke this year was 6 minutes, so speed does not appear to be a limiting issue. Another concern overall for transradial access is the potential limitation in the tools we may be able to deploy, given the smaller size of the vessel. It is reassuring [in the report from Dr. Almallouhi] that a variety of cases were successfully completed via this approach. However, fewer than 2% of their cases [24 patients] were apparently emergent, acute strokes, lending no specific support to that context. I do not expect that to change based on this paper,” Dr. Payne concluded.

“It is not clear to me that transradial neurointervention will change much. We have excellent safety data for the femoral approach, a proven track record of efficacy, and for most patients it seems to afford a somewhat wider range of tools that can be deployed, with simpler anatomy for accessing the cervical vessels in most arches. It is reassuring that the results reported by Dr. Almallouhi did not suggest negative outcomes, and as such I suspect the transradial approach at least gives us an additional option in a minority of patients. We have seen in the past 5-10 years an explosion of tools for the endovascular treatment of stroke; transradial access represents another potential strategy that appears so far to be safe,” Dr. Payne said.

Drs. Nogueira, Almallouhi, and Payne had no relevant disclosures.

SOURCE: Almallouhi E et al. Stroke. 2020 Feb;51(suppl 1):A64.

The safety advantage that has already coaxed U.S. interventional cardiologists to switch many of their routine catheterizations from femoral-artery entry in the groin to a radial-artery approach through a patient’s wrist is now prompting a similar shift among U.S. interventional neurologists, who are increasingly pivoting to transradial access when performing many neurovascular procedures.

Dr. Raul G. Nogueira, professor of neurology, Emory University, Atlanta
Dr. Raul G. Nogueira

“It’s growing dramatically in U.S. practice. It may be hype, but there is big excitement. We are still in an assessment mode, but the adoption rate has been high,” Raul G. Nogueira, MD, said in an interview during the International Stroke Conference sponsored by the American Heart Association. “The big advantage [of transradial catheterization entry] is elimination of groin complications, some of which can be pretty bad. Is it safe for the brain? It’s probably okay, but that needs more study,” said Dr. Nogueira, professor of neurology at Emory University and director of the Neurovascular Service at the Grady Marcus Stroke and Neuroscience Center in Atlanta.

His uncertainty stems from the more difficult route taken to advance a catheter from the wrist into brain vessels, a maneuver that requires significant manipulation of the catheter tip, unlike the path from the right radial artery into the heart’s arteries, a “straight shot,” he explained. To reach the brain’s vasculature, the tip must execute a spin “that may scrape small emboli from the arch or arteries, so we need to look at this a little more carefully.” Ideally in a prospective, randomized study, he said. “We need to see whether the burden of [magnetic resonance] lesions is any higher when you go through the radial [artery].”

Some of the first-reported, large-scale U.S. experiences using a radial-artery approach for various neurovascular procedures, including a few thrombectomy cases, came in a series of 1,272 patients treated at any of four U.S. centers during July 2018 to June 2019, a period when the neurovascular staffs at all four centers transitioned from primarily using femoral-artery access to using radial access as their default mode. During the 12-month transition period, overall use of radial access at all four centers rose from roughly a quarter of all neurovascular interventions during July to September 2018 to closer to 80% by April to June 2019, Eyad Almallouhi, MD, reported at the conference.



During the entire 12 months, the operators ran up a 94% rate of successfully completed procedures using radial access, a rate that rose from about 88% during the first quarter to roughly 95% success during the fourth quarter tracked, said Dr. Almallouhi, a neurologist at the Medical University of South Carolina in Charleston. The rate of crossover from what began as a transradial procedure but switched to transfemoral was just under 6% overall, with a nearly 14% crossover rate during the first quarter that then dropped to around 5% for the rest of the transition year. Crossovers for interventional procedures throughout the study year occurred at a 12% rate, while crossovers for diagnostic procedures occurred at a 5% clip throughout the entire year.

None of the transradial patients had a major access-site complication, and minor complications occurred in less than 2% of the patients, including 11 with a forearm hematoma, 6 with forearm pain, and 5 with oozing at their access site. The absence of any major access-site complications among the transradial-access patients in this series contrasts with a recent report of a 1.7% rate of major complications secondary to femoral-artery access for mechanical thrombectomy in a combined analysis of data from seven published studies that included 660 thrombectomy procedures (Am J Neuroradiol. 2019 Feb. doi: 10.3174/ajnr.A6423). The other three centers that participated in the study Dr. Almallouhi presented were the University of Miami, Thomas Jefferson University in Philadelphia, and the University of Pittsburgh.

Of the 1,272 total procedures studied, 83% were diagnostic procedures, which had an overall 95% success rate, and 17% were interventional procedures, which had a success rate of 89%. The interventional transradial procedures included 62 primary coilings of aneurysms, 44 stent-assisted aneurysm coilings, 40 patients who underwent a flow diversion, 21 balloon-assisted aneurysm coilings, and 24 patients who underwent stroke thrombectomy.

The size of the devices commonly used for thrombectomy are often too large to allow for radial-artery access, noted Dr. Nogueira. For urgent interventions like thrombectomy “we use balloon-guided catheters that are large-bore and don’t fit well in the radial,” he said, although thrombectomy via the radial artery without a balloon-guided catheter is possible for clots located in the basilar artery. Last year, researchers in Germany reported using a balloon-guided catheter to perform mechanical thrombectomy via the radial artery (Interv Neuroradiol. 2019 Oct 1;25[5]:508-10). But it’s a different story for elective, diagnostic procedures. “I have moved most of these to transradial,” Dr. Nogueira said. He and his coauthors summarized the case for transradial access for cerebral angiography in a recent review; in addition to enhanced safety they cited other advantages including improved patient satisfaction and reduced cost because of a shorter length of stay (Interv Cardiol Clin. 2020 Jan;9[1]:75-86).

Dr. Jeremy Payne, director of the Banner Center for Neurovascular Medicine, and medical director of the Banner — University Medical Center Phoenix Comprehensive Stroke Program
Dr. Jeremy Payne

Despite his enthusiasm and the enthusiasm of other neurointerventionalists for the transradial approach, other stroke neurologists have been more cautious and slower to shift away from the femoral approach. “Our experience has been that for most cases it’s a bit more challenging to access the cervical vessels from the radial artery than from the traditional femoral approach. For arches with complex anatomy, however, the transradial approach can be of benefit in some cases, depending on the angles that need to be traversed,” commented Jeremy Payne, MD, director of the Banner Center for Neurovascular Medicine and medical director of the Banner—University Medical Center Phoenix Comprehensive Stroke Program. Dr. Payne highlighted that, while he is not an interventionalist himself, he and his interventional staff have regularly discussed the transradial option.

“In the cardiology literature the radial approach has been very successful, with better overall safety than the traditional femoral approach. Largely this seems to do with the anatomy of the aortic arch. It’s simply a more direct approach to the coronaries via the right radial artery; getting the wire into the correct vessel is significantly more difficult the more acute the angle it has to traverse,” such as when the target is an intracerebral vessel, Dr. Payne said in an interview.

“Our experience in the past 6 months has been about 25% transradial for some of our procedures, mainly diagnostic angiograms. We don’t find any difference in safety, however, as our transfemoral procedures are already very safe. One of the benefits of a transradial approach has been that a closure device may not be needed, with fewer vascular complications at the access site, such as fistula formation. We use ultrasound for access, and have not seen a difference in those approaches at all so far. One might argue that using ultrasound to establish access would slow us down, but so far our fastest case start-to-recanalization time in an acute stroke this year was 6 minutes, so speed does not appear to be a limiting issue. Another concern overall for transradial access is the potential limitation in the tools we may be able to deploy, given the smaller size of the vessel. It is reassuring [in the report from Dr. Almallouhi] that a variety of cases were successfully completed via this approach. However, fewer than 2% of their cases [24 patients] were apparently emergent, acute strokes, lending no specific support to that context. I do not expect that to change based on this paper,” Dr. Payne concluded.

“It is not clear to me that transradial neurointervention will change much. We have excellent safety data for the femoral approach, a proven track record of efficacy, and for most patients it seems to afford a somewhat wider range of tools that can be deployed, with simpler anatomy for accessing the cervical vessels in most arches. It is reassuring that the results reported by Dr. Almallouhi did not suggest negative outcomes, and as such I suspect the transradial approach at least gives us an additional option in a minority of patients. We have seen in the past 5-10 years an explosion of tools for the endovascular treatment of stroke; transradial access represents another potential strategy that appears so far to be safe,” Dr. Payne said.

Drs. Nogueira, Almallouhi, and Payne had no relevant disclosures.

SOURCE: Almallouhi E et al. Stroke. 2020 Feb;51(suppl 1):A64.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ISC 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

5-year-old boy • behavioral issues • elevated ALT and AST levels • Dx?

Article Type
Changed
Wed, 03/18/2020 - 09:37
Display Headline
5-year-old boy • behavioral issues • elevated ALT and AST levels • Dx?

THE CASE

A 5-year-old boy was brought into his primary care clinic by his mother, who expressed concern about her son’s increasing impulsiveness, aggression, and difficulty staying on task at preschool and at home. The child’s medical history was unremarkable, and he was taking no medications. The family history was negative for hepatic or metabolic disease and positive for attention deficit-hyperactivity disorder (ADHD; father).

The child’s growth was normal. His physical exam was remarkable for a liver edge 1 cm below his costal margin. No Kayser-Fleischer rings were present. 

Screening included a complete metabolic panel. Notable results included an alanine aminotransferase (ALT) level of 208 U/dL (normal range, < 30 U/dL), an aspartate transaminase (AST) level of 125 U/dL (normal range, 10-34 U/dL), and an alkaline phosphatase (ALP) of 470 U/dL (normal range, 93-309 U/dL). Subsequent repeat laboratory testing confirmed these elevations (ALT, 248 U/dL; AST, 137 U/dL; ALP, 462 U/dL). Ceruloplasmin levels were low (11 mg/dL; normal range, 18-35 mg/dL), and 24-hour urinary copper was not obtainable. Prothrombin/partial thromboplastin time, ammonia, lactate, total and direct bilirubin, and gamma-glutamyltransferase levels were normal.

Further evaluation included abdominal ultrasound and brain magnetic resonance imaging, both of which yielded normal results. Testing for Epstein-Barr virus; ­cytomegalovirus; hepatitis A, B, and C titers; and antinuclear, anti-smooth muscle, and anti–liver-kidney microsomal antibodies was negative.

THE DIAGNOSIS

The patient’s low ceruloplasmin prompted referral to Pediatric Gastroenterology for consultation and liver biopsy due to concern for Wilson disease. Biopsy results were consistent with, and quantitative liver copper confirmatory for, this diagnosis (FIGURE).

Liver biopsy supports Wilson disease diagnosis

Genetic testing for mutations in the ATP7B gene was performed on the patient, his mother, and his siblings (his father was unavailable). The patient, his mother, and his sister were all positive for His1069Gln mutation; only the patient was positive for a 3990_3993 del mutation (his half-brother was negative for both mutations). The presence of 2 different mutant alleles for the ATP7B gene, one on each chromosome—the common substitution mutation, His1069Gln, in exon 14 and a 3990_3993 del TTAT mutation in exon 19—qualified the patient as a compound heterozygote.

The 3990_3993 del TTAT mutation—which to our knowledge has not been previously reported—produced a translational frame shift and premature stop codon. As others have pointed out, frame shift and missense mutations produce a more severe phenotype.1

Continue to: Further testing was prompted...

 

 

Further testing was prompted by a report suggesting that codon 129 mutations of the human prion gene (HPG) influence Wilson disease.2 Compared with patients who are heterozygous (M129V) or homozygous (V129V) for valine, those who are homozygous for methionine (M129M) have delayed symptom onset.2 Our patient was heterozygous (M129V). It is interesting to speculate that HPG heterozygosity, combined with a mutation causing a stop codon, predisposed our patient to more rapid accumulation of copper and earlier age of onset.  

DISCUSSION

Wilson disease is an inherited disorder of copper metabolism.3 An inherent difficulty in its recognition, diagnosis, and management is its rarity: global prevalence is estimated as 1/30,000, although this varies by geographic location.1 In contrast, ADHD has a prevalence of 7.2%,4 making it 2400 times more prevalent than Wilson disease. Furthermore, abnormal liver function tests are common in children; the differential diagnosis includes etiologies such as infection (both viral and nonviral), immune-mediated inflammatory disease, drug toxicity (iatrogenic or medication-induced), anatomic abnormalities, and nonalcoholic fatty liver disease.5

Wilson disease is remarkable, however, for being easily treatable if detected and devastating if not. Although liver abnormalities often improve with treatment, delayed diagnosis and management significantly impact neurologic recovery: An 18-month delay results in 38% of patients continuing to have major neurologic disabilities.6 Untreated, Wilson disease may be fatal within 5 years of development of neurologic symptoms.7 Thus, it has been suggested that evaluation for Wilson disease be considered in any child older than 1 year who presents with unexplained liver disease, including asymptomatic elevations of serum transaminases.8

Mutations in ATP7B on chromosome 13 are responsible for the pathology of Wilson disease9; more than 250 mutations have been identified, including substitutions, deletions, and missense mutations.10 Affected patients may be compound heterozygotes11 and/or may possess new mutations, as seen in our patient.

Although copper absorption is normal, impaired excretion causes toxic accumulation in affected organs. ATP7B’s product, ATPase 2, regulates copper excretion, as well as copper binding to apoceruloplasmin to form the carrier protein ceruloplasmin. An ATP7B abnormality would prevent the latter—making ceruloplasmin a useful screening biomarker and a reliable marker for Wilson disease by age 1 year.8

Continue to: Hepatic and neurocognitive effects

 

 

Hepatic and neurocognitive effects. Excess copper in hepatocytes causes oxidative damage and release of copper into the circulation, with accumulation in susceptible organs (eg, brain, kidneys). Hepatocyte apoptosis is accelerated by copper’s negative effect on inhibitor of apoptosis protein.12,13 Renal tubular damage leads to Fanconi syndrome,14 in which substances such as glucose, phosphates, and potassium are excreted in urine rather than absorbed into the bloodstream by the kidneys. Excess copper deposition in the Descemet membrane may lead to Kayser-Fleisher ring formation.15 In the brain, copper deposition may occur in the lenticular nuclei,3 as well as in the thalamus, subthalamus, brainstem, and frontal cortex—resulting in extrapyramidal, cerebral, and mild cerebellar symptoms.6  

Cognitive impairment, which may be subtle, includes increased impulsivity, impaired judgment, apathy, poor decision making, decreased attention, increased lability, slowed thinking, and memory loss.6 Behavioral manifestations include changes in school or work performance and outbursts mimicking ADHD12,16,17 as well as paranoia, depression, and bizarre behaviors.16,18 Neuropsychiatric abnormalities include personality changes, pseudoparkinsonism, dyskinesia/dysarthria, and ataxia/tremor. Younger patients with psychiatric symptoms may be labelled with depression, anxiety, obsessive-compulsive disorder, bipolar disorder, or antisocial disorder.6,16,18

Hepatic disease manifestations range from asymptomatic elevations in AST/ALT to acute hepatitis, mimicking infectious processes. Cirrhosis is the end result of untreated Wilson disease, with liver transplantation required if end-stage liver disease results. Rarely, patients present in fulminant hepatic failure, with death occurring if emergent liver transplantation is not performed.6,8,10

In this case, routine screening saved the day.

Of note, before age 10, > 80% of patients with Wilson disease present with hepatic symptoms; those ages 10 to 18 often manifest psychiatric changes.17 Kayser-Fleisher rings are common in patients with neurologic manifestations but less so in those who have hepatic presentations or are presymptomatic.6,15

Effective disease-mitigating treatment is indicated and available for both symptomatic and asymptomatic individuals and includes the copper chelators D-penicillamine (starting dose, 150-300 mg/d with a gradual weekly increase to 20 mg/kg/d) and trientine hydrochloride (a heavy metal chelating compound; starting dose, 20 mg/kg/d to a maximum of 1000 mg/d in young adults). Adverse effects of D-penicillamine include cutaneous eruptions, neutropenia, thrombocytopenia, proteinuria, and a lupus-like syndrome; therefore, trientine is increasingly being used as first-line therapy.8

Continue to: For asymptomatic...

 

 

For asymptomatic patients who have had effective chelation therapy and proven de-coppering, zinc salts are a useful follow-on therapy. Zinc’s proposed mechanism of action is induction of metallothionein in enterocytes, which promotes copper trapping and eventual excretion into the lumen. Importantly, treatment for Wilson disease is lifelong and monitoring of compliance is essential.8

Our 5-year-old patient was started on oral trientine at 20 mg/kg/d and a low copper diet. In response to this initial treatment, the patient’s liver function tests (LFTs) normalized, and he was switched to 25 mg tid of a zinc chelate, with continuation of the low copper diet. His LFTs have remained normal, although his urine copper levels are still elevated. He continues to be monitored periodically with LFTs and measurement of urine copper levels. He is also being treated for ADHD, as his presenting behavioral abnormalities suggestive of ADHD have not resolved.

THE TAKEAWAY

Although children presenting with symptoms consistent with ADHD often have ADHD, as was true in this case, it is important to consider other diagnoses. Unexplained elevations of liver function test values in children older than 1 year should prompt screening for Wilson disease.5,8 Additionally, other family members should be evaluated; if they have the disease, treatment should be started by age 2 years, even if the patient is asymptomatic.

In our patient’s case, routine screening saved the day. The complete metabolic panel revealed elevated ALT and AST levels, prompting further evaluation. Without this testing, his diagnosis likely would have been delayed, leading to progressive liver and central nervous system disease. With early identification and treatment, it is possible to stop the progression of Wilson disease.

CORRESPONDENCE
Jeffrey Taylor, MD, MS, Evangelical Community Hospital, Department of Pediatrics, 1 Hospital Drive, Lewisburg PA, 17837; jstaylor1@geisinger.edu.

References

1. Wu F, Wang J, Pu C, et al. Wilson’s disease: a comprehensive review of the molecular mechanisms. Int J Mol Sci. 2015;16:6419-6431.

2. Merle U, Stremmel W, Gessner R. Influence of homozygosity for methionine at codon 129 of the human prion gene on the onset of neurological and hepatic symptoms in Wilson disease. Arch Neurol. 2006;63:982-985.

3. Compston A. Progressive lenticular degeneration: a familial nervous disease associated with cirrhosis of the liver, by S. A. Kinnier Wilson, (From the National Hospital, and the Laboratory of the National Hospital, Queen Square, London) Brain 1912: 34; 295-509. Brain. 2009;132(pt 8):1997-2001.

4. Thomas R, Sanders S, Doust J, et al. Prevalence of attention-­deficit/hyperactivity disorder: a systematic review and meta-analysis. Pediatrics. 2015;135:e994-e1001.

5. Kang K. Abnormality on liver function test. Pediatr Gastroenterol Hepatol Nutr. 2013;16:225-232.

6. Lorincz M. Neurologic Wilson’s disease. Ann NY Acad Sci. 2010;1184:173-187.

7. Dening TR, Berrios GE, Walshe JM. Wilson’s disease and epilepsy. Brain. 1988;111(pt 5):1139-1155.

8. Socha P, Janczyk W, Dhawan A, et al. Wilson’s disease in children: a position paper by the Hepatology Committee of the European Society for Paediatric Gastroenterology, Hepatology and Nutrition. J Pediatr Gastroenterol Nutrit. 2018;66:334-344.

9. Bull PC, Thomas GR, Rommens JM, et al. The Wilson disease gene is a putative copper transporting P-type ATPase similar to the Menkes gene. Nat Genet. 1993;5:327-337.

10. Ala A, Schilsky ML. Wilson disease: pathophysiology, diagnosis, treatment and screening. Clin Liver Dis. 2004;8:787-805, viii.

11. Thomas GR, Forbes JR, Roberts EA, et al. The Wilson disease gene: spectrum of mutations and their consequences. Nat Genet. 1995;9:210-217.

12. Pfeiffer RF. Wilson’s disease. Semin Neurol. 2007;27:123-132.

13. Das SK, Ray K. Wilson’s disease: an update. Nat Clin Pract Neurol. 2006;2:482-493.

14. Morgan HG, Stewart WK, Lowe KG, et al. Wilson’s disease and the Fanconi syndrome. Q J Med. 1962;31:361-384.

15. Wiebers DO, Hollenhorst RW, Goldstein NP. The ophthalmologic manifestations of Wilson’s disease. Mayo Clin Proc. 1977;52:409-416.

16. Jackson GH, Meyer A, Lippmann S. Wilson’s disease: psychiatric manifestations may be the clinical presentation. Postgrad Med. 1994;95:135-138.

17. O’Conner JA, Sokol RJ. Copper metabolism and copper storage disorders. In: Suchy FJ, Sokol RJ, Balistreri WF, eds. Liver Disease in Children. 3rd ed. New York, NY: Cambridge University Press; 2007:626-660.

18. Dening TR, Berrios GE. Wilson’s disease: a longitudinal study of psychiatric symptoms. Biol Psychiatry. 1990;28:255-265.

Article PDF
Author and Disclosure Information

Department of Pediatrics, Geisinger Medical Center, Danville, PA (Dr. Taylor); Montana Children’s/Kalispell Regional Healthcare (Dr. Flass)
jstaylor1@geisinger.edu

The authors reported no potential conflict of interest relevant to this article.

Issue
The Journal of Family Practice - 69(2)
Publications
Topics
Page Number
97-100
Sections
Author and Disclosure Information

Department of Pediatrics, Geisinger Medical Center, Danville, PA (Dr. Taylor); Montana Children’s/Kalispell Regional Healthcare (Dr. Flass)
jstaylor1@geisinger.edu

The authors reported no potential conflict of interest relevant to this article.

Author and Disclosure Information

Department of Pediatrics, Geisinger Medical Center, Danville, PA (Dr. Taylor); Montana Children’s/Kalispell Regional Healthcare (Dr. Flass)
jstaylor1@geisinger.edu

The authors reported no potential conflict of interest relevant to this article.

Article PDF
Article PDF

THE CASE

A 5-year-old boy was brought into his primary care clinic by his mother, who expressed concern about her son’s increasing impulsiveness, aggression, and difficulty staying on task at preschool and at home. The child’s medical history was unremarkable, and he was taking no medications. The family history was negative for hepatic or metabolic disease and positive for attention deficit-hyperactivity disorder (ADHD; father).

The child’s growth was normal. His physical exam was remarkable for a liver edge 1 cm below his costal margin. No Kayser-Fleischer rings were present. 

Screening included a complete metabolic panel. Notable results included an alanine aminotransferase (ALT) level of 208 U/dL (normal range, < 30 U/dL), an aspartate transaminase (AST) level of 125 U/dL (normal range, 10-34 U/dL), and an alkaline phosphatase (ALP) of 470 U/dL (normal range, 93-309 U/dL). Subsequent repeat laboratory testing confirmed these elevations (ALT, 248 U/dL; AST, 137 U/dL; ALP, 462 U/dL). Ceruloplasmin levels were low (11 mg/dL; normal range, 18-35 mg/dL), and 24-hour urinary copper was not obtainable. Prothrombin/partial thromboplastin time, ammonia, lactate, total and direct bilirubin, and gamma-glutamyltransferase levels were normal.

Further evaluation included abdominal ultrasound and brain magnetic resonance imaging, both of which yielded normal results. Testing for Epstein-Barr virus; ­cytomegalovirus; hepatitis A, B, and C titers; and antinuclear, anti-smooth muscle, and anti–liver-kidney microsomal antibodies was negative.

THE DIAGNOSIS

The patient’s low ceruloplasmin prompted referral to Pediatric Gastroenterology for consultation and liver biopsy due to concern for Wilson disease. Biopsy results were consistent with, and quantitative liver copper confirmatory for, this diagnosis (FIGURE).

Liver biopsy supports Wilson disease diagnosis

Genetic testing for mutations in the ATP7B gene was performed on the patient, his mother, and his siblings (his father was unavailable). The patient, his mother, and his sister were all positive for His1069Gln mutation; only the patient was positive for a 3990_3993 del mutation (his half-brother was negative for both mutations). The presence of 2 different mutant alleles for the ATP7B gene, one on each chromosome—the common substitution mutation, His1069Gln, in exon 14 and a 3990_3993 del TTAT mutation in exon 19—qualified the patient as a compound heterozygote.

The 3990_3993 del TTAT mutation—which to our knowledge has not been previously reported—produced a translational frame shift and premature stop codon. As others have pointed out, frame shift and missense mutations produce a more severe phenotype.1

Continue to: Further testing was prompted...

 

 

Further testing was prompted by a report suggesting that codon 129 mutations of the human prion gene (HPG) influence Wilson disease.2 Compared with patients who are heterozygous (M129V) or homozygous (V129V) for valine, those who are homozygous for methionine (M129M) have delayed symptom onset.2 Our patient was heterozygous (M129V). It is interesting to speculate that HPG heterozygosity, combined with a mutation causing a stop codon, predisposed our patient to more rapid accumulation of copper and earlier age of onset.  

DISCUSSION

Wilson disease is an inherited disorder of copper metabolism.3 An inherent difficulty in its recognition, diagnosis, and management is its rarity: global prevalence is estimated as 1/30,000, although this varies by geographic location.1 In contrast, ADHD has a prevalence of 7.2%,4 making it 2400 times more prevalent than Wilson disease. Furthermore, abnormal liver function tests are common in children; the differential diagnosis includes etiologies such as infection (both viral and nonviral), immune-mediated inflammatory disease, drug toxicity (iatrogenic or medication-induced), anatomic abnormalities, and nonalcoholic fatty liver disease.5

Wilson disease is remarkable, however, for being easily treatable if detected and devastating if not. Although liver abnormalities often improve with treatment, delayed diagnosis and management significantly impact neurologic recovery: An 18-month delay results in 38% of patients continuing to have major neurologic disabilities.6 Untreated, Wilson disease may be fatal within 5 years of development of neurologic symptoms.7 Thus, it has been suggested that evaluation for Wilson disease be considered in any child older than 1 year who presents with unexplained liver disease, including asymptomatic elevations of serum transaminases.8

Mutations in ATP7B on chromosome 13 are responsible for the pathology of Wilson disease9; more than 250 mutations have been identified, including substitutions, deletions, and missense mutations.10 Affected patients may be compound heterozygotes11 and/or may possess new mutations, as seen in our patient.

Although copper absorption is normal, impaired excretion causes toxic accumulation in affected organs. ATP7B’s product, ATPase 2, regulates copper excretion, as well as copper binding to apoceruloplasmin to form the carrier protein ceruloplasmin. An ATP7B abnormality would prevent the latter—making ceruloplasmin a useful screening biomarker and a reliable marker for Wilson disease by age 1 year.8

Continue to: Hepatic and neurocognitive effects

 

 

Hepatic and neurocognitive effects. Excess copper in hepatocytes causes oxidative damage and release of copper into the circulation, with accumulation in susceptible organs (eg, brain, kidneys). Hepatocyte apoptosis is accelerated by copper’s negative effect on inhibitor of apoptosis protein.12,13 Renal tubular damage leads to Fanconi syndrome,14 in which substances such as glucose, phosphates, and potassium are excreted in urine rather than absorbed into the bloodstream by the kidneys. Excess copper deposition in the Descemet membrane may lead to Kayser-Fleisher ring formation.15 In the brain, copper deposition may occur in the lenticular nuclei,3 as well as in the thalamus, subthalamus, brainstem, and frontal cortex—resulting in extrapyramidal, cerebral, and mild cerebellar symptoms.6  

Cognitive impairment, which may be subtle, includes increased impulsivity, impaired judgment, apathy, poor decision making, decreased attention, increased lability, slowed thinking, and memory loss.6 Behavioral manifestations include changes in school or work performance and outbursts mimicking ADHD12,16,17 as well as paranoia, depression, and bizarre behaviors.16,18 Neuropsychiatric abnormalities include personality changes, pseudoparkinsonism, dyskinesia/dysarthria, and ataxia/tremor. Younger patients with psychiatric symptoms may be labelled with depression, anxiety, obsessive-compulsive disorder, bipolar disorder, or antisocial disorder.6,16,18

Hepatic disease manifestations range from asymptomatic elevations in AST/ALT to acute hepatitis, mimicking infectious processes. Cirrhosis is the end result of untreated Wilson disease, with liver transplantation required if end-stage liver disease results. Rarely, patients present in fulminant hepatic failure, with death occurring if emergent liver transplantation is not performed.6,8,10

In this case, routine screening saved the day.

Of note, before age 10, > 80% of patients with Wilson disease present with hepatic symptoms; those ages 10 to 18 often manifest psychiatric changes.17 Kayser-Fleisher rings are common in patients with neurologic manifestations but less so in those who have hepatic presentations or are presymptomatic.6,15

Effective disease-mitigating treatment is indicated and available for both symptomatic and asymptomatic individuals and includes the copper chelators D-penicillamine (starting dose, 150-300 mg/d with a gradual weekly increase to 20 mg/kg/d) and trientine hydrochloride (a heavy metal chelating compound; starting dose, 20 mg/kg/d to a maximum of 1000 mg/d in young adults). Adverse effects of D-penicillamine include cutaneous eruptions, neutropenia, thrombocytopenia, proteinuria, and a lupus-like syndrome; therefore, trientine is increasingly being used as first-line therapy.8

Continue to: For asymptomatic...

 

 

For asymptomatic patients who have had effective chelation therapy and proven de-coppering, zinc salts are a useful follow-on therapy. Zinc’s proposed mechanism of action is induction of metallothionein in enterocytes, which promotes copper trapping and eventual excretion into the lumen. Importantly, treatment for Wilson disease is lifelong and monitoring of compliance is essential.8

Our 5-year-old patient was started on oral trientine at 20 mg/kg/d and a low copper diet. In response to this initial treatment, the patient’s liver function tests (LFTs) normalized, and he was switched to 25 mg tid of a zinc chelate, with continuation of the low copper diet. His LFTs have remained normal, although his urine copper levels are still elevated. He continues to be monitored periodically with LFTs and measurement of urine copper levels. He is also being treated for ADHD, as his presenting behavioral abnormalities suggestive of ADHD have not resolved.

THE TAKEAWAY

Although children presenting with symptoms consistent with ADHD often have ADHD, as was true in this case, it is important to consider other diagnoses. Unexplained elevations of liver function test values in children older than 1 year should prompt screening for Wilson disease.5,8 Additionally, other family members should be evaluated; if they have the disease, treatment should be started by age 2 years, even if the patient is asymptomatic.

In our patient’s case, routine screening saved the day. The complete metabolic panel revealed elevated ALT and AST levels, prompting further evaluation. Without this testing, his diagnosis likely would have been delayed, leading to progressive liver and central nervous system disease. With early identification and treatment, it is possible to stop the progression of Wilson disease.

CORRESPONDENCE
Jeffrey Taylor, MD, MS, Evangelical Community Hospital, Department of Pediatrics, 1 Hospital Drive, Lewisburg PA, 17837; jstaylor1@geisinger.edu.

THE CASE

A 5-year-old boy was brought into his primary care clinic by his mother, who expressed concern about her son’s increasing impulsiveness, aggression, and difficulty staying on task at preschool and at home. The child’s medical history was unremarkable, and he was taking no medications. The family history was negative for hepatic or metabolic disease and positive for attention deficit-hyperactivity disorder (ADHD; father).

The child’s growth was normal. His physical exam was remarkable for a liver edge 1 cm below his costal margin. No Kayser-Fleischer rings were present. 

Screening included a complete metabolic panel. Notable results included an alanine aminotransferase (ALT) level of 208 U/dL (normal range, < 30 U/dL), an aspartate transaminase (AST) level of 125 U/dL (normal range, 10-34 U/dL), and an alkaline phosphatase (ALP) of 470 U/dL (normal range, 93-309 U/dL). Subsequent repeat laboratory testing confirmed these elevations (ALT, 248 U/dL; AST, 137 U/dL; ALP, 462 U/dL). Ceruloplasmin levels were low (11 mg/dL; normal range, 18-35 mg/dL), and 24-hour urinary copper was not obtainable. Prothrombin/partial thromboplastin time, ammonia, lactate, total and direct bilirubin, and gamma-glutamyltransferase levels were normal.

Further evaluation included abdominal ultrasound and brain magnetic resonance imaging, both of which yielded normal results. Testing for Epstein-Barr virus; ­cytomegalovirus; hepatitis A, B, and C titers; and antinuclear, anti-smooth muscle, and anti–liver-kidney microsomal antibodies was negative.

THE DIAGNOSIS

The patient’s low ceruloplasmin prompted referral to Pediatric Gastroenterology for consultation and liver biopsy due to concern for Wilson disease. Biopsy results were consistent with, and quantitative liver copper confirmatory for, this diagnosis (FIGURE).

Liver biopsy supports Wilson disease diagnosis

Genetic testing for mutations in the ATP7B gene was performed on the patient, his mother, and his siblings (his father was unavailable). The patient, his mother, and his sister were all positive for His1069Gln mutation; only the patient was positive for a 3990_3993 del mutation (his half-brother was negative for both mutations). The presence of 2 different mutant alleles for the ATP7B gene, one on each chromosome—the common substitution mutation, His1069Gln, in exon 14 and a 3990_3993 del TTAT mutation in exon 19—qualified the patient as a compound heterozygote.

The 3990_3993 del TTAT mutation—which to our knowledge has not been previously reported—produced a translational frame shift and premature stop codon. As others have pointed out, frame shift and missense mutations produce a more severe phenotype.1

Continue to: Further testing was prompted...

 

 

Further testing was prompted by a report suggesting that codon 129 mutations of the human prion gene (HPG) influence Wilson disease.2 Compared with patients who are heterozygous (M129V) or homozygous (V129V) for valine, those who are homozygous for methionine (M129M) have delayed symptom onset.2 Our patient was heterozygous (M129V). It is interesting to speculate that HPG heterozygosity, combined with a mutation causing a stop codon, predisposed our patient to more rapid accumulation of copper and earlier age of onset.  

DISCUSSION

Wilson disease is an inherited disorder of copper metabolism.3 An inherent difficulty in its recognition, diagnosis, and management is its rarity: global prevalence is estimated as 1/30,000, although this varies by geographic location.1 In contrast, ADHD has a prevalence of 7.2%,4 making it 2400 times more prevalent than Wilson disease. Furthermore, abnormal liver function tests are common in children; the differential diagnosis includes etiologies such as infection (both viral and nonviral), immune-mediated inflammatory disease, drug toxicity (iatrogenic or medication-induced), anatomic abnormalities, and nonalcoholic fatty liver disease.5

Wilson disease is remarkable, however, for being easily treatable if detected and devastating if not. Although liver abnormalities often improve with treatment, delayed diagnosis and management significantly impact neurologic recovery: An 18-month delay results in 38% of patients continuing to have major neurologic disabilities.6 Untreated, Wilson disease may be fatal within 5 years of development of neurologic symptoms.7 Thus, it has been suggested that evaluation for Wilson disease be considered in any child older than 1 year who presents with unexplained liver disease, including asymptomatic elevations of serum transaminases.8

Mutations in ATP7B on chromosome 13 are responsible for the pathology of Wilson disease9; more than 250 mutations have been identified, including substitutions, deletions, and missense mutations.10 Affected patients may be compound heterozygotes11 and/or may possess new mutations, as seen in our patient.

Although copper absorption is normal, impaired excretion causes toxic accumulation in affected organs. ATP7B’s product, ATPase 2, regulates copper excretion, as well as copper binding to apoceruloplasmin to form the carrier protein ceruloplasmin. An ATP7B abnormality would prevent the latter—making ceruloplasmin a useful screening biomarker and a reliable marker for Wilson disease by age 1 year.8

Continue to: Hepatic and neurocognitive effects

 

 

Hepatic and neurocognitive effects. Excess copper in hepatocytes causes oxidative damage and release of copper into the circulation, with accumulation in susceptible organs (eg, brain, kidneys). Hepatocyte apoptosis is accelerated by copper’s negative effect on inhibitor of apoptosis protein.12,13 Renal tubular damage leads to Fanconi syndrome,14 in which substances such as glucose, phosphates, and potassium are excreted in urine rather than absorbed into the bloodstream by the kidneys. Excess copper deposition in the Descemet membrane may lead to Kayser-Fleisher ring formation.15 In the brain, copper deposition may occur in the lenticular nuclei,3 as well as in the thalamus, subthalamus, brainstem, and frontal cortex—resulting in extrapyramidal, cerebral, and mild cerebellar symptoms.6  

Cognitive impairment, which may be subtle, includes increased impulsivity, impaired judgment, apathy, poor decision making, decreased attention, increased lability, slowed thinking, and memory loss.6 Behavioral manifestations include changes in school or work performance and outbursts mimicking ADHD12,16,17 as well as paranoia, depression, and bizarre behaviors.16,18 Neuropsychiatric abnormalities include personality changes, pseudoparkinsonism, dyskinesia/dysarthria, and ataxia/tremor. Younger patients with psychiatric symptoms may be labelled with depression, anxiety, obsessive-compulsive disorder, bipolar disorder, or antisocial disorder.6,16,18

Hepatic disease manifestations range from asymptomatic elevations in AST/ALT to acute hepatitis, mimicking infectious processes. Cirrhosis is the end result of untreated Wilson disease, with liver transplantation required if end-stage liver disease results. Rarely, patients present in fulminant hepatic failure, with death occurring if emergent liver transplantation is not performed.6,8,10

In this case, routine screening saved the day.

Of note, before age 10, > 80% of patients with Wilson disease present with hepatic symptoms; those ages 10 to 18 often manifest psychiatric changes.17 Kayser-Fleisher rings are common in patients with neurologic manifestations but less so in those who have hepatic presentations or are presymptomatic.6,15

Effective disease-mitigating treatment is indicated and available for both symptomatic and asymptomatic individuals and includes the copper chelators D-penicillamine (starting dose, 150-300 mg/d with a gradual weekly increase to 20 mg/kg/d) and trientine hydrochloride (a heavy metal chelating compound; starting dose, 20 mg/kg/d to a maximum of 1000 mg/d in young adults). Adverse effects of D-penicillamine include cutaneous eruptions, neutropenia, thrombocytopenia, proteinuria, and a lupus-like syndrome; therefore, trientine is increasingly being used as first-line therapy.8

Continue to: For asymptomatic...

 

 

For asymptomatic patients who have had effective chelation therapy and proven de-coppering, zinc salts are a useful follow-on therapy. Zinc’s proposed mechanism of action is induction of metallothionein in enterocytes, which promotes copper trapping and eventual excretion into the lumen. Importantly, treatment for Wilson disease is lifelong and monitoring of compliance is essential.8

Our 5-year-old patient was started on oral trientine at 20 mg/kg/d and a low copper diet. In response to this initial treatment, the patient’s liver function tests (LFTs) normalized, and he was switched to 25 mg tid of a zinc chelate, with continuation of the low copper diet. His LFTs have remained normal, although his urine copper levels are still elevated. He continues to be monitored periodically with LFTs and measurement of urine copper levels. He is also being treated for ADHD, as his presenting behavioral abnormalities suggestive of ADHD have not resolved.

THE TAKEAWAY

Although children presenting with symptoms consistent with ADHD often have ADHD, as was true in this case, it is important to consider other diagnoses. Unexplained elevations of liver function test values in children older than 1 year should prompt screening for Wilson disease.5,8 Additionally, other family members should be evaluated; if they have the disease, treatment should be started by age 2 years, even if the patient is asymptomatic.

In our patient’s case, routine screening saved the day. The complete metabolic panel revealed elevated ALT and AST levels, prompting further evaluation. Without this testing, his diagnosis likely would have been delayed, leading to progressive liver and central nervous system disease. With early identification and treatment, it is possible to stop the progression of Wilson disease.

CORRESPONDENCE
Jeffrey Taylor, MD, MS, Evangelical Community Hospital, Department of Pediatrics, 1 Hospital Drive, Lewisburg PA, 17837; jstaylor1@geisinger.edu.

References

1. Wu F, Wang J, Pu C, et al. Wilson’s disease: a comprehensive review of the molecular mechanisms. Int J Mol Sci. 2015;16:6419-6431.

2. Merle U, Stremmel W, Gessner R. Influence of homozygosity for methionine at codon 129 of the human prion gene on the onset of neurological and hepatic symptoms in Wilson disease. Arch Neurol. 2006;63:982-985.

3. Compston A. Progressive lenticular degeneration: a familial nervous disease associated with cirrhosis of the liver, by S. A. Kinnier Wilson, (From the National Hospital, and the Laboratory of the National Hospital, Queen Square, London) Brain 1912: 34; 295-509. Brain. 2009;132(pt 8):1997-2001.

4. Thomas R, Sanders S, Doust J, et al. Prevalence of attention-­deficit/hyperactivity disorder: a systematic review and meta-analysis. Pediatrics. 2015;135:e994-e1001.

5. Kang K. Abnormality on liver function test. Pediatr Gastroenterol Hepatol Nutr. 2013;16:225-232.

6. Lorincz M. Neurologic Wilson’s disease. Ann NY Acad Sci. 2010;1184:173-187.

7. Dening TR, Berrios GE, Walshe JM. Wilson’s disease and epilepsy. Brain. 1988;111(pt 5):1139-1155.

8. Socha P, Janczyk W, Dhawan A, et al. Wilson’s disease in children: a position paper by the Hepatology Committee of the European Society for Paediatric Gastroenterology, Hepatology and Nutrition. J Pediatr Gastroenterol Nutrit. 2018;66:334-344.

9. Bull PC, Thomas GR, Rommens JM, et al. The Wilson disease gene is a putative copper transporting P-type ATPase similar to the Menkes gene. Nat Genet. 1993;5:327-337.

10. Ala A, Schilsky ML. Wilson disease: pathophysiology, diagnosis, treatment and screening. Clin Liver Dis. 2004;8:787-805, viii.

11. Thomas GR, Forbes JR, Roberts EA, et al. The Wilson disease gene: spectrum of mutations and their consequences. Nat Genet. 1995;9:210-217.

12. Pfeiffer RF. Wilson’s disease. Semin Neurol. 2007;27:123-132.

13. Das SK, Ray K. Wilson’s disease: an update. Nat Clin Pract Neurol. 2006;2:482-493.

14. Morgan HG, Stewart WK, Lowe KG, et al. Wilson’s disease and the Fanconi syndrome. Q J Med. 1962;31:361-384.

15. Wiebers DO, Hollenhorst RW, Goldstein NP. The ophthalmologic manifestations of Wilson’s disease. Mayo Clin Proc. 1977;52:409-416.

16. Jackson GH, Meyer A, Lippmann S. Wilson’s disease: psychiatric manifestations may be the clinical presentation. Postgrad Med. 1994;95:135-138.

17. O’Conner JA, Sokol RJ. Copper metabolism and copper storage disorders. In: Suchy FJ, Sokol RJ, Balistreri WF, eds. Liver Disease in Children. 3rd ed. New York, NY: Cambridge University Press; 2007:626-660.

18. Dening TR, Berrios GE. Wilson’s disease: a longitudinal study of psychiatric symptoms. Biol Psychiatry. 1990;28:255-265.

References

1. Wu F, Wang J, Pu C, et al. Wilson’s disease: a comprehensive review of the molecular mechanisms. Int J Mol Sci. 2015;16:6419-6431.

2. Merle U, Stremmel W, Gessner R. Influence of homozygosity for methionine at codon 129 of the human prion gene on the onset of neurological and hepatic symptoms in Wilson disease. Arch Neurol. 2006;63:982-985.

3. Compston A. Progressive lenticular degeneration: a familial nervous disease associated with cirrhosis of the liver, by S. A. Kinnier Wilson, (From the National Hospital, and the Laboratory of the National Hospital, Queen Square, London) Brain 1912: 34; 295-509. Brain. 2009;132(pt 8):1997-2001.

4. Thomas R, Sanders S, Doust J, et al. Prevalence of attention-­deficit/hyperactivity disorder: a systematic review and meta-analysis. Pediatrics. 2015;135:e994-e1001.

5. Kang K. Abnormality on liver function test. Pediatr Gastroenterol Hepatol Nutr. 2013;16:225-232.

6. Lorincz M. Neurologic Wilson’s disease. Ann NY Acad Sci. 2010;1184:173-187.

7. Dening TR, Berrios GE, Walshe JM. Wilson’s disease and epilepsy. Brain. 1988;111(pt 5):1139-1155.

8. Socha P, Janczyk W, Dhawan A, et al. Wilson’s disease in children: a position paper by the Hepatology Committee of the European Society for Paediatric Gastroenterology, Hepatology and Nutrition. J Pediatr Gastroenterol Nutrit. 2018;66:334-344.

9. Bull PC, Thomas GR, Rommens JM, et al. The Wilson disease gene is a putative copper transporting P-type ATPase similar to the Menkes gene. Nat Genet. 1993;5:327-337.

10. Ala A, Schilsky ML. Wilson disease: pathophysiology, diagnosis, treatment and screening. Clin Liver Dis. 2004;8:787-805, viii.

11. Thomas GR, Forbes JR, Roberts EA, et al. The Wilson disease gene: spectrum of mutations and their consequences. Nat Genet. 1995;9:210-217.

12. Pfeiffer RF. Wilson’s disease. Semin Neurol. 2007;27:123-132.

13. Das SK, Ray K. Wilson’s disease: an update. Nat Clin Pract Neurol. 2006;2:482-493.

14. Morgan HG, Stewart WK, Lowe KG, et al. Wilson’s disease and the Fanconi syndrome. Q J Med. 1962;31:361-384.

15. Wiebers DO, Hollenhorst RW, Goldstein NP. The ophthalmologic manifestations of Wilson’s disease. Mayo Clin Proc. 1977;52:409-416.

16. Jackson GH, Meyer A, Lippmann S. Wilson’s disease: psychiatric manifestations may be the clinical presentation. Postgrad Med. 1994;95:135-138.

17. O’Conner JA, Sokol RJ. Copper metabolism and copper storage disorders. In: Suchy FJ, Sokol RJ, Balistreri WF, eds. Liver Disease in Children. 3rd ed. New York, NY: Cambridge University Press; 2007:626-660.

18. Dening TR, Berrios GE. Wilson’s disease: a longitudinal study of psychiatric symptoms. Biol Psychiatry. 1990;28:255-265.

Issue
The Journal of Family Practice - 69(2)
Issue
The Journal of Family Practice - 69(2)
Page Number
97-100
Page Number
97-100
Publications
Publications
Topics
Article Type
Display Headline
5-year-old boy • behavioral issues • elevated ALT and AST levels • Dx?
Display Headline
5-year-old boy • behavioral issues • elevated ALT and AST levels • Dx?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
PubMed ID
32182292
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Mammography does not reduce breast cancer deaths in women 75 and older

Article Type
Changed
Thu, 12/15/2022 - 17:38

While more than half of women aged 75 years and older receive annual mammograms, they do not see a reduced risk of death from breast cancer, compared with women who have stopped regular screening, according to a study published in Annals of Internal Medicine.

copyright/Thinkstock

The lack of benefit is not because older women’s cancer risk is low; a third of breast cancer deaths occur in women diagnosed at or after age 70 years, according to study author Xabier García-Albéniz, MD, PhD, of Harvard University in Boston, and colleagues.

The lack of benefit is not because mammography is less effective in women older than 75 years; indeed, it becomes a better diagnostic tool as women age, said Otis Brawley, MD, of Johns Hopkins University, Baltimore, the author of an editorial related to the study. Rather, the lack of benefit is because breast cancer treatment in older women is less successful, he clarified.
 

Study details

Dr. García-Albéniz and colleagues looked at data from 1,058,013 women enrolled in Medicare across the United States during 2000-2008. All subjects were aged 70-84 years and had a life expectancy of at least 10 years, at least one recent mammogram, and no history of breast cancer.

There are little randomized trial data available on mammography and breast cancer deaths for women in their early 70s and none for women older than 75 years. To compensate for this, the researchers aimed to emulate a prospective trial by looking at deaths over an 8-year period for women aged 70 and older who either continued annual screening or stopped it. The investigators conducted separate analyses for women aged 70-74 years and those 75-84 years of age.

Diagnoses of breast cancer were, not surprisingly, higher in the continued-screening group, but this did not translate to serious reductions in death.

In the continued-screening group, the estimated 8-year risk for breast cancer was 5.5% in women aged 70-74 and 5.8% in women aged 75-84 years. Among women who stopped screening, the estimated 8-year risk for breast cancer was 3.9% in both age groups.

Among women aged 70-74 years, the estimated 8-year risk for breast cancer death was slightly reduced with continued screening: 2.7 deaths per 1,000 women, compared with 3.7 deaths per 1,000 women for those who stopped screening. The risk difference was –1.0 deaths per 1,000 women, and the hazard ratio was 0.78.

Among women aged 75-84 years, there was no difference in estimated 8-year risk for breast cancer death. Women treated under a continued screening protocol had 3.8 deaths per 1,000, while the stop-screening group had 3.7 deaths per 1,000. The risk difference was 0.07 deaths per 1,000 women, and the hazard ratio was 1.00.

Interpreting the results

In the editorial accompanying this study, Dr. Brawley praised its design as “especially useful in breast cancer screening,” as “prospective randomized studies of mammography are not feasible and are perhaps no longer ethical in older women … because mammography is so widely accepted.”

In an interview, Dr. Brawley stressed that the findings do not argue for denying women aged 75 years and older mammography screening. Decisions about screening require a value judgment tailored to each individual patient’s perceived risks and benefits, he said.

Dr. Otis W. Brawley  of Johns Hopkins University, Baltimore
Dr. Otis W. Brawley

In the absence of randomized trial evidence, “the jury will always be out” on the benefits of regular mammography for women 75 and older, Dr. Brawley said. “A clinical trial or a modeling study always tells you about an average person who doesn’t exist,” he added. “I predict that, in the future, we will have more parameters to tell us, ‘this is a person who’s 80 years old who is likely to benefit from screening; this is a person who is 75 years old who is unlikely to benefit.’ ”

And focusing too much on screening, he said, can divert attention from a key driver of breast cancer mortality in older women: inadequate treatment.

In the United States, Dr. Brawley said, “There’s a lot of emphasis on screening but fewer people writing about the fact that nearly 40% of American women get less than optimal treatment once they’re diagnosed.”

Dr. Brawley cited a 2013 modeling study showing that improvements in delivering current treatments would save more women even if screening rates remained unaltered (Cancer. 2013 Jul 15;119[14]:2541-8).

Among women in their 70s and 80s, Dr. Brawley said, some of the barriers to effective breast cancer care aren’t related to treatment efficacy but to travel and other logistical issues that can become more pronounced with age. “Unfortunately, there’s very little research on why, for women in their 70s and 80s, the treatments don’t work as well as they work in women 20 years younger,” he said.

Dr. García-Albéniz and colleagues’ study was funded by the National Institutes of Health. One coauthor reported financial ties to industry. Dr. Brawley discloses no conflicts of interest related to his editorial.

SOURCE: García-Albéniz X et al. Ann Intern Med 2020. doi: 10.7326/M18-1199.

Publications
Topics
Sections

While more than half of women aged 75 years and older receive annual mammograms, they do not see a reduced risk of death from breast cancer, compared with women who have stopped regular screening, according to a study published in Annals of Internal Medicine.

copyright/Thinkstock

The lack of benefit is not because older women’s cancer risk is low; a third of breast cancer deaths occur in women diagnosed at or after age 70 years, according to study author Xabier García-Albéniz, MD, PhD, of Harvard University in Boston, and colleagues.

The lack of benefit is not because mammography is less effective in women older than 75 years; indeed, it becomes a better diagnostic tool as women age, said Otis Brawley, MD, of Johns Hopkins University, Baltimore, the author of an editorial related to the study. Rather, the lack of benefit is because breast cancer treatment in older women is less successful, he clarified.
 

Study details

Dr. García-Albéniz and colleagues looked at data from 1,058,013 women enrolled in Medicare across the United States during 2000-2008. All subjects were aged 70-84 years and had a life expectancy of at least 10 years, at least one recent mammogram, and no history of breast cancer.

There are little randomized trial data available on mammography and breast cancer deaths for women in their early 70s and none for women older than 75 years. To compensate for this, the researchers aimed to emulate a prospective trial by looking at deaths over an 8-year period for women aged 70 and older who either continued annual screening or stopped it. The investigators conducted separate analyses for women aged 70-74 years and those 75-84 years of age.

Diagnoses of breast cancer were, not surprisingly, higher in the continued-screening group, but this did not translate to serious reductions in death.

In the continued-screening group, the estimated 8-year risk for breast cancer was 5.5% in women aged 70-74 and 5.8% in women aged 75-84 years. Among women who stopped screening, the estimated 8-year risk for breast cancer was 3.9% in both age groups.

Among women aged 70-74 years, the estimated 8-year risk for breast cancer death was slightly reduced with continued screening: 2.7 deaths per 1,000 women, compared with 3.7 deaths per 1,000 women for those who stopped screening. The risk difference was –1.0 deaths per 1,000 women, and the hazard ratio was 0.78.

Among women aged 75-84 years, there was no difference in estimated 8-year risk for breast cancer death. Women treated under a continued screening protocol had 3.8 deaths per 1,000, while the stop-screening group had 3.7 deaths per 1,000. The risk difference was 0.07 deaths per 1,000 women, and the hazard ratio was 1.00.

Interpreting the results

In the editorial accompanying this study, Dr. Brawley praised its design as “especially useful in breast cancer screening,” as “prospective randomized studies of mammography are not feasible and are perhaps no longer ethical in older women … because mammography is so widely accepted.”

In an interview, Dr. Brawley stressed that the findings do not argue for denying women aged 75 years and older mammography screening. Decisions about screening require a value judgment tailored to each individual patient’s perceived risks and benefits, he said.

Dr. Otis W. Brawley  of Johns Hopkins University, Baltimore
Dr. Otis W. Brawley

In the absence of randomized trial evidence, “the jury will always be out” on the benefits of regular mammography for women 75 and older, Dr. Brawley said. “A clinical trial or a modeling study always tells you about an average person who doesn’t exist,” he added. “I predict that, in the future, we will have more parameters to tell us, ‘this is a person who’s 80 years old who is likely to benefit from screening; this is a person who is 75 years old who is unlikely to benefit.’ ”

And focusing too much on screening, he said, can divert attention from a key driver of breast cancer mortality in older women: inadequate treatment.

In the United States, Dr. Brawley said, “There’s a lot of emphasis on screening but fewer people writing about the fact that nearly 40% of American women get less than optimal treatment once they’re diagnosed.”

Dr. Brawley cited a 2013 modeling study showing that improvements in delivering current treatments would save more women even if screening rates remained unaltered (Cancer. 2013 Jul 15;119[14]:2541-8).

Among women in their 70s and 80s, Dr. Brawley said, some of the barriers to effective breast cancer care aren’t related to treatment efficacy but to travel and other logistical issues that can become more pronounced with age. “Unfortunately, there’s very little research on why, for women in their 70s and 80s, the treatments don’t work as well as they work in women 20 years younger,” he said.

Dr. García-Albéniz and colleagues’ study was funded by the National Institutes of Health. One coauthor reported financial ties to industry. Dr. Brawley discloses no conflicts of interest related to his editorial.

SOURCE: García-Albéniz X et al. Ann Intern Med 2020. doi: 10.7326/M18-1199.

While more than half of women aged 75 years and older receive annual mammograms, they do not see a reduced risk of death from breast cancer, compared with women who have stopped regular screening, according to a study published in Annals of Internal Medicine.

copyright/Thinkstock

The lack of benefit is not because older women’s cancer risk is low; a third of breast cancer deaths occur in women diagnosed at or after age 70 years, according to study author Xabier García-Albéniz, MD, PhD, of Harvard University in Boston, and colleagues.

The lack of benefit is not because mammography is less effective in women older than 75 years; indeed, it becomes a better diagnostic tool as women age, said Otis Brawley, MD, of Johns Hopkins University, Baltimore, the author of an editorial related to the study. Rather, the lack of benefit is because breast cancer treatment in older women is less successful, he clarified.
 

Study details

Dr. García-Albéniz and colleagues looked at data from 1,058,013 women enrolled in Medicare across the United States during 2000-2008. All subjects were aged 70-84 years and had a life expectancy of at least 10 years, at least one recent mammogram, and no history of breast cancer.

There are little randomized trial data available on mammography and breast cancer deaths for women in their early 70s and none for women older than 75 years. To compensate for this, the researchers aimed to emulate a prospective trial by looking at deaths over an 8-year period for women aged 70 and older who either continued annual screening or stopped it. The investigators conducted separate analyses for women aged 70-74 years and those 75-84 years of age.

Diagnoses of breast cancer were, not surprisingly, higher in the continued-screening group, but this did not translate to serious reductions in death.

In the continued-screening group, the estimated 8-year risk for breast cancer was 5.5% in women aged 70-74 and 5.8% in women aged 75-84 years. Among women who stopped screening, the estimated 8-year risk for breast cancer was 3.9% in both age groups.

Among women aged 70-74 years, the estimated 8-year risk for breast cancer death was slightly reduced with continued screening: 2.7 deaths per 1,000 women, compared with 3.7 deaths per 1,000 women for those who stopped screening. The risk difference was –1.0 deaths per 1,000 women, and the hazard ratio was 0.78.

Among women aged 75-84 years, there was no difference in estimated 8-year risk for breast cancer death. Women treated under a continued screening protocol had 3.8 deaths per 1,000, while the stop-screening group had 3.7 deaths per 1,000. The risk difference was 0.07 deaths per 1,000 women, and the hazard ratio was 1.00.

Interpreting the results

In the editorial accompanying this study, Dr. Brawley praised its design as “especially useful in breast cancer screening,” as “prospective randomized studies of mammography are not feasible and are perhaps no longer ethical in older women … because mammography is so widely accepted.”

In an interview, Dr. Brawley stressed that the findings do not argue for denying women aged 75 years and older mammography screening. Decisions about screening require a value judgment tailored to each individual patient’s perceived risks and benefits, he said.

Dr. Otis W. Brawley  of Johns Hopkins University, Baltimore
Dr. Otis W. Brawley

In the absence of randomized trial evidence, “the jury will always be out” on the benefits of regular mammography for women 75 and older, Dr. Brawley said. “A clinical trial or a modeling study always tells you about an average person who doesn’t exist,” he added. “I predict that, in the future, we will have more parameters to tell us, ‘this is a person who’s 80 years old who is likely to benefit from screening; this is a person who is 75 years old who is unlikely to benefit.’ ”

And focusing too much on screening, he said, can divert attention from a key driver of breast cancer mortality in older women: inadequate treatment.

In the United States, Dr. Brawley said, “There’s a lot of emphasis on screening but fewer people writing about the fact that nearly 40% of American women get less than optimal treatment once they’re diagnosed.”

Dr. Brawley cited a 2013 modeling study showing that improvements in delivering current treatments would save more women even if screening rates remained unaltered (Cancer. 2013 Jul 15;119[14]:2541-8).

Among women in their 70s and 80s, Dr. Brawley said, some of the barriers to effective breast cancer care aren’t related to treatment efficacy but to travel and other logistical issues that can become more pronounced with age. “Unfortunately, there’s very little research on why, for women in their 70s and 80s, the treatments don’t work as well as they work in women 20 years younger,” he said.

Dr. García-Albéniz and colleagues’ study was funded by the National Institutes of Health. One coauthor reported financial ties to industry. Dr. Brawley discloses no conflicts of interest related to his editorial.

SOURCE: García-Albéniz X et al. Ann Intern Med 2020. doi: 10.7326/M18-1199.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

NPH insulin: It remains a good option

Article Type
Changed
Tue, 05/03/2022 - 15:11
Display Headline
NPH insulin: It remains a good option

ILLUSTRATIVE CASE

Blanche is a 54-year-old overweight woman who has had type 2 diabetes mellitus (T2DM) for 5 years. She has been optimized on both metformin (1000 mg bid) and exenatide (2 mg weekly). While taking these medications, her hemoglobin A1C (HbA1C) has dropped from 11.2 to 8.4, and her body mass index (BMI) has declined from 35 to 31. However, she is still not at goal. You decide to start her on long-acting basal insulin. She has limited income, and she currently spends $75/month for her metformin, exenatide, atorvastatin, and lisinopril. What insulin do you prescribe?

The Centers for Disease Control and Prevention (CDC) reported that the prevalence of diabetes in the United States was 9.4% (30.3 million people) in 2015.2 Among those affected, approximately 95.8% had T2DM.2 The same report estimated that 1.5 million new cases of diabetes (6.7 per 1000 persons) were diagnosed annually among US adults ≥ 18 years of age, and that about $7900 of annual medical expenses for patients diagnosed with diabetes was directly attributable to diabetes.2

In the United States, neutral protamine Hagedorn (NPH) insulin was the most commonly used intermediate- to long-acting insulin until the introduction of the long-acting insulin analogs (insulin glargine in 2000 and insulin detemir in 2005).3 Despite being considerably more expensive than NPH insulin, long-acting insulin analogs had captured more than 80% of the total long-acting insulin market by 2010.4 The market share for NPH insulin dropped from 81.9% in 2001 to 16.2% in 2010.4

While the newer insulin analogs are significantly more expensive than NPH insulin, with higher corresponding out-of-pocket costs to patients, researchers have had a difficult time demonstrating greater effectiveness or any definitive differences in any long-term outcomes between NPH and the insulin analogs. A 2007 Cochrane review comparing NPH insulin to both glargine and detemir showed little difference in metabolic control (as measured by HbA1C) or in the rate of severe hypoglycemia. However, the rates of symptomatic, overall, and nocturnal hypoglycemia were statistically lower with the insulin analogs.5

A 2015 retrospective observational study from the Veterans Health Administration (N = 142,940) covering a 10-year period from 2000 to 2010 found no consistent differences in long-term health outcomes when comparing the use of long-acting insulin analogs to that of NPH insulin.3,6

STUDY SUMMARY

Study compares performance of basal insulin analogs to that of NPH

This retrospective, observational study included 25,489 adult patients with T2DM who were enrolled in Kaiser Permanente of Northern California, had full medical and prescription coverage, and initiated basal insulin therapy with either NPH or an insulin analog between 2006 and 2015.

This study makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

The primary outcome was the time from basal insulin therapy initiation to a hypoglycemia-related emergency department (ED) visit or hospital admission. The secondary outcome was the change in HbA1C level within 1 year of initiation of basal insulin therapy.

Continue to: Per 1000 person-years...

 

 

Per 1000 person-years, there was no significant difference in hypoglycemia-related ED visits or hospital admissions between the analog and NPH groups (11.9 events vs 8.8 events, respectively; between-group difference, 3.1 events; 95% confidence interval [CI], –1.5 to 7.7). HbA1C reduction was statistically greater with NPH, but most likely not clinically significant between insulin analogs and NPH (1.26 vs 1.48 percentage points; between group difference, –0.22%; 95% CI, –0.09% to –0.37%).

WHAT’S NEW?

No clinically relevant differences between insulin analogs and NPH

This study revealed that there is no clinically relevant difference in HbA1C levels and no difference in patient-focused outcomes of hypoglycemia-related ED visits or hospital admissions between NPH insulin and the more expensive insulin analogs. This makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

CAVEATS

Demographics and less severe hypoglycemia might be at issue

This retrospective, observational study has broad demographics (but moderate under-representation of African-Americans), minimal patient health care disparities, and good access to medications. But generalizability outside of an integrated health delivery system may be limited. The study design also is subject to confounding, as not all potential impacts on the results can be corrected for or controlled in an observational study. Also, less profound hypoglycemia that did not require an ED visit or hospital admission was not captured.

 

CHALLENGES TO IMPLEMENTATION

Convenience and marketing factors may hinder change

Insulin analogs may have a number of convenience and marketing factors that may make it hard for providers and systems to change and use more NPH. However, the easy-to-use insulin analog pens are matched in availability and convenience by the much less advertised NPH insulin pens produced by at least 3 major pharmaceutical companies. In addition, while the overall cost for the insulin analogs continues to be 2 to 3 times that of non-human NPH insulin, insurance often covers up to, or more than, 80% of the cost of the insulin analogs, making the difference in the patient’s copay between the 2 not as severe. For example, patients may pay $30 to $40 per month for insulin analogs vs $10 to $25 per month for cheaper versions of NPH.7,8

ACKNOWLEDGMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.

Files
References

1. Lipska KJ, Parker MM, Moffet HH, et al. Association of initiation of basal insulin analogs vs neutral protamine Hagedorn insulin with hypoglycemia-related emergency department visits or hospital admissions and with glycemic control in patients with type 2 diabetes. JAMA. 2018;320:53-62.

2. Centers for Disease Control and Prevention. National Diabetes Statistics Report, 2017. Atlanta, GA: Centers for Disease Control and Prevention, U.S. Dept of Health and Human Services; 2017. www.cdc.gov/diabetes/pdfs/data/statistics/national-diabetes-statistics-report.pdf. Accessed January 15, 2020.

3. Prentice JC, Conlin PR, Gellad WF, et al. Long-term outcomes of analogue insulin compared with NPH for patients with type 2 diabetes mellitus. Am J Manag Care. 2015;21:e235-e243.

4. Turner LW, Nartey D, Stafford RS, et al. Ambulatory treatment of type 2 diabetes in the U.S., 1997-2012. Diabetes Care. 2014;37:985-992.

5. Horvath K, Jeitler K, Berghold A, et al. Long-acting insulin analogues versus NPH insulin (human isophane insulin) for type 2 diabetes mellitus. Cochrane Database Syst Rev. 2007;(2):CD005613.

6. Chamberlain JJ, Herman WH, Leal S, et al. Pharmacologic therapy for type 2 diabetes: synopsis of the 2017 American Diabetes Association standards of medical care in diabetes. Ann Intern Med. 2017;166:572-578.

7. GoodRx.com. Insulins. www.goodrx.com/insulins. Accessed January 20, 2020. 8. Cefalu WT, Dawes DE, Gavlak G, et al. Insulin access and affordability working group: conclusions and recommendations. Diabetes Care. 2018;41:1299-1311.

Article PDF
Author and Disclosure Information

Madigan Family Medicine Residency, JBLM/Tacoma, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado Family Medicine Residency, Denver

Issue
The Journal of Family Practice - 69(2)
Publications
Topics
Page Number
94-95
Sections
Files
Files
Author and Disclosure Information

Madigan Family Medicine Residency, JBLM/Tacoma, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado Family Medicine Residency, Denver

Author and Disclosure Information

Madigan Family Medicine Residency, JBLM/Tacoma, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado Family Medicine Residency, Denver

Article PDF
Article PDF

ILLUSTRATIVE CASE

Blanche is a 54-year-old overweight woman who has had type 2 diabetes mellitus (T2DM) for 5 years. She has been optimized on both metformin (1000 mg bid) and exenatide (2 mg weekly). While taking these medications, her hemoglobin A1C (HbA1C) has dropped from 11.2 to 8.4, and her body mass index (BMI) has declined from 35 to 31. However, she is still not at goal. You decide to start her on long-acting basal insulin. She has limited income, and she currently spends $75/month for her metformin, exenatide, atorvastatin, and lisinopril. What insulin do you prescribe?

The Centers for Disease Control and Prevention (CDC) reported that the prevalence of diabetes in the United States was 9.4% (30.3 million people) in 2015.2 Among those affected, approximately 95.8% had T2DM.2 The same report estimated that 1.5 million new cases of diabetes (6.7 per 1000 persons) were diagnosed annually among US adults ≥ 18 years of age, and that about $7900 of annual medical expenses for patients diagnosed with diabetes was directly attributable to diabetes.2

In the United States, neutral protamine Hagedorn (NPH) insulin was the most commonly used intermediate- to long-acting insulin until the introduction of the long-acting insulin analogs (insulin glargine in 2000 and insulin detemir in 2005).3 Despite being considerably more expensive than NPH insulin, long-acting insulin analogs had captured more than 80% of the total long-acting insulin market by 2010.4 The market share for NPH insulin dropped from 81.9% in 2001 to 16.2% in 2010.4

While the newer insulin analogs are significantly more expensive than NPH insulin, with higher corresponding out-of-pocket costs to patients, researchers have had a difficult time demonstrating greater effectiveness or any definitive differences in any long-term outcomes between NPH and the insulin analogs. A 2007 Cochrane review comparing NPH insulin to both glargine and detemir showed little difference in metabolic control (as measured by HbA1C) or in the rate of severe hypoglycemia. However, the rates of symptomatic, overall, and nocturnal hypoglycemia were statistically lower with the insulin analogs.5

A 2015 retrospective observational study from the Veterans Health Administration (N = 142,940) covering a 10-year period from 2000 to 2010 found no consistent differences in long-term health outcomes when comparing the use of long-acting insulin analogs to that of NPH insulin.3,6

STUDY SUMMARY

Study compares performance of basal insulin analogs to that of NPH

This retrospective, observational study included 25,489 adult patients with T2DM who were enrolled in Kaiser Permanente of Northern California, had full medical and prescription coverage, and initiated basal insulin therapy with either NPH or an insulin analog between 2006 and 2015.

This study makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

The primary outcome was the time from basal insulin therapy initiation to a hypoglycemia-related emergency department (ED) visit or hospital admission. The secondary outcome was the change in HbA1C level within 1 year of initiation of basal insulin therapy.

Continue to: Per 1000 person-years...

 

 

Per 1000 person-years, there was no significant difference in hypoglycemia-related ED visits or hospital admissions between the analog and NPH groups (11.9 events vs 8.8 events, respectively; between-group difference, 3.1 events; 95% confidence interval [CI], –1.5 to 7.7). HbA1C reduction was statistically greater with NPH, but most likely not clinically significant between insulin analogs and NPH (1.26 vs 1.48 percentage points; between group difference, –0.22%; 95% CI, –0.09% to –0.37%).

WHAT’S NEW?

No clinically relevant differences between insulin analogs and NPH

This study revealed that there is no clinically relevant difference in HbA1C levels and no difference in patient-focused outcomes of hypoglycemia-related ED visits or hospital admissions between NPH insulin and the more expensive insulin analogs. This makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

CAVEATS

Demographics and less severe hypoglycemia might be at issue

This retrospective, observational study has broad demographics (but moderate under-representation of African-Americans), minimal patient health care disparities, and good access to medications. But generalizability outside of an integrated health delivery system may be limited. The study design also is subject to confounding, as not all potential impacts on the results can be corrected for or controlled in an observational study. Also, less profound hypoglycemia that did not require an ED visit or hospital admission was not captured.

 

CHALLENGES TO IMPLEMENTATION

Convenience and marketing factors may hinder change

Insulin analogs may have a number of convenience and marketing factors that may make it hard for providers and systems to change and use more NPH. However, the easy-to-use insulin analog pens are matched in availability and convenience by the much less advertised NPH insulin pens produced by at least 3 major pharmaceutical companies. In addition, while the overall cost for the insulin analogs continues to be 2 to 3 times that of non-human NPH insulin, insurance often covers up to, or more than, 80% of the cost of the insulin analogs, making the difference in the patient’s copay between the 2 not as severe. For example, patients may pay $30 to $40 per month for insulin analogs vs $10 to $25 per month for cheaper versions of NPH.7,8

ACKNOWLEDGMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

Blanche is a 54-year-old overweight woman who has had type 2 diabetes mellitus (T2DM) for 5 years. She has been optimized on both metformin (1000 mg bid) and exenatide (2 mg weekly). While taking these medications, her hemoglobin A1C (HbA1C) has dropped from 11.2 to 8.4, and her body mass index (BMI) has declined from 35 to 31. However, she is still not at goal. You decide to start her on long-acting basal insulin. She has limited income, and she currently spends $75/month for her metformin, exenatide, atorvastatin, and lisinopril. What insulin do you prescribe?

The Centers for Disease Control and Prevention (CDC) reported that the prevalence of diabetes in the United States was 9.4% (30.3 million people) in 2015.2 Among those affected, approximately 95.8% had T2DM.2 The same report estimated that 1.5 million new cases of diabetes (6.7 per 1000 persons) were diagnosed annually among US adults ≥ 18 years of age, and that about $7900 of annual medical expenses for patients diagnosed with diabetes was directly attributable to diabetes.2

In the United States, neutral protamine Hagedorn (NPH) insulin was the most commonly used intermediate- to long-acting insulin until the introduction of the long-acting insulin analogs (insulin glargine in 2000 and insulin detemir in 2005).3 Despite being considerably more expensive than NPH insulin, long-acting insulin analogs had captured more than 80% of the total long-acting insulin market by 2010.4 The market share for NPH insulin dropped from 81.9% in 2001 to 16.2% in 2010.4

While the newer insulin analogs are significantly more expensive than NPH insulin, with higher corresponding out-of-pocket costs to patients, researchers have had a difficult time demonstrating greater effectiveness or any definitive differences in any long-term outcomes between NPH and the insulin analogs. A 2007 Cochrane review comparing NPH insulin to both glargine and detemir showed little difference in metabolic control (as measured by HbA1C) or in the rate of severe hypoglycemia. However, the rates of symptomatic, overall, and nocturnal hypoglycemia were statistically lower with the insulin analogs.5

A 2015 retrospective observational study from the Veterans Health Administration (N = 142,940) covering a 10-year period from 2000 to 2010 found no consistent differences in long-term health outcomes when comparing the use of long-acting insulin analogs to that of NPH insulin.3,6

STUDY SUMMARY

Study compares performance of basal insulin analogs to that of NPH

This retrospective, observational study included 25,489 adult patients with T2DM who were enrolled in Kaiser Permanente of Northern California, had full medical and prescription coverage, and initiated basal insulin therapy with either NPH or an insulin analog between 2006 and 2015.

This study makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

The primary outcome was the time from basal insulin therapy initiation to a hypoglycemia-related emergency department (ED) visit or hospital admission. The secondary outcome was the change in HbA1C level within 1 year of initiation of basal insulin therapy.

Continue to: Per 1000 person-years...

 

 

Per 1000 person-years, there was no significant difference in hypoglycemia-related ED visits or hospital admissions between the analog and NPH groups (11.9 events vs 8.8 events, respectively; between-group difference, 3.1 events; 95% confidence interval [CI], –1.5 to 7.7). HbA1C reduction was statistically greater with NPH, but most likely not clinically significant between insulin analogs and NPH (1.26 vs 1.48 percentage points; between group difference, –0.22%; 95% CI, –0.09% to –0.37%).

WHAT’S NEW?

No clinically relevant differences between insulin analogs and NPH

This study revealed that there is no clinically relevant difference in HbA1C levels and no difference in patient-focused outcomes of hypoglycemia-related ED visits or hospital admissions between NPH insulin and the more expensive insulin analogs. This makes a strong case for a different approach to initial basal insulin therapy for patients with T2DM who need insulin for glucose control.

CAVEATS

Demographics and less severe hypoglycemia might be at issue

This retrospective, observational study has broad demographics (but moderate under-representation of African-Americans), minimal patient health care disparities, and good access to medications. But generalizability outside of an integrated health delivery system may be limited. The study design also is subject to confounding, as not all potential impacts on the results can be corrected for or controlled in an observational study. Also, less profound hypoglycemia that did not require an ED visit or hospital admission was not captured.

 

CHALLENGES TO IMPLEMENTATION

Convenience and marketing factors may hinder change

Insulin analogs may have a number of convenience and marketing factors that may make it hard for providers and systems to change and use more NPH. However, the easy-to-use insulin analog pens are matched in availability and convenience by the much less advertised NPH insulin pens produced by at least 3 major pharmaceutical companies. In addition, while the overall cost for the insulin analogs continues to be 2 to 3 times that of non-human NPH insulin, insurance often covers up to, or more than, 80% of the cost of the insulin analogs, making the difference in the patient’s copay between the 2 not as severe. For example, patients may pay $30 to $40 per month for insulin analogs vs $10 to $25 per month for cheaper versions of NPH.7,8

ACKNOWLEDGMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.

References

1. Lipska KJ, Parker MM, Moffet HH, et al. Association of initiation of basal insulin analogs vs neutral protamine Hagedorn insulin with hypoglycemia-related emergency department visits or hospital admissions and with glycemic control in patients with type 2 diabetes. JAMA. 2018;320:53-62.

2. Centers for Disease Control and Prevention. National Diabetes Statistics Report, 2017. Atlanta, GA: Centers for Disease Control and Prevention, U.S. Dept of Health and Human Services; 2017. www.cdc.gov/diabetes/pdfs/data/statistics/national-diabetes-statistics-report.pdf. Accessed January 15, 2020.

3. Prentice JC, Conlin PR, Gellad WF, et al. Long-term outcomes of analogue insulin compared with NPH for patients with type 2 diabetes mellitus. Am J Manag Care. 2015;21:e235-e243.

4. Turner LW, Nartey D, Stafford RS, et al. Ambulatory treatment of type 2 diabetes in the U.S., 1997-2012. Diabetes Care. 2014;37:985-992.

5. Horvath K, Jeitler K, Berghold A, et al. Long-acting insulin analogues versus NPH insulin (human isophane insulin) for type 2 diabetes mellitus. Cochrane Database Syst Rev. 2007;(2):CD005613.

6. Chamberlain JJ, Herman WH, Leal S, et al. Pharmacologic therapy for type 2 diabetes: synopsis of the 2017 American Diabetes Association standards of medical care in diabetes. Ann Intern Med. 2017;166:572-578.

7. GoodRx.com. Insulins. www.goodrx.com/insulins. Accessed January 20, 2020. 8. Cefalu WT, Dawes DE, Gavlak G, et al. Insulin access and affordability working group: conclusions and recommendations. Diabetes Care. 2018;41:1299-1311.

References

1. Lipska KJ, Parker MM, Moffet HH, et al. Association of initiation of basal insulin analogs vs neutral protamine Hagedorn insulin with hypoglycemia-related emergency department visits or hospital admissions and with glycemic control in patients with type 2 diabetes. JAMA. 2018;320:53-62.

2. Centers for Disease Control and Prevention. National Diabetes Statistics Report, 2017. Atlanta, GA: Centers for Disease Control and Prevention, U.S. Dept of Health and Human Services; 2017. www.cdc.gov/diabetes/pdfs/data/statistics/national-diabetes-statistics-report.pdf. Accessed January 15, 2020.

3. Prentice JC, Conlin PR, Gellad WF, et al. Long-term outcomes of analogue insulin compared with NPH for patients with type 2 diabetes mellitus. Am J Manag Care. 2015;21:e235-e243.

4. Turner LW, Nartey D, Stafford RS, et al. Ambulatory treatment of type 2 diabetes in the U.S., 1997-2012. Diabetes Care. 2014;37:985-992.

5. Horvath K, Jeitler K, Berghold A, et al. Long-acting insulin analogues versus NPH insulin (human isophane insulin) for type 2 diabetes mellitus. Cochrane Database Syst Rev. 2007;(2):CD005613.

6. Chamberlain JJ, Herman WH, Leal S, et al. Pharmacologic therapy for type 2 diabetes: synopsis of the 2017 American Diabetes Association standards of medical care in diabetes. Ann Intern Med. 2017;166:572-578.

7. GoodRx.com. Insulins. www.goodrx.com/insulins. Accessed January 20, 2020. 8. Cefalu WT, Dawes DE, Gavlak G, et al. Insulin access and affordability working group: conclusions and recommendations. Diabetes Care. 2018;41:1299-1311.

Issue
The Journal of Family Practice - 69(2)
Issue
The Journal of Family Practice - 69(2)
Page Number
94-95
Page Number
94-95
Publications
Publications
Topics
Article Type
Display Headline
NPH insulin: It remains a good option
Display Headline
NPH insulin: It remains a good option
Sections
PURLs Copyright
Copyright © 2020. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Consider NPH insulin for patients who require initiation of long-acting insulin therapy because it is as safe as, and more cost-­effective than, basal insulin analogs.

STRENGTH OF RECOMMENDATION

B: Based on a single, large, retrospective, observational study.

Lipska KJ, Parker MM, Moffet HH, et al. Association of initiation of basal insulin analogs vs neutral protamine Hagedorn insulin with hypoglycemia-related emergency department visits or hospital admissions and with glycemic control in patients with type 2 diabetes. JAMA. 2018;320:53-62.1

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
PubMed ID
32182291
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media
Media Files

No reduction in oral mucositis with folinic acid post transplant

Article Type
Changed
Tue, 03/03/2020 - 15:06

– Folinic acid does not prevent oral mucositis in patients who are receiving a calcineurin inhibitor and methotrexate for graft-vs.-host disease prophylaxis, results of a multicenter study from Israel suggest.

Dr. Moshe Yeshrun

A randomized clinical trial to determine whether folinic acid rescue 24 hours after a methotrexate dose protects patients against severe oral mucositis was halted for futility after an interim analysis showed no advantage to adding folinic acid, reported Moshe Yeshrun, MD, of Rabin Medical Center at Tel Aviv University, Israel.

“Regarding the primary and secondary endpoints, we observed identical rates and duration of severe oral mucositis, as well as identical rates of oral mucositis of any grade in the folinic acid and placebo groups,” he said at the annual Transplantation and Cellular Therapy Meetings.

There were also no significant differences between the folinic acid and placebo control groups in time to neutrophil and platelet engraftment, rates of febrile neutropenia and bloodstream infections, veno-occlusive disease, need for opiates or total parenteral nutrition (TPN), or time from transplant to discharge, Dr. Yeshrun added at the meeting, held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.

Folinic acid therapy did not, however, appear to abrogate or interfere with the effects of methotrexate on prevention of either acute or chronic graft-vs-host disease (GVHD).
 

Severe adverse event

Oral mucositis can be a serious complication of therapy, associated with increased morbidity and mortality; significant pain; difficulty with eating or speaking; difficulty swallowing water, food, and medications; prolonged hospitalizations; and increased costs of care, Dr. Yeshrun noted.

The presence of oral mucositis sometimes leads clinicians to reduce or even skip methotrexate doses, thereby increasing risk for GVHD.

There are limited data from nonrandomized studies indicating that folinic acid (also called leucovorin) may reduce methotrexate-associated toxicities, and both the European Society for Blood and Marrow Transplantation and European LeukemiaNet working group recommend the use of folinic-acid rescue 24 hours following each methotrexate dose, Dr. Yeshrun said.

To see whether folinic acid rescue actually reduces the rate of methotrexate-induced toxicity and affects outcomes for patients who receive methotrexate post transplant for GVHD prophylaxis, Dr. Yeshrun and colleagues in three Israeli medical centers conducted a randomized, placebo-controlled trial.

The eligible study population included patients 18 and older with hematological malignancies in complete remission or minimal residual disease who underwent myeloablative conditioning and allogeneic transplant from HLA-matched or 1-antigen mismatched siblings or unrelated donors.

The patients were stratified by treatment center and intensity of the conditioning regimen, and then randomized on a 1:1 basis to receive folinic acid or placebo beginning 24 hours after each methotrexate dose, with the assigned medication given at a dose of 15 mg three times daily on the first day, and once daily on days 3 and 6.

Patients who received a transplant from an unrelated donor were also given anti-thymocyte globulin at a total dose of 15 mg/kg.

Supportive care included filgrastim (Neupogen) 5 mcg/kg from day 7 until neutrophil engraftment, infection prophylaxis, and ursodeoxycholic acid for prevention of veno-occlusive disease.
 

 

 

Trial stopped

Although the study was designed to enroll 116 patients to have a power of 80% to detect a 50% reduction in the rate of severe oral mucositis from an anticipated 50% in the placebo arm, a planned interim analysis conducted after approximately half of the target events occurred showed no difference in the rates of severe oral mucositis, and the trial was halted.

A total of 28 patients in the folinic acid group and 24 in the placebo group were available for the analysis.

The rate of grade 3 or 4 mucositis, the primary endpoint, was 46.6% in the folinic acid group, and 45.8% in the placebo group.

Respectively, the median duration of severe oral mucositis was 4 days in each group, days to neutrophil engraftment were a median of 12 and to platelet engraftment were a median of 13 days in each group, rates of febrile neutropenia were 57.1% and 58.3%, rates of bloodstream infections were 10.7% and 16.6%, rates of veno-occlusive disease were 7% and 12.5%, need for TPN occurred in 14.2% vs. 25%, need for opiates occurred in 78.5% vs. 66.6%, and median time to discharge was 18 and 19 days. As noted, none of the differences were statistically significant.

“These unequivocal interim results led to our decision to discontinue the study,” Dr. Yeshrun said.

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel, who was not involved in the study, said in an interview “I think this is a very practical and important study, because we need evidence-based data such as this.”

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel
Ted Bosworth/MDedge News
Dr. Arnon Nagler


Dr. Nagler, who comoderated the session where the data were presented, noted that the study was limited by the small number of patients and the lack of a subgroup analysis, but emphasized that the findings were important nonetheless.

Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program
Ted Bosworth/MDedge News
Dr. Maria Gilleece

His comoderator, Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program, noted in an interview that folinic acid is frequently used in the United Kingdom for patients receiving high-dose methotrexate, ”and certainly, this study suggests that may be inappropriate.”

Rabin Medical Center sponsored the trial. Dr. Yeshrun, Dr. Nagler, and Dr. Gilleece reported no relevant conflicts of interest.

SOURCE: Yeshrun M. et al. TCT 2020. Abstract 61.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Folinic acid does not prevent oral mucositis in patients who are receiving a calcineurin inhibitor and methotrexate for graft-vs.-host disease prophylaxis, results of a multicenter study from Israel suggest.

Dr. Moshe Yeshrun

A randomized clinical trial to determine whether folinic acid rescue 24 hours after a methotrexate dose protects patients against severe oral mucositis was halted for futility after an interim analysis showed no advantage to adding folinic acid, reported Moshe Yeshrun, MD, of Rabin Medical Center at Tel Aviv University, Israel.

“Regarding the primary and secondary endpoints, we observed identical rates and duration of severe oral mucositis, as well as identical rates of oral mucositis of any grade in the folinic acid and placebo groups,” he said at the annual Transplantation and Cellular Therapy Meetings.

There were also no significant differences between the folinic acid and placebo control groups in time to neutrophil and platelet engraftment, rates of febrile neutropenia and bloodstream infections, veno-occlusive disease, need for opiates or total parenteral nutrition (TPN), or time from transplant to discharge, Dr. Yeshrun added at the meeting, held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.

Folinic acid therapy did not, however, appear to abrogate or interfere with the effects of methotrexate on prevention of either acute or chronic graft-vs-host disease (GVHD).
 

Severe adverse event

Oral mucositis can be a serious complication of therapy, associated with increased morbidity and mortality; significant pain; difficulty with eating or speaking; difficulty swallowing water, food, and medications; prolonged hospitalizations; and increased costs of care, Dr. Yeshrun noted.

The presence of oral mucositis sometimes leads clinicians to reduce or even skip methotrexate doses, thereby increasing risk for GVHD.

There are limited data from nonrandomized studies indicating that folinic acid (also called leucovorin) may reduce methotrexate-associated toxicities, and both the European Society for Blood and Marrow Transplantation and European LeukemiaNet working group recommend the use of folinic-acid rescue 24 hours following each methotrexate dose, Dr. Yeshrun said.

To see whether folinic acid rescue actually reduces the rate of methotrexate-induced toxicity and affects outcomes for patients who receive methotrexate post transplant for GVHD prophylaxis, Dr. Yeshrun and colleagues in three Israeli medical centers conducted a randomized, placebo-controlled trial.

The eligible study population included patients 18 and older with hematological malignancies in complete remission or minimal residual disease who underwent myeloablative conditioning and allogeneic transplant from HLA-matched or 1-antigen mismatched siblings or unrelated donors.

The patients were stratified by treatment center and intensity of the conditioning regimen, and then randomized on a 1:1 basis to receive folinic acid or placebo beginning 24 hours after each methotrexate dose, with the assigned medication given at a dose of 15 mg three times daily on the first day, and once daily on days 3 and 6.

Patients who received a transplant from an unrelated donor were also given anti-thymocyte globulin at a total dose of 15 mg/kg.

Supportive care included filgrastim (Neupogen) 5 mcg/kg from day 7 until neutrophil engraftment, infection prophylaxis, and ursodeoxycholic acid for prevention of veno-occlusive disease.
 

 

 

Trial stopped

Although the study was designed to enroll 116 patients to have a power of 80% to detect a 50% reduction in the rate of severe oral mucositis from an anticipated 50% in the placebo arm, a planned interim analysis conducted after approximately half of the target events occurred showed no difference in the rates of severe oral mucositis, and the trial was halted.

A total of 28 patients in the folinic acid group and 24 in the placebo group were available for the analysis.

The rate of grade 3 or 4 mucositis, the primary endpoint, was 46.6% in the folinic acid group, and 45.8% in the placebo group.

Respectively, the median duration of severe oral mucositis was 4 days in each group, days to neutrophil engraftment were a median of 12 and to platelet engraftment were a median of 13 days in each group, rates of febrile neutropenia were 57.1% and 58.3%, rates of bloodstream infections were 10.7% and 16.6%, rates of veno-occlusive disease were 7% and 12.5%, need for TPN occurred in 14.2% vs. 25%, need for opiates occurred in 78.5% vs. 66.6%, and median time to discharge was 18 and 19 days. As noted, none of the differences were statistically significant.

“These unequivocal interim results led to our decision to discontinue the study,” Dr. Yeshrun said.

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel, who was not involved in the study, said in an interview “I think this is a very practical and important study, because we need evidence-based data such as this.”

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel
Ted Bosworth/MDedge News
Dr. Arnon Nagler


Dr. Nagler, who comoderated the session where the data were presented, noted that the study was limited by the small number of patients and the lack of a subgroup analysis, but emphasized that the findings were important nonetheless.

Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program
Ted Bosworth/MDedge News
Dr. Maria Gilleece

His comoderator, Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program, noted in an interview that folinic acid is frequently used in the United Kingdom for patients receiving high-dose methotrexate, ”and certainly, this study suggests that may be inappropriate.”

Rabin Medical Center sponsored the trial. Dr. Yeshrun, Dr. Nagler, and Dr. Gilleece reported no relevant conflicts of interest.

SOURCE: Yeshrun M. et al. TCT 2020. Abstract 61.

– Folinic acid does not prevent oral mucositis in patients who are receiving a calcineurin inhibitor and methotrexate for graft-vs.-host disease prophylaxis, results of a multicenter study from Israel suggest.

Dr. Moshe Yeshrun

A randomized clinical trial to determine whether folinic acid rescue 24 hours after a methotrexate dose protects patients against severe oral mucositis was halted for futility after an interim analysis showed no advantage to adding folinic acid, reported Moshe Yeshrun, MD, of Rabin Medical Center at Tel Aviv University, Israel.

“Regarding the primary and secondary endpoints, we observed identical rates and duration of severe oral mucositis, as well as identical rates of oral mucositis of any grade in the folinic acid and placebo groups,” he said at the annual Transplantation and Cellular Therapy Meetings.

There were also no significant differences between the folinic acid and placebo control groups in time to neutrophil and platelet engraftment, rates of febrile neutropenia and bloodstream infections, veno-occlusive disease, need for opiates or total parenteral nutrition (TPN), or time from transplant to discharge, Dr. Yeshrun added at the meeting, held by the American Society for Blood and Marrow Transplantation and the Center for International Blood and Marrow Transplant Research.

Folinic acid therapy did not, however, appear to abrogate or interfere with the effects of methotrexate on prevention of either acute or chronic graft-vs-host disease (GVHD).
 

Severe adverse event

Oral mucositis can be a serious complication of therapy, associated with increased morbidity and mortality; significant pain; difficulty with eating or speaking; difficulty swallowing water, food, and medications; prolonged hospitalizations; and increased costs of care, Dr. Yeshrun noted.

The presence of oral mucositis sometimes leads clinicians to reduce or even skip methotrexate doses, thereby increasing risk for GVHD.

There are limited data from nonrandomized studies indicating that folinic acid (also called leucovorin) may reduce methotrexate-associated toxicities, and both the European Society for Blood and Marrow Transplantation and European LeukemiaNet working group recommend the use of folinic-acid rescue 24 hours following each methotrexate dose, Dr. Yeshrun said.

To see whether folinic acid rescue actually reduces the rate of methotrexate-induced toxicity and affects outcomes for patients who receive methotrexate post transplant for GVHD prophylaxis, Dr. Yeshrun and colleagues in three Israeli medical centers conducted a randomized, placebo-controlled trial.

The eligible study population included patients 18 and older with hematological malignancies in complete remission or minimal residual disease who underwent myeloablative conditioning and allogeneic transplant from HLA-matched or 1-antigen mismatched siblings or unrelated donors.

The patients were stratified by treatment center and intensity of the conditioning regimen, and then randomized on a 1:1 basis to receive folinic acid or placebo beginning 24 hours after each methotrexate dose, with the assigned medication given at a dose of 15 mg three times daily on the first day, and once daily on days 3 and 6.

Patients who received a transplant from an unrelated donor were also given anti-thymocyte globulin at a total dose of 15 mg/kg.

Supportive care included filgrastim (Neupogen) 5 mcg/kg from day 7 until neutrophil engraftment, infection prophylaxis, and ursodeoxycholic acid for prevention of veno-occlusive disease.
 

 

 

Trial stopped

Although the study was designed to enroll 116 patients to have a power of 80% to detect a 50% reduction in the rate of severe oral mucositis from an anticipated 50% in the placebo arm, a planned interim analysis conducted after approximately half of the target events occurred showed no difference in the rates of severe oral mucositis, and the trial was halted.

A total of 28 patients in the folinic acid group and 24 in the placebo group were available for the analysis.

The rate of grade 3 or 4 mucositis, the primary endpoint, was 46.6% in the folinic acid group, and 45.8% in the placebo group.

Respectively, the median duration of severe oral mucositis was 4 days in each group, days to neutrophil engraftment were a median of 12 and to platelet engraftment were a median of 13 days in each group, rates of febrile neutropenia were 57.1% and 58.3%, rates of bloodstream infections were 10.7% and 16.6%, rates of veno-occlusive disease were 7% and 12.5%, need for TPN occurred in 14.2% vs. 25%, need for opiates occurred in 78.5% vs. 66.6%, and median time to discharge was 18 and 19 days. As noted, none of the differences were statistically significant.

“These unequivocal interim results led to our decision to discontinue the study,” Dr. Yeshrun said.

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel, who was not involved in the study, said in an interview “I think this is a very practical and important study, because we need evidence-based data such as this.”

Arnon Nagler, MD, MSc, from Sheba Medical Center Tel HaShomer at Tel Aviv University in Israel
Ted Bosworth/MDedge News
Dr. Arnon Nagler


Dr. Nagler, who comoderated the session where the data were presented, noted that the study was limited by the small number of patients and the lack of a subgroup analysis, but emphasized that the findings were important nonetheless.

Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program
Ted Bosworth/MDedge News
Dr. Maria Gilleece

His comoderator, Maria Gilleece, MD, director of the Yorkshire (England) Blood and Marrow Transplant Program, noted in an interview that folinic acid is frequently used in the United Kingdom for patients receiving high-dose methotrexate, ”and certainly, this study suggests that may be inappropriate.”

Rabin Medical Center sponsored the trial. Dr. Yeshrun, Dr. Nagler, and Dr. Gilleece reported no relevant conflicts of interest.

SOURCE: Yeshrun M. et al. TCT 2020. Abstract 61.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM TCT 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

High failure rate for magnetic rod system in scoliosis surgery

Article Type
Changed
Mon, 03/22/2021 - 14:08

February 14, 2020 – A surgical magnetic rod system used to treat scoliosis in children has a high failure rate, necessitating multiple surgeries and causing significant morbidity in these young patients, new research suggests.

The Magnetic Expansion Control (MAGEC) rod system (NuVasive), which was developed to replace traditional rods because of their high failure rate, has itself turned out to have a less than stellar success rate.

Researchers found that the most common complications associated with the MAGEC rod system are failure of the distraction mechanism used to stretch soft tissues to make space for bone growth, as well as rod fracture.

“This rod system fails very often compared to any standard spinal implant,” study investigator Aakash Agarwal, PhD, director of research at Spinal Balance Inc and adjunct professor of bioengineering at the University of Toledo, in Ohio, told Medscape Medical News.

The relatively high frequency of such adverse events is of “great concern,” said Agarwal, who recommended that neurosurgeons use more gradual distractions to minimize stress on spinal rods.

The study was published online October 19 in Spine Surgery and Related Research.
 

A Mainstay of Treatment

Scoliosis refers to the lateral curving of the spine, usually in the thoracic or thoracolumbar region. The degree of scoliosis is typically determined with x-rays.

Early-onset scoliosis (EOS) occurs in children younger than 5 years. It occurs more often in boys than girls. Although only about 1 or 2 in 10,000 children develop the condition, it can be severe, sometimes interfering with normal organ development.

Surgical intervention is required when bracing and casting fail to stop progression of the scoliotic curve. The aim of the surgery is to allow growth of the spine and ribcage and to correct and limit the extent of spinal deformity, said Agarwal.

Only children with adequate potential for spine and ribcage growth are candidates for the surgery. The age of eligible patients can vary but is typically up to about age 10 for girls and age 12 or 13 for boys, said Agarwal.

Although other surgical techniques are sometimes used, the distraction-based mechanism has been the mainstay of surgical intervention in EOS for more than a decade, he noted.

“The concept uses distraction, or stretching of the spine, to create additional soft-tissue space between the vertebrae for the bone to gradually grow into,” he said.

This has traditionally been achieved by putting two rods – proximally and distally – in each side of the spine. The attachment points of the rods must be repeatedly loosened and pushed apart.

Traditional growth rods are subject to risk of fracture and autofusion, which is stiffening of soft tissues in vertebral segments caused by trauma to the spine with excessive distraction.

Trauma Nightmare

The major problem with traditional growth rods is the need for invasive surgery every 6 to 12 months, said Agarwal.

“The trauma of repeated surgery is a nightmare for both the patients and the surgeons, from increased complications with each subsequent surgery to infections and unplanned surgeries,” he said.

This limitation led to the development of the titanium-based MAGEC rods. This noninvasive magnetic distraction system allows the rod to be expanded from outside the body using a remote control.

This approach results in a “drastic reduction” in the number of consecutive surgeries and has the potential to reduce growth rod fracture and autofusion, said Agarwal.

The system also allows for more gradual distractions of the growth rods.

“With the MAGEC system, you can stretch the spine a little bit every week without invasive surgery,” said Agarwal.

His own research showed that smaller and more frequent distractions result in much less stress in the rods. He suggests, for example, a distraction of 1.5-2.0 mm every month, rather than 4.5-6.0 mm every 3 months.

In the United States, the MAGEC system is used for all children undergoing distraction-based corrections. But in developing countries, at least 70% of patients still undergo surgery using traditional growth rods, owing to the very high initial cost associated with MAGEC rods, said Agarwal.

He believes traditional rods should not be used at all. In regions where MAGEC rods are unaffordable or inaccessible, surgeons should use alternative surgical techniques, he said.

“Given the variety of surgeon options, use of traditional growth rods isn’t justified,” he said.

For this new study, Agarwal and his colleagues searched the Manufacturer and User Facility Device Experience (MAUDE) database to identify relevant adverse events. Operated through the U.S. Food and Drug Administration, MAUDE is a voluntary reporting system for adverse events involving medical devices.

Of the 163 reports related to the MAGEC system through June of last year, 129 were for failures of the distraction mechanism, 24 were for rod fracture, and 10 were for other medical complications, such as infection and tissue necrosis.

 

 

Bare Minimum

These reports are “just the bare minimum,” said Agarwal. “For example, tissue necrosis – or metallosis due to wear – is present in almost all cases with MAGEC,” but these cases aren’t reported because of “absence of clinical symptoms.”

Agarwal called these MAGEC-related complications “very worrisome.”

“Every single failure of rod fracture or noninvasive distraction mechanism failure in MAGEC leads to another open surgery. And with each surgery, the risk of other complications, such as infection, goes up very significantly,” he said.

He added that the failure of the distraction of the growth rods reduces the overall efficacy of the device.

“Newer studies even question if there is a real quality-of-life difference with use of MAGEC rods over the myriad of other options,” he said.

He stressed the need for better technical and clinical controls to avoid such adverse events – for example, more frequent use of minimum distraction.

The researchers also retrieved MAUDE data on the top five failures associated with standard instrumentation used in spinal fusion. These included pedicle screw breakage post surgery (336 reports), set screw damage during surgery (257), rod breakage post surgery (175), interbody cage breakage during surgery (118), and pedicle screw breakage during surgery (75).

The rates of adverse events involving the MAGEC rods, which are used in relatively rare surgical procedures, “seem high” in comparison, said Agarwal.

Commenting for Medscape Medical News, Lee Tan, MD, assistant professor of neurologic surgery, University of California, San Francisco, praised the authors for conducting an “interesting” study on the complications and mode of failure related to MAGEC rods in scoliosis correction using a large database.

“They identified distraction mechanism failure and pedicle screw breakage as the most common device-related complication and standard instrumentation-related complication, respectively,” said Tan.

“This is very useful information during patient education and preoperative counseling. It also identifies the areas for improvement and innovation on this important topic. I commend the authors for their excellent work,” he said.

The study received no funding. Agarwal has received royalties from and consults for Spinal Balance and is an editorial board member for Clinical Spine Surgery and Spine.

This article first appeared on Medscape.com.

Publications
Topics
Sections

February 14, 2020 – A surgical magnetic rod system used to treat scoliosis in children has a high failure rate, necessitating multiple surgeries and causing significant morbidity in these young patients, new research suggests.

The Magnetic Expansion Control (MAGEC) rod system (NuVasive), which was developed to replace traditional rods because of their high failure rate, has itself turned out to have a less than stellar success rate.

Researchers found that the most common complications associated with the MAGEC rod system are failure of the distraction mechanism used to stretch soft tissues to make space for bone growth, as well as rod fracture.

“This rod system fails very often compared to any standard spinal implant,” study investigator Aakash Agarwal, PhD, director of research at Spinal Balance Inc and adjunct professor of bioengineering at the University of Toledo, in Ohio, told Medscape Medical News.

The relatively high frequency of such adverse events is of “great concern,” said Agarwal, who recommended that neurosurgeons use more gradual distractions to minimize stress on spinal rods.

The study was published online October 19 in Spine Surgery and Related Research.
 

A Mainstay of Treatment

Scoliosis refers to the lateral curving of the spine, usually in the thoracic or thoracolumbar region. The degree of scoliosis is typically determined with x-rays.

Early-onset scoliosis (EOS) occurs in children younger than 5 years. It occurs more often in boys than girls. Although only about 1 or 2 in 10,000 children develop the condition, it can be severe, sometimes interfering with normal organ development.

Surgical intervention is required when bracing and casting fail to stop progression of the scoliotic curve. The aim of the surgery is to allow growth of the spine and ribcage and to correct and limit the extent of spinal deformity, said Agarwal.

Only children with adequate potential for spine and ribcage growth are candidates for the surgery. The age of eligible patients can vary but is typically up to about age 10 for girls and age 12 or 13 for boys, said Agarwal.

Although other surgical techniques are sometimes used, the distraction-based mechanism has been the mainstay of surgical intervention in EOS for more than a decade, he noted.

“The concept uses distraction, or stretching of the spine, to create additional soft-tissue space between the vertebrae for the bone to gradually grow into,” he said.

This has traditionally been achieved by putting two rods – proximally and distally – in each side of the spine. The attachment points of the rods must be repeatedly loosened and pushed apart.

Traditional growth rods are subject to risk of fracture and autofusion, which is stiffening of soft tissues in vertebral segments caused by trauma to the spine with excessive distraction.

Trauma Nightmare

The major problem with traditional growth rods is the need for invasive surgery every 6 to 12 months, said Agarwal.

“The trauma of repeated surgery is a nightmare for both the patients and the surgeons, from increased complications with each subsequent surgery to infections and unplanned surgeries,” he said.

This limitation led to the development of the titanium-based MAGEC rods. This noninvasive magnetic distraction system allows the rod to be expanded from outside the body using a remote control.

This approach results in a “drastic reduction” in the number of consecutive surgeries and has the potential to reduce growth rod fracture and autofusion, said Agarwal.

The system also allows for more gradual distractions of the growth rods.

“With the MAGEC system, you can stretch the spine a little bit every week without invasive surgery,” said Agarwal.

His own research showed that smaller and more frequent distractions result in much less stress in the rods. He suggests, for example, a distraction of 1.5-2.0 mm every month, rather than 4.5-6.0 mm every 3 months.

In the United States, the MAGEC system is used for all children undergoing distraction-based corrections. But in developing countries, at least 70% of patients still undergo surgery using traditional growth rods, owing to the very high initial cost associated with MAGEC rods, said Agarwal.

He believes traditional rods should not be used at all. In regions where MAGEC rods are unaffordable or inaccessible, surgeons should use alternative surgical techniques, he said.

“Given the variety of surgeon options, use of traditional growth rods isn’t justified,” he said.

For this new study, Agarwal and his colleagues searched the Manufacturer and User Facility Device Experience (MAUDE) database to identify relevant adverse events. Operated through the U.S. Food and Drug Administration, MAUDE is a voluntary reporting system for adverse events involving medical devices.

Of the 163 reports related to the MAGEC system through June of last year, 129 were for failures of the distraction mechanism, 24 were for rod fracture, and 10 were for other medical complications, such as infection and tissue necrosis.

 

 

Bare Minimum

These reports are “just the bare minimum,” said Agarwal. “For example, tissue necrosis – or metallosis due to wear – is present in almost all cases with MAGEC,” but these cases aren’t reported because of “absence of clinical symptoms.”

Agarwal called these MAGEC-related complications “very worrisome.”

“Every single failure of rod fracture or noninvasive distraction mechanism failure in MAGEC leads to another open surgery. And with each surgery, the risk of other complications, such as infection, goes up very significantly,” he said.

He added that the failure of the distraction of the growth rods reduces the overall efficacy of the device.

“Newer studies even question if there is a real quality-of-life difference with use of MAGEC rods over the myriad of other options,” he said.

He stressed the need for better technical and clinical controls to avoid such adverse events – for example, more frequent use of minimum distraction.

The researchers also retrieved MAUDE data on the top five failures associated with standard instrumentation used in spinal fusion. These included pedicle screw breakage post surgery (336 reports), set screw damage during surgery (257), rod breakage post surgery (175), interbody cage breakage during surgery (118), and pedicle screw breakage during surgery (75).

The rates of adverse events involving the MAGEC rods, which are used in relatively rare surgical procedures, “seem high” in comparison, said Agarwal.

Commenting for Medscape Medical News, Lee Tan, MD, assistant professor of neurologic surgery, University of California, San Francisco, praised the authors for conducting an “interesting” study on the complications and mode of failure related to MAGEC rods in scoliosis correction using a large database.

“They identified distraction mechanism failure and pedicle screw breakage as the most common device-related complication and standard instrumentation-related complication, respectively,” said Tan.

“This is very useful information during patient education and preoperative counseling. It also identifies the areas for improvement and innovation on this important topic. I commend the authors for their excellent work,” he said.

The study received no funding. Agarwal has received royalties from and consults for Spinal Balance and is an editorial board member for Clinical Spine Surgery and Spine.

This article first appeared on Medscape.com.

February 14, 2020 – A surgical magnetic rod system used to treat scoliosis in children has a high failure rate, necessitating multiple surgeries and causing significant morbidity in these young patients, new research suggests.

The Magnetic Expansion Control (MAGEC) rod system (NuVasive), which was developed to replace traditional rods because of their high failure rate, has itself turned out to have a less than stellar success rate.

Researchers found that the most common complications associated with the MAGEC rod system are failure of the distraction mechanism used to stretch soft tissues to make space for bone growth, as well as rod fracture.

“This rod system fails very often compared to any standard spinal implant,” study investigator Aakash Agarwal, PhD, director of research at Spinal Balance Inc and adjunct professor of bioengineering at the University of Toledo, in Ohio, told Medscape Medical News.

The relatively high frequency of such adverse events is of “great concern,” said Agarwal, who recommended that neurosurgeons use more gradual distractions to minimize stress on spinal rods.

The study was published online October 19 in Spine Surgery and Related Research.
 

A Mainstay of Treatment

Scoliosis refers to the lateral curving of the spine, usually in the thoracic or thoracolumbar region. The degree of scoliosis is typically determined with x-rays.

Early-onset scoliosis (EOS) occurs in children younger than 5 years. It occurs more often in boys than girls. Although only about 1 or 2 in 10,000 children develop the condition, it can be severe, sometimes interfering with normal organ development.

Surgical intervention is required when bracing and casting fail to stop progression of the scoliotic curve. The aim of the surgery is to allow growth of the spine and ribcage and to correct and limit the extent of spinal deformity, said Agarwal.

Only children with adequate potential for spine and ribcage growth are candidates for the surgery. The age of eligible patients can vary but is typically up to about age 10 for girls and age 12 or 13 for boys, said Agarwal.

Although other surgical techniques are sometimes used, the distraction-based mechanism has been the mainstay of surgical intervention in EOS for more than a decade, he noted.

“The concept uses distraction, or stretching of the spine, to create additional soft-tissue space between the vertebrae for the bone to gradually grow into,” he said.

This has traditionally been achieved by putting two rods – proximally and distally – in each side of the spine. The attachment points of the rods must be repeatedly loosened and pushed apart.

Traditional growth rods are subject to risk of fracture and autofusion, which is stiffening of soft tissues in vertebral segments caused by trauma to the spine with excessive distraction.

Trauma Nightmare

The major problem with traditional growth rods is the need for invasive surgery every 6 to 12 months, said Agarwal.

“The trauma of repeated surgery is a nightmare for both the patients and the surgeons, from increased complications with each subsequent surgery to infections and unplanned surgeries,” he said.

This limitation led to the development of the titanium-based MAGEC rods. This noninvasive magnetic distraction system allows the rod to be expanded from outside the body using a remote control.

This approach results in a “drastic reduction” in the number of consecutive surgeries and has the potential to reduce growth rod fracture and autofusion, said Agarwal.

The system also allows for more gradual distractions of the growth rods.

“With the MAGEC system, you can stretch the spine a little bit every week without invasive surgery,” said Agarwal.

His own research showed that smaller and more frequent distractions result in much less stress in the rods. He suggests, for example, a distraction of 1.5-2.0 mm every month, rather than 4.5-6.0 mm every 3 months.

In the United States, the MAGEC system is used for all children undergoing distraction-based corrections. But in developing countries, at least 70% of patients still undergo surgery using traditional growth rods, owing to the very high initial cost associated with MAGEC rods, said Agarwal.

He believes traditional rods should not be used at all. In regions where MAGEC rods are unaffordable or inaccessible, surgeons should use alternative surgical techniques, he said.

“Given the variety of surgeon options, use of traditional growth rods isn’t justified,” he said.

For this new study, Agarwal and his colleagues searched the Manufacturer and User Facility Device Experience (MAUDE) database to identify relevant adverse events. Operated through the U.S. Food and Drug Administration, MAUDE is a voluntary reporting system for adverse events involving medical devices.

Of the 163 reports related to the MAGEC system through June of last year, 129 were for failures of the distraction mechanism, 24 were for rod fracture, and 10 were for other medical complications, such as infection and tissue necrosis.

 

 

Bare Minimum

These reports are “just the bare minimum,” said Agarwal. “For example, tissue necrosis – or metallosis due to wear – is present in almost all cases with MAGEC,” but these cases aren’t reported because of “absence of clinical symptoms.”

Agarwal called these MAGEC-related complications “very worrisome.”

“Every single failure of rod fracture or noninvasive distraction mechanism failure in MAGEC leads to another open surgery. And with each surgery, the risk of other complications, such as infection, goes up very significantly,” he said.

He added that the failure of the distraction of the growth rods reduces the overall efficacy of the device.

“Newer studies even question if there is a real quality-of-life difference with use of MAGEC rods over the myriad of other options,” he said.

He stressed the need for better technical and clinical controls to avoid such adverse events – for example, more frequent use of minimum distraction.

The researchers also retrieved MAUDE data on the top five failures associated with standard instrumentation used in spinal fusion. These included pedicle screw breakage post surgery (336 reports), set screw damage during surgery (257), rod breakage post surgery (175), interbody cage breakage during surgery (118), and pedicle screw breakage during surgery (75).

The rates of adverse events involving the MAGEC rods, which are used in relatively rare surgical procedures, “seem high” in comparison, said Agarwal.

Commenting for Medscape Medical News, Lee Tan, MD, assistant professor of neurologic surgery, University of California, San Francisco, praised the authors for conducting an “interesting” study on the complications and mode of failure related to MAGEC rods in scoliosis correction using a large database.

“They identified distraction mechanism failure and pedicle screw breakage as the most common device-related complication and standard instrumentation-related complication, respectively,” said Tan.

“This is very useful information during patient education and preoperative counseling. It also identifies the areas for improvement and innovation on this important topic. I commend the authors for their excellent work,” he said.

The study received no funding. Agarwal has received royalties from and consults for Spinal Balance and is an editorial board member for Clinical Spine Surgery and Spine.

This article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Medscape Article

What medical conferences are being canceled by coronavirus?

Article Type
Changed
Tue, 03/17/2020 - 10:40

In a typical year, March marks the start of conference season, made all the more attractive by collegial gatherings and travel to warmer climes. But 2020 has already proven anything but typical as the number of novel coronavirus cases continues to increase around the globe. As a potential pandemic looms, these meetings – full of handshakes and crowded lecture halls – are also nirvana for opportunistic viruses. As are the airports, airplanes, and cabs required to get there.

So, as COVID-19 continues to spread, medical and scientific societies must make some difficult decisions. In Europe, at least a few societies have already suspended their upcoming meetings, while France has temporarily banned all gatherings over 5000 people.

In the United States, however, most medical conferences are moving forward as planned – at least for now. But one conference of 10,000 attendees, the American Physical Society annual meeting, which was scheduled for March 2-6 in Denver, was canceled the day before the meeting started. Although it’s not a medical conference, it speaks to the “rapidly escalating health concerns” that all conference organizers must grapple with.

APS Physics Meetings

@APSMeetings

Due to rapidly escalating health concerns relating to the spread of the coronavirus disease (COVID-19), the 2020 APS March Meeting in Denver, CO, has been canceled. Please do not travel to Denver to attend the March Meeting. More information will follow shortly. #apsmarch

734 9:59 PM - Feb 29, 2020

Just one smaller medical meeting, the Ataxia Conference, which was scheduled for March 6-7 in Denver, has been canceled.

Most societies hosting these meetings have put out statements to their attendees saying that they’re monitoring the situation and will adapt as necessary. The United States and Canadian Academy of Pathology, which is holding its annual meeting in Los Angeles this week, sent out an email beforehand asking international travelers to consider staying home. The Healthcare Information and Management Systems Society (HIMSS) Global Health Conference, which is slated to have about 50,000 attendees from around the world, has declared itself a “handshake-free” conference but otherwise intends to move ahead as planned.

All of these conferences will be pushing forward without at least one prominent group of attendees. New York University’s Langone Health has removed its employees from the decision-making process and instead is taking a proactive stance: The health system just declared a 60-day (minimum) ban preventing employees from attending any meetings or conferences and from all domestic and international work-related travel.

Here’s what some of the societies have said to attendees about their intent to proceed or modify their plans:

  • Conference on Retroviruses and Opportunistic Infections (CROI), Boston, 3/8/20 - 3/11/20: Monitoring the situation and seeking input from local, state, and federal infectious-disease and public-health experts. Final decision expected by the evening of March 3.
  • American Academy of Allergy, Asthma & Immunology (AAAAI), Philadelphia, 3/13/20 - 3/16/20: Monitoring developments but no plans to cancel or postpone at this time.
  • American Academy of Orthopedic Surgeons (AAOS), Orlando, 3/24/20 - 3/28/20: Proceeding as planned.
  • American Academy of Dermatology (AAD), Denver, 3/20/20 - 3/24/20: The AAD’s 2020 Annual Meeting is scheduled to take place as planned. The organization will increase the number of hand-sanitizing stations throughout the convention center, and it is adding a nursing station specifically designated for anyone with flu-like symptoms.
  • American College of Cardiology (ACC), Chicago, 3/28/20 - 3/30/20: The organization is working with attendees, faculty, exhibitors, and other stakeholders in affected countries to ensure access to research and education from the meeting, but is otherwise proceeding as planned.
  • Endocrine Society (ENDO), San Francisco, 3/28/20 - 3/31/20: ENDO 2020 will take place as scheduled, but this is an evolving situation worldwide. The society will continue to monitor and provide updates on its FAQ page.
  • American College of Physicians Internal Medicine (ACP IM), Los Angeles, 4/23/20 - 4/25/20: ACP leadership is closely monitoring the COVID-19 situation and is actively working with the Centers for Disease Control and Prevention (CDC) to ensure authoritative communication of safety updates and recommendations as the situation evolves.
  • American Association for Cancer Research (AACR), San Diego, 4/24/20 - 4/29/20: At this time, there is no plan to cancel or postpone any scheduled AACR meetings. The organization is tracking all travel restrictions as well as information and guidance from the CDC and World Health Organization.
  • American Academy of Neurology (AAN), Toronto, 4/25/20 - 5/1/20: The group is continuing to closely monitor the situation in Toronto and will provide updates as the situation warrants.

This article originally appeared on Medscape.com.

Publications
Topics
Sections

In a typical year, March marks the start of conference season, made all the more attractive by collegial gatherings and travel to warmer climes. But 2020 has already proven anything but typical as the number of novel coronavirus cases continues to increase around the globe. As a potential pandemic looms, these meetings – full of handshakes and crowded lecture halls – are also nirvana for opportunistic viruses. As are the airports, airplanes, and cabs required to get there.

So, as COVID-19 continues to spread, medical and scientific societies must make some difficult decisions. In Europe, at least a few societies have already suspended their upcoming meetings, while France has temporarily banned all gatherings over 5000 people.

In the United States, however, most medical conferences are moving forward as planned – at least for now. But one conference of 10,000 attendees, the American Physical Society annual meeting, which was scheduled for March 2-6 in Denver, was canceled the day before the meeting started. Although it’s not a medical conference, it speaks to the “rapidly escalating health concerns” that all conference organizers must grapple with.

APS Physics Meetings

@APSMeetings

Due to rapidly escalating health concerns relating to the spread of the coronavirus disease (COVID-19), the 2020 APS March Meeting in Denver, CO, has been canceled. Please do not travel to Denver to attend the March Meeting. More information will follow shortly. #apsmarch

734 9:59 PM - Feb 29, 2020

Just one smaller medical meeting, the Ataxia Conference, which was scheduled for March 6-7 in Denver, has been canceled.

Most societies hosting these meetings have put out statements to their attendees saying that they’re monitoring the situation and will adapt as necessary. The United States and Canadian Academy of Pathology, which is holding its annual meeting in Los Angeles this week, sent out an email beforehand asking international travelers to consider staying home. The Healthcare Information and Management Systems Society (HIMSS) Global Health Conference, which is slated to have about 50,000 attendees from around the world, has declared itself a “handshake-free” conference but otherwise intends to move ahead as planned.

All of these conferences will be pushing forward without at least one prominent group of attendees. New York University’s Langone Health has removed its employees from the decision-making process and instead is taking a proactive stance: The health system just declared a 60-day (minimum) ban preventing employees from attending any meetings or conferences and from all domestic and international work-related travel.

Here’s what some of the societies have said to attendees about their intent to proceed or modify their plans:

  • Conference on Retroviruses and Opportunistic Infections (CROI), Boston, 3/8/20 - 3/11/20: Monitoring the situation and seeking input from local, state, and federal infectious-disease and public-health experts. Final decision expected by the evening of March 3.
  • American Academy of Allergy, Asthma & Immunology (AAAAI), Philadelphia, 3/13/20 - 3/16/20: Monitoring developments but no plans to cancel or postpone at this time.
  • American Academy of Orthopedic Surgeons (AAOS), Orlando, 3/24/20 - 3/28/20: Proceeding as planned.
  • American Academy of Dermatology (AAD), Denver, 3/20/20 - 3/24/20: The AAD’s 2020 Annual Meeting is scheduled to take place as planned. The organization will increase the number of hand-sanitizing stations throughout the convention center, and it is adding a nursing station specifically designated for anyone with flu-like symptoms.
  • American College of Cardiology (ACC), Chicago, 3/28/20 - 3/30/20: The organization is working with attendees, faculty, exhibitors, and other stakeholders in affected countries to ensure access to research and education from the meeting, but is otherwise proceeding as planned.
  • Endocrine Society (ENDO), San Francisco, 3/28/20 - 3/31/20: ENDO 2020 will take place as scheduled, but this is an evolving situation worldwide. The society will continue to monitor and provide updates on its FAQ page.
  • American College of Physicians Internal Medicine (ACP IM), Los Angeles, 4/23/20 - 4/25/20: ACP leadership is closely monitoring the COVID-19 situation and is actively working with the Centers for Disease Control and Prevention (CDC) to ensure authoritative communication of safety updates and recommendations as the situation evolves.
  • American Association for Cancer Research (AACR), San Diego, 4/24/20 - 4/29/20: At this time, there is no plan to cancel or postpone any scheduled AACR meetings. The organization is tracking all travel restrictions as well as information and guidance from the CDC and World Health Organization.
  • American Academy of Neurology (AAN), Toronto, 4/25/20 - 5/1/20: The group is continuing to closely monitor the situation in Toronto and will provide updates as the situation warrants.

This article originally appeared on Medscape.com.

In a typical year, March marks the start of conference season, made all the more attractive by collegial gatherings and travel to warmer climes. But 2020 has already proven anything but typical as the number of novel coronavirus cases continues to increase around the globe. As a potential pandemic looms, these meetings – full of handshakes and crowded lecture halls – are also nirvana for opportunistic viruses. As are the airports, airplanes, and cabs required to get there.

So, as COVID-19 continues to spread, medical and scientific societies must make some difficult decisions. In Europe, at least a few societies have already suspended their upcoming meetings, while France has temporarily banned all gatherings over 5000 people.

In the United States, however, most medical conferences are moving forward as planned – at least for now. But one conference of 10,000 attendees, the American Physical Society annual meeting, which was scheduled for March 2-6 in Denver, was canceled the day before the meeting started. Although it’s not a medical conference, it speaks to the “rapidly escalating health concerns” that all conference organizers must grapple with.

APS Physics Meetings

@APSMeetings

Due to rapidly escalating health concerns relating to the spread of the coronavirus disease (COVID-19), the 2020 APS March Meeting in Denver, CO, has been canceled. Please do not travel to Denver to attend the March Meeting. More information will follow shortly. #apsmarch

734 9:59 PM - Feb 29, 2020

Just one smaller medical meeting, the Ataxia Conference, which was scheduled for March 6-7 in Denver, has been canceled.

Most societies hosting these meetings have put out statements to their attendees saying that they’re monitoring the situation and will adapt as necessary. The United States and Canadian Academy of Pathology, which is holding its annual meeting in Los Angeles this week, sent out an email beforehand asking international travelers to consider staying home. The Healthcare Information and Management Systems Society (HIMSS) Global Health Conference, which is slated to have about 50,000 attendees from around the world, has declared itself a “handshake-free” conference but otherwise intends to move ahead as planned.

All of these conferences will be pushing forward without at least one prominent group of attendees. New York University’s Langone Health has removed its employees from the decision-making process and instead is taking a proactive stance: The health system just declared a 60-day (minimum) ban preventing employees from attending any meetings or conferences and from all domestic and international work-related travel.

Here’s what some of the societies have said to attendees about their intent to proceed or modify their plans:

  • Conference on Retroviruses and Opportunistic Infections (CROI), Boston, 3/8/20 - 3/11/20: Monitoring the situation and seeking input from local, state, and federal infectious-disease and public-health experts. Final decision expected by the evening of March 3.
  • American Academy of Allergy, Asthma & Immunology (AAAAI), Philadelphia, 3/13/20 - 3/16/20: Monitoring developments but no plans to cancel or postpone at this time.
  • American Academy of Orthopedic Surgeons (AAOS), Orlando, 3/24/20 - 3/28/20: Proceeding as planned.
  • American Academy of Dermatology (AAD), Denver, 3/20/20 - 3/24/20: The AAD’s 2020 Annual Meeting is scheduled to take place as planned. The organization will increase the number of hand-sanitizing stations throughout the convention center, and it is adding a nursing station specifically designated for anyone with flu-like symptoms.
  • American College of Cardiology (ACC), Chicago, 3/28/20 - 3/30/20: The organization is working with attendees, faculty, exhibitors, and other stakeholders in affected countries to ensure access to research and education from the meeting, but is otherwise proceeding as planned.
  • Endocrine Society (ENDO), San Francisco, 3/28/20 - 3/31/20: ENDO 2020 will take place as scheduled, but this is an evolving situation worldwide. The society will continue to monitor and provide updates on its FAQ page.
  • American College of Physicians Internal Medicine (ACP IM), Los Angeles, 4/23/20 - 4/25/20: ACP leadership is closely monitoring the COVID-19 situation and is actively working with the Centers for Disease Control and Prevention (CDC) to ensure authoritative communication of safety updates and recommendations as the situation evolves.
  • American Association for Cancer Research (AACR), San Diego, 4/24/20 - 4/29/20: At this time, there is no plan to cancel or postpone any scheduled AACR meetings. The organization is tracking all travel restrictions as well as information and guidance from the CDC and World Health Organization.
  • American Academy of Neurology (AAN), Toronto, 4/25/20 - 5/1/20: The group is continuing to closely monitor the situation in Toronto and will provide updates as the situation warrants.

This article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

rTMS for depression continues to evolve

Article Type
Changed
Tue, 03/03/2020 - 13:59

– Repetitive transcranial magnetic stimulation methods for treatment-resistant depression continue to be refined.

Dr. Alan Schatzberg, Stanford (Calif.) University
Dr. Alan Schatzberg

“Original studies have relatively low response rates, but we’re seeing better response rates as we figure out the localization, the parameters, the wave form, and how frequently you can give it,” Alan F. Schatzberg, MD, said at an annual psychopharmacology update held by the Nevada Psychiatric Association.

Repetitive transcranial magnetic stimulation (rTMS) involves the application of a magnetic field to a particular area of the brain, typically the dorsal lateral aspect of the prefrontal cortex. “It’s a weaker stimulant than electroconvulsive therapy, but it’s more focused and a lot safer,” said Dr. Schatzberg, professor of psychiatry and behavioral sciences at Stanford (Calif.) University. “It does not require anesthesia. In fact, it does seem to have some antidepressant effects.”

The original trial that applied this technology was conducted in 301 medication-free patients with major depression who had not benefited from prior treatment (Biol Psychiatry. 2007;62[11]:1208-16). Of the 301 patients, 155 received active rTMS, while 146 received sham rTMS. Treatment sessions were conducted five times per week for 4-6 weeks. The primary outcome was the symptom score change as assessed at week 4 with the Montgomery-Åsberg Depression Rating Scale (MADRS). Secondary outcomes included changes on the 17- and 24-item Hamilton Depression Rating Scale (HAMD), and response and remission rates with the MADRS and HAMD.

Response rates were significantly higher with active TMS on all three scales at weeks 4 and 6. Remission rates were approximately twofold higher with active TMS at week 6 and significant on the MADRS and HAMD24 scales (but not the HAMD17 scale). “The response rate for patients receiving active treatment was about 20%, and the remission at 6 weeks was about 18%,” said Dr. Schatzberg, who was an adviser to the study. “It was about twofold higher than in the sham group. It’s not dramatically effective, but it certainly is better than the sham control.” The MADRS score dropped about 6 points in the rTMS group, compared with about 2 points in the sham group, while the HAMD 24 score dropped about 7 points in the rTMS group, compared with about 3.5 points in the sham group.



In a separate, multisite, sham-controlled trial supported by the National Institutes of Health, researchers enrolled 199 antidepressant drug-free patients to determine whether daily left prefrontal rTMS safely and effectively treats major depressive disorder (Arch Gen Psychiatry. 2010;67[5]:507-16). Over the course of 3 weeks, the researchers delivered rTMS to the left prefrontal cortex for 37.5 minutes (3,000 pulses per session) using a figure-eight solid-core coil. Sham rTMS used a similar coil with a metal insert blocking the magnetic field and scalp electrodes that delivered matched somatosensory sensations. The retention rate was 88%, and no device-related serious adverse events were reported. A significantly greater proportion of patients treated with rTMS achieved remission, compared with those in the sham group (15% vs. 5%, respectively; P = .02). The odds of attaining remission were 4.2 times greater with active rTMS than with the sham treatment.

“These are not huge remission and response rates,” Dr. Schatzberg said of the results from this and other studies. “What can we do to start increasing efficacy? One thing you can do is design a better coil. You can alter the site of application, and you can change the pulse frequency and the pulse number. You can also change the brain wave focus. Theta seems to be mostly associated with hippocampal function around memory. Because of that, a number of groups starting giving theta waves.”

In one such study, researchers used accelerated, high-dose intermittent theta burst stimulation (iTBS) to treat highly treatment-resistant depression patients (Brain. 2018;141[3]:e18). The treatment lasted 5 days and consisted of 10 sessions per day, with 50 minutes between each session. “It’s a much more intensive system that delivers about 90,000 pulses,” said Dr. Schatzberg, who directs the Stanford Mood Disorders Center. Most patients remitted, but the durability of therapeutic response was weak, and all patients relapsed within 2 weeks post treatment.

“There’s more work to be done, but rTMS is really a good technology,” he concluded. “I think we will achieve much higher rates of success with this treatment once we push the envelope a little bit.”

Dr. Schatzberg disclosed that he has served a consultant to Alkermes, Avanir, Bracket, Compass, Delpor, Epiodyne, Janssen, Jazz, Lundbeck, McKinsey, Merck, Myriad Genetics, Owl, Neuronetics, Pfizer, Sage, and Sunovion. He has received research funding from Janssen and also holds an ownership interest in Corcept, Dermira, Delpor, Epiodyne, Incyte Genetics, Madrigal, Merck, Owl Analytics, Seattle Genetics, Titan, and Xhale.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Repetitive transcranial magnetic stimulation methods for treatment-resistant depression continue to be refined.

Dr. Alan Schatzberg, Stanford (Calif.) University
Dr. Alan Schatzberg

“Original studies have relatively low response rates, but we’re seeing better response rates as we figure out the localization, the parameters, the wave form, and how frequently you can give it,” Alan F. Schatzberg, MD, said at an annual psychopharmacology update held by the Nevada Psychiatric Association.

Repetitive transcranial magnetic stimulation (rTMS) involves the application of a magnetic field to a particular area of the brain, typically the dorsal lateral aspect of the prefrontal cortex. “It’s a weaker stimulant than electroconvulsive therapy, but it’s more focused and a lot safer,” said Dr. Schatzberg, professor of psychiatry and behavioral sciences at Stanford (Calif.) University. “It does not require anesthesia. In fact, it does seem to have some antidepressant effects.”

The original trial that applied this technology was conducted in 301 medication-free patients with major depression who had not benefited from prior treatment (Biol Psychiatry. 2007;62[11]:1208-16). Of the 301 patients, 155 received active rTMS, while 146 received sham rTMS. Treatment sessions were conducted five times per week for 4-6 weeks. The primary outcome was the symptom score change as assessed at week 4 with the Montgomery-Åsberg Depression Rating Scale (MADRS). Secondary outcomes included changes on the 17- and 24-item Hamilton Depression Rating Scale (HAMD), and response and remission rates with the MADRS and HAMD.

Response rates were significantly higher with active TMS on all three scales at weeks 4 and 6. Remission rates were approximately twofold higher with active TMS at week 6 and significant on the MADRS and HAMD24 scales (but not the HAMD17 scale). “The response rate for patients receiving active treatment was about 20%, and the remission at 6 weeks was about 18%,” said Dr. Schatzberg, who was an adviser to the study. “It was about twofold higher than in the sham group. It’s not dramatically effective, but it certainly is better than the sham control.” The MADRS score dropped about 6 points in the rTMS group, compared with about 2 points in the sham group, while the HAMD 24 score dropped about 7 points in the rTMS group, compared with about 3.5 points in the sham group.



In a separate, multisite, sham-controlled trial supported by the National Institutes of Health, researchers enrolled 199 antidepressant drug-free patients to determine whether daily left prefrontal rTMS safely and effectively treats major depressive disorder (Arch Gen Psychiatry. 2010;67[5]:507-16). Over the course of 3 weeks, the researchers delivered rTMS to the left prefrontal cortex for 37.5 minutes (3,000 pulses per session) using a figure-eight solid-core coil. Sham rTMS used a similar coil with a metal insert blocking the magnetic field and scalp electrodes that delivered matched somatosensory sensations. The retention rate was 88%, and no device-related serious adverse events were reported. A significantly greater proportion of patients treated with rTMS achieved remission, compared with those in the sham group (15% vs. 5%, respectively; P = .02). The odds of attaining remission were 4.2 times greater with active rTMS than with the sham treatment.

“These are not huge remission and response rates,” Dr. Schatzberg said of the results from this and other studies. “What can we do to start increasing efficacy? One thing you can do is design a better coil. You can alter the site of application, and you can change the pulse frequency and the pulse number. You can also change the brain wave focus. Theta seems to be mostly associated with hippocampal function around memory. Because of that, a number of groups starting giving theta waves.”

In one such study, researchers used accelerated, high-dose intermittent theta burst stimulation (iTBS) to treat highly treatment-resistant depression patients (Brain. 2018;141[3]:e18). The treatment lasted 5 days and consisted of 10 sessions per day, with 50 minutes between each session. “It’s a much more intensive system that delivers about 90,000 pulses,” said Dr. Schatzberg, who directs the Stanford Mood Disorders Center. Most patients remitted, but the durability of therapeutic response was weak, and all patients relapsed within 2 weeks post treatment.

“There’s more work to be done, but rTMS is really a good technology,” he concluded. “I think we will achieve much higher rates of success with this treatment once we push the envelope a little bit.”

Dr. Schatzberg disclosed that he has served a consultant to Alkermes, Avanir, Bracket, Compass, Delpor, Epiodyne, Janssen, Jazz, Lundbeck, McKinsey, Merck, Myriad Genetics, Owl, Neuronetics, Pfizer, Sage, and Sunovion. He has received research funding from Janssen and also holds an ownership interest in Corcept, Dermira, Delpor, Epiodyne, Incyte Genetics, Madrigal, Merck, Owl Analytics, Seattle Genetics, Titan, and Xhale.

– Repetitive transcranial magnetic stimulation methods for treatment-resistant depression continue to be refined.

Dr. Alan Schatzberg, Stanford (Calif.) University
Dr. Alan Schatzberg

“Original studies have relatively low response rates, but we’re seeing better response rates as we figure out the localization, the parameters, the wave form, and how frequently you can give it,” Alan F. Schatzberg, MD, said at an annual psychopharmacology update held by the Nevada Psychiatric Association.

Repetitive transcranial magnetic stimulation (rTMS) involves the application of a magnetic field to a particular area of the brain, typically the dorsal lateral aspect of the prefrontal cortex. “It’s a weaker stimulant than electroconvulsive therapy, but it’s more focused and a lot safer,” said Dr. Schatzberg, professor of psychiatry and behavioral sciences at Stanford (Calif.) University. “It does not require anesthesia. In fact, it does seem to have some antidepressant effects.”

The original trial that applied this technology was conducted in 301 medication-free patients with major depression who had not benefited from prior treatment (Biol Psychiatry. 2007;62[11]:1208-16). Of the 301 patients, 155 received active rTMS, while 146 received sham rTMS. Treatment sessions were conducted five times per week for 4-6 weeks. The primary outcome was the symptom score change as assessed at week 4 with the Montgomery-Åsberg Depression Rating Scale (MADRS). Secondary outcomes included changes on the 17- and 24-item Hamilton Depression Rating Scale (HAMD), and response and remission rates with the MADRS and HAMD.

Response rates were significantly higher with active TMS on all three scales at weeks 4 and 6. Remission rates were approximately twofold higher with active TMS at week 6 and significant on the MADRS and HAMD24 scales (but not the HAMD17 scale). “The response rate for patients receiving active treatment was about 20%, and the remission at 6 weeks was about 18%,” said Dr. Schatzberg, who was an adviser to the study. “It was about twofold higher than in the sham group. It’s not dramatically effective, but it certainly is better than the sham control.” The MADRS score dropped about 6 points in the rTMS group, compared with about 2 points in the sham group, while the HAMD 24 score dropped about 7 points in the rTMS group, compared with about 3.5 points in the sham group.



In a separate, multisite, sham-controlled trial supported by the National Institutes of Health, researchers enrolled 199 antidepressant drug-free patients to determine whether daily left prefrontal rTMS safely and effectively treats major depressive disorder (Arch Gen Psychiatry. 2010;67[5]:507-16). Over the course of 3 weeks, the researchers delivered rTMS to the left prefrontal cortex for 37.5 minutes (3,000 pulses per session) using a figure-eight solid-core coil. Sham rTMS used a similar coil with a metal insert blocking the magnetic field and scalp electrodes that delivered matched somatosensory sensations. The retention rate was 88%, and no device-related serious adverse events were reported. A significantly greater proportion of patients treated with rTMS achieved remission, compared with those in the sham group (15% vs. 5%, respectively; P = .02). The odds of attaining remission were 4.2 times greater with active rTMS than with the sham treatment.

“These are not huge remission and response rates,” Dr. Schatzberg said of the results from this and other studies. “What can we do to start increasing efficacy? One thing you can do is design a better coil. You can alter the site of application, and you can change the pulse frequency and the pulse number. You can also change the brain wave focus. Theta seems to be mostly associated with hippocampal function around memory. Because of that, a number of groups starting giving theta waves.”

In one such study, researchers used accelerated, high-dose intermittent theta burst stimulation (iTBS) to treat highly treatment-resistant depression patients (Brain. 2018;141[3]:e18). The treatment lasted 5 days and consisted of 10 sessions per day, with 50 minutes between each session. “It’s a much more intensive system that delivers about 90,000 pulses,” said Dr. Schatzberg, who directs the Stanford Mood Disorders Center. Most patients remitted, but the durability of therapeutic response was weak, and all patients relapsed within 2 weeks post treatment.

“There’s more work to be done, but rTMS is really a good technology,” he concluded. “I think we will achieve much higher rates of success with this treatment once we push the envelope a little bit.”

Dr. Schatzberg disclosed that he has served a consultant to Alkermes, Avanir, Bracket, Compass, Delpor, Epiodyne, Janssen, Jazz, Lundbeck, McKinsey, Merck, Myriad Genetics, Owl, Neuronetics, Pfizer, Sage, and Sunovion. He has received research funding from Janssen and also holds an ownership interest in Corcept, Dermira, Delpor, Epiodyne, Incyte Genetics, Madrigal, Merck, Owl Analytics, Seattle Genetics, Titan, and Xhale.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM NPA 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.