Bariatric surgery has mostly positive impact in knee arthroplasty

Article Type
Changed
Tue, 09/24/2019 - 16:25

 

Bariatric surgery prior to arthroplasty reduces the likelihood of multiple complications but some risks increase, a large study has found.

A patient undergoes rehab for his knee
KatarzynaBialasiewicz/Thinkstock

The study, led by Yicun Wang, PhD, of Nanjing (China) University was published in the Journal of Arthroplasty. “Generally speaking, bariatric surgery decreases some postoperative complications, decreases length of stay, and lowers mortality,” the study investigators wrote, [but] anemia and blood transfusion seem to be more common in patients with prior bariatric surgery.

They analyzed the effect of bariatric surgery on subsequent arthroplasty in morbidly obese patients in the United States using Nationwide Inpatient Sample 2006-2014 data on total hip arthroplasty (THA) and total knee arthroplasty (TKA). The researchers defined morbid obese patients as those with a body mass index higher than 40 kg/m2.

Among patients who underwent TKA, the researchers compared a group of 9,803 morbidly obese patients with the same number of patients who had undergone bariatric surgery. The two groups were matched by age, sex, income, primary insurance payer, and race.

There were large differences between the bariatric surgery group vs. morbidly obese group: Pulmonary embolism was much more common in the morbid obesity group (odds ratio, 0.22; 95% confidence interval, 0.05-1.03; P = .0346) while blood transfusion was more common in the bariatric surgery group (OR, 1.76; 95% CI, 1.52-2.03; P less than .0001).

For TKA, the researchers used the same approach to analyze 2,540 matched pairs of patients. In the bariatric surgery vs. morbidly obese comparison, pulmonary embolism was more common in the morbidly obese group (OR, 0.34; 95% CI, 0.20-0.57; P less than .0001), as were respiratory complications (OR, 0.45; 95% CI, 0.26-0.78; P = .0032) and death (OR, 0.07; 95% CI, 0.01-0.50; P = .0005). But the bariatric surgery group had higher levels of blood transfusion (OR, 1.87; 95% CI, 1.71-2.04; P less than .0001) and anemia (OR, 1.16; 95% CI, 1.09-1.24; P less than .0001).

Going forward, the researchers write, “future studies on these patients should attempt to evaluate the impact of bariatric surgery on the long-term outcomes of arthroplasty.”

The study was supported by various funders including the National Natural Science Foundation of China, the Natural Science Foundation of Guangdong Province, the Project of Administration of Traditional Chinese Medicine of Guangdong Province and others. No author disclosures are reported.

SOURCE: Wang Y et al. J Arthroplasty. 2019;S0883-5403(19)30667-9.

Publications
Topics
Sections

 

Bariatric surgery prior to arthroplasty reduces the likelihood of multiple complications but some risks increase, a large study has found.

A patient undergoes rehab for his knee
KatarzynaBialasiewicz/Thinkstock

The study, led by Yicun Wang, PhD, of Nanjing (China) University was published in the Journal of Arthroplasty. “Generally speaking, bariatric surgery decreases some postoperative complications, decreases length of stay, and lowers mortality,” the study investigators wrote, [but] anemia and blood transfusion seem to be more common in patients with prior bariatric surgery.

They analyzed the effect of bariatric surgery on subsequent arthroplasty in morbidly obese patients in the United States using Nationwide Inpatient Sample 2006-2014 data on total hip arthroplasty (THA) and total knee arthroplasty (TKA). The researchers defined morbid obese patients as those with a body mass index higher than 40 kg/m2.

Among patients who underwent TKA, the researchers compared a group of 9,803 morbidly obese patients with the same number of patients who had undergone bariatric surgery. The two groups were matched by age, sex, income, primary insurance payer, and race.

There were large differences between the bariatric surgery group vs. morbidly obese group: Pulmonary embolism was much more common in the morbid obesity group (odds ratio, 0.22; 95% confidence interval, 0.05-1.03; P = .0346) while blood transfusion was more common in the bariatric surgery group (OR, 1.76; 95% CI, 1.52-2.03; P less than .0001).

For TKA, the researchers used the same approach to analyze 2,540 matched pairs of patients. In the bariatric surgery vs. morbidly obese comparison, pulmonary embolism was more common in the morbidly obese group (OR, 0.34; 95% CI, 0.20-0.57; P less than .0001), as were respiratory complications (OR, 0.45; 95% CI, 0.26-0.78; P = .0032) and death (OR, 0.07; 95% CI, 0.01-0.50; P = .0005). But the bariatric surgery group had higher levels of blood transfusion (OR, 1.87; 95% CI, 1.71-2.04; P less than .0001) and anemia (OR, 1.16; 95% CI, 1.09-1.24; P less than .0001).

Going forward, the researchers write, “future studies on these patients should attempt to evaluate the impact of bariatric surgery on the long-term outcomes of arthroplasty.”

The study was supported by various funders including the National Natural Science Foundation of China, the Natural Science Foundation of Guangdong Province, the Project of Administration of Traditional Chinese Medicine of Guangdong Province and others. No author disclosures are reported.

SOURCE: Wang Y et al. J Arthroplasty. 2019;S0883-5403(19)30667-9.

 

Bariatric surgery prior to arthroplasty reduces the likelihood of multiple complications but some risks increase, a large study has found.

A patient undergoes rehab for his knee
KatarzynaBialasiewicz/Thinkstock

The study, led by Yicun Wang, PhD, of Nanjing (China) University was published in the Journal of Arthroplasty. “Generally speaking, bariatric surgery decreases some postoperative complications, decreases length of stay, and lowers mortality,” the study investigators wrote, [but] anemia and blood transfusion seem to be more common in patients with prior bariatric surgery.

They analyzed the effect of bariatric surgery on subsequent arthroplasty in morbidly obese patients in the United States using Nationwide Inpatient Sample 2006-2014 data on total hip arthroplasty (THA) and total knee arthroplasty (TKA). The researchers defined morbid obese patients as those with a body mass index higher than 40 kg/m2.

Among patients who underwent TKA, the researchers compared a group of 9,803 morbidly obese patients with the same number of patients who had undergone bariatric surgery. The two groups were matched by age, sex, income, primary insurance payer, and race.

There were large differences between the bariatric surgery group vs. morbidly obese group: Pulmonary embolism was much more common in the morbid obesity group (odds ratio, 0.22; 95% confidence interval, 0.05-1.03; P = .0346) while blood transfusion was more common in the bariatric surgery group (OR, 1.76; 95% CI, 1.52-2.03; P less than .0001).

For TKA, the researchers used the same approach to analyze 2,540 matched pairs of patients. In the bariatric surgery vs. morbidly obese comparison, pulmonary embolism was more common in the morbidly obese group (OR, 0.34; 95% CI, 0.20-0.57; P less than .0001), as were respiratory complications (OR, 0.45; 95% CI, 0.26-0.78; P = .0032) and death (OR, 0.07; 95% CI, 0.01-0.50; P = .0005). But the bariatric surgery group had higher levels of blood transfusion (OR, 1.87; 95% CI, 1.71-2.04; P less than .0001) and anemia (OR, 1.16; 95% CI, 1.09-1.24; P less than .0001).

Going forward, the researchers write, “future studies on these patients should attempt to evaluate the impact of bariatric surgery on the long-term outcomes of arthroplasty.”

The study was supported by various funders including the National Natural Science Foundation of China, the Natural Science Foundation of Guangdong Province, the Project of Administration of Traditional Chinese Medicine of Guangdong Province and others. No author disclosures are reported.

SOURCE: Wang Y et al. J Arthroplasty. 2019;S0883-5403(19)30667-9.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF ARTHROPLASTY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

 

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Cancer patients increasingly being discharged to subacute rehabilitation facilities

Article Type
Changed
Thu, 09/12/2019 - 12:47

As immunotherapy has become more widely available, cancer patients have been referred to subacute rehabilitation (SAR) facilities at an increasing rate, to become well enough to tolerate treatment, investigators report.

However, “many patients never receive additional therapy and are readmitted or deceased within a short time, marking this population as one with strong needs for concurrent palliative and usual oncology care,” wrote Jonathan C. Yeh, MD, of Johns Hopkins University, Baltimore, and coauthors. Their report is in Journal of Oncology Practice.

To determine if the advent of immunotherapy contributed to an increase in referrals to SAR facilities – and which patients survived long enough to benefit – the researchers reviewed the electronic charts of 358 patients who were referred to such facilities. The number of referrals increased gradually over the 8-year period of the study.

Of these 358 patients, 174 (49%) were seen again in the oncology clinic before readmission or death and 117 (33%) ever received additional cancer-directed treatment. The patients most likely to receive additional treatment were those with leukemia, lymphoma, or localized solid disease.

Of the 413 total discharges, 116 (28%) resulted in hospital readmission within 30 days of discharge. Seventy-four (21%) of the patients were deceased within 30 days; 212 (59%) were deceased within 180 days. Only 123 (30%) of the initial admissions involved a palliative care specialist; this involvement was associated with increases in documented goals of care, completion of advance directives, and election of do-not-resuscitate status.

The authors noted their study’s limitations, including all of the data coming from a single tertiary cancer center. In addition, the data are observational, which made the researchers “unable to control for key patient characteristics such as performance status, patient goals, insurance coverage of trials, and the like.”

Dr. Smith reported being employed by UpToDate and receiving royalties as coeditor of the Oxford Textbook of Cancer Communication. No other conflicts of interest were reported.

SOURCE: Yeh JC et al. J Oncol Pract. 2019 Aug 29. doi: 10.1200/JOP.19.00044.

Publications
Topics
Sections

As immunotherapy has become more widely available, cancer patients have been referred to subacute rehabilitation (SAR) facilities at an increasing rate, to become well enough to tolerate treatment, investigators report.

However, “many patients never receive additional therapy and are readmitted or deceased within a short time, marking this population as one with strong needs for concurrent palliative and usual oncology care,” wrote Jonathan C. Yeh, MD, of Johns Hopkins University, Baltimore, and coauthors. Their report is in Journal of Oncology Practice.

To determine if the advent of immunotherapy contributed to an increase in referrals to SAR facilities – and which patients survived long enough to benefit – the researchers reviewed the electronic charts of 358 patients who were referred to such facilities. The number of referrals increased gradually over the 8-year period of the study.

Of these 358 patients, 174 (49%) were seen again in the oncology clinic before readmission or death and 117 (33%) ever received additional cancer-directed treatment. The patients most likely to receive additional treatment were those with leukemia, lymphoma, or localized solid disease.

Of the 413 total discharges, 116 (28%) resulted in hospital readmission within 30 days of discharge. Seventy-four (21%) of the patients were deceased within 30 days; 212 (59%) were deceased within 180 days. Only 123 (30%) of the initial admissions involved a palliative care specialist; this involvement was associated with increases in documented goals of care, completion of advance directives, and election of do-not-resuscitate status.

The authors noted their study’s limitations, including all of the data coming from a single tertiary cancer center. In addition, the data are observational, which made the researchers “unable to control for key patient characteristics such as performance status, patient goals, insurance coverage of trials, and the like.”

Dr. Smith reported being employed by UpToDate and receiving royalties as coeditor of the Oxford Textbook of Cancer Communication. No other conflicts of interest were reported.

SOURCE: Yeh JC et al. J Oncol Pract. 2019 Aug 29. doi: 10.1200/JOP.19.00044.

As immunotherapy has become more widely available, cancer patients have been referred to subacute rehabilitation (SAR) facilities at an increasing rate, to become well enough to tolerate treatment, investigators report.

However, “many patients never receive additional therapy and are readmitted or deceased within a short time, marking this population as one with strong needs for concurrent palliative and usual oncology care,” wrote Jonathan C. Yeh, MD, of Johns Hopkins University, Baltimore, and coauthors. Their report is in Journal of Oncology Practice.

To determine if the advent of immunotherapy contributed to an increase in referrals to SAR facilities – and which patients survived long enough to benefit – the researchers reviewed the electronic charts of 358 patients who were referred to such facilities. The number of referrals increased gradually over the 8-year period of the study.

Of these 358 patients, 174 (49%) were seen again in the oncology clinic before readmission or death and 117 (33%) ever received additional cancer-directed treatment. The patients most likely to receive additional treatment were those with leukemia, lymphoma, or localized solid disease.

Of the 413 total discharges, 116 (28%) resulted in hospital readmission within 30 days of discharge. Seventy-four (21%) of the patients were deceased within 30 days; 212 (59%) were deceased within 180 days. Only 123 (30%) of the initial admissions involved a palliative care specialist; this involvement was associated with increases in documented goals of care, completion of advance directives, and election of do-not-resuscitate status.

The authors noted their study’s limitations, including all of the data coming from a single tertiary cancer center. In addition, the data are observational, which made the researchers “unable to control for key patient characteristics such as performance status, patient goals, insurance coverage of trials, and the like.”

Dr. Smith reported being employed by UpToDate and receiving royalties as coeditor of the Oxford Textbook of Cancer Communication. No other conflicts of interest were reported.

SOURCE: Yeh JC et al. J Oncol Pract. 2019 Aug 29. doi: 10.1200/JOP.19.00044.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF ONCOLOGY PRACTICE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

High-dose teriparatide tops standard dose in boosting BMD in postmenopausal women

Article Type
Changed
Sat, 09/14/2019 - 23:31

 

A high dose of teriparatide in combination with denosumab increases bone mineral density (BMD) in postmenopausal women with osteoporosis to a greater extent than a lower-dose regimen.

The latest findings suggest that high-dose teriparatide “stimulates even greater separation between bone resorption and formation than that of standard-dose teriparatide,” wrote Joy N. Tsai, MD, of Harvard Medical School, Boston, and her coauthors in Lancet Diabetes & Endocrinology.

Previously, findings from the Denosumab and Teriparatide Administration (DATA) study showed that a combination of teriparatide and denosumab increased both BMD and estimated bone strength more than either drug alone. To determine if a higher dose of teriparatide plus denosumab would result in larger increases in BMD, investigators in the DATA-HD study randomly assigned 76 postmenopausal women with osteoporosis to receive either 40 μg of teriparatide daily (higher dose; n = 37) or 20 μg of teriparatide daily (standard dose; n = 39). At 3 months, patients in both groups were also started on 60 mg of denosumab every 6 months. Of the initial participants, 69 completed at least one postbaseline visit and were included in the analysis.

At the 15-month follow-up, areal BMD (aBMD) had increased in both groups in all measured sites – lumbar spine, femoral neck, total hip, and distal radius. Patients in the 40-μg group had a significantly higher increase in mean lumbar spine aBMD (17.5%), compared with those in the 20-μg or standard-dose group (9.5%; 95% confidence interval, 5.5-10.6; P less than .0001).

There was also a greater increase in mean femoral neck aBMD in patients in the 40-μg group (6.8%), compared with those in the 20-μg group (4.3; 95% CI, 0.5-4.5; P = .04) and in mean total hip aBMD (6.1% vs. 3.9%, 95% CI; 0.6-3.8, P less than .0001).

In all, 29 participants in the 40-μg group (78%) and 30 participants in the 20-μg group (77%) had adverse events, but with the exceptions of headache and rash, all serious adverse events were considered unrelated to treatment.

The authors noted the limitations of their study, including that it was conducted at a single site with a predominantly white population. In addition, the authors acknowledged that the small sample size did not allow for “direct assessment of fracture benefit [or] for rigorous evaluation of the tolerability and safety of this treatment.”

In an accompanying editorial, Sundeep Khosla, MD, of the Mayo Clinic, Rochester, Minn., wrote that the benefits of personalizing treatment for osteoporosis patients at high risk of fracture seemed to be coming into focus (Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30266-9).

Although the DATA and DATA-HD studies of teriparatide and denosumab reported by Dr. Tsai and colleagues have their limitations – including a small sample size for DATA-HD – they indicate the “possibility of refining treatment for patients with osteoporosis at high risk of fracture and personalizing treatment for these patients beyond the one-size-fits-all approach currently used,” Dr. Khosla wrote. Rather than offer bisphosphonates at standardized doses, patients at high risk could now be considered for the newly recommended high-dose teriparatide and denosumab combination, he said.

Dr. Khosla also noted that price remains an issue, given the estimated cost of $76,000 for 15 months of this proposed combination. However, the benefits in regard to at least bone mineral density are clear, he added, and that might prove sufficient enough for high-risk patients in need of an alternative therapy.

The study was supported by the Dart Family Foundation, the National Institutes of Health, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Eli Lilly and Amgen supplied the drugs. The authors reported numerous conflicts of interest, including receiving grants, reimbursements, and personal fees from various pharmaceutical companies, committees, and research institutes. Dr. Khosla reported no conflicts of interest.

SOURCE: Tsai JN et al. Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30255-4.

Publications
Topics
Sections

 

A high dose of teriparatide in combination with denosumab increases bone mineral density (BMD) in postmenopausal women with osteoporosis to a greater extent than a lower-dose regimen.

The latest findings suggest that high-dose teriparatide “stimulates even greater separation between bone resorption and formation than that of standard-dose teriparatide,” wrote Joy N. Tsai, MD, of Harvard Medical School, Boston, and her coauthors in Lancet Diabetes & Endocrinology.

Previously, findings from the Denosumab and Teriparatide Administration (DATA) study showed that a combination of teriparatide and denosumab increased both BMD and estimated bone strength more than either drug alone. To determine if a higher dose of teriparatide plus denosumab would result in larger increases in BMD, investigators in the DATA-HD study randomly assigned 76 postmenopausal women with osteoporosis to receive either 40 μg of teriparatide daily (higher dose; n = 37) or 20 μg of teriparatide daily (standard dose; n = 39). At 3 months, patients in both groups were also started on 60 mg of denosumab every 6 months. Of the initial participants, 69 completed at least one postbaseline visit and were included in the analysis.

At the 15-month follow-up, areal BMD (aBMD) had increased in both groups in all measured sites – lumbar spine, femoral neck, total hip, and distal radius. Patients in the 40-μg group had a significantly higher increase in mean lumbar spine aBMD (17.5%), compared with those in the 20-μg or standard-dose group (9.5%; 95% confidence interval, 5.5-10.6; P less than .0001).

There was also a greater increase in mean femoral neck aBMD in patients in the 40-μg group (6.8%), compared with those in the 20-μg group (4.3; 95% CI, 0.5-4.5; P = .04) and in mean total hip aBMD (6.1% vs. 3.9%, 95% CI; 0.6-3.8, P less than .0001).

In all, 29 participants in the 40-μg group (78%) and 30 participants in the 20-μg group (77%) had adverse events, but with the exceptions of headache and rash, all serious adverse events were considered unrelated to treatment.

The authors noted the limitations of their study, including that it was conducted at a single site with a predominantly white population. In addition, the authors acknowledged that the small sample size did not allow for “direct assessment of fracture benefit [or] for rigorous evaluation of the tolerability and safety of this treatment.”

In an accompanying editorial, Sundeep Khosla, MD, of the Mayo Clinic, Rochester, Minn., wrote that the benefits of personalizing treatment for osteoporosis patients at high risk of fracture seemed to be coming into focus (Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30266-9).

Although the DATA and DATA-HD studies of teriparatide and denosumab reported by Dr. Tsai and colleagues have their limitations – including a small sample size for DATA-HD – they indicate the “possibility of refining treatment for patients with osteoporosis at high risk of fracture and personalizing treatment for these patients beyond the one-size-fits-all approach currently used,” Dr. Khosla wrote. Rather than offer bisphosphonates at standardized doses, patients at high risk could now be considered for the newly recommended high-dose teriparatide and denosumab combination, he said.

Dr. Khosla also noted that price remains an issue, given the estimated cost of $76,000 for 15 months of this proposed combination. However, the benefits in regard to at least bone mineral density are clear, he added, and that might prove sufficient enough for high-risk patients in need of an alternative therapy.

The study was supported by the Dart Family Foundation, the National Institutes of Health, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Eli Lilly and Amgen supplied the drugs. The authors reported numerous conflicts of interest, including receiving grants, reimbursements, and personal fees from various pharmaceutical companies, committees, and research institutes. Dr. Khosla reported no conflicts of interest.

SOURCE: Tsai JN et al. Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30255-4.

 

A high dose of teriparatide in combination with denosumab increases bone mineral density (BMD) in postmenopausal women with osteoporosis to a greater extent than a lower-dose regimen.

The latest findings suggest that high-dose teriparatide “stimulates even greater separation between bone resorption and formation than that of standard-dose teriparatide,” wrote Joy N. Tsai, MD, of Harvard Medical School, Boston, and her coauthors in Lancet Diabetes & Endocrinology.

Previously, findings from the Denosumab and Teriparatide Administration (DATA) study showed that a combination of teriparatide and denosumab increased both BMD and estimated bone strength more than either drug alone. To determine if a higher dose of teriparatide plus denosumab would result in larger increases in BMD, investigators in the DATA-HD study randomly assigned 76 postmenopausal women with osteoporosis to receive either 40 μg of teriparatide daily (higher dose; n = 37) or 20 μg of teriparatide daily (standard dose; n = 39). At 3 months, patients in both groups were also started on 60 mg of denosumab every 6 months. Of the initial participants, 69 completed at least one postbaseline visit and were included in the analysis.

At the 15-month follow-up, areal BMD (aBMD) had increased in both groups in all measured sites – lumbar spine, femoral neck, total hip, and distal radius. Patients in the 40-μg group had a significantly higher increase in mean lumbar spine aBMD (17.5%), compared with those in the 20-μg or standard-dose group (9.5%; 95% confidence interval, 5.5-10.6; P less than .0001).

There was also a greater increase in mean femoral neck aBMD in patients in the 40-μg group (6.8%), compared with those in the 20-μg group (4.3; 95% CI, 0.5-4.5; P = .04) and in mean total hip aBMD (6.1% vs. 3.9%, 95% CI; 0.6-3.8, P less than .0001).

In all, 29 participants in the 40-μg group (78%) and 30 participants in the 20-μg group (77%) had adverse events, but with the exceptions of headache and rash, all serious adverse events were considered unrelated to treatment.

The authors noted the limitations of their study, including that it was conducted at a single site with a predominantly white population. In addition, the authors acknowledged that the small sample size did not allow for “direct assessment of fracture benefit [or] for rigorous evaluation of the tolerability and safety of this treatment.”

In an accompanying editorial, Sundeep Khosla, MD, of the Mayo Clinic, Rochester, Minn., wrote that the benefits of personalizing treatment for osteoporosis patients at high risk of fracture seemed to be coming into focus (Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30266-9).

Although the DATA and DATA-HD studies of teriparatide and denosumab reported by Dr. Tsai and colleagues have their limitations – including a small sample size for DATA-HD – they indicate the “possibility of refining treatment for patients with osteoporosis at high risk of fracture and personalizing treatment for these patients beyond the one-size-fits-all approach currently used,” Dr. Khosla wrote. Rather than offer bisphosphonates at standardized doses, patients at high risk could now be considered for the newly recommended high-dose teriparatide and denosumab combination, he said.

Dr. Khosla also noted that price remains an issue, given the estimated cost of $76,000 for 15 months of this proposed combination. However, the benefits in regard to at least bone mineral density are clear, he added, and that might prove sufficient enough for high-risk patients in need of an alternative therapy.

The study was supported by the Dart Family Foundation, the National Institutes of Health, and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Eli Lilly and Amgen supplied the drugs. The authors reported numerous conflicts of interest, including receiving grants, reimbursements, and personal fees from various pharmaceutical companies, committees, and research institutes. Dr. Khosla reported no conflicts of interest.

SOURCE: Tsai JN et al. Lancet Diabetes Endocrinol. 2019 Aug 22. doi: 10.1016/S2213-8587(19)30255-4.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM LANCET DIABETES & ENDOCRINOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Short Takes

Article Type
Changed
Thu, 09/12/2019 - 12:25

Pharmacist-led intervention reduced inappropriate medication prescriptions

An outpatient pharmacy-led intervention of notifying prescribing physicians to discontinue inappropriate Beers Criteria medications resulted in a greater discontinuation of inappropriate medications for older adults at 6 months, compared with the control group (43% vs. 12% discontinuation).

Citation: Martin P et al. Effect of a pharmacist-led educational intervention on inappropriate medication prescriptions in older adults: The D-PRESCRIBE randomized clinical trial. JAMA. 2018;320(18):1889-98.

 

Omadacycline noninferior for community-acquired pneumonia and acute bacterial soft tissue skin infections

Randomized, double-blind, double-dummy trials showed omadacycline is a noninferior alternative to moxifloxacin for the treatment of community-acquired pneumonia and to linezolid for acute bacterial soft-tissue skin infections.

Citation: Nuzyra (omadacycline) [package insert]. Boston, MA: Paratek Pharmaceuticals. 2018.

 

Lack of evidence to support low-salt diet in adult heart failure patients

Systematic review of multiple databases demonstrated there is limited high-quality evidence to support current guidelines that recommend a low-salt diet to heart failure patients.

Citation: Mahtani KR et al. Reduced salt intake for heart failure: A systematic review. JAMA Int Med. 2018;178(12):1693-700.


Magnesium for rate control in rapid atrial fibrillation

Randomized, controlled trial demonstrated that intravenous magnesium sulfate in combination with atrioventricular (AV) nodal blocking agents resulted in better rate control for atrial fibrillation with rapid ventricular response than did placebo given in combination with AV nodal blocking agents.

Citation: Bouida W et al. Low-dose magnesium sulfate versus high-dose in the early management of rapid atrial fibrillation: Randomized controlled double-blind study (LOMAGHI Study). Acad Emerg Med. 2019 Feb;26(2):183-91.


Low versus intermediate tidal volume strategy on ventilator-free days in ICU patients without ARDS

Randomized, clinical trial of low tidal volume versus intermediate tidal volume strategies in invasively ventilated patients without accute respiratory distress syndrome (ARDS) demonstrated no difference in number of ventilator-free days, ICU length of stay, hospital length of stay, incidence of ventilator-associated adverse events (ARDS, pneumonia, severe atelectasis, pneumothorax), or 28-day mortality.

Citation: Writing Group for the PReVENT Investigators, Simonis FD, Serpa Neto A. Effect of a low vs intermediate tidal volume strategy on ventilator-free days in intensive care unit patients without ARDS: A randomized clinical trial. JAMA. 2018;320(18):1872-80.

Publications
Topics
Sections

Pharmacist-led intervention reduced inappropriate medication prescriptions

An outpatient pharmacy-led intervention of notifying prescribing physicians to discontinue inappropriate Beers Criteria medications resulted in a greater discontinuation of inappropriate medications for older adults at 6 months, compared with the control group (43% vs. 12% discontinuation).

Citation: Martin P et al. Effect of a pharmacist-led educational intervention on inappropriate medication prescriptions in older adults: The D-PRESCRIBE randomized clinical trial. JAMA. 2018;320(18):1889-98.

 

Omadacycline noninferior for community-acquired pneumonia and acute bacterial soft tissue skin infections

Randomized, double-blind, double-dummy trials showed omadacycline is a noninferior alternative to moxifloxacin for the treatment of community-acquired pneumonia and to linezolid for acute bacterial soft-tissue skin infections.

Citation: Nuzyra (omadacycline) [package insert]. Boston, MA: Paratek Pharmaceuticals. 2018.

 

Lack of evidence to support low-salt diet in adult heart failure patients

Systematic review of multiple databases demonstrated there is limited high-quality evidence to support current guidelines that recommend a low-salt diet to heart failure patients.

Citation: Mahtani KR et al. Reduced salt intake for heart failure: A systematic review. JAMA Int Med. 2018;178(12):1693-700.


Magnesium for rate control in rapid atrial fibrillation

Randomized, controlled trial demonstrated that intravenous magnesium sulfate in combination with atrioventricular (AV) nodal blocking agents resulted in better rate control for atrial fibrillation with rapid ventricular response than did placebo given in combination with AV nodal blocking agents.

Citation: Bouida W et al. Low-dose magnesium sulfate versus high-dose in the early management of rapid atrial fibrillation: Randomized controlled double-blind study (LOMAGHI Study). Acad Emerg Med. 2019 Feb;26(2):183-91.


Low versus intermediate tidal volume strategy on ventilator-free days in ICU patients without ARDS

Randomized, clinical trial of low tidal volume versus intermediate tidal volume strategies in invasively ventilated patients without accute respiratory distress syndrome (ARDS) demonstrated no difference in number of ventilator-free days, ICU length of stay, hospital length of stay, incidence of ventilator-associated adverse events (ARDS, pneumonia, severe atelectasis, pneumothorax), or 28-day mortality.

Citation: Writing Group for the PReVENT Investigators, Simonis FD, Serpa Neto A. Effect of a low vs intermediate tidal volume strategy on ventilator-free days in intensive care unit patients without ARDS: A randomized clinical trial. JAMA. 2018;320(18):1872-80.

Pharmacist-led intervention reduced inappropriate medication prescriptions

An outpatient pharmacy-led intervention of notifying prescribing physicians to discontinue inappropriate Beers Criteria medications resulted in a greater discontinuation of inappropriate medications for older adults at 6 months, compared with the control group (43% vs. 12% discontinuation).

Citation: Martin P et al. Effect of a pharmacist-led educational intervention on inappropriate medication prescriptions in older adults: The D-PRESCRIBE randomized clinical trial. JAMA. 2018;320(18):1889-98.

 

Omadacycline noninferior for community-acquired pneumonia and acute bacterial soft tissue skin infections

Randomized, double-blind, double-dummy trials showed omadacycline is a noninferior alternative to moxifloxacin for the treatment of community-acquired pneumonia and to linezolid for acute bacterial soft-tissue skin infections.

Citation: Nuzyra (omadacycline) [package insert]. Boston, MA: Paratek Pharmaceuticals. 2018.

 

Lack of evidence to support low-salt diet in adult heart failure patients

Systematic review of multiple databases demonstrated there is limited high-quality evidence to support current guidelines that recommend a low-salt diet to heart failure patients.

Citation: Mahtani KR et al. Reduced salt intake for heart failure: A systematic review. JAMA Int Med. 2018;178(12):1693-700.


Magnesium for rate control in rapid atrial fibrillation

Randomized, controlled trial demonstrated that intravenous magnesium sulfate in combination with atrioventricular (AV) nodal blocking agents resulted in better rate control for atrial fibrillation with rapid ventricular response than did placebo given in combination with AV nodal blocking agents.

Citation: Bouida W et al. Low-dose magnesium sulfate versus high-dose in the early management of rapid atrial fibrillation: Randomized controlled double-blind study (LOMAGHI Study). Acad Emerg Med. 2019 Feb;26(2):183-91.


Low versus intermediate tidal volume strategy on ventilator-free days in ICU patients without ARDS

Randomized, clinical trial of low tidal volume versus intermediate tidal volume strategies in invasively ventilated patients without accute respiratory distress syndrome (ARDS) demonstrated no difference in number of ventilator-free days, ICU length of stay, hospital length of stay, incidence of ventilator-associated adverse events (ARDS, pneumonia, severe atelectasis, pneumothorax), or 28-day mortality.

Citation: Writing Group for the PReVENT Investigators, Simonis FD, Serpa Neto A. Effect of a low vs intermediate tidal volume strategy on ventilator-free days in intensive care unit patients without ARDS: A randomized clinical trial. JAMA. 2018;320(18):1872-80.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

In MS, iron-ringed lesions may add to imaging toolkit

Article Type
Changed
Wed, 10/30/2019 - 14:45

The presence of an iron ring around a brain lesion suspicious for multiple sclerosis (MS) may provide a promising adjunct to evolving magnetic resonance imaging techniques to track disease activity and progression, according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Dr. Margareta Clarke, a research fellow at the Vall d'Hebron Research Institute in Barcelona.
Kari Oakes/MDedge News
Dr. Margareta Clarke

Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.

“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”

Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”

It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.

 

 


The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.

An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.

Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.

Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.

The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.

The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.

In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).

Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.

In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).

“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.

Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.

Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.

SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.

Meeting/Event
Issue
Neurology Reviews- 27(11)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

The presence of an iron ring around a brain lesion suspicious for multiple sclerosis (MS) may provide a promising adjunct to evolving magnetic resonance imaging techniques to track disease activity and progression, according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Dr. Margareta Clarke, a research fellow at the Vall d'Hebron Research Institute in Barcelona.
Kari Oakes/MDedge News
Dr. Margareta Clarke

Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.

“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”

Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”

It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.

 

 


The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.

An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.

Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.

Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.

The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.

The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.

In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).

Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.

In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).

“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.

Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.

Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.

SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.

The presence of an iron ring around a brain lesion suspicious for multiple sclerosis (MS) may provide a promising adjunct to evolving magnetic resonance imaging techniques to track disease activity and progression, according to research presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Dr. Margareta Clarke, a research fellow at the Vall d'Hebron Research Institute in Barcelona.
Kari Oakes/MDedge News
Dr. Margareta Clarke

Using a conventional 3 Tesla magnetic resonance imaging (MRI) scanner, Margareta Clarke, PhD, and colleagues were able to identify iron rings (also called iron rims) and the central vein sign, and saw that both lesion characteristics were more common in MS patients than in those without MS.

“Routine two-dimensional 3 Tesla MRI with susceptibility weighting can be used to successfully visualize central veins and iron rims,” said Dr. Clarke, speaking at an imaging-focused young investigators’ session at the meeting. “Also, the central vein sign findings from previous 3T studies are confirmed.”

Dr. Clarke, a research fellow at the Vall d’Hebron Research Institute in Barcelona, explained that iron is stored within oligodendrocytes and myelin within the brain. In up to 56% of MS lesions, a rim of iron is visible with susceptibility weighted MRI imaging, she said, adding that the iron rings around the lesions “are likely caused by iron-laden activated microglia and macrophages that accumulate on the edges of lesions.”

It had been known that when lesions are surrounded by iron rings, they are more likely to enlarge and become increasingly hypointense on T1 weighted MRI. In addition, patients with more disability are more likely to have iron-rimmed brain lesions, said Dr. Clarke, and iron rings are associated with chronic disease activity. “Iron rings are a proposed marker of continuing inflammation and tissue loss,” she added.

 

 


The cross-sectional, single-center study enrolled patients with clinically isolated syndrome (CIS), MS, and conditions which can mimic MS on MRI. Dr. Clarke and her coinvestigators looked at the frequency of lesions with the central vein sign, and with iron rings, in all patients.

An additional aim of the study was to compare how experienced and inexperienced raters fared in their identification of both central veins and iron rings in 25 scans randomly chosen from within the study population. Inter-rater reliability between experienced and inexperienced raters was assessed as good, with little difference between experience levels in detecting iron rings and central veins, said Dr. Clarke.

Criteria used for central vein determination were those established by the North American Imaging in MS initiative, said Dr. Clarke: The vein needs to be seen entering and/or exiting the lesion, and the vein must course through the lesion’s center. If lesions are confluent, each of the larger lesion’s “fingers” must be assessed individually.

Iron rings appear as a hypointense area rimming the lesion’s edge; for the study, an iron ring was considered present if it could be seen fully or partially encircling a lesion, and if the ring was visible on at least two slices.

The study enrolled 103 patients with relapsing-remitting MS, 49 with progressive MS, 112 with CIS, and 35 non-MS patients; about 60% of this latter group had either autoimmune or vascular disease.

The fewest white matter lesions – a median of 4 per patient - were seen in the CIS group, while the progressive MS and non-MS group each had a median of 7 lesions, and the relapsing-remitting MS group had a median of 10 lesions.

In all, 2,617 lesions were analyzed, and 1,352 were assessed as having the central vein sign. Patients with MS or CIS had central vein sign in more than 50% of their lesions, while the non-MS patients had fewer than 20% central vein–positive lesions. In CIS and MS patients, central vein–positive lesions occurred more frequently in the periventricular and subcortical regions, compared with other brain regions (P less than .001).

Iron rings were detected in 392 lesions; none of the non-MS patients had iron ring–positive lesions. In terms of the brain regions where iron rings were most likely to be seen, said Dr. Clarke, “Over half of all iron ring-positive lesions were periventricular.” This finding was statistically significant as well (P less than .001). At least one lesion with an iron ring was seen in 59% of relapsing-remitting MS patients, 39% of progressive MS patients, and 48% of CIS patients.

In terms of patient characteristics, men were 40% more likely to have iron ring–positive lesions, and patients with relapsing-remitting MS were 50% more likely than were patients with CIS to have iron rings. Iron rings became 3% less likely for each additional year of age, as well (P less than .01 for all comparisons).

“Our results show that iron ring numbers peak in relapsing-remitting MS and decrease with longer disease duration,” Dr. Clarke and colleagues reported.

Dr. Clarke acknowledged several limitations of the study, including its single-center and retrospective nature, as well as the relatively low numbers of non-MS patients and patients with progressive MS. She and her colleagues are planning larger studies using 5-year follow-up data, she said.

Dr. Clarke is an ECTRIMS-MAGNIMS fellow and reported a speaker honorarium from Novartis.

SOURCE: Clarke M et al. ECTRIMS 2019. Abstract 108.

Issue
Neurology Reviews- 27(11)
Issue
Neurology Reviews- 27(11)
Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ECTRIMS 2019

Citation Override
Publish date: September 12, 2019
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Supercooling extends donor liver viability by 27 hours

Article Type
Changed
Thu, 09/26/2019 - 10:10

 

A new supercooling process could extend ex vivo liver viability by more than a day, potentially expanding transplant availability, according to investigators.

A machine profusion process helps supercool human liver without freezing the tissue.
Massachusetts General Hospital
A machine profusion process helps supercool human liver without freezing the tissue.

Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.

“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.

Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.

Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.

In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.

“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.

The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.

To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.

On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.

Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.

“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.

To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.

“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”

Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.

Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.

“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”

The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.

SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.

Publications
Topics
Sections

 

A new supercooling process could extend ex vivo liver viability by more than a day, potentially expanding transplant availability, according to investigators.

A machine profusion process helps supercool human liver without freezing the tissue.
Massachusetts General Hospital
A machine profusion process helps supercool human liver without freezing the tissue.

Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.

“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.

Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.

Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.

In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.

“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.

The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.

To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.

On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.

Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.

“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.

To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.

“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”

Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.

Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.

“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”

The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.

SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.

 

A new supercooling process could extend ex vivo liver viability by more than a day, potentially expanding transplant availability, according to investigators.

A machine profusion process helps supercool human liver without freezing the tissue.
Massachusetts General Hospital
A machine profusion process helps supercool human liver without freezing the tissue.

Standard cooling to 4°C provides just 12 hours of organ preservation, but laboratory testing showed that supercooling to –4°C added 27 hours of viability, reported lead author Reinier J. de Vries, MD, of Harvard Medical School and Massachusetts General Hospital in Boston, and colleagues.

“The absence of technology to preserve organs for more than a few hours is one of the fundamental causes of the donor organ–shortage crisis,” the investigators wrote in Nature Biotechnology.

Supercooling organs to high-subzero temperatures has been shown to prolong organ life while avoiding ice-mediated injury, but techniques that are successful for rat livers have been difficult to translate to human livers because of their larger size, which increases the risk of ice formation, the investigators explained.

Three strategies were employed to overcome this problem: minimization of air-liquid interfaces, development of a new supercooling-preservation solution, and hypothermic machine perfusion to more evenly distribute preservation solution throughout the liver tissue. For recovery of organs after supercooling, the investigators used subnormothermic machine perfusion, which has been used effectively in rat transplants.

In order to measure the impact of this process on organ viability, the investigators first measured adenylate energy content, both before supercooling and after recovery.

“Adenylate energy content, and, particularly, the organ’s ability to recover it during (re)perfusion, is considered the most representative metric for liver viability,” they wrote.

The difference between pre- and postsupercooling energy charge was less than 20%; in comparison, failed liver transplants in large animals and clinical trials have typically involved an energy-charge loss of 40% or more.

To further test organ viability, the investigators measured pre- and postsupercooling levels of bile production, oxygen uptake, and vascular resistance. All of these parameters have been shown to predict transplant success in rats, and bile production has additional precedent from human studies.

On average, bile production, portal resistance, and arterial resistance were not significantly affected by supercooling. Although portal vein resistance was 20% higher after supercooling, this compared favorably with increases of 100%-150% that have been measured in nonviable livers. Similarly, oxygen uptake increased by a mean of 17%, but this was three times lower than changes that have been observed in livers with impaired viability, at 51%.

Additional measures of hepatocellular injury, including AST and ALT, were also supportive of viability after supercooling. Histopathology confirmed these findings by showing preserved tissue architecture.

“In summary, we find that the human livers tested displayed no substantial difference in viability before and after extended subzero supercooling preservation,” the investigators wrote.

To simulate transplantation, the investigators reperfused the organs with blood at a normal temperature, including platelets, complement, and white blood cells, which are drivers of ischemia reperfusion injury. During this process, energy charge remained stable, which indicates preserved mitochondrial function. While energy charge held steady, lactate metabolism increased with bile and urea production, suggesting increased liver function. Bile pH and HCO3– levels fell within range for viability. Although bile glucose exceeded proposed criteria, the investigators pointed out that levels still fell within parameters for research-quality livers. Lactate levels also rose within the first hour of reperfusion, but the investigators suggested that this finding should be interpreted with appropriate context.

“It should be considered that the livers in this study were initially rejected for transplantation,” they wrote, “and the confidence intervals of the lactate concentration at the end of reperfusion largely overlap with time-matched values reported by others during [normothermic machine perfusion] of rejected human livers.”

Hepatocellular injury and histology also were evaluated during and after simulated transplantation, respectively, with favorable results. Although sites of preexisting hepatic injury were aggravated by the process, and rates of apoptosis increased, the investigators considered these changes were clinically insignificant.

Looking to the future, the investigators suggested that further refinement of the process could facilitate even-lower storage temperatures while better preserving liver viability.

“The use of human livers makes this study clinically relevant and promotes the translation of subzero organ preservation to the clinic,” the investigators concluded. “However, long-term survival experiments of transplanted supercooled livers in swine or an alternative large animal model will be needed before clinical translation.”

The study was funded by the National Institutes of Health and the Department of Defense. Dr. de Vries and four other coauthors have provisional patent applications related to the study, and one coauthor disclosed a financial relationship with Organ Solutions.

SOURCE: de Vries RJ et al. Nature Biotechnol. 2019 Sep 9. doi: 10.1038/s41587-019-0223-y.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM NATURE BIOTECHNOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Educating teens, young adults about dangers of vaping

Article Type
Changed
Wed, 09/25/2019 - 13:53

 

Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.

LiudmylaSupynska/Thinkstock

Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.

A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4

Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.

Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.

Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.

In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11

Dr. Lantie Elisabeth Jorandby
Dr. Lantie Elisabeth Jorandby

Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco. Unfortunately for our patients, the market has found a new way to promote e-cigarettes as the “cleaner, harmless” substitute to smoking. As clinicians, we must counter those messages.

Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?

In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.

 

 

Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
 

References

1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19. 

2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.

3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.

4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.

5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.

6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.

7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.

8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.

9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.

10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.

11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.

* This column was updated 9/24/2019.

Publications
Topics
Sections

 

Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.

LiudmylaSupynska/Thinkstock

Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.

A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4

Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.

Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.

Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.

In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11

Dr. Lantie Elisabeth Jorandby
Dr. Lantie Elisabeth Jorandby

Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco. Unfortunately for our patients, the market has found a new way to promote e-cigarettes as the “cleaner, harmless” substitute to smoking. As clinicians, we must counter those messages.

Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?

In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.

 

 

Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
 

References

1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19. 

2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.

3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.

4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.

5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.

6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.

7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.

8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.

9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.

10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.

11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.

* This column was updated 9/24/2019.

 

Physicians have been alarmed about the vaping craze for quite some time. This alarm has grown louder in the wake of news that electronic cigarettes have been associated with a mysterious lung disease.

LiudmylaSupynska/Thinkstock

Public health officials have reported that there have been 530 cases of vaping-related respiratory disease,1 and as of press time at least seven deaths had been attributed to vaping*. On Sept. 6, 2019, the Food and Drug Administration, Centers for Disease Control and Prevention, and other health officials issued an investigation notice on vaping and e-cigarettes,2 cautioning teenagers, young adults, and pregnant women to avoid e-cigarettes completely and cautioning all users to never buy e-cigarettes off the street or from social sources.

A few days later, on Sept. 9, the FDA’s Center for Tobacco Products issued a warning letter to JUUL Labs, makers of a popular e-cigarette, for illegal marketing of modified-risk tobacco products.3 Then on Sept. 10, health officials in Kansas reported that a sixth person has died of a lung illness related to vaping.4

Researchers have found that 80% of those diagnosed with the vaping illness used products that contained THC, the psychoactive ingredient in marijuana, 61% had used nicotine products, and 7% used cannabidiol (CBD) products. Vitamin E acetate is another substance identified in press reports as tied to the severe lung disease.

Most of the patients affected are adolescents and young adults, with the average age of 19 years.5 This comes as vaping among high school students rose 78% between 2017 and 2018.6 According the U.S. surgeon general, one in five teens vapes. Other data show that teen use of e-cigarettes comes with most users having never smoked a traditional cigarette.7 Teens and young adults frequently borrow buy* e-cigarette “pods” from gas stations but borrow and purchase from friends or peers. In addition, young people are known to alter the pods to insert other liquids, such as CBD and other marijuana products.

Teens and young adults are at higher risk for vaping complications. Their respiratory and immune systems are still developing. In addition to concerns about the recent surge of respiratory illnesses, nicotine is known to also suppress the immune system, which makes people who use it more susceptible to viral and bacterial infections – and also making it harder for them to recover.

In addition nicotine hyperactivates the reward centers of the brain, which can trigger addictive behaviors. Because the brains of young adults are not yet fully developed until at or after age 26, nicotine use before this can “prime the pump” of a still-developing brain, thereby increasing the likelihood for addiction to harder drugs. Nicotine has been shown to disrupt sleep patterns, which are critical for mental and physical health. Lastly, research shows that smoking increases the risks of various psychiatric disorders, such as depression and anxiety. My teen and young adult patients have endlessly debated with me the idea that smoking – either nicotine or marijuana – eases their anxiety or helps them get to sleep. I tell them that, in the long run, the data show that smoking makes those problems worse.8-11

Dr. Lantie Elisabeth Jorandby
Dr. Lantie Elisabeth Jorandby

Nationally, we are seeing an explosion of multistate legislation pushing marijuana as a health food. E-cigarettes have followed as the “healthy” alternative to traditional tobacco. Unfortunately for our patients, the market has found a new way to promote e-cigarettes as the “cleaner, harmless” substitute to smoking. As clinicians, we must counter those messages.

Finally, our world is now filled with smartphones, sexting, and social media overuse. An entire peer group exists that knows life only with constant electronic stimulation. It is not without irony that our national nicotine obsessions have morphed from paper cigarettes to electronic versions. This raises questions: Are teens and young adults using e-cigarettes because of boredom? Are we witnessing a generational ADHD borne from restlessness that stems from lives with fewer meaningful face-to-face human interactions?

In addition to educating our teens and young adults about the physical risks tied to vaping, we need to teach them to build meaning into their lives that exists outside of this digital age.

 

 

Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
 

References

1. CDC. Outbreak of lung injury associated with e-cigarette use, or vaping. 2019 Sep 19. 

2. CDC. Outbreak of lung illness associated with using e-cigarette products. Investigation notice. 2019 Sep 6.

3. FDA. Warning letter, JUUL Labs. 2019 Sep 9.

4. Sixth person dies of vaping-related illness. The Hill. 2019 Sep 10.

5. Layden JE. Pulmonary illness related to cigarette use in Illinois and Wisconsin – preliminary report. N Engl J Med. 2019 Sep 6. doi: 10.1056/NEJMoa1911614.

6. Cullen KA et al. CDC. MMWR. 2018 Nov 16;67(45):1276-7.

7. National Academies of Sciences, Engineering, and Medicine. Public health consequences of e-cigarettes. 2018.

8. Patton GC et al. Am J Public Health. 1996 Feb;86(2):225-30.

9. Leventhal AM et al. J Psychiatr Res. 2016 Feb;73:71-8.

10. Levine A et al. J Am Acad Child Adolesc Psychiatry. 2017 Mar;56(3):214-2.

11. Leadbeater BJ et al. Addiction. 2019 Feb;114(2):278-93.

* This column was updated 9/24/2019.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Prior antibiotic use lowers checkpoint inhibitor response and survival

Article Type
Changed
Thu, 09/12/2019 - 14:57

 

Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.

In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.

A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.

The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.

However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.

The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.

They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.

The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.

“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.

Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.

“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.

The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.

SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.

Publications
Topics
Sections

 

Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.

In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.

A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.

The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.

However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.

The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.

They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.

The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.

“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.

Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.

“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.

The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.

SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.

 

Prior antibiotic use may be associated with a reduced treatment response to checkpoint inhibitors, and worse outcomes, in patients with cancer, according to investigators.

In a prospective cohort study, researchers followed 196 patients with cancer who were treated with immune checkpoint inhibitors in routine clinical practice.

A total of 22 patients had been treated with a 7-day or less course of broad-spectrum beta-lactam–based antibiotics in the 30 days prior to starting immune checkpoint inhibitor therapy, and 68 patients were concurrently taking broad-spectrum beta-lactam–based antibiotics with their checkpoint inhibitor therapy.

The analysis revealed that prior antibiotic therapy was associated with nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy (P less than .001) and significantly worse overall survival (2 vs. 26 months). Patients who had been on prior antibiotic therapy were also more likely to stop checkpoint inhibitor therapy because their disease had progressed, and were more likely to die of progressive disease while on checkpoint inhibitors.

However, concurrent antibiotic use did not appear to affect either treatment response to checkpoint inhibitors or overall survival.

The most common indication for both prior and concurrent antibiotic use was respiratory tract infections. Researchers examined whether cancer type might play a role in contributing to the association; for example, chronic airway disease in lung cancer might mean higher likelihood of antibiotic use but also lower treatment response and survival.

They found that the association between prior antibiotic therapy and overall survival was consistent across the 119 patients with non–small cell lung cancer, the 38 patients with melanoma, and the 39 patients with other tumor types.

The association was also independent of the class of antibiotic used, the patient’s performance status, and their corticosteroid use.

“Broad-spectrum ATB [antibiotic] use can cause prolonged disruption of the gut ecosystem and impair the effectiveness of the cytotoxic T-cell response against cancer, strengthening the biologic plausibility underlying the adverse effect of ATB therapy on immunotherapy outcomes,” wrote Dr. David J. Pinato, from Imperial College London, and coauthors in JAMA Oncology.

Addressing the question of whether comorbidities might be the mediating factor, the authors pointed out that the use of antibiotics during checkpoint inhibitor therapy – which was a potential indicator of patients’ status worsening during treatment – was not associated with reduced response to treatment or lower overall survival.

“Although provision of cATB [concurrent antibiotic] therapy appears to be safe in the context of immunotherapy, clinicians should carefully weigh the pros and cons of prescribing broad-spectrum ATBs prior to ICI [immune checkpoint inhibitor] treatment,” they wrote.

The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.

SOURCE: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: People who take antibiotics prior to checkpoint inhibitor therapy have lower treatment response and overall survival.

Major finding: Prior antibiotic use is associated with a nearly a 100% greater likelihood of poor response to checkpoint inhibitor therapy.

Study details: A prospective cohort study involving 196 patients receiving checkpoint inhibitor therapy for cancer.

Disclosures: The study was supported by the Imperial College National Institute for Health Research Biomedical Research Centre, the Imperial College Tissue Bank, the Imperial Cancer Research U.K. Centre, the National Institute for Health Research, and the Wellcome Trust Strategic Fund. Two authors reported receiving grant funding and personal fees from the pharmaceutical sector unrelated to the study.

Source: Pinato D et al. JAMA Oncol. 2019 Sep 12. doi: 10.1001/jamaoncol.2019.2785.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Ponesimod reduces annualized relapse rate, compared with teriflunomide

Article Type
Changed
Mon, 09/30/2019 - 14:06

 

Ponesimod reduces annualized relapse rate, compared with teriflunomide, in adults with relapsing multiple sclerosis (MS), according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.

Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel.
Dr. Ludwig Kappos

Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.

Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.

The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.

Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.

At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.

The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.

This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.

The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.

SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.

Meeting/Event
Issue
Neurology Reviews- 27(10)
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Ponesimod reduces annualized relapse rate, compared with teriflunomide, in adults with relapsing multiple sclerosis (MS), according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.

Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel.
Dr. Ludwig Kappos

Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.

Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.

The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.

Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.

At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.

The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.

This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.

The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.

SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.

 

Ponesimod reduces annualized relapse rate, compared with teriflunomide, in adults with relapsing multiple sclerosis (MS), according to research presented at ECTRIMS 2019. Ponesimod also reduces fatigue and the number of active lesions, compared with teriflunomide.

Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel.
Dr. Ludwig Kappos

Ponesimod selectively modulates the sphingosine-1-phosphate receptor 1 (S1P1). The drug is administered orally and reduces circulating lymphocyte counts by inducing a rapid, dose-dependent, and reversible sequestration of lymphocytes in lymphoid organs. This effect decreases the number of immune cells available for inflammatory attacks in the CNS, said Ludwig Kappos, MD, head of the department of neurology at University Hospital Basel (Switzerland). The drug has no active metabolites, and its effects on the immune system are reversible.

Dr. Kappos and colleagues conducted the OPTIMUM phase 3 study to assess the efficacy and safety of oral ponesimod, compared with those of teriflunomide. They enrolled patients between ages 18 and 55 years with an established diagnosis of MS according to the 2010 McDonald criteria with a relapsing course from onset into the multicenter, randomized, double-blind, superiority study. Eligible patients had an Expanded Disability Status Scale (EDSS) score of 0 to 5.5 inclusive and recent clinical or MRI disease activity. Dr. Kappos and colleagues randomized participants in equal groups to receive ponesimod (20 mg/day) or teriflunomide (14 mg/day) and the respective placebo for 108 weeks. To mitigate the potential effects on heart rate that are associated with S1P1 modulators, patients were titrated gradually from 2 mg/day to the target dose over 14 days.

The trial’s primary endpoint was the annualized relapse rate over 108 weeks. Secondary endpoints were the effect on fatigue-related symptoms, as assessed with Fatigue Symptom and Impact Questionnaire-Relapsing MS (FSIQ-RMS); active lesions on MRI to week 108; and time to 12- and 24-week confirmed disability accumulation to end of study. The investigators also assessed the drugs’ safety and tolerability.

Dr. Kappos and colleagues randomized 1,133 patients at 162 sites in 28 countries. They stratified randomization according to whether participants had received prior disease-modifying treatment in the previous 2 years (39.4% had, and 60.6% had not) and EDSS score at baseline (83.4% had a score of 3.5 or lower, and 16.6% had a score above 3.5). The population’s mean age was 36.7 years, and 65% of participants were female. Most patients were recruited in Europe, and 51% came from E.U. countries. Patients’ mean baseline EDSS score was 2.6, and mean disease duration was 7.6 years. The mean prestudy 12-month relapse rate was 1.3, and 483 (42.7%) patients had one or more gadolinium-enhancing T1 lesions on baseline MRI. The two treatment groups were well balanced. The rate of treatment discontinuation was 16.6% for ponesimod and 16.4% on teriflunomide.

At the end of the study, the annualized relapse rate was 0.202 in the ponesimod group and 0.290 in the teriflunomide group. Compared with teriflunomide, ponesimod significantly reduced the annualized relapse rate by 30.5%. Fatigue remained stable in the ponesimod group, but worsened in the teriflunomide group: The mean difference in FSIQ-RMS score between the arms at week 108 was 3.57, and this result was statistically significant. In addition, ponesimod significantly reduced the number of active lesions by 56%, compared with teriflunomide. The risk for 12- and 24- week confirmed disability were lower with ponesimod, compared with teriflunomide, but the difference was not statistically significant.

The rates of treatment-emergent adverse events were approximately 89% for the ponesimod arm and 88% for teriflunomide. The rates of serious adverse events were about 9% for ponesimod and about 8% for teriflunomide. Respiratory events and laboratory values prompted slightly more study discontinuations in the ponesimod group than in the teriflunomide group.

This research represents the first controlled study to show superior efficacy of oral ponesimod, compared with an approved oral compound, said Dr. Kappos. “The overall profile suggests that [ponesimod] may be a valuable addition to our armamentarium in treating patients with relapsing forms of MS,” he concluded.

The study was supported by Actelion Pharmaceuticals. University Hospital Basel, where Dr. Kappos works, received steering committee, advisory board, and consultancy fees from Actelion and other companies.

SOURCE: Kappos L et al. ECTRIMS 2019, Abstract 93.

Issue
Neurology Reviews- 27(10)
Issue
Neurology Reviews- 27(10)
Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ECTRIMS 2019

Citation Override
Publish date: September 12, 2019
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Ponesimod reduces the number of confirmed MS relapses, compared with teriflunomide.

Major finding: Annualized relapse rate was 30.5% lower with ponesimod, compared with teriflunomide.

Study details: A randomized, double-blind, superiority study of 1,133 patients with relapsing-remitting MS.

Disclosures: Actelion Pharmaceuticals sponsored the study.

Source: Kappos L et al. ECTRIMS 2019, Abstract 93.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Continuous treatment reduces risk of confirmed disability progression in MS

Article Type
Changed
Thu, 09/12/2019 - 11:33

Pooled data from several national multiple sclerosis (MS) registries indicate that continuous exposure to disease-modifying therapy (DMT) for more than 10 years reduces the risk of confirmed disability progression (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.

“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”

Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.

 

 


Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.

The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.

The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.

During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.

Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.

Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.

SOURCE: Laffaldano P et al. ECTRIMS 2019. Abstract: 94.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Pooled data from several national multiple sclerosis (MS) registries indicate that continuous exposure to disease-modifying therapy (DMT) for more than 10 years reduces the risk of confirmed disability progression (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.

“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”

Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.

 

 


Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.

The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.

The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.

During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.

Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.

Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.

SOURCE: Laffaldano P et al. ECTRIMS 2019. Abstract: 94.

Pooled data from several national multiple sclerosis (MS) registries indicate that continuous exposure to disease-modifying therapy (DMT) for more than 10 years reduces the risk of confirmed disability progression (CDP), according to an investigation presented at the annual congress of the European Committee for Treatment and Research in Multiple Sclerosis.

Using several confirmation points for Expanded Disability Status Scale (EDSS) progression (e.g., 12 months and 24 months), researchers detected a clear gradient of treatment effect. Identification of the most reliable outcome definitions will require further investigations, they said.

“The ultimate goal of MS treatment is the prevention of long-term disability accumulation,” said Giuseppe Lucisano, a biostatistician at the Center for Outcomes Research and Clinical Epidemiology in Pescara, Italy. “Continuous DMT exposure can impact long-term disability accumulation in MS, but it has not been definitively demonstrated yet.”

Registries and clinical databases provide the opportunity to collect longitudinal data for treated and untreated patients as a means of investigating questions such as this one, the researchers said. The Danish, Italian, and Swedish national MS registries, MSBase, and the Observatoire of MS (OFSEP) merged their data in the Big Multiple Sclerosis Data (BMSD) Network, which includes approximately 150,000 patients and more than 470,000 EDSS evaluations. The result is a large dataset suitable for long-term longitudinal studies.

 

 


Mr. Lucisano and colleagues sought to examine the long-term effect of DMTs on CDP and irreversible disability milestones (i.e., EDSS scores of 4 and 6) in relapsing-remitting MS. The researchers used marginal structural proportional models, a novel technique that enables them to correct modeling for confounders that vary with time in longitudinal observational studies. Such confounders include treatment switches, on-treatment relapses, and treatment gaps.

The investigators selected patients with 10 or more years’ follow-up and one or more EDSS score evaluations per year from the BMSD pooled cohort. Using marginal structural proportional models, the investigators evaluated cumulative hazards of 3-, 12- and 24-month CDP (i.e., CDP3, CDP12, CDP24) events in 6-month periods. They created stabilized inverse probability of treatment weights (IPTWs) at each 6-month period using survival models according to treatment status (i.e., treated versus untreated). Treatment status was assigned for each patient according to the percentage of time that he or she spent receiving DMT in each 6-month period. A patient who received treatment for 70% or more of the period studied was considered treated; patients who did not meet this threshold were considered untreated. The weights were calculated on the basis of sex, age, occurrence of relapse, EDSS score, and registry source. Finally, the researchers used Cox regression models estimating the effect of DMTs on the risk of reaching CDP3, CDP12, and CDP24, adjusted by the IPTWs, to compare cohorts that remained treated or untreated throughout follow-up.

The investigators identified a cohort of 15,602 patients with relapsing-remitting MS, and this group had 312,040 EDSS score evaluations. Approximately 28% of patients were male. Median age at disease onset was 28.3 years, and median disease duration was 18.7 years. Median follow-up duration was 13.8 years.

During follow-up, 43.3% of patients had CDP3, 27.7% had CDP12, and 14.4% had CDP24 events. In addition, 23.6% of patients reached an EDSS score of 4, and 11.2% reached an EDSS score of 6.

Cox models adjusted by IPTW demonstrated increasing positive evidence of the effect of cumulative treatment exposure, compared with cumulative untreated epochs, according to the length of confirmation time used for defining the CDP. The investigators did not observe an effect of treatment on the probability of reaching CDP3 (hazard ratio [HR], 1.02), but treatment had a protective effect on CDP12 (HR, 0.90) and CDP24 (HR, 0.65) endpoints. During treated epochs, the HR of EDSS 4 was 0.89, and the HR of EDSS 6 was 0.86. Sensitivity analyses largely confirmed the results of the main analysis.

Two of the researchers are employees of Biogen International, which supported the research. Several investigators received compensation or funding from various pharmaceutical companies.

SOURCE: Laffaldano P et al. ECTRIMS 2019. Abstract: 94.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ECTRIMS 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.