AML variants before transplant signal need for aggressive therapy

Article Type
Changed
Tue, 06/25/2019 - 15:19

 

– Patients with acute myeloid leukemia who were in morphological complete remission prior to allogeneic hematopoietic cell transplant but had genomic evidence of a lingering AML variant had worse posttransplant outcomes when they underwent reduced-intensity conditioning, rather than myeloablative conditioning, investigators reported.

Dr. Christopher S. Hourigan

Among adults with AML in remission after induction therapy who were randomized in a clinical trial to either reduced-intensity conditioning (RIC) or myeloablative conditioning prior to transplant, those with known AML variants detected with ultra-deep genomic sequencing who underwent RIC had significantly greater risk for relapse, decreased disease-free survival (DFS), and worse overall survival (OS), compared with similar patients who underwent myeloablative conditioning (MAC), reported Christopher S. Hourigan, DM, DPhil, of the Laboratory of Myeloid Malignancies at the National Heart, Lung, and Blood Institute in Bethesda, Md.

The findings suggest that those patients with pretransplant AML variants who can tolerate MAC should get it, and that investigators need to find new options for patients who can’t, he said in an interview at the annual congress of the European Hematology Association.

“If I wasn’t a lab investigator and was a clinical trialist, I would be very excited about doing some randomized trials now to try see about novel targeted agents. For example, we have FLT3 inhibitors, we have IDH1 and IDH2 inhibitors, and I would be looking to try to combine reduced-intensity conditioning with additional therapy to try to lower the relapse rate for that group at the highest risk,” he said.

Previous studies have shown that, regardless of the method used – flow cytometry, quantitative polymerase chain reaction, or next-generation sequencing – minimal residual disease (MRD) detected in patients with AML in complete remission prior to transplant is associated with both cumulative incidence of relapse and worse overall survival.
 

Measurable, not minimal

Dr. Hourigan contends that the word “minimal” – the “M” in “MRD” – is a misnomer and should be replaced by the word “measurable,” because MRD really reflects the limitations of disease-detection technology.

“If you tell patients ‘you have minimal residual disease, and you have a huge chance of dying over the next few years,’ there’s nothing minimal about that,” he said.

The fundamental question that Dr. Hourigan and colleagues asked is, “is MRD just useful for predicting prognosis? Is this fate, or can we as doctors do something about it?”

To get answers, they examined whole-blood samples from patients enrolled in the BMT CTN 0901 trial, which compared survival and other outcomes following allogeneic hematopoietic stem cell transplants (allo-HSCT) with either RIC or MAC for pretransplant conditioning in patients with AML or the myelodysplastic syndrome.

The trial was halted early after just 272 of a planned 356 patients were enrolled, following evidence of a significantly higher relapse rate among patients who had undergone RIC.

“Strikingly, over half the AML patients receiving RIC relapsed within 18 months after getting transplants,” Dr. Hourigan said.
 

Relapse, survival differences

For this substudy, the National Institutes of Health investigators developed a custom 13-gene panel that would detect at least one AML variant in approximately 80% of patients who were included in a previous study of genomic classification and prognosis in AML.

They used ultra-deep genomic sequencing to look for variants in blood samples from 188 patients in BMT CTN 0901. There were no variants detected in the blood of 31% of patients who had undergone MAC or in 33% of those who had undergone RIC.

Among patients who did have detectable variants, the average number of variants per patient was 2.5.

In this cohort, transplant-related mortality (TRM) was higher with MAC at 27% vs. 20% with RIC at 3 years, but there were no differences in TRM within conditioning arms for patients, with or without AML variants.

Relapse rates in the cohort studied by Dr. Hourigan and his colleagues were virtually identical to those seen in the full study set, with an 18-month relapse rate of 16% for patients treated with MAC vs. 51% for those treated with RIC.

Among patients randomized to RIC, 3-year relapse rates were 57% for patients with detectable pretransplant AML variants, compared with 32% for those without variants (P less than .001).

Although there were no significant differences in 3-year OS by variant status among patients assigned to MAC, variant-positive patients assigned to RIC had significantly worse 3-year OS than those without variants (P = .04).

Among patients with no detectable variants, there were no significant differences in OS between the MAC or RIC arms. However, among patients with variants, survival was significantly worse with RIC (P = .02).

In multivariate analysis controlling for disease risk and donor group among patients who tested positive for an AML variant pretransplant, RIC was significantly associated with an increased risk for relapse (hazard ratio, 5.98; P less than .001); decreased DFS (HR, 2.80; P less than .001), and worse OS (HR, 2.16; P = .003).

“This study provides evidence that intervention for AML patients with evidence of MRD can result in improved survival,” Dr. Hourigan said.

Questions that still need to be addressed include whether variants in different genes confer different degrees of relapse risk, whether next-generation sequencing positivity is equivalent to MRD positivity, and whether the 13-gene panel could be improved upon to lower the chance for false negatives, he said.

The study was supported by the NIH. Dr. Hourigan reported research funding from Merck and Sellas Life Sciences AG, research collaboration with Qiagen and Archer, advisory board participation as an NIH official duty for Janssen and Novartis, and part-time employment with the Johns Hopkins School of Medicine.

SOURCE: Hourigan CS et al. EHA Congress, Abstract LB2600.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Patients with acute myeloid leukemia who were in morphological complete remission prior to allogeneic hematopoietic cell transplant but had genomic evidence of a lingering AML variant had worse posttransplant outcomes when they underwent reduced-intensity conditioning, rather than myeloablative conditioning, investigators reported.

Dr. Christopher S. Hourigan

Among adults with AML in remission after induction therapy who were randomized in a clinical trial to either reduced-intensity conditioning (RIC) or myeloablative conditioning prior to transplant, those with known AML variants detected with ultra-deep genomic sequencing who underwent RIC had significantly greater risk for relapse, decreased disease-free survival (DFS), and worse overall survival (OS), compared with similar patients who underwent myeloablative conditioning (MAC), reported Christopher S. Hourigan, DM, DPhil, of the Laboratory of Myeloid Malignancies at the National Heart, Lung, and Blood Institute in Bethesda, Md.

The findings suggest that those patients with pretransplant AML variants who can tolerate MAC should get it, and that investigators need to find new options for patients who can’t, he said in an interview at the annual congress of the European Hematology Association.

“If I wasn’t a lab investigator and was a clinical trialist, I would be very excited about doing some randomized trials now to try see about novel targeted agents. For example, we have FLT3 inhibitors, we have IDH1 and IDH2 inhibitors, and I would be looking to try to combine reduced-intensity conditioning with additional therapy to try to lower the relapse rate for that group at the highest risk,” he said.

Previous studies have shown that, regardless of the method used – flow cytometry, quantitative polymerase chain reaction, or next-generation sequencing – minimal residual disease (MRD) detected in patients with AML in complete remission prior to transplant is associated with both cumulative incidence of relapse and worse overall survival.
 

Measurable, not minimal

Dr. Hourigan contends that the word “minimal” – the “M” in “MRD” – is a misnomer and should be replaced by the word “measurable,” because MRD really reflects the limitations of disease-detection technology.

“If you tell patients ‘you have minimal residual disease, and you have a huge chance of dying over the next few years,’ there’s nothing minimal about that,” he said.

The fundamental question that Dr. Hourigan and colleagues asked is, “is MRD just useful for predicting prognosis? Is this fate, or can we as doctors do something about it?”

To get answers, they examined whole-blood samples from patients enrolled in the BMT CTN 0901 trial, which compared survival and other outcomes following allogeneic hematopoietic stem cell transplants (allo-HSCT) with either RIC or MAC for pretransplant conditioning in patients with AML or the myelodysplastic syndrome.

The trial was halted early after just 272 of a planned 356 patients were enrolled, following evidence of a significantly higher relapse rate among patients who had undergone RIC.

“Strikingly, over half the AML patients receiving RIC relapsed within 18 months after getting transplants,” Dr. Hourigan said.
 

Relapse, survival differences

For this substudy, the National Institutes of Health investigators developed a custom 13-gene panel that would detect at least one AML variant in approximately 80% of patients who were included in a previous study of genomic classification and prognosis in AML.

They used ultra-deep genomic sequencing to look for variants in blood samples from 188 patients in BMT CTN 0901. There were no variants detected in the blood of 31% of patients who had undergone MAC or in 33% of those who had undergone RIC.

Among patients who did have detectable variants, the average number of variants per patient was 2.5.

In this cohort, transplant-related mortality (TRM) was higher with MAC at 27% vs. 20% with RIC at 3 years, but there were no differences in TRM within conditioning arms for patients, with or without AML variants.

Relapse rates in the cohort studied by Dr. Hourigan and his colleagues were virtually identical to those seen in the full study set, with an 18-month relapse rate of 16% for patients treated with MAC vs. 51% for those treated with RIC.

Among patients randomized to RIC, 3-year relapse rates were 57% for patients with detectable pretransplant AML variants, compared with 32% for those without variants (P less than .001).

Although there were no significant differences in 3-year OS by variant status among patients assigned to MAC, variant-positive patients assigned to RIC had significantly worse 3-year OS than those without variants (P = .04).

Among patients with no detectable variants, there were no significant differences in OS between the MAC or RIC arms. However, among patients with variants, survival was significantly worse with RIC (P = .02).

In multivariate analysis controlling for disease risk and donor group among patients who tested positive for an AML variant pretransplant, RIC was significantly associated with an increased risk for relapse (hazard ratio, 5.98; P less than .001); decreased DFS (HR, 2.80; P less than .001), and worse OS (HR, 2.16; P = .003).

“This study provides evidence that intervention for AML patients with evidence of MRD can result in improved survival,” Dr. Hourigan said.

Questions that still need to be addressed include whether variants in different genes confer different degrees of relapse risk, whether next-generation sequencing positivity is equivalent to MRD positivity, and whether the 13-gene panel could be improved upon to lower the chance for false negatives, he said.

The study was supported by the NIH. Dr. Hourigan reported research funding from Merck and Sellas Life Sciences AG, research collaboration with Qiagen and Archer, advisory board participation as an NIH official duty for Janssen and Novartis, and part-time employment with the Johns Hopkins School of Medicine.

SOURCE: Hourigan CS et al. EHA Congress, Abstract LB2600.

 

– Patients with acute myeloid leukemia who were in morphological complete remission prior to allogeneic hematopoietic cell transplant but had genomic evidence of a lingering AML variant had worse posttransplant outcomes when they underwent reduced-intensity conditioning, rather than myeloablative conditioning, investigators reported.

Dr. Christopher S. Hourigan

Among adults with AML in remission after induction therapy who were randomized in a clinical trial to either reduced-intensity conditioning (RIC) or myeloablative conditioning prior to transplant, those with known AML variants detected with ultra-deep genomic sequencing who underwent RIC had significantly greater risk for relapse, decreased disease-free survival (DFS), and worse overall survival (OS), compared with similar patients who underwent myeloablative conditioning (MAC), reported Christopher S. Hourigan, DM, DPhil, of the Laboratory of Myeloid Malignancies at the National Heart, Lung, and Blood Institute in Bethesda, Md.

The findings suggest that those patients with pretransplant AML variants who can tolerate MAC should get it, and that investigators need to find new options for patients who can’t, he said in an interview at the annual congress of the European Hematology Association.

“If I wasn’t a lab investigator and was a clinical trialist, I would be very excited about doing some randomized trials now to try see about novel targeted agents. For example, we have FLT3 inhibitors, we have IDH1 and IDH2 inhibitors, and I would be looking to try to combine reduced-intensity conditioning with additional therapy to try to lower the relapse rate for that group at the highest risk,” he said.

Previous studies have shown that, regardless of the method used – flow cytometry, quantitative polymerase chain reaction, or next-generation sequencing – minimal residual disease (MRD) detected in patients with AML in complete remission prior to transplant is associated with both cumulative incidence of relapse and worse overall survival.
 

Measurable, not minimal

Dr. Hourigan contends that the word “minimal” – the “M” in “MRD” – is a misnomer and should be replaced by the word “measurable,” because MRD really reflects the limitations of disease-detection technology.

“If you tell patients ‘you have minimal residual disease, and you have a huge chance of dying over the next few years,’ there’s nothing minimal about that,” he said.

The fundamental question that Dr. Hourigan and colleagues asked is, “is MRD just useful for predicting prognosis? Is this fate, or can we as doctors do something about it?”

To get answers, they examined whole-blood samples from patients enrolled in the BMT CTN 0901 trial, which compared survival and other outcomes following allogeneic hematopoietic stem cell transplants (allo-HSCT) with either RIC or MAC for pretransplant conditioning in patients with AML or the myelodysplastic syndrome.

The trial was halted early after just 272 of a planned 356 patients were enrolled, following evidence of a significantly higher relapse rate among patients who had undergone RIC.

“Strikingly, over half the AML patients receiving RIC relapsed within 18 months after getting transplants,” Dr. Hourigan said.
 

Relapse, survival differences

For this substudy, the National Institutes of Health investigators developed a custom 13-gene panel that would detect at least one AML variant in approximately 80% of patients who were included in a previous study of genomic classification and prognosis in AML.

They used ultra-deep genomic sequencing to look for variants in blood samples from 188 patients in BMT CTN 0901. There were no variants detected in the blood of 31% of patients who had undergone MAC or in 33% of those who had undergone RIC.

Among patients who did have detectable variants, the average number of variants per patient was 2.5.

In this cohort, transplant-related mortality (TRM) was higher with MAC at 27% vs. 20% with RIC at 3 years, but there were no differences in TRM within conditioning arms for patients, with or without AML variants.

Relapse rates in the cohort studied by Dr. Hourigan and his colleagues were virtually identical to those seen in the full study set, with an 18-month relapse rate of 16% for patients treated with MAC vs. 51% for those treated with RIC.

Among patients randomized to RIC, 3-year relapse rates were 57% for patients with detectable pretransplant AML variants, compared with 32% for those without variants (P less than .001).

Although there were no significant differences in 3-year OS by variant status among patients assigned to MAC, variant-positive patients assigned to RIC had significantly worse 3-year OS than those without variants (P = .04).

Among patients with no detectable variants, there were no significant differences in OS between the MAC or RIC arms. However, among patients with variants, survival was significantly worse with RIC (P = .02).

In multivariate analysis controlling for disease risk and donor group among patients who tested positive for an AML variant pretransplant, RIC was significantly associated with an increased risk for relapse (hazard ratio, 5.98; P less than .001); decreased DFS (HR, 2.80; P less than .001), and worse OS (HR, 2.16; P = .003).

“This study provides evidence that intervention for AML patients with evidence of MRD can result in improved survival,” Dr. Hourigan said.

Questions that still need to be addressed include whether variants in different genes confer different degrees of relapse risk, whether next-generation sequencing positivity is equivalent to MRD positivity, and whether the 13-gene panel could be improved upon to lower the chance for false negatives, he said.

The study was supported by the NIH. Dr. Hourigan reported research funding from Merck and Sellas Life Sciences AG, research collaboration with Qiagen and Archer, advisory board participation as an NIH official duty for Janssen and Novartis, and part-time employment with the Johns Hopkins School of Medicine.

SOURCE: Hourigan CS et al. EHA Congress, Abstract LB2600.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM EHA CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

For tough AML, half respond to selinexor plus chemotherapy

Article Type
Changed
Fri, 06/19/2020 - 16:20

 

– Patients with relapsed or refractory acute myeloid leukemia (AML) may be more likely to respond when selinexor is added to standard chemotherapy, according to investigators.

Dr. Walter Fieler of University Medical Center Hamburg-Eppendorf, Germany
Will Pass/MDedge News
Dr. Walter Fiedler

In a recent phase 2 trial, selinexor given with cytarabine and idarubicin led to a 50% overall response rate, reported lead author Walter Fiedler, MD, of University Medical Center Hamburg-Eppendorf (Germany). This response rate is at the upper end of what has been seen in published studies, Dr. Fiedler said at the annual congress of the European Hematology Association.

He also noted that giving a flat dose of selinexor improved tolerability in the trial, a significant finding in light of common adverse events and recent concerns from the Food and Drug Administration about the safety of selinexor for patients with multiple myeloma.

“The rationale to employ selinexor in this study is that there is a synergy between anthracyclines and selinexor,” Dr. Fiedler said, which may restore anthracycline sensitivity in relapsed or refractory patients. “Secondly, there is a c-myc reduction pathway that leads to a reduction of DNA damage repair genes such as Rad51 and Chk1, and this might result in inhibition of homologous recombination.”

The study involved 44 patients with relapsed or refractory AML, of whom 17 (39%) had previously received stem cell transplantation and 11 (25%) exhibited therapy-induced or secondary disease. The median patient age was 59.5 years.

Patients were given idarubicin 10 mg/m2 on days 1, 3, and 5, and cytarabine 100 mg/m2 on days 1-7. Initially, selinexor was given at a dose of 40 mg/m2 twice per week for 4 weeks, but this led to high rates of febrile neutropenia and grade 3 or higher diarrhea, along with prolonged aplasia. In response to this issue, after the first 27 patients, the dose was reduced to a flat amount of 60 mg, given twice weekly for 3 weeks.

For patients not undergoing transplantation after the first or second induction cycle, selinexor maintenance monotherapy was offered for up to 1 year.

The primary endpoint was overall remission rate, reported as complete remission, complete remission with incomplete blood count recovery, and morphological leukemia-free status. Secondary endpoints included the rate of partial remissions, percentage of patients being transplanted after induction, early death rate, overall survival, event-free survival, and relapse-free survival.



The efficacy analysis revealed an overall response rate of 50%. A total of 9 patients had complete remission (21.4%), 11 achieved complete remission with incomplete blood count recovery (26.2%), and 1 exhibited morphological leukemia-free status (2.4%). Of note, almost half of the patients (47%) who had relapsed after previous stem cell transplantation responded, as did three-quarters who tested positive for an NPM1 mutation. After a median follow-up of 8.2 months, the median overall survival was 8.2 months, relapse-free survival was 17.7 months, and event-free survival was 4.9 months.

Adverse events occurred frequently, with a majority of patients experiencing nausea (86%), diarrhea (83%), vomiting (74%), decreased appetite (71%), febrile neutropenia (67%), fatigue (64%), leukopenia (62%), thrombocytopenia (62%), or anemia (60%).

Grade 3 or higher adverse events were almost as common, and included febrile neutropenia (67%), leukopenia (62%), thrombocytopenia (62%), anemia (57%), and diarrhea (50%). Reducing the dose did improve tolerability, with notable drops in the rate of severe diarrhea (56% vs. 40%) and febrile neutropenia (85% vs. 33%). In total, 19% of patients discontinued treatment because of adverse events.

A total of 25 patients (60%) died during the study, with about half dying from disease progression (n = 12), and fewer succumbing to infectious complications, graft-versus-host disease, multiorgan failure, multiple brain infarct, or asystole. Two deaths, one from suspected hemophagocytosis and another from systemic inflammatory response syndrome, were considered possibly related to selinexor.

“The results should be further evaluated in a phase 3 study,” Dr. Fiedler said. However, plans for this are not yet underway, he said, adding that Karyopharm Therapeutics will be focusing its efforts on selinexor for myeloma first.

The study was funded by Karyopharm. Dr. Fielder reported financial relationships with Amgen, Pfizer, Jazz Pharmaceuticals, and other companies.

SOURCE: Fiedler W et al. EHA Congress, Abstract S880.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Patients with relapsed or refractory acute myeloid leukemia (AML) may be more likely to respond when selinexor is added to standard chemotherapy, according to investigators.

Dr. Walter Fieler of University Medical Center Hamburg-Eppendorf, Germany
Will Pass/MDedge News
Dr. Walter Fiedler

In a recent phase 2 trial, selinexor given with cytarabine and idarubicin led to a 50% overall response rate, reported lead author Walter Fiedler, MD, of University Medical Center Hamburg-Eppendorf (Germany). This response rate is at the upper end of what has been seen in published studies, Dr. Fiedler said at the annual congress of the European Hematology Association.

He also noted that giving a flat dose of selinexor improved tolerability in the trial, a significant finding in light of common adverse events and recent concerns from the Food and Drug Administration about the safety of selinexor for patients with multiple myeloma.

“The rationale to employ selinexor in this study is that there is a synergy between anthracyclines and selinexor,” Dr. Fiedler said, which may restore anthracycline sensitivity in relapsed or refractory patients. “Secondly, there is a c-myc reduction pathway that leads to a reduction of DNA damage repair genes such as Rad51 and Chk1, and this might result in inhibition of homologous recombination.”

The study involved 44 patients with relapsed or refractory AML, of whom 17 (39%) had previously received stem cell transplantation and 11 (25%) exhibited therapy-induced or secondary disease. The median patient age was 59.5 years.

Patients were given idarubicin 10 mg/m2 on days 1, 3, and 5, and cytarabine 100 mg/m2 on days 1-7. Initially, selinexor was given at a dose of 40 mg/m2 twice per week for 4 weeks, but this led to high rates of febrile neutropenia and grade 3 or higher diarrhea, along with prolonged aplasia. In response to this issue, after the first 27 patients, the dose was reduced to a flat amount of 60 mg, given twice weekly for 3 weeks.

For patients not undergoing transplantation after the first or second induction cycle, selinexor maintenance monotherapy was offered for up to 1 year.

The primary endpoint was overall remission rate, reported as complete remission, complete remission with incomplete blood count recovery, and morphological leukemia-free status. Secondary endpoints included the rate of partial remissions, percentage of patients being transplanted after induction, early death rate, overall survival, event-free survival, and relapse-free survival.



The efficacy analysis revealed an overall response rate of 50%. A total of 9 patients had complete remission (21.4%), 11 achieved complete remission with incomplete blood count recovery (26.2%), and 1 exhibited morphological leukemia-free status (2.4%). Of note, almost half of the patients (47%) who had relapsed after previous stem cell transplantation responded, as did three-quarters who tested positive for an NPM1 mutation. After a median follow-up of 8.2 months, the median overall survival was 8.2 months, relapse-free survival was 17.7 months, and event-free survival was 4.9 months.

Adverse events occurred frequently, with a majority of patients experiencing nausea (86%), diarrhea (83%), vomiting (74%), decreased appetite (71%), febrile neutropenia (67%), fatigue (64%), leukopenia (62%), thrombocytopenia (62%), or anemia (60%).

Grade 3 or higher adverse events were almost as common, and included febrile neutropenia (67%), leukopenia (62%), thrombocytopenia (62%), anemia (57%), and diarrhea (50%). Reducing the dose did improve tolerability, with notable drops in the rate of severe diarrhea (56% vs. 40%) and febrile neutropenia (85% vs. 33%). In total, 19% of patients discontinued treatment because of adverse events.

A total of 25 patients (60%) died during the study, with about half dying from disease progression (n = 12), and fewer succumbing to infectious complications, graft-versus-host disease, multiorgan failure, multiple brain infarct, or asystole. Two deaths, one from suspected hemophagocytosis and another from systemic inflammatory response syndrome, were considered possibly related to selinexor.

“The results should be further evaluated in a phase 3 study,” Dr. Fiedler said. However, plans for this are not yet underway, he said, adding that Karyopharm Therapeutics will be focusing its efforts on selinexor for myeloma first.

The study was funded by Karyopharm. Dr. Fielder reported financial relationships with Amgen, Pfizer, Jazz Pharmaceuticals, and other companies.

SOURCE: Fiedler W et al. EHA Congress, Abstract S880.

 

– Patients with relapsed or refractory acute myeloid leukemia (AML) may be more likely to respond when selinexor is added to standard chemotherapy, according to investigators.

Dr. Walter Fieler of University Medical Center Hamburg-Eppendorf, Germany
Will Pass/MDedge News
Dr. Walter Fiedler

In a recent phase 2 trial, selinexor given with cytarabine and idarubicin led to a 50% overall response rate, reported lead author Walter Fiedler, MD, of University Medical Center Hamburg-Eppendorf (Germany). This response rate is at the upper end of what has been seen in published studies, Dr. Fiedler said at the annual congress of the European Hematology Association.

He also noted that giving a flat dose of selinexor improved tolerability in the trial, a significant finding in light of common adverse events and recent concerns from the Food and Drug Administration about the safety of selinexor for patients with multiple myeloma.

“The rationale to employ selinexor in this study is that there is a synergy between anthracyclines and selinexor,” Dr. Fiedler said, which may restore anthracycline sensitivity in relapsed or refractory patients. “Secondly, there is a c-myc reduction pathway that leads to a reduction of DNA damage repair genes such as Rad51 and Chk1, and this might result in inhibition of homologous recombination.”

The study involved 44 patients with relapsed or refractory AML, of whom 17 (39%) had previously received stem cell transplantation and 11 (25%) exhibited therapy-induced or secondary disease. The median patient age was 59.5 years.

Patients were given idarubicin 10 mg/m2 on days 1, 3, and 5, and cytarabine 100 mg/m2 on days 1-7. Initially, selinexor was given at a dose of 40 mg/m2 twice per week for 4 weeks, but this led to high rates of febrile neutropenia and grade 3 or higher diarrhea, along with prolonged aplasia. In response to this issue, after the first 27 patients, the dose was reduced to a flat amount of 60 mg, given twice weekly for 3 weeks.

For patients not undergoing transplantation after the first or second induction cycle, selinexor maintenance monotherapy was offered for up to 1 year.

The primary endpoint was overall remission rate, reported as complete remission, complete remission with incomplete blood count recovery, and morphological leukemia-free status. Secondary endpoints included the rate of partial remissions, percentage of patients being transplanted after induction, early death rate, overall survival, event-free survival, and relapse-free survival.



The efficacy analysis revealed an overall response rate of 50%. A total of 9 patients had complete remission (21.4%), 11 achieved complete remission with incomplete blood count recovery (26.2%), and 1 exhibited morphological leukemia-free status (2.4%). Of note, almost half of the patients (47%) who had relapsed after previous stem cell transplantation responded, as did three-quarters who tested positive for an NPM1 mutation. After a median follow-up of 8.2 months, the median overall survival was 8.2 months, relapse-free survival was 17.7 months, and event-free survival was 4.9 months.

Adverse events occurred frequently, with a majority of patients experiencing nausea (86%), diarrhea (83%), vomiting (74%), decreased appetite (71%), febrile neutropenia (67%), fatigue (64%), leukopenia (62%), thrombocytopenia (62%), or anemia (60%).

Grade 3 or higher adverse events were almost as common, and included febrile neutropenia (67%), leukopenia (62%), thrombocytopenia (62%), anemia (57%), and diarrhea (50%). Reducing the dose did improve tolerability, with notable drops in the rate of severe diarrhea (56% vs. 40%) and febrile neutropenia (85% vs. 33%). In total, 19% of patients discontinued treatment because of adverse events.

A total of 25 patients (60%) died during the study, with about half dying from disease progression (n = 12), and fewer succumbing to infectious complications, graft-versus-host disease, multiorgan failure, multiple brain infarct, or asystole. Two deaths, one from suspected hemophagocytosis and another from systemic inflammatory response syndrome, were considered possibly related to selinexor.

“The results should be further evaluated in a phase 3 study,” Dr. Fiedler said. However, plans for this are not yet underway, he said, adding that Karyopharm Therapeutics will be focusing its efforts on selinexor for myeloma first.

The study was funded by Karyopharm. Dr. Fielder reported financial relationships with Amgen, Pfizer, Jazz Pharmaceuticals, and other companies.

SOURCE: Fiedler W et al. EHA Congress, Abstract S880.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM EHA CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge

Day 75 is key threshold in FVIII inhibitor development

Article Type
Changed
Tue, 07/23/2019 - 19:31

 

For previously untreated patients with severe hemophilia A, the risk of developing factor VIII (FVIII) inhibitors becomes marginal after 75 exposure days, according to an observational study of more than 1,000 infants on prophylaxis.

“Most inhibitors develop during the first 50 exposure days to FVIII, with 50% of inhibitors already present after 14-15 exposure days,” wrote H. Marijke van den Berg, MD, PhD, of the PedNet Haemophilia Research Foundation, Baarn, the Netherlands. The findings were published in Blood.

Dr. van den Berg and her colleagues aimed to characterize the risk of inhibitor development beyond 50 exposure days and to calculate the age when patients reach near-zero risk. The researchers followed 1,038 previously untreated patients with severe hemophilia A from first exposure to FVIII until inhibitor development, up to a maximum of 1,000 exposure days. Data was obtained from the PedNet Haemophilia Registry.

From the initial cohort, 943 patients (91%) were followed until 50 exposure days, and 899 (87%) were followed until 75 exposure days. Inhibitor development was defined by a minimum of two positive inhibitor titers in conjunction with reduced in-vivo FVIII recovery. The team also conducted a survival analysis to measure inhibitor incidence and reported median ages at initial exposure and at exposure day 75.



After analysis, the researchers found that 298 of 300 (99.3%) occurrences of inhibitor development ensued within 75 exposure days. No inhibitor development occurred between exposure day 75 and 150. The final two occurrences developed at exposure day 249 and 262, each with a low titer. The median age at first exposure was 1.1 years versus 2.3 years at exposure day 75.

“Our study shows that children on prophylaxis reach a near-zero risk plateau of inhibitor development at 75 [exposure days] only 1.2 years after the first [exposure day],” they wrote.

The researchers explained that these findings could impact the design of future clinical studies for previously untreated patients with severe hemophilia A. And they noted that these data are applicable to patients administered early prophylaxis, since the majority of study participants began prophylaxis early in life. “Frequent testing for inhibitors until 75 instead of 50 exposure days, therefore, is feasible and should be recommended for all [previously untreated patients],” they concluded.

No funding sources were reported. The authors reported having no conflicts of interest.

SOURCE: van den Berg HM et al. Blood. 2019 Jun 11. doi: 10.1182/blood.2019000658.

Publications
Topics
Sections

 

For previously untreated patients with severe hemophilia A, the risk of developing factor VIII (FVIII) inhibitors becomes marginal after 75 exposure days, according to an observational study of more than 1,000 infants on prophylaxis.

“Most inhibitors develop during the first 50 exposure days to FVIII, with 50% of inhibitors already present after 14-15 exposure days,” wrote H. Marijke van den Berg, MD, PhD, of the PedNet Haemophilia Research Foundation, Baarn, the Netherlands. The findings were published in Blood.

Dr. van den Berg and her colleagues aimed to characterize the risk of inhibitor development beyond 50 exposure days and to calculate the age when patients reach near-zero risk. The researchers followed 1,038 previously untreated patients with severe hemophilia A from first exposure to FVIII until inhibitor development, up to a maximum of 1,000 exposure days. Data was obtained from the PedNet Haemophilia Registry.

From the initial cohort, 943 patients (91%) were followed until 50 exposure days, and 899 (87%) were followed until 75 exposure days. Inhibitor development was defined by a minimum of two positive inhibitor titers in conjunction with reduced in-vivo FVIII recovery. The team also conducted a survival analysis to measure inhibitor incidence and reported median ages at initial exposure and at exposure day 75.



After analysis, the researchers found that 298 of 300 (99.3%) occurrences of inhibitor development ensued within 75 exposure days. No inhibitor development occurred between exposure day 75 and 150. The final two occurrences developed at exposure day 249 and 262, each with a low titer. The median age at first exposure was 1.1 years versus 2.3 years at exposure day 75.

“Our study shows that children on prophylaxis reach a near-zero risk plateau of inhibitor development at 75 [exposure days] only 1.2 years after the first [exposure day],” they wrote.

The researchers explained that these findings could impact the design of future clinical studies for previously untreated patients with severe hemophilia A. And they noted that these data are applicable to patients administered early prophylaxis, since the majority of study participants began prophylaxis early in life. “Frequent testing for inhibitors until 75 instead of 50 exposure days, therefore, is feasible and should be recommended for all [previously untreated patients],” they concluded.

No funding sources were reported. The authors reported having no conflicts of interest.

SOURCE: van den Berg HM et al. Blood. 2019 Jun 11. doi: 10.1182/blood.2019000658.

 

For previously untreated patients with severe hemophilia A, the risk of developing factor VIII (FVIII) inhibitors becomes marginal after 75 exposure days, according to an observational study of more than 1,000 infants on prophylaxis.

“Most inhibitors develop during the first 50 exposure days to FVIII, with 50% of inhibitors already present after 14-15 exposure days,” wrote H. Marijke van den Berg, MD, PhD, of the PedNet Haemophilia Research Foundation, Baarn, the Netherlands. The findings were published in Blood.

Dr. van den Berg and her colleagues aimed to characterize the risk of inhibitor development beyond 50 exposure days and to calculate the age when patients reach near-zero risk. The researchers followed 1,038 previously untreated patients with severe hemophilia A from first exposure to FVIII until inhibitor development, up to a maximum of 1,000 exposure days. Data was obtained from the PedNet Haemophilia Registry.

From the initial cohort, 943 patients (91%) were followed until 50 exposure days, and 899 (87%) were followed until 75 exposure days. Inhibitor development was defined by a minimum of two positive inhibitor titers in conjunction with reduced in-vivo FVIII recovery. The team also conducted a survival analysis to measure inhibitor incidence and reported median ages at initial exposure and at exposure day 75.



After analysis, the researchers found that 298 of 300 (99.3%) occurrences of inhibitor development ensued within 75 exposure days. No inhibitor development occurred between exposure day 75 and 150. The final two occurrences developed at exposure day 249 and 262, each with a low titer. The median age at first exposure was 1.1 years versus 2.3 years at exposure day 75.

“Our study shows that children on prophylaxis reach a near-zero risk plateau of inhibitor development at 75 [exposure days] only 1.2 years after the first [exposure day],” they wrote.

The researchers explained that these findings could impact the design of future clinical studies for previously untreated patients with severe hemophilia A. And they noted that these data are applicable to patients administered early prophylaxis, since the majority of study participants began prophylaxis early in life. “Frequent testing for inhibitors until 75 instead of 50 exposure days, therefore, is feasible and should be recommended for all [previously untreated patients],” they concluded.

No funding sources were reported. The authors reported having no conflicts of interest.

SOURCE: van den Berg HM et al. Blood. 2019 Jun 11. doi: 10.1182/blood.2019000658.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM BLOOD

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
203558
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Postholiday colonoscopies have lower rates of good bowel prep

Article Type
Changed
Wed, 06/26/2019 - 09:56

 

Inadequate bowel preparation was significantly more likely for colonoscopies performed after holidays, according to a single-center retrospective study presented at the annual Digestive Disease Week.

Of patients whose colonoscopies were performed the day after a holiday, 55.4% had inadequate bowel preparation, compared with 45.7% of those receiving colonoscopies on other days, for an odds ratio of 1.5 for inadequate preparation on the day after a holiday (95% confidence interval, 1.1-1.9; P = .006).

In addition to the lead finding, inadequate bowel prep was also more likely in the afternoon, and earlier in the week (OR, 1.6 and 1.3, respectively), said Ammar Nassri, MD, a gastroenterology fellow at the University of Florida, Jacksonville.

Patients who were male and white were more likely to have inadequate bowel preparation (OR, 1.3 and 2.7, respectively). Having Medicaid as opposed to other forms of insurance also upped the likelihood of inadequate bowel preparation (OR, 1.9).

It’s important to identify modifiable factors associated with inadequate bowel preparation for a number of reasons. Among them, said Dr. Nassri, is cost-effectiveness: Screening colonoscopy has been found to be cost effective, compared with fecal immunochemical testing only when the inadequate bowel prep rate is 13% or less.

Adenomas are more likely to be missed with inadequate bowel preparation as well, he noted, with one study finding missed adenomas on 33% of subsequent colonoscopies performed after an initial colonoscopy with inadequate preparation.

Also, inadequate preparation can mean longer procedures and increased likelihood of failed procedures – and higher costs, he said.

“Several studies have created prediction models to predict the likelihood of having an inadequate bowel preparation, but these models have not gained widespread acceptance,” said Dr. Nassri.

He and his collaborators aimed to identify the rate of inadequate bowel preparation in their patient population, and to examine the association of modifiable variables with adequacy of preparation. These included the day of the week, the time of day, and whether a colonoscopy followed a holiday.

Additionally, the investigators looked at various patient demographic variables to see whether they were associated with adequacy of bowel preparation. Adult patients who received outpatient colonoscopy over a 3-year period were included. Preparation was considered adequate if it was assigned a score of at least 6 on the Boston Bowel Preparation Scale, or at least “fair” on the Aronchik scale.

A total of 6,510 patients were included. The mean age was 56.3 years, and about 60% were female. Just over half (51.3%) were African American; 46.6% were white. Over half of patients (56.4%) had health insurance provided by city contract or Florida Medicaid; the remainder had either Medicare or commercial insurance.

Overall, nearly half of patients (46%) had inadequate bowel preparation. Half of males overall had adequate bowel preparation, compared with 57% of females (P less than .001). As the hour of the colonoscopy grew later, the likelihood of adequacy of bowel preparation dropped. The inverse relationship was statistically significant (P less than .001), with over 60% of 7 a.m. colonoscopies having adequate preparation. By 3 p.m., over 60% of bowel preparations were inadequate in the University of Florida cohort.

Colonoscopies performed later in the week were most likely to have adequate bowel preparation, with rates nearing 60% by Friday, compared with rates just over or at 50% for the first 3 days of the week (P less than .001).

“This study showed that a colonoscopy on the day after a holiday has a higher rate of inadequate bowel preparation,” said Dr. Nassri. Conversely, he said, “Colonoscopy toward the end of the week has a higher likelihood of adequate bowel preparation.”

The present work, he said, “re-demonstrated that procedures done later in the day have a poorer bowel preparation.”

Dr. Nassri reported no conflicts of interest.

Publications
Topics
Sections

 

Inadequate bowel preparation was significantly more likely for colonoscopies performed after holidays, according to a single-center retrospective study presented at the annual Digestive Disease Week.

Of patients whose colonoscopies were performed the day after a holiday, 55.4% had inadequate bowel preparation, compared with 45.7% of those receiving colonoscopies on other days, for an odds ratio of 1.5 for inadequate preparation on the day after a holiday (95% confidence interval, 1.1-1.9; P = .006).

In addition to the lead finding, inadequate bowel prep was also more likely in the afternoon, and earlier in the week (OR, 1.6 and 1.3, respectively), said Ammar Nassri, MD, a gastroenterology fellow at the University of Florida, Jacksonville.

Patients who were male and white were more likely to have inadequate bowel preparation (OR, 1.3 and 2.7, respectively). Having Medicaid as opposed to other forms of insurance also upped the likelihood of inadequate bowel preparation (OR, 1.9).

It’s important to identify modifiable factors associated with inadequate bowel preparation for a number of reasons. Among them, said Dr. Nassri, is cost-effectiveness: Screening colonoscopy has been found to be cost effective, compared with fecal immunochemical testing only when the inadequate bowel prep rate is 13% or less.

Adenomas are more likely to be missed with inadequate bowel preparation as well, he noted, with one study finding missed adenomas on 33% of subsequent colonoscopies performed after an initial colonoscopy with inadequate preparation.

Also, inadequate preparation can mean longer procedures and increased likelihood of failed procedures – and higher costs, he said.

“Several studies have created prediction models to predict the likelihood of having an inadequate bowel preparation, but these models have not gained widespread acceptance,” said Dr. Nassri.

He and his collaborators aimed to identify the rate of inadequate bowel preparation in their patient population, and to examine the association of modifiable variables with adequacy of preparation. These included the day of the week, the time of day, and whether a colonoscopy followed a holiday.

Additionally, the investigators looked at various patient demographic variables to see whether they were associated with adequacy of bowel preparation. Adult patients who received outpatient colonoscopy over a 3-year period were included. Preparation was considered adequate if it was assigned a score of at least 6 on the Boston Bowel Preparation Scale, or at least “fair” on the Aronchik scale.

A total of 6,510 patients were included. The mean age was 56.3 years, and about 60% were female. Just over half (51.3%) were African American; 46.6% were white. Over half of patients (56.4%) had health insurance provided by city contract or Florida Medicaid; the remainder had either Medicare or commercial insurance.

Overall, nearly half of patients (46%) had inadequate bowel preparation. Half of males overall had adequate bowel preparation, compared with 57% of females (P less than .001). As the hour of the colonoscopy grew later, the likelihood of adequacy of bowel preparation dropped. The inverse relationship was statistically significant (P less than .001), with over 60% of 7 a.m. colonoscopies having adequate preparation. By 3 p.m., over 60% of bowel preparations were inadequate in the University of Florida cohort.

Colonoscopies performed later in the week were most likely to have adequate bowel preparation, with rates nearing 60% by Friday, compared with rates just over or at 50% for the first 3 days of the week (P less than .001).

“This study showed that a colonoscopy on the day after a holiday has a higher rate of inadequate bowel preparation,” said Dr. Nassri. Conversely, he said, “Colonoscopy toward the end of the week has a higher likelihood of adequate bowel preparation.”

The present work, he said, “re-demonstrated that procedures done later in the day have a poorer bowel preparation.”

Dr. Nassri reported no conflicts of interest.

 

Inadequate bowel preparation was significantly more likely for colonoscopies performed after holidays, according to a single-center retrospective study presented at the annual Digestive Disease Week.

Of patients whose colonoscopies were performed the day after a holiday, 55.4% had inadequate bowel preparation, compared with 45.7% of those receiving colonoscopies on other days, for an odds ratio of 1.5 for inadequate preparation on the day after a holiday (95% confidence interval, 1.1-1.9; P = .006).

In addition to the lead finding, inadequate bowel prep was also more likely in the afternoon, and earlier in the week (OR, 1.6 and 1.3, respectively), said Ammar Nassri, MD, a gastroenterology fellow at the University of Florida, Jacksonville.

Patients who were male and white were more likely to have inadequate bowel preparation (OR, 1.3 and 2.7, respectively). Having Medicaid as opposed to other forms of insurance also upped the likelihood of inadequate bowel preparation (OR, 1.9).

It’s important to identify modifiable factors associated with inadequate bowel preparation for a number of reasons. Among them, said Dr. Nassri, is cost-effectiveness: Screening colonoscopy has been found to be cost effective, compared with fecal immunochemical testing only when the inadequate bowel prep rate is 13% or less.

Adenomas are more likely to be missed with inadequate bowel preparation as well, he noted, with one study finding missed adenomas on 33% of subsequent colonoscopies performed after an initial colonoscopy with inadequate preparation.

Also, inadequate preparation can mean longer procedures and increased likelihood of failed procedures – and higher costs, he said.

“Several studies have created prediction models to predict the likelihood of having an inadequate bowel preparation, but these models have not gained widespread acceptance,” said Dr. Nassri.

He and his collaborators aimed to identify the rate of inadequate bowel preparation in their patient population, and to examine the association of modifiable variables with adequacy of preparation. These included the day of the week, the time of day, and whether a colonoscopy followed a holiday.

Additionally, the investigators looked at various patient demographic variables to see whether they were associated with adequacy of bowel preparation. Adult patients who received outpatient colonoscopy over a 3-year period were included. Preparation was considered adequate if it was assigned a score of at least 6 on the Boston Bowel Preparation Scale, or at least “fair” on the Aronchik scale.

A total of 6,510 patients were included. The mean age was 56.3 years, and about 60% were female. Just over half (51.3%) were African American; 46.6% were white. Over half of patients (56.4%) had health insurance provided by city contract or Florida Medicaid; the remainder had either Medicare or commercial insurance.

Overall, nearly half of patients (46%) had inadequate bowel preparation. Half of males overall had adequate bowel preparation, compared with 57% of females (P less than .001). As the hour of the colonoscopy grew later, the likelihood of adequacy of bowel preparation dropped. The inverse relationship was statistically significant (P less than .001), with over 60% of 7 a.m. colonoscopies having adequate preparation. By 3 p.m., over 60% of bowel preparations were inadequate in the University of Florida cohort.

Colonoscopies performed later in the week were most likely to have adequate bowel preparation, with rates nearing 60% by Friday, compared with rates just over or at 50% for the first 3 days of the week (P less than .001).

“This study showed that a colonoscopy on the day after a holiday has a higher rate of inadequate bowel preparation,” said Dr. Nassri. Conversely, he said, “Colonoscopy toward the end of the week has a higher likelihood of adequate bowel preparation.”

The present work, he said, “re-demonstrated that procedures done later in the day have a poorer bowel preparation.”

Dr. Nassri reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM DDW 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Medicare may best Medicare Advantage at reducing readmissions

Article Type
Changed
Wed, 05/06/2020 - 12:25

 

Although earlier research may suggest otherwise, traditional Medicare may actually do a better job of lowering the risk of hospital readmissions than Medicare Advantage, new research suggests.

Admitting & Outpatients sign
Copyright Kimberly Pack/Thinkstock

Researchers used what they described as “a novel data linkage” comparing 30-day readmission rates after hospitalization for three major conditions in the Hospital Readmissions Reduction Program for patients using traditional Medicare versus Medicare Advantage. Those conditions included acute MI, heart failure, and pneumonia.

“Our results contrast with those of previous studies that have reported lower or statistically similar readmission rates for Medicare Advantage beneficiaries,” Orestis A. Panagiotou, MD, of Brown University, Providence, R.I., and colleagues wrote in a research report published in Annals of Internal Medicine.

In this retrospective cohort study, the researchers linked data from 2011 to 2014 from the Medicare Provider Analysis and Review (MedPAR) file to the Healthcare Effectiveness Data and Information Set (HEDIS).

The novel linkage found that HEDIS data underreported hospital admissions for acute MI, heart failure, and pneumonia, the researchers stated. “Plans incorrectly excluded hospitalizations that should have qualified for the readmission measure, and readmission rates were substantially higher among incorrectly excluded hospitalizations.”

Despite this, in analyses using the linkage of HEDIS and MedPAR, “Medicare Advantage beneficiaries had higher 30-day risk-adjusted readmission rates after [acute MI, heart failure, and pneumonia] than did traditional Medicare beneficiaries,” the investigators noted.

Patients in Medicare Advantage had lower unadjusted readmission rates compared with those in traditional Medicare (16.6% vs. 17.1% for acute MI; 21.4% vs. 21.7% for heart failure; and 16.3% vs. 16.4% for pneumonia). After standardization, Medicare Advantage patients had higher readmission rates, compared with those in traditional Medicare (17.2% vs. 16.9% for acute MI; 21.7% vs. 21.4% for heart failure; and 16.5% vs. 16.0% for pneumonia).

The study authors added that, while unadjusted readmission rates were higher for traditional Medicare beneficiaries, “the direction of the difference reversed after standardization. This occurred because Medicare Advantage beneficiaries have, on average, a lower expected readmission risk [that is, they are ‘healthier’].” Prior studies have documented that Medicare Advantage plans enroll beneficiaries with fewer comorbid conditions and that high-cost beneficiaries switch out of Medicare Advantage and into traditional Medicare.

The researchers suggested four reasons for the differences between the results in this study versus others that compared patients using Medicare with those using Medicare Advantage. These were that the new study included a more comprehensive data set, analyses with comorbid conditions “from a well-validated model applied by CMS [Centers for Medicare & Medicaid Services],” national data focused on three conditions included in the Hospital Readmissions Reduction Program, and patients discharged to places other than skilled nursing facilities and inpatient rehabilitation facilities.

Authors of an accompanying editorial called for caution to be used in interpreting Medicare Advantage enrollment as causing an increased readmission risk.

“[The] results are sensitive to adjustment for case mix,” wrote Peter Huckfeldt, PhD, of the University of Minnesota, Minneapolis, and Neeraj Sood, PhD, of the University of Southern California, Los Angeles, in the editorial published in Annals of Internal Medicine (2019 June 25. doi:10.7326/M19-1599) “Using diagnosis codes on hospital claims for case-mix adjustments may be increasingly perilous. ... To our knowledge, there is no recent evidence comparing the intensity of diagnostic coding between clinically similar [traditional Medicare] and [Medicare Advantage] hospital admissions, but if [traditional Medicare] enrollees were coded more intensively than [Medicare Advantage] enrollees, this could lead to [traditional Medicare] enrollees having lower risk-adjusted readmission rares due to coding practices.”

The editorialists added that using a cross-sectional comparison of Medicare Advantage and traditional Medicare patients is concerning because a “key challenge in estimating the effect of [Medicare Advantage] is that enrollment is voluntary,” which can lead to a number of analytical concerns.

The researchers concluded that their findings “are concerning because CMS uses HEDIS performance to construct composite quality ratings and assign payment bonuses to Medicare Advantage plans.

“Our study suggests a need for improved monitoring of the accuracy of HEDIS data,” they noted.

The National Institute on Aging provided the primary funding for this study. A number of the authors received grants from the National Institutes of Health during the conduct of the study. No other relevant disclosures were reported.

SOURCE: Panagiotou OA et al. Ann Intern Med. 2019 Jun 25. doi: 10.7326/M18-1795.

Publications
Topics
Sections

 

Although earlier research may suggest otherwise, traditional Medicare may actually do a better job of lowering the risk of hospital readmissions than Medicare Advantage, new research suggests.

Admitting & Outpatients sign
Copyright Kimberly Pack/Thinkstock

Researchers used what they described as “a novel data linkage” comparing 30-day readmission rates after hospitalization for three major conditions in the Hospital Readmissions Reduction Program for patients using traditional Medicare versus Medicare Advantage. Those conditions included acute MI, heart failure, and pneumonia.

“Our results contrast with those of previous studies that have reported lower or statistically similar readmission rates for Medicare Advantage beneficiaries,” Orestis A. Panagiotou, MD, of Brown University, Providence, R.I., and colleagues wrote in a research report published in Annals of Internal Medicine.

In this retrospective cohort study, the researchers linked data from 2011 to 2014 from the Medicare Provider Analysis and Review (MedPAR) file to the Healthcare Effectiveness Data and Information Set (HEDIS).

The novel linkage found that HEDIS data underreported hospital admissions for acute MI, heart failure, and pneumonia, the researchers stated. “Plans incorrectly excluded hospitalizations that should have qualified for the readmission measure, and readmission rates were substantially higher among incorrectly excluded hospitalizations.”

Despite this, in analyses using the linkage of HEDIS and MedPAR, “Medicare Advantage beneficiaries had higher 30-day risk-adjusted readmission rates after [acute MI, heart failure, and pneumonia] than did traditional Medicare beneficiaries,” the investigators noted.

Patients in Medicare Advantage had lower unadjusted readmission rates compared with those in traditional Medicare (16.6% vs. 17.1% for acute MI; 21.4% vs. 21.7% for heart failure; and 16.3% vs. 16.4% for pneumonia). After standardization, Medicare Advantage patients had higher readmission rates, compared with those in traditional Medicare (17.2% vs. 16.9% for acute MI; 21.7% vs. 21.4% for heart failure; and 16.5% vs. 16.0% for pneumonia).

The study authors added that, while unadjusted readmission rates were higher for traditional Medicare beneficiaries, “the direction of the difference reversed after standardization. This occurred because Medicare Advantage beneficiaries have, on average, a lower expected readmission risk [that is, they are ‘healthier’].” Prior studies have documented that Medicare Advantage plans enroll beneficiaries with fewer comorbid conditions and that high-cost beneficiaries switch out of Medicare Advantage and into traditional Medicare.

The researchers suggested four reasons for the differences between the results in this study versus others that compared patients using Medicare with those using Medicare Advantage. These were that the new study included a more comprehensive data set, analyses with comorbid conditions “from a well-validated model applied by CMS [Centers for Medicare & Medicaid Services],” national data focused on three conditions included in the Hospital Readmissions Reduction Program, and patients discharged to places other than skilled nursing facilities and inpatient rehabilitation facilities.

Authors of an accompanying editorial called for caution to be used in interpreting Medicare Advantage enrollment as causing an increased readmission risk.

“[The] results are sensitive to adjustment for case mix,” wrote Peter Huckfeldt, PhD, of the University of Minnesota, Minneapolis, and Neeraj Sood, PhD, of the University of Southern California, Los Angeles, in the editorial published in Annals of Internal Medicine (2019 June 25. doi:10.7326/M19-1599) “Using diagnosis codes on hospital claims for case-mix adjustments may be increasingly perilous. ... To our knowledge, there is no recent evidence comparing the intensity of diagnostic coding between clinically similar [traditional Medicare] and [Medicare Advantage] hospital admissions, but if [traditional Medicare] enrollees were coded more intensively than [Medicare Advantage] enrollees, this could lead to [traditional Medicare] enrollees having lower risk-adjusted readmission rares due to coding practices.”

The editorialists added that using a cross-sectional comparison of Medicare Advantage and traditional Medicare patients is concerning because a “key challenge in estimating the effect of [Medicare Advantage] is that enrollment is voluntary,” which can lead to a number of analytical concerns.

The researchers concluded that their findings “are concerning because CMS uses HEDIS performance to construct composite quality ratings and assign payment bonuses to Medicare Advantage plans.

“Our study suggests a need for improved monitoring of the accuracy of HEDIS data,” they noted.

The National Institute on Aging provided the primary funding for this study. A number of the authors received grants from the National Institutes of Health during the conduct of the study. No other relevant disclosures were reported.

SOURCE: Panagiotou OA et al. Ann Intern Med. 2019 Jun 25. doi: 10.7326/M18-1795.

 

Although earlier research may suggest otherwise, traditional Medicare may actually do a better job of lowering the risk of hospital readmissions than Medicare Advantage, new research suggests.

Admitting & Outpatients sign
Copyright Kimberly Pack/Thinkstock

Researchers used what they described as “a novel data linkage” comparing 30-day readmission rates after hospitalization for three major conditions in the Hospital Readmissions Reduction Program for patients using traditional Medicare versus Medicare Advantage. Those conditions included acute MI, heart failure, and pneumonia.

“Our results contrast with those of previous studies that have reported lower or statistically similar readmission rates for Medicare Advantage beneficiaries,” Orestis A. Panagiotou, MD, of Brown University, Providence, R.I., and colleagues wrote in a research report published in Annals of Internal Medicine.

In this retrospective cohort study, the researchers linked data from 2011 to 2014 from the Medicare Provider Analysis and Review (MedPAR) file to the Healthcare Effectiveness Data and Information Set (HEDIS).

The novel linkage found that HEDIS data underreported hospital admissions for acute MI, heart failure, and pneumonia, the researchers stated. “Plans incorrectly excluded hospitalizations that should have qualified for the readmission measure, and readmission rates were substantially higher among incorrectly excluded hospitalizations.”

Despite this, in analyses using the linkage of HEDIS and MedPAR, “Medicare Advantage beneficiaries had higher 30-day risk-adjusted readmission rates after [acute MI, heart failure, and pneumonia] than did traditional Medicare beneficiaries,” the investigators noted.

Patients in Medicare Advantage had lower unadjusted readmission rates compared with those in traditional Medicare (16.6% vs. 17.1% for acute MI; 21.4% vs. 21.7% for heart failure; and 16.3% vs. 16.4% for pneumonia). After standardization, Medicare Advantage patients had higher readmission rates, compared with those in traditional Medicare (17.2% vs. 16.9% for acute MI; 21.7% vs. 21.4% for heart failure; and 16.5% vs. 16.0% for pneumonia).

The study authors added that, while unadjusted readmission rates were higher for traditional Medicare beneficiaries, “the direction of the difference reversed after standardization. This occurred because Medicare Advantage beneficiaries have, on average, a lower expected readmission risk [that is, they are ‘healthier’].” Prior studies have documented that Medicare Advantage plans enroll beneficiaries with fewer comorbid conditions and that high-cost beneficiaries switch out of Medicare Advantage and into traditional Medicare.

The researchers suggested four reasons for the differences between the results in this study versus others that compared patients using Medicare with those using Medicare Advantage. These were that the new study included a more comprehensive data set, analyses with comorbid conditions “from a well-validated model applied by CMS [Centers for Medicare & Medicaid Services],” national data focused on three conditions included in the Hospital Readmissions Reduction Program, and patients discharged to places other than skilled nursing facilities and inpatient rehabilitation facilities.

Authors of an accompanying editorial called for caution to be used in interpreting Medicare Advantage enrollment as causing an increased readmission risk.

“[The] results are sensitive to adjustment for case mix,” wrote Peter Huckfeldt, PhD, of the University of Minnesota, Minneapolis, and Neeraj Sood, PhD, of the University of Southern California, Los Angeles, in the editorial published in Annals of Internal Medicine (2019 June 25. doi:10.7326/M19-1599) “Using diagnosis codes on hospital claims for case-mix adjustments may be increasingly perilous. ... To our knowledge, there is no recent evidence comparing the intensity of diagnostic coding between clinically similar [traditional Medicare] and [Medicare Advantage] hospital admissions, but if [traditional Medicare] enrollees were coded more intensively than [Medicare Advantage] enrollees, this could lead to [traditional Medicare] enrollees having lower risk-adjusted readmission rares due to coding practices.”

The editorialists added that using a cross-sectional comparison of Medicare Advantage and traditional Medicare patients is concerning because a “key challenge in estimating the effect of [Medicare Advantage] is that enrollment is voluntary,” which can lead to a number of analytical concerns.

The researchers concluded that their findings “are concerning because CMS uses HEDIS performance to construct composite quality ratings and assign payment bonuses to Medicare Advantage plans.

“Our study suggests a need for improved monitoring of the accuracy of HEDIS data,” they noted.

The National Institute on Aging provided the primary funding for this study. A number of the authors received grants from the National Institutes of Health during the conduct of the study. No other relevant disclosures were reported.

SOURCE: Panagiotou OA et al. Ann Intern Med. 2019 Jun 25. doi: 10.7326/M18-1795.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

About one in four youths prescribed stimulants also use the drugs nonmedically

Article Type
Changed
Thu, 06/27/2019 - 08:57

 

– Of 196 U.S. youth who reported use of at least one prescribed stimulant in their lifetimes, 25% also said they used the drugs nonmedically, based on a survey of children and adolescents aged 10-17 years.

Another 5% of the youth surveyed reported exclusively nonmedical use of stimulants. The survey participants lived in six U.S. cities and their outlying areas.

“Parents of both users and nonusers should warn their children of the dangers of using others’ stimulants and giving their own stimulants to others,” concluded Linda B. Cottler, PhD, MPH of the University of Florida, and colleagues.

“Physicians and pharmacists should make users and their families aware of the need to take medications as prescribed and not to share medications with others,” they wrote in their research poster at the annual meeting of the College on Problems of Drug Dependence. “Continuous monitoring of these medications in the community should be a priority.”

Though prevalence research has shown increasing stimulant misuse among youth, little data exist for younger children, the researchers noted. They therefore conducted a survey of 1,777 youth aged 10-17 years from September to October 2018 in six cities in California, Texas, and Florida, the most populous U.S. states.

The participants included youth from urban, rural, and suburban areas of Los Angeles, Dallas, Houston, Tampa, Orlando, and Miami. Trained graduate students and professional raters approached the respondents in entertainment venues and obtained assent but did not require parental consent. The respondents received $30 for completing the survey.

A total of 11.1% of respondents reporting having used prescription stimulants in their lifetime, and 7.6% had done so in the past 30 days. Just under a third of those who used stimulants (30.1%) did so for nonmedical purposes, defined as taking the stimulant nonorally (except for the patch Daytrana), getting the stimulant from someone else, or taking more of the drug than prescribed.

A quarter of the respondents who used stimulants reported both medical use and nonmedical use. And 5.1% of these youths reported only using stimulants nonmedically.

Among those with any lifetime stimulant use, 13.8% reported nonoral administration, including 9.7% who snorted or sniffed the drugs, 4.1% who smoked them, and 1.0% who injected them. Just over half (51.8%) of those reporting nonoral use had also used prescription stimulants orally.

The likelihood of using stimulants nonmedically increased with age (P less than .0001). The researchers found no significant associations between nonmedical use and geography or race/ethnicity. Among 10- to 12-year-olds, 3.1% reported only medical use of stimulants, and 0.7% (2 of 286 respondents in this age group) reported any nonmedical use of stimulants.

Of those aged 13-15 years, 2.1% reported any nonmedical stimulant use.

Nonmedical stimulant use was reported by twice as many boys (67.8%) as girls (32.2%), though this finding may not be surprising as the majority of nonmedical users were also medical users and stimulants are prescribed more frequently to boys than to girls (P less than .0006).

The research was funded by Arbor Pharmaceuticals. The authors noted no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Of 196 U.S. youth who reported use of at least one prescribed stimulant in their lifetimes, 25% also said they used the drugs nonmedically, based on a survey of children and adolescents aged 10-17 years.

Another 5% of the youth surveyed reported exclusively nonmedical use of stimulants. The survey participants lived in six U.S. cities and their outlying areas.

“Parents of both users and nonusers should warn their children of the dangers of using others’ stimulants and giving their own stimulants to others,” concluded Linda B. Cottler, PhD, MPH of the University of Florida, and colleagues.

“Physicians and pharmacists should make users and their families aware of the need to take medications as prescribed and not to share medications with others,” they wrote in their research poster at the annual meeting of the College on Problems of Drug Dependence. “Continuous monitoring of these medications in the community should be a priority.”

Though prevalence research has shown increasing stimulant misuse among youth, little data exist for younger children, the researchers noted. They therefore conducted a survey of 1,777 youth aged 10-17 years from September to October 2018 in six cities in California, Texas, and Florida, the most populous U.S. states.

The participants included youth from urban, rural, and suburban areas of Los Angeles, Dallas, Houston, Tampa, Orlando, and Miami. Trained graduate students and professional raters approached the respondents in entertainment venues and obtained assent but did not require parental consent. The respondents received $30 for completing the survey.

A total of 11.1% of respondents reporting having used prescription stimulants in their lifetime, and 7.6% had done so in the past 30 days. Just under a third of those who used stimulants (30.1%) did so for nonmedical purposes, defined as taking the stimulant nonorally (except for the patch Daytrana), getting the stimulant from someone else, or taking more of the drug than prescribed.

A quarter of the respondents who used stimulants reported both medical use and nonmedical use. And 5.1% of these youths reported only using stimulants nonmedically.

Among those with any lifetime stimulant use, 13.8% reported nonoral administration, including 9.7% who snorted or sniffed the drugs, 4.1% who smoked them, and 1.0% who injected them. Just over half (51.8%) of those reporting nonoral use had also used prescription stimulants orally.

The likelihood of using stimulants nonmedically increased with age (P less than .0001). The researchers found no significant associations between nonmedical use and geography or race/ethnicity. Among 10- to 12-year-olds, 3.1% reported only medical use of stimulants, and 0.7% (2 of 286 respondents in this age group) reported any nonmedical use of stimulants.

Of those aged 13-15 years, 2.1% reported any nonmedical stimulant use.

Nonmedical stimulant use was reported by twice as many boys (67.8%) as girls (32.2%), though this finding may not be surprising as the majority of nonmedical users were also medical users and stimulants are prescribed more frequently to boys than to girls (P less than .0006).

The research was funded by Arbor Pharmaceuticals. The authors noted no conflicts of interest.

 

– Of 196 U.S. youth who reported use of at least one prescribed stimulant in their lifetimes, 25% also said they used the drugs nonmedically, based on a survey of children and adolescents aged 10-17 years.

Another 5% of the youth surveyed reported exclusively nonmedical use of stimulants. The survey participants lived in six U.S. cities and their outlying areas.

“Parents of both users and nonusers should warn their children of the dangers of using others’ stimulants and giving their own stimulants to others,” concluded Linda B. Cottler, PhD, MPH of the University of Florida, and colleagues.

“Physicians and pharmacists should make users and their families aware of the need to take medications as prescribed and not to share medications with others,” they wrote in their research poster at the annual meeting of the College on Problems of Drug Dependence. “Continuous monitoring of these medications in the community should be a priority.”

Though prevalence research has shown increasing stimulant misuse among youth, little data exist for younger children, the researchers noted. They therefore conducted a survey of 1,777 youth aged 10-17 years from September to October 2018 in six cities in California, Texas, and Florida, the most populous U.S. states.

The participants included youth from urban, rural, and suburban areas of Los Angeles, Dallas, Houston, Tampa, Orlando, and Miami. Trained graduate students and professional raters approached the respondents in entertainment venues and obtained assent but did not require parental consent. The respondents received $30 for completing the survey.

A total of 11.1% of respondents reporting having used prescription stimulants in their lifetime, and 7.6% had done so in the past 30 days. Just under a third of those who used stimulants (30.1%) did so for nonmedical purposes, defined as taking the stimulant nonorally (except for the patch Daytrana), getting the stimulant from someone else, or taking more of the drug than prescribed.

A quarter of the respondents who used stimulants reported both medical use and nonmedical use. And 5.1% of these youths reported only using stimulants nonmedically.

Among those with any lifetime stimulant use, 13.8% reported nonoral administration, including 9.7% who snorted or sniffed the drugs, 4.1% who smoked them, and 1.0% who injected them. Just over half (51.8%) of those reporting nonoral use had also used prescription stimulants orally.

The likelihood of using stimulants nonmedically increased with age (P less than .0001). The researchers found no significant associations between nonmedical use and geography or race/ethnicity. Among 10- to 12-year-olds, 3.1% reported only medical use of stimulants, and 0.7% (2 of 286 respondents in this age group) reported any nonmedical use of stimulants.

Of those aged 13-15 years, 2.1% reported any nonmedical stimulant use.

Nonmedical stimulant use was reported by twice as many boys (67.8%) as girls (32.2%), though this finding may not be surprising as the majority of nonmedical users were also medical users and stimulants are prescribed more frequently to boys than to girls (P less than .0006).

The research was funded by Arbor Pharmaceuticals. The authors noted no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM CPDD 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Imaging predicts early postural instability in Parkinson’s disease

Article Type
Changed
Fri, 06/28/2019 - 11:35

 

– Diffusion-weighted MRI and the presence of at least five of seven clinical features may prove useful for determining which newly diagnosed patients with Parkinson’s disease are likely to have rapidly progressive disease, Frank M. Skidmore, MD, reported at the annual meeting of the American Academy of Neurology.

Patients with gray matter and axonal disease on initial imaging were found to have more aggressive disease associated with early gait dysfunction than were patients with primarily white matter and axonal disease, said Dr. Skidmore, associate professor of neurology at the University of Alabama, Birmingham.

Diffusion-weighted imaging provides a way to assess cellular fluid partitioning and directional information in gray and white matter. Thus, it has the potential to identify brainstem pathology that is associated with disease progression, he said. “Our approach provides a pathway towards using MR to detect early, prognostic, neurodegenerative changes in diseases of the brain.”

Dr. Skidmore and colleagues performed diffusion-weighted imaging on 101 patients with newly diagnosed Parkinson’s disease and 56 healthy controls. They found that Parkinson’s disease was associated with altered radial diffusion in white matter. Changes were observed mainly in the striatonigral tract and the substantia nigra. The investigators also noted atrophy in the cerebellar peduncle among patients with Parkinson’s disease.

At baseline, the patients who went on to have subsequent development of early postural instability and gait dysfunction had decreased intracellular fluid partitioning in the substantia nigra and the mesencephalic locomotor region, which are predominantly gray matter regions. These participants had a lower orientation diffusion index (ODI) and a lower estimate of cellularity, Dr. Skidmore said.

The researchers defined early gait dysfunction as the achievement of a Hoehn and Yahr score of 3 at least once while on medication during the first 5 years after Parkinson’s disease diagnosis. Follow-up was at least 5 years in 79 of the patients.

To identify clinical features associated with early postural instability and gait difficulty, the investigators examined data for 301 patients. In this population, Dr. Skidmore and colleagues identified 218 patients whose Hoehn and Yahr scores never exceeded 2 and 83 patients with at least one Hoehn and Yahr score of 3 or more. Using Bonferroni correction, they examined Unified Parkinson’s Disease Rating Scale (UPDRS) data for all patients to identify significant differences between these two groups. Seven items distinguished patients who developed early postural instability and gait difficulty. They included lightheadedness, fatigue, difficulty walking, ability to rise from a chair, and postural problems. The seven-item scale was superior to the Unified Parkinson’s Disease Rating Scale (UPDRS) at predicting which newly diagnosed patients would develop early postural and gait difficulties

SOURCE: Skidmore F et al. AANN 2019, Abstract S41.004.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Diffusion-weighted MRI and the presence of at least five of seven clinical features may prove useful for determining which newly diagnosed patients with Parkinson’s disease are likely to have rapidly progressive disease, Frank M. Skidmore, MD, reported at the annual meeting of the American Academy of Neurology.

Patients with gray matter and axonal disease on initial imaging were found to have more aggressive disease associated with early gait dysfunction than were patients with primarily white matter and axonal disease, said Dr. Skidmore, associate professor of neurology at the University of Alabama, Birmingham.

Diffusion-weighted imaging provides a way to assess cellular fluid partitioning and directional information in gray and white matter. Thus, it has the potential to identify brainstem pathology that is associated with disease progression, he said. “Our approach provides a pathway towards using MR to detect early, prognostic, neurodegenerative changes in diseases of the brain.”

Dr. Skidmore and colleagues performed diffusion-weighted imaging on 101 patients with newly diagnosed Parkinson’s disease and 56 healthy controls. They found that Parkinson’s disease was associated with altered radial diffusion in white matter. Changes were observed mainly in the striatonigral tract and the substantia nigra. The investigators also noted atrophy in the cerebellar peduncle among patients with Parkinson’s disease.

At baseline, the patients who went on to have subsequent development of early postural instability and gait dysfunction had decreased intracellular fluid partitioning in the substantia nigra and the mesencephalic locomotor region, which are predominantly gray matter regions. These participants had a lower orientation diffusion index (ODI) and a lower estimate of cellularity, Dr. Skidmore said.

The researchers defined early gait dysfunction as the achievement of a Hoehn and Yahr score of 3 at least once while on medication during the first 5 years after Parkinson’s disease diagnosis. Follow-up was at least 5 years in 79 of the patients.

To identify clinical features associated with early postural instability and gait difficulty, the investigators examined data for 301 patients. In this population, Dr. Skidmore and colleagues identified 218 patients whose Hoehn and Yahr scores never exceeded 2 and 83 patients with at least one Hoehn and Yahr score of 3 or more. Using Bonferroni correction, they examined Unified Parkinson’s Disease Rating Scale (UPDRS) data for all patients to identify significant differences between these two groups. Seven items distinguished patients who developed early postural instability and gait difficulty. They included lightheadedness, fatigue, difficulty walking, ability to rise from a chair, and postural problems. The seven-item scale was superior to the Unified Parkinson’s Disease Rating Scale (UPDRS) at predicting which newly diagnosed patients would develop early postural and gait difficulties

SOURCE: Skidmore F et al. AANN 2019, Abstract S41.004.

 

– Diffusion-weighted MRI and the presence of at least five of seven clinical features may prove useful for determining which newly diagnosed patients with Parkinson’s disease are likely to have rapidly progressive disease, Frank M. Skidmore, MD, reported at the annual meeting of the American Academy of Neurology.

Patients with gray matter and axonal disease on initial imaging were found to have more aggressive disease associated with early gait dysfunction than were patients with primarily white matter and axonal disease, said Dr. Skidmore, associate professor of neurology at the University of Alabama, Birmingham.

Diffusion-weighted imaging provides a way to assess cellular fluid partitioning and directional information in gray and white matter. Thus, it has the potential to identify brainstem pathology that is associated with disease progression, he said. “Our approach provides a pathway towards using MR to detect early, prognostic, neurodegenerative changes in diseases of the brain.”

Dr. Skidmore and colleagues performed diffusion-weighted imaging on 101 patients with newly diagnosed Parkinson’s disease and 56 healthy controls. They found that Parkinson’s disease was associated with altered radial diffusion in white matter. Changes were observed mainly in the striatonigral tract and the substantia nigra. The investigators also noted atrophy in the cerebellar peduncle among patients with Parkinson’s disease.

At baseline, the patients who went on to have subsequent development of early postural instability and gait dysfunction had decreased intracellular fluid partitioning in the substantia nigra and the mesencephalic locomotor region, which are predominantly gray matter regions. These participants had a lower orientation diffusion index (ODI) and a lower estimate of cellularity, Dr. Skidmore said.

The researchers defined early gait dysfunction as the achievement of a Hoehn and Yahr score of 3 at least once while on medication during the first 5 years after Parkinson’s disease diagnosis. Follow-up was at least 5 years in 79 of the patients.

To identify clinical features associated with early postural instability and gait difficulty, the investigators examined data for 301 patients. In this population, Dr. Skidmore and colleagues identified 218 patients whose Hoehn and Yahr scores never exceeded 2 and 83 patients with at least one Hoehn and Yahr score of 3 or more. Using Bonferroni correction, they examined Unified Parkinson’s Disease Rating Scale (UPDRS) data for all patients to identify significant differences between these two groups. Seven items distinguished patients who developed early postural instability and gait difficulty. They included lightheadedness, fatigue, difficulty walking, ability to rise from a chair, and postural problems. The seven-item scale was superior to the Unified Parkinson’s Disease Rating Scale (UPDRS) at predicting which newly diagnosed patients would develop early postural and gait difficulties

SOURCE: Skidmore F et al. AANN 2019, Abstract S41.004.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM AAN 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Two trials support shorter DAPT without aspirin after stent

Too soon to abandon aspirin
Article Type
Changed
Tue, 06/25/2019 - 12:34

 

An entire year of dual-antiplatelet therapy may be no better at limiting ischemic events or death than a shorter course for patients who have undergone percutaneous coronary intervention with a drug-eluting stent.

The two trials, which tested dual-antiplatelet therapy (DAPT) regimens of 3 months and 1 month, are also noteworthy for giving a P2Y12 inhibitor after DAPT instead of aspirin monotherapy, which is a more common approach. Each randomized about 3,000 patients.

According to lead author Joo-Yong Hahn, MD, of Sungkyunkwan University in Seoul, South Korea, and colleagues, who conducted the first trial (SMART-CHOICE), both shorter and longer DAPT regimens with aspirin have been associated with shortcomings.

Specifically, shorter duration DAPT with subsequent aspirin monotherapy carries increased risks of MI and stent thrombosis, the investigators wrote. “Conversely, prolonged DAPT increases the risk of bleeding, which offsets the benefit from reducing recurrent ischemic events. Therefore, neither prolonged DAPT nor short-duration DAPT followed by aspirin monotherapy is fully satisfactory.” Because of these shortcomings, the investigators suggested that developing novel strategies “is of paramount importance.”

SMART-CHOICE

The multicenter trial by Dr. Hahn and colleagues, conducted in South Korea, involved 2,993 patients undergoing percutaneous coronary intervention with drug-eluting stents. Patients were randomized to receive either standard DAPT with aspirin and a P2Y12 inhibitor for 12 months, or aspirin plus a P2Y12 inhibitor for 3 months followed by 9 months of P2Y12 monotherapy. Patients were stratified by enrolling center, clinical presentation, type of stent, and type of P2Y12 therapy. Stents were limited to those eluting cobalt-chromium everolimus (Xience Prime, Xience Expedition, or Xience Alpine; Abbott Vascular), platinum-chromium everolimus (Promus Element, Promus Premier, or SYNERGY; Boston Scientific), or sirolimus (Orsiro; Biotronik). Acceptable P2Y12 therapies were clopidogrel, ticagrelor, and prasugrel. The primary endpoint was a composite of major adverse cerebrovascular and cardiac events, including stroke, MI, or all-cause death, at 12 months after percutaneous coronary intervention. A number of secondary endpoints were also evaluated, such as bleeding rate, stent thrombosis, and the individual components of the primary endpoint.

Almost all patients (95%) in the DAPT group adhered to the study protocol, while a smaller proportion (79%) followed P2Y12 monotherapy as described. Still, for both groups, more than 97% of patients completed 1-year follow-up. Primary endpoint analysis showed that the cumulative rate of major adverse cerebrovascular and cardiac events was similar between both groups, at 2.9% in the P2Y12 group versus 2.5% in the DAPT group, which was statistically significant for noninferiority (P = .007). Per-protocol analysis supported this finding.

Similarly, the components of the primary endpoint – stroke, MI, or all-cause death – did not vary significantly between groups. No significant difference was detected for the risk of stent thrombosis. Although the major bleeding rate was comparable between groups, the overall bleeding rate was significantly lower in the P2Y12 inhibitor group than the DAPT group (2.0% vs. 3.4%; P = .02); this finding also was supported by per-protocol analysis (1.8% vs. 3.1%; P = .04).

The investigators proposed several explanations for the results. “First, aspirin might provide little additional inhibition of platelet aggregation in the presence of a P2Y12 inhibitor. … Second, the risk of bleeding was significantly lower with P2Y12 inhibitor monotherapy than with DAPT in the present study.”

They noted that second-generation drug-eluting stents were used, which have been shown to significantly reduce MI and stent thrombosis, compared with first-generation products.

 

 

STOPDAPT-2

This study, led by Hirotoshi Watanabe, MD, of Kyoto University, and colleagues, followed a similar design, but with an even shorter duration of DAPT in the treatment arm, at 1 month, and stricter criteria for the stent, which was limited to one cobalt-chromium everolimus-eluting model (Xience Series; Abbott Vascular). During the first month of the trial, all patients received aspirin plus either clopidogrel or prasugrel; thereafter, patients in the 12-month group received aspirin and clopidogrel while the 1-month group was given clopidogrel alone.

The primary endpoint was a composite of cardiovascular and bleeding events, including MI, stent thrombosis, cardiovascular death, stroke, and major or minor bleeding. Secondary endpoints included these components individually, as well as a list of other cardiovascular and bleeding measures.

Similarly to the first trial, Dr. Watanabe and colleagues found that the shorter DAPT protocol was noninferior to standard DAPT and associated with a lower rate of bleeding events. The primary endpoint occurred in 2.4% of the 1-month DAPT group, compared with 3.7% of the 12-month DAPT group, thereby meeting noninferiority criteria (P less than .001). This finding was confirmed in the per-protocol population. The 1-month DAPT regimen was significantly associated with fewer major bleeding events than the 12-month protocol (0.41% vs. 1.54%), demonstrating superiority (P = .004). In addition, seven other measures of bleeding frequency were lower in the 1-month DAPT group than the standard DAPT group, including Bleeding Academic Research Consortium type 3 or 5 criteria, and Global Use of Strategies to Open Occluded Arteries moderate or severe criteria.

Dr. Watanabe and colleagues provided some insight into these findings and described clinical implications. “The benefit [of the 1-month DAPT regimen] was driven by a significant reduction of bleeding events without an increase in cardiovascular events,” they wrote. “Therefore, the very short DAPT duration of 1 month would be a potential option even in patients without high bleeding risk. Given the very low rates of stent thrombosis in studies using contemporary drug-eluting stents, avoiding bleeding with de-escalation of antiplatelet therapy may be more important than attempting further reduction of stent thrombosis with intensive antiplatelet therapy.”

SMART-CHOICE was funded by the Korean Society of Interventional Cardiology, Biotronik, Abbott Vascular, and Boston Scientific. Dr. Hahn and colleagues reported receiving additional financial relationships with AstraZeneca, Daiichi Sankyo, Sanofi-Aventis, and others. STOPDAPT-2 was funded by Abbott Vascular. Dr. Watanabe and colleagues reported receiving additional funding from Daiichi Sankyo, Otsuka Pharmaceutical, Kowa Pharmaceuticals, and others.

SOURCES: Watanabe H et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8145; Hahn J-Y et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8146.

Body

 

These two studies evaluated shorter-duration dual-antiplatelet therapy (DAPT) with a novel variation: Instead of discontinuing the P2Y12 inhibitor, which is a more common approach, the regimens discontinued aspirin. Although the studies had slightly different 1-year endpoints, both found that shorter DAPT with continued P2Y12 monotherapy reduced bleeding complications without increasing risk of ischemic events or death.

Based on these findings, and those from other trials, shorter DAPT will likely gain support, particularly when used with atherosclerosis risk factor reduction, newer implantation techniques, and contemporary stents. However, clinicians considering shorter DAPT for their patients should do so in light of baseline ischemic complication risk and clinical presentation. Furthermore, it remains unclear whether P2Y12 or aspirin monotherapy should be given after shorter DAPT. Until more evidence is available, it is too soon to abandon aspirin monotherapy or traditional DAPT.

Khaled M. Ziada, MD, and David J. Moliterno, MD, are with the department of cardiovascular medicine at the University of Kentucky, Lexington. Dr. Moliterno has received grants from AstraZeneca. No other disclosures were reported. Their remarks are adapted from an accompanying editorial (JAMA. 2019 June 25. doi: 10.1001/jama.2019.7025).

Publications
Topics
Sections
Body

 

These two studies evaluated shorter-duration dual-antiplatelet therapy (DAPT) with a novel variation: Instead of discontinuing the P2Y12 inhibitor, which is a more common approach, the regimens discontinued aspirin. Although the studies had slightly different 1-year endpoints, both found that shorter DAPT with continued P2Y12 monotherapy reduced bleeding complications without increasing risk of ischemic events or death.

Based on these findings, and those from other trials, shorter DAPT will likely gain support, particularly when used with atherosclerosis risk factor reduction, newer implantation techniques, and contemporary stents. However, clinicians considering shorter DAPT for their patients should do so in light of baseline ischemic complication risk and clinical presentation. Furthermore, it remains unclear whether P2Y12 or aspirin monotherapy should be given after shorter DAPT. Until more evidence is available, it is too soon to abandon aspirin monotherapy or traditional DAPT.

Khaled M. Ziada, MD, and David J. Moliterno, MD, are with the department of cardiovascular medicine at the University of Kentucky, Lexington. Dr. Moliterno has received grants from AstraZeneca. No other disclosures were reported. Their remarks are adapted from an accompanying editorial (JAMA. 2019 June 25. doi: 10.1001/jama.2019.7025).

Body

 

These two studies evaluated shorter-duration dual-antiplatelet therapy (DAPT) with a novel variation: Instead of discontinuing the P2Y12 inhibitor, which is a more common approach, the regimens discontinued aspirin. Although the studies had slightly different 1-year endpoints, both found that shorter DAPT with continued P2Y12 monotherapy reduced bleeding complications without increasing risk of ischemic events or death.

Based on these findings, and those from other trials, shorter DAPT will likely gain support, particularly when used with atherosclerosis risk factor reduction, newer implantation techniques, and contemporary stents. However, clinicians considering shorter DAPT for their patients should do so in light of baseline ischemic complication risk and clinical presentation. Furthermore, it remains unclear whether P2Y12 or aspirin monotherapy should be given after shorter DAPT. Until more evidence is available, it is too soon to abandon aspirin monotherapy or traditional DAPT.

Khaled M. Ziada, MD, and David J. Moliterno, MD, are with the department of cardiovascular medicine at the University of Kentucky, Lexington. Dr. Moliterno has received grants from AstraZeneca. No other disclosures were reported. Their remarks are adapted from an accompanying editorial (JAMA. 2019 June 25. doi: 10.1001/jama.2019.7025).

Title
Too soon to abandon aspirin
Too soon to abandon aspirin

 

An entire year of dual-antiplatelet therapy may be no better at limiting ischemic events or death than a shorter course for patients who have undergone percutaneous coronary intervention with a drug-eluting stent.

The two trials, which tested dual-antiplatelet therapy (DAPT) regimens of 3 months and 1 month, are also noteworthy for giving a P2Y12 inhibitor after DAPT instead of aspirin monotherapy, which is a more common approach. Each randomized about 3,000 patients.

According to lead author Joo-Yong Hahn, MD, of Sungkyunkwan University in Seoul, South Korea, and colleagues, who conducted the first trial (SMART-CHOICE), both shorter and longer DAPT regimens with aspirin have been associated with shortcomings.

Specifically, shorter duration DAPT with subsequent aspirin monotherapy carries increased risks of MI and stent thrombosis, the investigators wrote. “Conversely, prolonged DAPT increases the risk of bleeding, which offsets the benefit from reducing recurrent ischemic events. Therefore, neither prolonged DAPT nor short-duration DAPT followed by aspirin monotherapy is fully satisfactory.” Because of these shortcomings, the investigators suggested that developing novel strategies “is of paramount importance.”

SMART-CHOICE

The multicenter trial by Dr. Hahn and colleagues, conducted in South Korea, involved 2,993 patients undergoing percutaneous coronary intervention with drug-eluting stents. Patients were randomized to receive either standard DAPT with aspirin and a P2Y12 inhibitor for 12 months, or aspirin plus a P2Y12 inhibitor for 3 months followed by 9 months of P2Y12 monotherapy. Patients were stratified by enrolling center, clinical presentation, type of stent, and type of P2Y12 therapy. Stents were limited to those eluting cobalt-chromium everolimus (Xience Prime, Xience Expedition, or Xience Alpine; Abbott Vascular), platinum-chromium everolimus (Promus Element, Promus Premier, or SYNERGY; Boston Scientific), or sirolimus (Orsiro; Biotronik). Acceptable P2Y12 therapies were clopidogrel, ticagrelor, and prasugrel. The primary endpoint was a composite of major adverse cerebrovascular and cardiac events, including stroke, MI, or all-cause death, at 12 months after percutaneous coronary intervention. A number of secondary endpoints were also evaluated, such as bleeding rate, stent thrombosis, and the individual components of the primary endpoint.

Almost all patients (95%) in the DAPT group adhered to the study protocol, while a smaller proportion (79%) followed P2Y12 monotherapy as described. Still, for both groups, more than 97% of patients completed 1-year follow-up. Primary endpoint analysis showed that the cumulative rate of major adverse cerebrovascular and cardiac events was similar between both groups, at 2.9% in the P2Y12 group versus 2.5% in the DAPT group, which was statistically significant for noninferiority (P = .007). Per-protocol analysis supported this finding.

Similarly, the components of the primary endpoint – stroke, MI, or all-cause death – did not vary significantly between groups. No significant difference was detected for the risk of stent thrombosis. Although the major bleeding rate was comparable between groups, the overall bleeding rate was significantly lower in the P2Y12 inhibitor group than the DAPT group (2.0% vs. 3.4%; P = .02); this finding also was supported by per-protocol analysis (1.8% vs. 3.1%; P = .04).

The investigators proposed several explanations for the results. “First, aspirin might provide little additional inhibition of platelet aggregation in the presence of a P2Y12 inhibitor. … Second, the risk of bleeding was significantly lower with P2Y12 inhibitor monotherapy than with DAPT in the present study.”

They noted that second-generation drug-eluting stents were used, which have been shown to significantly reduce MI and stent thrombosis, compared with first-generation products.

 

 

STOPDAPT-2

This study, led by Hirotoshi Watanabe, MD, of Kyoto University, and colleagues, followed a similar design, but with an even shorter duration of DAPT in the treatment arm, at 1 month, and stricter criteria for the stent, which was limited to one cobalt-chromium everolimus-eluting model (Xience Series; Abbott Vascular). During the first month of the trial, all patients received aspirin plus either clopidogrel or prasugrel; thereafter, patients in the 12-month group received aspirin and clopidogrel while the 1-month group was given clopidogrel alone.

The primary endpoint was a composite of cardiovascular and bleeding events, including MI, stent thrombosis, cardiovascular death, stroke, and major or minor bleeding. Secondary endpoints included these components individually, as well as a list of other cardiovascular and bleeding measures.

Similarly to the first trial, Dr. Watanabe and colleagues found that the shorter DAPT protocol was noninferior to standard DAPT and associated with a lower rate of bleeding events. The primary endpoint occurred in 2.4% of the 1-month DAPT group, compared with 3.7% of the 12-month DAPT group, thereby meeting noninferiority criteria (P less than .001). This finding was confirmed in the per-protocol population. The 1-month DAPT regimen was significantly associated with fewer major bleeding events than the 12-month protocol (0.41% vs. 1.54%), demonstrating superiority (P = .004). In addition, seven other measures of bleeding frequency were lower in the 1-month DAPT group than the standard DAPT group, including Bleeding Academic Research Consortium type 3 or 5 criteria, and Global Use of Strategies to Open Occluded Arteries moderate or severe criteria.

Dr. Watanabe and colleagues provided some insight into these findings and described clinical implications. “The benefit [of the 1-month DAPT regimen] was driven by a significant reduction of bleeding events without an increase in cardiovascular events,” they wrote. “Therefore, the very short DAPT duration of 1 month would be a potential option even in patients without high bleeding risk. Given the very low rates of stent thrombosis in studies using contemporary drug-eluting stents, avoiding bleeding with de-escalation of antiplatelet therapy may be more important than attempting further reduction of stent thrombosis with intensive antiplatelet therapy.”

SMART-CHOICE was funded by the Korean Society of Interventional Cardiology, Biotronik, Abbott Vascular, and Boston Scientific. Dr. Hahn and colleagues reported receiving additional financial relationships with AstraZeneca, Daiichi Sankyo, Sanofi-Aventis, and others. STOPDAPT-2 was funded by Abbott Vascular. Dr. Watanabe and colleagues reported receiving additional funding from Daiichi Sankyo, Otsuka Pharmaceutical, Kowa Pharmaceuticals, and others.

SOURCES: Watanabe H et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8145; Hahn J-Y et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8146.

 

An entire year of dual-antiplatelet therapy may be no better at limiting ischemic events or death than a shorter course for patients who have undergone percutaneous coronary intervention with a drug-eluting stent.

The two trials, which tested dual-antiplatelet therapy (DAPT) regimens of 3 months and 1 month, are also noteworthy for giving a P2Y12 inhibitor after DAPT instead of aspirin monotherapy, which is a more common approach. Each randomized about 3,000 patients.

According to lead author Joo-Yong Hahn, MD, of Sungkyunkwan University in Seoul, South Korea, and colleagues, who conducted the first trial (SMART-CHOICE), both shorter and longer DAPT regimens with aspirin have been associated with shortcomings.

Specifically, shorter duration DAPT with subsequent aspirin monotherapy carries increased risks of MI and stent thrombosis, the investigators wrote. “Conversely, prolonged DAPT increases the risk of bleeding, which offsets the benefit from reducing recurrent ischemic events. Therefore, neither prolonged DAPT nor short-duration DAPT followed by aspirin monotherapy is fully satisfactory.” Because of these shortcomings, the investigators suggested that developing novel strategies “is of paramount importance.”

SMART-CHOICE

The multicenter trial by Dr. Hahn and colleagues, conducted in South Korea, involved 2,993 patients undergoing percutaneous coronary intervention with drug-eluting stents. Patients were randomized to receive either standard DAPT with aspirin and a P2Y12 inhibitor for 12 months, or aspirin plus a P2Y12 inhibitor for 3 months followed by 9 months of P2Y12 monotherapy. Patients were stratified by enrolling center, clinical presentation, type of stent, and type of P2Y12 therapy. Stents were limited to those eluting cobalt-chromium everolimus (Xience Prime, Xience Expedition, or Xience Alpine; Abbott Vascular), platinum-chromium everolimus (Promus Element, Promus Premier, or SYNERGY; Boston Scientific), or sirolimus (Orsiro; Biotronik). Acceptable P2Y12 therapies were clopidogrel, ticagrelor, and prasugrel. The primary endpoint was a composite of major adverse cerebrovascular and cardiac events, including stroke, MI, or all-cause death, at 12 months after percutaneous coronary intervention. A number of secondary endpoints were also evaluated, such as bleeding rate, stent thrombosis, and the individual components of the primary endpoint.

Almost all patients (95%) in the DAPT group adhered to the study protocol, while a smaller proportion (79%) followed P2Y12 monotherapy as described. Still, for both groups, more than 97% of patients completed 1-year follow-up. Primary endpoint analysis showed that the cumulative rate of major adverse cerebrovascular and cardiac events was similar between both groups, at 2.9% in the P2Y12 group versus 2.5% in the DAPT group, which was statistically significant for noninferiority (P = .007). Per-protocol analysis supported this finding.

Similarly, the components of the primary endpoint – stroke, MI, or all-cause death – did not vary significantly between groups. No significant difference was detected for the risk of stent thrombosis. Although the major bleeding rate was comparable between groups, the overall bleeding rate was significantly lower in the P2Y12 inhibitor group than the DAPT group (2.0% vs. 3.4%; P = .02); this finding also was supported by per-protocol analysis (1.8% vs. 3.1%; P = .04).

The investigators proposed several explanations for the results. “First, aspirin might provide little additional inhibition of platelet aggregation in the presence of a P2Y12 inhibitor. … Second, the risk of bleeding was significantly lower with P2Y12 inhibitor monotherapy than with DAPT in the present study.”

They noted that second-generation drug-eluting stents were used, which have been shown to significantly reduce MI and stent thrombosis, compared with first-generation products.

 

 

STOPDAPT-2

This study, led by Hirotoshi Watanabe, MD, of Kyoto University, and colleagues, followed a similar design, but with an even shorter duration of DAPT in the treatment arm, at 1 month, and stricter criteria for the stent, which was limited to one cobalt-chromium everolimus-eluting model (Xience Series; Abbott Vascular). During the first month of the trial, all patients received aspirin plus either clopidogrel or prasugrel; thereafter, patients in the 12-month group received aspirin and clopidogrel while the 1-month group was given clopidogrel alone.

The primary endpoint was a composite of cardiovascular and bleeding events, including MI, stent thrombosis, cardiovascular death, stroke, and major or minor bleeding. Secondary endpoints included these components individually, as well as a list of other cardiovascular and bleeding measures.

Similarly to the first trial, Dr. Watanabe and colleagues found that the shorter DAPT protocol was noninferior to standard DAPT and associated with a lower rate of bleeding events. The primary endpoint occurred in 2.4% of the 1-month DAPT group, compared with 3.7% of the 12-month DAPT group, thereby meeting noninferiority criteria (P less than .001). This finding was confirmed in the per-protocol population. The 1-month DAPT regimen was significantly associated with fewer major bleeding events than the 12-month protocol (0.41% vs. 1.54%), demonstrating superiority (P = .004). In addition, seven other measures of bleeding frequency were lower in the 1-month DAPT group than the standard DAPT group, including Bleeding Academic Research Consortium type 3 or 5 criteria, and Global Use of Strategies to Open Occluded Arteries moderate or severe criteria.

Dr. Watanabe and colleagues provided some insight into these findings and described clinical implications. “The benefit [of the 1-month DAPT regimen] was driven by a significant reduction of bleeding events without an increase in cardiovascular events,” they wrote. “Therefore, the very short DAPT duration of 1 month would be a potential option even in patients without high bleeding risk. Given the very low rates of stent thrombosis in studies using contemporary drug-eluting stents, avoiding bleeding with de-escalation of antiplatelet therapy may be more important than attempting further reduction of stent thrombosis with intensive antiplatelet therapy.”

SMART-CHOICE was funded by the Korean Society of Interventional Cardiology, Biotronik, Abbott Vascular, and Boston Scientific. Dr. Hahn and colleagues reported receiving additional financial relationships with AstraZeneca, Daiichi Sankyo, Sanofi-Aventis, and others. STOPDAPT-2 was funded by Abbott Vascular. Dr. Watanabe and colleagues reported receiving additional funding from Daiichi Sankyo, Otsuka Pharmaceutical, Kowa Pharmaceuticals, and others.

SOURCES: Watanabe H et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8145; Hahn J-Y et al. JAMA. 2019 Jun 25. doi: 10.1001/jama.2019.8146.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

What drives intensification of antihypertensive therapy at discharge?

Article Type
Changed
Tue, 06/25/2019 - 13:18

Background: Transient elevations in blood pressure are common among adult patients, yet there are no data or guidelines that support long-term medication changes based on these readings. Tight control of blood pressure is likely to improve outcomes among patients with heart failure), myocardial infarction, and stroke. Patients with reduced life expectancy, dementia, or metastatic cancer are less likely to benefit from tight control.

Dr. Horatio (Teddy) Holzer, division of hospital medicine, Mount Sinai Hospital, New York
Dr. Teddy Holzer


Study design: Retrospective cohort study.

Setting: U.S. Veterans Administration (VA) Health System.

Synopsis: The investigators reviewed data from 14,915 adults over 65 (median age, 76 years) admitted to the VA with a diagnosis of pneumonia, urinary tract infection, or venous thromboembolism. Most patients (65%) had well-controlled blood pressure prior to admission.

A total of 2,074 (14%) patients were discharged with an intensified hypertension regimen (additional medication or higher dose). While both elevated inpatient and outpatient blood pressures were predictive of intensification, the association with elevated inpatient blood pressure was much stronger (odds ratio, 3.66; 95% confidence interval, 3.29-4.08) than it was with elevated outpatient blood pressure (OR, 1.75; 95% CI, 1.58-1.93).

In a multivariate regression analysis, the investigators found no significant differences in intensification by life expectancy (P = .07), diagnosis of dementia (P = .95), or metastatic malignancy (P = .13). There was a small increased probability of intensification among patients with heart failure, but no such difference for patients with history of MI (P = .53), stroke (P = .37), or renal disease (P = .73).

The generalizability of this trial may be limited given the cohort was predominantly male (97%), white (77%), and 53% had at least four major comorbidities.

Bottom line: Intensification of antihypertensive therapy at discharge is often driven by inpatient blood pressure readings rather than the broader context of their disease, such as prior long-term outpatient blood pressure control or major comorbidities.

Citation: Anderson TS et al. Intensification of older adults’ outpatient blood pressure treatment at hospital discharge: A national retrospective cohort study. BMJ. 2018:362:k3503.

Dr. Holzer is an assistant professor of medicine in the division of hospital medicine at Mount Sinai Hospital, New York.

Publications
Topics
Sections

Background: Transient elevations in blood pressure are common among adult patients, yet there are no data or guidelines that support long-term medication changes based on these readings. Tight control of blood pressure is likely to improve outcomes among patients with heart failure), myocardial infarction, and stroke. Patients with reduced life expectancy, dementia, or metastatic cancer are less likely to benefit from tight control.

Dr. Horatio (Teddy) Holzer, division of hospital medicine, Mount Sinai Hospital, New York
Dr. Teddy Holzer


Study design: Retrospective cohort study.

Setting: U.S. Veterans Administration (VA) Health System.

Synopsis: The investigators reviewed data from 14,915 adults over 65 (median age, 76 years) admitted to the VA with a diagnosis of pneumonia, urinary tract infection, or venous thromboembolism. Most patients (65%) had well-controlled blood pressure prior to admission.

A total of 2,074 (14%) patients were discharged with an intensified hypertension regimen (additional medication or higher dose). While both elevated inpatient and outpatient blood pressures were predictive of intensification, the association with elevated inpatient blood pressure was much stronger (odds ratio, 3.66; 95% confidence interval, 3.29-4.08) than it was with elevated outpatient blood pressure (OR, 1.75; 95% CI, 1.58-1.93).

In a multivariate regression analysis, the investigators found no significant differences in intensification by life expectancy (P = .07), diagnosis of dementia (P = .95), or metastatic malignancy (P = .13). There was a small increased probability of intensification among patients with heart failure, but no such difference for patients with history of MI (P = .53), stroke (P = .37), or renal disease (P = .73).

The generalizability of this trial may be limited given the cohort was predominantly male (97%), white (77%), and 53% had at least four major comorbidities.

Bottom line: Intensification of antihypertensive therapy at discharge is often driven by inpatient blood pressure readings rather than the broader context of their disease, such as prior long-term outpatient blood pressure control or major comorbidities.

Citation: Anderson TS et al. Intensification of older adults’ outpatient blood pressure treatment at hospital discharge: A national retrospective cohort study. BMJ. 2018:362:k3503.

Dr. Holzer is an assistant professor of medicine in the division of hospital medicine at Mount Sinai Hospital, New York.

Background: Transient elevations in blood pressure are common among adult patients, yet there are no data or guidelines that support long-term medication changes based on these readings. Tight control of blood pressure is likely to improve outcomes among patients with heart failure), myocardial infarction, and stroke. Patients with reduced life expectancy, dementia, or metastatic cancer are less likely to benefit from tight control.

Dr. Horatio (Teddy) Holzer, division of hospital medicine, Mount Sinai Hospital, New York
Dr. Teddy Holzer


Study design: Retrospective cohort study.

Setting: U.S. Veterans Administration (VA) Health System.

Synopsis: The investigators reviewed data from 14,915 adults over 65 (median age, 76 years) admitted to the VA with a diagnosis of pneumonia, urinary tract infection, or venous thromboembolism. Most patients (65%) had well-controlled blood pressure prior to admission.

A total of 2,074 (14%) patients were discharged with an intensified hypertension regimen (additional medication or higher dose). While both elevated inpatient and outpatient blood pressures were predictive of intensification, the association with elevated inpatient blood pressure was much stronger (odds ratio, 3.66; 95% confidence interval, 3.29-4.08) than it was with elevated outpatient blood pressure (OR, 1.75; 95% CI, 1.58-1.93).

In a multivariate regression analysis, the investigators found no significant differences in intensification by life expectancy (P = .07), diagnosis of dementia (P = .95), or metastatic malignancy (P = .13). There was a small increased probability of intensification among patients with heart failure, but no such difference for patients with history of MI (P = .53), stroke (P = .37), or renal disease (P = .73).

The generalizability of this trial may be limited given the cohort was predominantly male (97%), white (77%), and 53% had at least four major comorbidities.

Bottom line: Intensification of antihypertensive therapy at discharge is often driven by inpatient blood pressure readings rather than the broader context of their disease, such as prior long-term outpatient blood pressure control or major comorbidities.

Citation: Anderson TS et al. Intensification of older adults’ outpatient blood pressure treatment at hospital discharge: A national retrospective cohort study. BMJ. 2018:362:k3503.

Dr. Holzer is an assistant professor of medicine in the division of hospital medicine at Mount Sinai Hospital, New York.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.