Breastfeeding protects against intussusception

Article Type
Changed
Tue, 06/04/2019 - 07:55

LJUBLJANA, SLOVENIAThe first dose of rotavirus vaccine emerged as the strongest independent risk factor for intussusception in infancy, but breastfeeding had a protective effect in a German case-control study.

Mother breastfeeding her baby.
©Maxim Tupikov/iStockphoto.com

Two other potent risk factors for intussusception in children less than 1 year old were identified: a family history of intussusception, and an episode of gastroenteritis, Doris F. Oberle, MD, PhD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

Dr. Oberle, of the Paul Ehrlich Institute in Langen, Germany, presented a retrospective study of 116 meticulously validated cases of intussusception in infancy treated at 19 German pediatric centers during 2010-2014 and 272 controls matched by birth month, sex, and location. A standardized interview was conducted with the parents of all study participants.

Rotavirus vaccine was added to the German national vaccination schedule in 2013. In a multivariate logistic regression analysis, the risk of intussusception was increased by 5.4-fold following the first dose of the vaccine, compared with nonrecipients. However, subsequent doses of rotavirus vaccine were not associated with any excess risk.

In addition, a family history of intussusception was linked to a 4.2-fold increased risk, while an episode of gastroenteritis during the first year of life was associated with a 4.7-fold elevated risk.

In a novel finding, breastfeeding was independently associated with a 44% reduction in the risk of intussusception, compared with that of bottle-fed babies.

The most common presenting signs and symptoms of intussusception were vomiting, abdominal pain, hematochezia, pallor, and reduced appetite, each present in at least half of affected infants.

Dr. Oberle reported having no financial conflicts regarding her study, supported by the Paul Ehrlich Institute.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

LJUBLJANA, SLOVENIAThe first dose of rotavirus vaccine emerged as the strongest independent risk factor for intussusception in infancy, but breastfeeding had a protective effect in a German case-control study.

Mother breastfeeding her baby.
©Maxim Tupikov/iStockphoto.com

Two other potent risk factors for intussusception in children less than 1 year old were identified: a family history of intussusception, and an episode of gastroenteritis, Doris F. Oberle, MD, PhD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

Dr. Oberle, of the Paul Ehrlich Institute in Langen, Germany, presented a retrospective study of 116 meticulously validated cases of intussusception in infancy treated at 19 German pediatric centers during 2010-2014 and 272 controls matched by birth month, sex, and location. A standardized interview was conducted with the parents of all study participants.

Rotavirus vaccine was added to the German national vaccination schedule in 2013. In a multivariate logistic regression analysis, the risk of intussusception was increased by 5.4-fold following the first dose of the vaccine, compared with nonrecipients. However, subsequent doses of rotavirus vaccine were not associated with any excess risk.

In addition, a family history of intussusception was linked to a 4.2-fold increased risk, while an episode of gastroenteritis during the first year of life was associated with a 4.7-fold elevated risk.

In a novel finding, breastfeeding was independently associated with a 44% reduction in the risk of intussusception, compared with that of bottle-fed babies.

The most common presenting signs and symptoms of intussusception were vomiting, abdominal pain, hematochezia, pallor, and reduced appetite, each present in at least half of affected infants.

Dr. Oberle reported having no financial conflicts regarding her study, supported by the Paul Ehrlich Institute.

LJUBLJANA, SLOVENIAThe first dose of rotavirus vaccine emerged as the strongest independent risk factor for intussusception in infancy, but breastfeeding had a protective effect in a German case-control study.

Mother breastfeeding her baby.
©Maxim Tupikov/iStockphoto.com

Two other potent risk factors for intussusception in children less than 1 year old were identified: a family history of intussusception, and an episode of gastroenteritis, Doris F. Oberle, MD, PhD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

Dr. Oberle, of the Paul Ehrlich Institute in Langen, Germany, presented a retrospective study of 116 meticulously validated cases of intussusception in infancy treated at 19 German pediatric centers during 2010-2014 and 272 controls matched by birth month, sex, and location. A standardized interview was conducted with the parents of all study participants.

Rotavirus vaccine was added to the German national vaccination schedule in 2013. In a multivariate logistic regression analysis, the risk of intussusception was increased by 5.4-fold following the first dose of the vaccine, compared with nonrecipients. However, subsequent doses of rotavirus vaccine were not associated with any excess risk.

In addition, a family history of intussusception was linked to a 4.2-fold increased risk, while an episode of gastroenteritis during the first year of life was associated with a 4.7-fold elevated risk.

In a novel finding, breastfeeding was independently associated with a 44% reduction in the risk of intussusception, compared with that of bottle-fed babies.

The most common presenting signs and symptoms of intussusception were vomiting, abdominal pain, hematochezia, pallor, and reduced appetite, each present in at least half of affected infants.

Dr. Oberle reported having no financial conflicts regarding her study, supported by the Paul Ehrlich Institute.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ESPID 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Chronic Myeloid Leukemia: Selecting First-line TKI Therapy

Article Type
Changed
Thu, 04/23/2020 - 15:08
Display Headline
Chronic Myeloid Leukemia: Selecting First-line TKI Therapy

From the Moffitt Cancer Center, Tampa, FL.

Abstract

  • Objective: To outline the approach to selecting a tyrosine kinase inhibitor (TKI) for initial treatment of chronic myeloid leukemia (CML) and monitoring patients following initiation of therapy.
  • Methods: Review of the literature and evidence-based guidelines.
  • Results: The development and availability of TKIs has improved survival for patients diagnosed with CML. The life expectancy of patients diagnosed with chronic-phase CML (CP-CML) is similar to that of the general population, provided they receive appropriate TKI therapy and adhere to treatment. Selection of the most appropriate first-line TKI for newly diagnosed CP-CML requires incorporation of the patient’s baseline karyotype and Sokal or EURO risk score, and a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy. After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, close monitoring and follow-up are necessary to ensure patients are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses.
  • Conclusion: Given the successful treatments available for patients with CML, it is crucial to identify patients with this diagnosis; ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing; and select the best therapy for each individual patient.

Keywords: chronic myeloid leukemia; CML; tyrosine kinase inhibitor; TKI; cancer; BCR-ABL protein.

Chronic myeloid leukemia (CML) is a rare myeloproliferative neoplasm that is characterized by the presence of the Philadelphia (Ph) chromosome and uninhibited expansion of bone marrow stem cells. The Ph chromosome arises from a reciprocal translocation between the Abelson (ABL) region on chromosome 9 and the breakpoint cluster region (BCR) of chromosome 22 (t(9;22)(q34;q11.2)), resulting in the BCR-ABL1 fusion gene and its protein product, BCR-ABL tyrosine kinase.1 BCR-ABL has constitutive tyrosine kinase activity that promotes growth, replication, and survival of hematopoietic cells through downstream pathways, which is the driving factor in the pathogenesis of CML.1

CML is divided into 3 phases based on the number of myeloblasts observed in the blood or bone marrow: chronic, accelerated, and blast. Most cases of CML are diagnosed in the chronic phase (CP), which is marked by proliferation of primarily the myeloid element.

Typical treatment for CML involves lifelong use of oral BCR-ABL tyrosine kinase inhibitors (TKIs). Currently, 5 TKIs have regulatory approval for treatment of this disease. The advent of TKIs, a class of small molecules targeting the tyrosine kinases, particularly the BCR-ABL tyrosine kinase, led to rapid changes in the management of CML and improved survival for patients. Patients diagnosed with chronic-phase CML (CP-CML) now have a life expectancy that is similar to that of the general population, as long as they receive appropriate TKI therapy and adhere to treatment. As such, it is crucial to identify patients with CML; ensure they receive a complete, appropriate diagnostic workup; and select the best therapy for each patient.

Epidemiology

According to SEER data estimates, 8430 new cases of CML were diagnosed in the United States in 2018. CML is a disease of older adults, with a median age of 65 years at diagnosis, and there is a slight male predominance. Between 2011 and 2015, the number of new CML cases was 1.8 per 100,000 persons. The median overall survival (OS) in patients with newly diagnosed CP-CML has not been reached.2 Given the effective treatments available for managing CML, it is estimated that the prevalence of CML in the United States will plateau at 180,000 patients by 2050.3

 

 

Diagnosis

Clinical Features

The diagnosis of CML is often suspected based on an incidental finding of leukocytosis and, in some cases, thrombocytosis. In many cases, this is an incidental finding on routine blood work, but approximately 50% of patients will present with constitutional symptoms associated with the disease. Characteristic features of the white blood cell differential include left-shifted maturation with neutrophilia and immature circulating myeloid cells. Basophilia and eosinophilia are often present as well. Splenomegaly is a common sign, present in 50% to 90% of patients at diagnosis. In those patients with symptoms related to CML at diagnosis, the most common presentation includes increasing fatigue, fevers, night sweats, early satiety, and weight loss. The diagnosis is confirmed by cytogenetic studies showing the Ph chromosome abnormality, t(9; 22)(q3.4;q1.1), and/or reverse transcriptase polymerase chain reaction (PCR) showing BCR-ABL1 transcripts.

Testing

Bone marrow biopsy. There are 3 distinct phases of CML: CP, accelerated phase (AP), and blast phase (BP). Bone marrow biopsy and aspiration at diagnosis are mandatory in order to determine the phase of the disease at diagnosis. This distinction is based on the percentage of blasts, promyelocytes, and basophils present as well as the platelet count and presence or absence of extramedullary disease.4 The vast majority of patients at diagnosis have CML that is in the chronic phase. The typical appearance in CP-CML is a hypercellular marrow with granulocytic and occasionally megakaryocytic hyperplasia. In many cases, basophilia and/or eosinophilia are noted as well. Dysplasia is not a typical finding in CML.5 Bone marrow fibrosis can be seen in up to one-third of patients at diagnosis, and may indicate a slightly worse prognosis.6 Although a diagnosis of CML can be made without a bone marrow biopsy, complete staging and prognostication are only possible with information gained from this test, including baseline karyotype and confirmation of CP versus a more advanced phase of CML.

Diagnostic criteria. The criteria for diagnosing AP-CML has not been agreed upon by various groups, but the modified MD Anderson Cancer Center (MDACC) criteria are used in the majority of clinical trials evaluating the efficacy of TKIs in preventing progression to advanced phases of CML. MDACC criteria define AP-CML as the presence of 1 of the following: 15% to 29% blasts in the peripheral blood or bone marrow, ≥ 30% peripheral blasts plus promyelocytes, ≥ 20% basophils in the blood or bone marrow, platelet count ≤ 100,000/μL unrelated to therapy, and clonal cytogenetic evolution in Ph-positive metaphases (Table).7

Diagnostic Criteria for Chronic Myeloid Leukemia


BP-CML is typically defined using the criteria developed by the International Bone Marrow Transplant Registry (IBMTR): ≥ 30% blasts in the peripheral blood and/or the bone marrow or the presence of extramedullary disease.8 Although not typically used in clinical trials, the revised World Health Organization (WHO) criteria for BP-CML include ≥ 20% blasts in the peripheral blood or bone marrow, extramedullary blast proliferation, and large foci or clusters of blasts in the bone marrow biopsy sample (Table).9

The defining feature of CML is the presence of the Ph chromosome abnormality. In a small subset of patients, additional chromosome abnormalities (ACA) in the Ph-positive cells may be identified at diagnosis. Some reports indicate that the presence of “major route” ACA (trisomy 8, isochromosome 17q, a second Ph chromosome, or trisomy 19) at diagnosis may negatively impact prognosis, but other reports contradict these findings.10,11

 

 

PCR assay. The typical BCR breakpoint in CML is the major breakpoint cluster region (M-BCR), which results in a 210-kDa protein (p210). Alternate breakpoints that are less frequently identified are the minor BCR (mBCR or p190), which is more commonly found in Ph-positive acute lymphoblastic leukemia (ALL), and the micro BCR (µBCR or p230), which is much less common and is often characterized by chronic neutrophilia.12 Identifying which BCR-ABL1 transcript is present in each patient using qualitative PCR is crucial in order to ensure proper monitoring during treatment.

The most sensitive method for detecting BCR-ABL1 mRNA transcripts is the quantitative real-time PCR (RQ-PCR) assay, which is typically done on peripheral blood. RQ-PCR is capable of detecting a single CML cell in the presence of ≥ 100,000 normal cells. This test should be done during the initial diagnostic workup in order to confirm the presence of BCR-ABL1 transcripts, and it is used as a standard method for monitoring response to TKI therapy.13 The International Scale (IS) is a standardized approach to reporting RQ-PCR results that was developed to allow comparison of results across various laboratories and has become the gold standard for reporting BCR-ABL1 transcript values.14

Determining Risk Scores

Calculating a patient’s Sokal score or EURO risk score at diagnosis remains an important component of the diagnostic workup in CP-CML, as this information has prognostic and therapeutic implications (an online calculator is available through European LeukemiaNet [ELN]). The risk for disease progression to the accelerated or blast phases is higher in patients with intermediate or high risk scores compared to those with a low risk score at diagnosis. The risk of progression in intermediate- or high-risk patients is lower when a second-generation TKI (dasatinib, nilotinib, or bosutinib) is used as frontline therapy compared to imatinib, and therefore, the National Comprehensive Cancer Network (NCCN) CML Panel recommends starting with a second-generation TKI in these patients.15-19

 

Monitoring Response to Therapy

After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, the successful management of CML patients relies on close monitoring and follow-up to ensure they are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses. A complete hematologic response (CHR) implies complete normalization of peripheral blood counts (with the exception of TKI-induced cytopenias) and resolution of any palpable splenomegaly. The majority of patients will achieve a CHR within 4 to 6 weeks after initiating CML-directed therapy.20

Cytogenetic Response

Cytogenetic responses are defined by the decrease in the number of Ph chromosome–positive metaphases when assessed on bone marrow cytogenetics. A partial cytogenetic response (PCyR) is defined as having 1% to 35% Ph-positive metaphases, a major cytogenetic response (MCyR) as having 0% to 35% Ph-positive metaphases, and a complete cytogenetic response (CCyR) implies that no Ph-positive metaphases are identified on bone marrow cytogenetics. An ideal response is the achievement of PCyR after 3 months on a TKI and a CCyR after 12 months on a TKI.21

 

 

Molecular Response

Once a patient has achieved a CCyR, monitoring their response to therapy can only be done using RQ-PCR to measure BCR-ABL1 transcripts in the peripheral blood. The NCCN and the ELN recommend monitoring RQ-PCR from the peripheral blood every 3 months in order to assess response to TKIs.19,22 As noted, the IS has become the gold standard reporting system for all BCR-ABL1 transcript levels in the majority of laboratories worldwide.14,23 Molecular responses are based on a log reduction in BCR-ABL1 transcripts from a standardized baseline. Many molecular responses can be correlated with cytogenetic responses such that, if reliable RQ-PCR testing is available, monitoring can be done using only peripheral blood RQ-PCR rather than repeat bone marrow biopsies. For example, an early molecular response (EMR) is defined as a RQ-PCR value of ≤ 10% IS, which is approximately equivalent to a PCyR.24 A value of 1% IS is approximately equivalent to a CCyR. A major molecular response (MMR) is a ≥ 3-log reduction in BCR-ABL1 transcripts from baseline and is a value of ≤ 0.1% IS. Deeper levels of molecular response are best described by the log reduction in BCR-ABL1 transcripts, with a 4-log reduction denoted as MR4.0, a 4.5-log reduction as MR4.5, and so forth. Complete molecular response (CMR) is defined by the level of sensitivity of the RQ-PCR assay being used.14

The definition of relapsed disease in CML is dependent on the type of response the patient had previously achieved. Relapse could be the loss of a hematologic or cytogenetic response, but fluctuations in BCR-ABL1 transcripts on routine RQ-PCR do not necessarily indicate relapsed CML. A 1-log increase in the level of BCR-ABL1 transcripts with a concurrent loss of MMR should prompt a bone marrow biopsy in order to assess for the loss of CCyR, and thus a cytogenetic relapse; however, this loss of MMR does not define relapse in and of itself. In the setting of relapsed disease, testing should be done to look for possible ABL kinase domain mutations, and alternate therapy should be selected.19

Multiple reports have identified the prognostic relevance of achieving an EMR at 3 and 6 months after starting TKI therapy. Marin and colleagues reported that in 282 imatinib-treated patients, there was a significant improvement in 8-year OS, progression-free survival (PFS), and cumulative incidence of CCyR and CMR in patients who had BCR-ABL1 transcripts < 9.84% IS after 3 months on treatment.24 This data highlights the importance of early molecular monitoring in order to ensure the best outcomes for patients with CP-CML.

The NCCN CML guidelines and ELN recommendations both agree that an ideal response after 3 months on a TKI is BCR-ABL1 transcripts < 10% IS, but treatment is not considered to be failing at this point if the patient marginally misses this milestone. After 6 months on treatment, an ideal response is considered BCR-ABL1 transcripts < 1%–10% IS. Ideally, patients will have BCR-ABL1 transcripts < 0.1%–1% IS by the time they complete 12 months of TKI therapy, suggesting that these patients have at least achieved a CCyR.19,22 Even after patients achieve these early milestones, frequent monitoring by RQ-PCR is required to ensure that they are maintaining their response to treatment. This will help to ensure patient compliance with treatment and will also help to identify a select subset of patients who could potentially be considered for an attempt at TKI cessation (not discussed in detail here) after a minimum of 3 years on therapy.19,25

Selecting First-line TKI Therapy

Selection of the most appropriate first-line TKI for newly diagnosed CP-CML patients requires incorporation of many patient-specific factors. These factors include baseline karyotype and confirmation of CP-CML through bone marrow biopsy, Sokal or EURO risk score, and a thorough patient history, including a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues in order to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy.

 

 

Imatinib

The management of CML was revolutionized by the development and ultimate regulatory approval of imatinib mesylate in 2001. Imatinib was the first small-molecule cancer therapy developed and approved. It acts by binding to the adenosine triphosphate (ATP) binding site in the catalytic domain of BCR-ABL, thus inhibiting the oncoprotein’s tyrosine kinase activity.26

The International Randomized Study of Interferon versus STI571 (IRIS) trial was a randomized phase 3 study that compared imatinib 400 mg daily to interferon alfa (IFNa) plus cytarabine. More than 1000 CP-CML patients were randomly assigned 1:1 to either imatinib or IFNa plus cytarabine and were assessed for event-free survival, hematologic and cytogenetic responses, freedom from progression to AP or BP, and toxicity. Imatinib was superior to the prior standard of care for all these outcomes.21 The long-term follow-up of the IRIS trial reported an 83% estimated 10-year OS and 79% estimated event-free survival for patients on the imatinib arm of this study.15 The cumulative rate of CCyR was 82.8%. Of the 204 imatinib-treated patients who could undergo a molecular response evaluation at 10 years, 93.1% had a MMR and 63.2% had a MR4.5, suggesting durable, deep molecular responses for many patients. The estimated 10-year rate of freedom from progression to AP or BP was 92.1%.

Higher doses of imatinib (600-800 mg daily) have been studied in an attempt to overcome resistance and improve cytogenetic and molecular response rates. The Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) trial was a randomized phase 3 study that compared imatinib 800 mg daily to imatinib 400 mg daily. Although the 6-month assessments found increased rates of CCyR and a MMR in the higher-dose imatinib arm, these differences were no longer present at the 12-month assessment. Furthermore, the higher dose of imatinib led to a significantly higher incidence of grade 3/4 hematologic adverse events, and approximately 50% of patients on imatinib 800 mg daily required a dose reduction to less than 600 mg daily because of toxicity.27

The Therapeutic Intensification in De Novo Leukaemia (TIDEL)-II study used plasma trough levels of imatinib on day 22 of treatment with imatinib 600 mg daily to determine if patients should escalate the imatinib dose to 800 mg daily. In patients who did not meet molecular milestones at 3, 6, or 12 months, cohort 1 was dose escalated to imatinib 800 mg daily and subsequently switched to nilotinib 400 mg twice daily for failing the same target 3 months later, and cohort 2 was switched to nilotinib. At 2 years, 73% of patients achieved MMR and 34% achieved MR4.5, suggesting that initial treatment with higher-dose imatinib, followed by a switch to nilotinib in those failing to achieve desired milestones, could be an effective strategy for managing newly diagnosed CP-CML.28

Toxicity. The standard starting dose of imatinib in CP-CML patients is 400 mg. The safety profile of imatinib has been very well established. In the IRIS trial, the most common adverse events (all grades in decreasing order of frequency) were peripheral and periorbital edema (60%), nausea (50%), muscle cramps (49%), musculoskeletal pain (47%), diarrhea (45%), rash (40%), fatigue (39%), abdominal pain (37%), headache (37%), and joint pain (31%). Grade 3/4 liver enzyme elevation can occur in 5% of patients.29 In the event of severe liver toxicity or fluid retention, imatinib should be held until the event resolves. At that time, imatinib can be restarted if deemed appropriate, but this is dependent on the severity of the inciting event. Fluid retention can be managed by the use of supportive care, diuretics, imatinib dose reduction, dose interruption, or imatinib discontinuation if the fluid retention is severe. Muscle cramps can be managed by the use of calcium supplements or tonic water. Management of rash can include topical or systemic steroids, or in some cases imatinib dose reduction, interruption, or discontinuation.19

 

 

Grade 3/4 imatinib-induced hematologic toxicity is not uncommon, with 17% of patients experiencing neutropenia, 9% thrombocytopenia, and 4% anemia. These adverse events occurred most commonly during the first year of therapy, and the frequency decreased over time.15,29 Depending on the degree of cytopenias, imatinib dosing should be interrupted until recovery of the absolute neutrophil count or platelet count, and can often be resumed at 400 mg daily. However, if cytopenias recur, imatinib should be held and subsequently restarted at 300 mg daily.19

Dasatinib

Dasatinib is a second-generation TKI that has regulatory approval for treatment of adult patients with newly diagnosed CP-CML or CP-CML in patients with resistance or intolerance to prior TKIs. In addition to dasatinib’s ability to inhibit ABL kinases, it is also known to be a potent inhibitor of Src family kinases. Dasatinib has shown efficacy in patients who have developed imatinib-resistant ABL kinase domain mutations.

Dasatinib was initially approved as second-line therapy in patients with resistance or intolerance to imatinib. This indication was based on the results of the phase 3 CA180-034 trial, which ultimately identified dasatinib 100 mg daily as the optimal dose. In this trial, 74% of patients enrolled had resistance to imatinib and the remainder were intolerant. The 7-year follow-up of patients randomized to dasatinib 100 mg (n = 167) daily indicated that 46% achieved MMR while on study. Of the 124 imatinib-resistant patients on dasatinib 100 mg daily, the 7-year PFS was 39% and OS was 63%. In the 43 imatinib-intolerant patients, the 7-year PFS was 51% and OS was 70%.30

Dasatinib 100 mg daily was compared to imatinib 400 mg daily in newly diagnosed CP-CML patients in the randomized phase 3 DASISION (Dasatinib versus Imatinib Study in Treatment-Naive CML Patients) trial. More patients on the dasatinib arm achieved an EMR of BCR-ABL1 transcripts ≤ 10% IS after 3 months on treatment compared to imatinib (84% versus 64%). Furthermore, the 5-year follow-up reports that the cumulative incidence of MMR and MR4.5 in dasatinib-treated patients was 76% and 42%, and was 64% and 33% with imatinib (P = 0.0022 and P = 0.0251, respectively). Fewer patients treated with dasatinib progressed to AP or BP (4.6%) compared to imatinib (7.3%), but the estimated 5-year OS was similar between the 2 arms (91% for dasatinib versus 90% for imatinib).16 Regulatory approval for dasatinib as first-line therapy in newly diagnosed CML patients was based on results of the DASISION trial.

Toxicity. Most dasatinib-related toxicities are reported as grade 1 or grade 2, but grade 3/4 hematologic adverse events are fairly common. In the DASISION trial, grade 3/4 neutropenia, anemia, and thrombocytopenia occurred in 29%, 13%, and 22% of dasatinib-treated patients, respectively. Cytopenias can generally be managed with temporary dose interruptions or dose reductions.

 

 

During the 5-year follow-up of the DASISION trial, pleural effusions were reported in 28% of patients, most of which were grade 1/2. This occurred at a rate of approximately ≤ 8% per year, suggesting a stable incidence over time, and the effusions appear to be dose-dependent.16 Depending on the severity, pleural effusion may be treated with diuretics, dose interruption, and, in some instances, steroids or a thoracentesis. Typically, dasatinib can be restarted at 1 dose level lower than the previous dose once the effusion has resolved.19 Other, less common side effects of dasatinib include pulmonary hypertension (5% of patients), as well as abdominal pain, fluid retention, headaches, fatigue, musculoskeletal pain, rash, nausea, and diarrhea. Pulmonary hypertension is typically reversible after cessation of dasatinib, and thus dasatinib should be permanently discontinued once the diagnosis is confirmed. Fluid retention is often treated with diuretics and supportive care. Nausea and diarrhea are generally manageable and occur less frequently when dasatinib is taken with food and a large glass of water. Antiemetics and antidiarrheals can be used as needed. Troublesome rash can be best managed with topical or systemic steroids as well as possible dose reduction or dose interruption.16,19 In the DASISION trial, adverse events led to therapy discontinuation more often in the dasatinib group than in the imatinib group (16% versus 7%).16 Bleeding, particularly in the setting of thrombocytopenia, has been reported in patients being treated with dasatinib as a result of the drug-induced reversible inhibition of platelet aggregation.31

Nilotinib

The structure of nilotinib is similar to that of imatinib; however, it has a markedly increased affinity for the ATP‐binding site on the BCR-ABL1 protein. It was initially given regulatory approval in the setting of imatinib failure. Nilotinib was studied at a dose of 400 mg twice daily in 321 patients who were imatinib-resistant or -intolerant. It proved to be highly effective at inducing cytogenetic remissions in the second-line setting, with 59% of patients achieving a MCyR and 45% achieving a CCyR. With a median follow-up time of 4 years, the OS was 78%.32 

Nilotinib gained regulatory approval for use as a first-line TKI after completion of the randomized phase 3 ENESTnd (Evaluating Nilotinib Efficacy and Safety in Clinical Trials-Newly Diagnosed Patients) trial. ENESTnd was a 3-arm study comparing nilotinib 300 mg twice daily versus nilotinib 400 mg twice daily versus imatinib 400 mg daily in newly diagnosed, previously untreated patients diagnosed with CP-CML. The primary endpoint of this clinical trial was rate of MMR at 12 months.33 Nilotinib surpassed imatinib in this regard, with 44% of patients on nilotinib 300 mg twice daily achieving MMR at 12 months versus 43% of nilotinib 400 mg twice daily patients versus 22% of the imatinib-treated patients (P < 0.001 for both comparisons). Furthermore, the rate of CCyR by 12 months was significantly higher for both nilotinib arms compared with imatinib (80% for nilotinib 300 mg, 78% for nilotinib 400 mg, and 65% for imatinib) (P < 0.001).12 Based on this data, nilotinib 300 mg twice daily was chosen as the standard dose of nilotinib in the first-line setting. After 5 years of follow-up on the ENESTnd study, there were fewer progressions to AP/BP CML in nilotinib-treated patients compared with imatinib. MMR was achieved in 77% of nilotinib 300 mg patients compared with 60.4% of patients on the imatinib arm. MR4.5 was also more common in patients treated with nilotinib 300 mg twice daily, with a rate of 53.5% at 5 years versus 31.4% in the imatinib arm.17 In spite of the deeper cytogenetic and molecular responses achieved with nilotinib, this did not translate into a significant improvement in OS. The 5-year OS rate was 93.7% in nilotinib 300 mg patients versus 91.7% in imatinib-treated patients, and this difference lacked statistical significance.17

Toxicity. Although some similarities exist between the toxicity profiles of nilotinib and imatinib, each drug has some distinct adverse events. On the ENESTnd trial, the rate of any grade 3/4 non-hematologic adverse event was fairly low; however, lower-grade toxicities were not uncommon. Patients treated with nilotinib 300 mg twice daily experienced rash (31%), headache (14%), pruritis (15%), and fatigue (11%) most commonly. The most frequently reported laboratory abnormalities included increased total bilirubin (53%), hypophosphatemia (32%), hyperglycemia (36%), elevated lipase (24%), increased alanine aminotransferase (ALT; 66%), and increased aspartate aminotransferase (AST; 40%). Any grade of neutropenia, thrombocytopenia, or anemia occurred at rates of 43%, 48%, and 38%, respectively.33 Although nilotinib has a Black Box Warning from the US Food and Drug Administration for QT interval prolongation, no patients on the ENESTnd trial experienced a QT interval corrected for heart rate greater than 500 msec.12

More recent concerns have emerged regarding the potential for cardiovascular toxicity after long-term use of nilotinib. The 5-year update of ENESTnd reports cardiovascular events, including ischemic heart disease, ischemic cerebrovascular events, or peripheral arterial disease occurring in 7.5% of patients treated with nilotinib 300 mg twice daily, as compared with a rate of 2.1% in imatinib-treated patients. The frequency of these cardiovascular events increased linearly over time in both arms. Elevations in total cholesterol from baseline occurred in 27.6% of nilotinib patients compared with 3.9% of imatinib patients. Furthermore, clinically meaningful increases in low-density lipoprotein cholesterol and glycated hemoglobin occurred more frequently with nilotinib therapy.33

 

 

Nilotinib should be taken on an empty stomach; therefore, patients should be made aware of the need to fast for 2 hours prior to each dose and 1 hour after each dose. Given the potential risk of QT interval prolongation, a baseline electrocardiogram (ECG) is recommended prior to initiating treatment to ensure the QT interval is within a normal range. A repeat ECG should be done approximately 7 days after nilotinib initiation to ensure no prolongation of the QT interval after starting. Close monitoring of potassium and magnesium levels is important to decrease the risk of cardiac arrhythmias, and concomitant use of drugs considered strong CYP3A4 inhibitors should be avoided.19

If the patient experiences any grade 3 or higher laboratory abnormalities, nilotinib should be held until resolution of the toxicity, and then restarted at a lower dose. Similarly, if patients develop significant neutropenia or thrombocytopenia, nilotinib doses should be interrupted until resolution of the cytopenias. At that point, nilotinib can be reinitiated at either the same or a lower dose. Rash can be managed by the use of topical or systemic steroids as well as potential dose reduction, interruption, or discontinuation.

Given the concerns for potential cardiovascular events with long-term use of nilotinib, caution is advised when prescribing it to any patient with a history of cardiovascular disease or peripheral arterial occlusive disease. At the first sign of new occlusive disease, nilotinib should be discontinued.19

 

Bosutinib

Bosutinib is a second-generation BCR-ABL TKI with activity against the Src family of kinases; it was initially approved to treat patients with CP-, AP-, or BP-CML after resistance or intolerance to imatinib. Long-term data has been reported from the phase 1/2 trial of bosutinib therapy in patients with CP-CML who developed resistance or intolerance to imatinib plus dasatinib and/or nilotinib. A total of 119 patients were included in the 4-year follow-up; 38 were resistant/intolerant to imatinib and resistant to dasatinib, 50 were resistant/intolerant to imatinib and intolerant to dasatinib, 26 were resistant/intolerant to imatinib and resistant to nilotinib, and 5 were resistant/intolerant to imatinib and intolerant to nilotinib or resistant/intolerant to dasatinib and nilotinib. Bosutinib 400 mg daily was studied in this setting. Of the 38 patients with imatinib resistance/intolerance and dasatinib resistance, 39% achieved MCyR, 22% achieved CCyR, and the OS was 67%. Of the 50 patients with imatinib resistance/intolerance and dasatinib intolerance, 42% achieved MCyR, 40% achieved CCyR, and the OS was 80%. Finally, in the 26 patients with imatinib resistance/intolerance and nilotinib resistance, 38% achieved MCyR, 31% achieved CCyR, and the OS was 87%.34

Five-year follow-up from the phase 1/2 clinical trial that studied bosutinib 500 mg daily in CP-CML patients after imatinib failure reported data on 284 patients. By 5 years on study, 60% of patients had achieved MCyR and 50% achieved CCyR with a 71% and 69% probability, respectively, of maintaining these responses at 5 years. The 5-year OS was 84%.35 These data led to the regulatory approval of bosutinib 500 mg daily as second-line or later therapy.

 

 

Bosutinib was initially studied in the first-line setting in the randomized phase 3 BELA (Bosutinib Efficacy and Safety in Newly Diagnosed Chronic Myeloid Leukemia) trial. This trial compared bosutinib 500 mg daily to imatinib 400 mg daily in newly diagnosed, previously untreated CP-CML patients. This trial failed to meet its primary endpoint of increased rate of CCyR at 12 months, with 70% of bosutinib patients achieving this response, compared to 68% of imatinib-treated patients (P = 0.601). In spite of this, the rate of MMR at 12 months was significantly higher in the bosutinib arm (41%) compared to the imatinib arm (27%; P = 0.001).36

A second phase 3 trial (BFORE) was designed to study bosutinib 400 mg daily versus imatinib in newly diagnosed, previously untreated CP-CML patients. This study enrolled 536 patients who were randomly assigned 1:1 to bosutinib versus imatinib. The primary endpoint of this trial was rate of MMR at 12 months. A significantly higher number of bosutinib-treated patients achieved this response (47.2%) compared with imatinib-treated patients (36.9%, P = 0.02). Furthermore, by 12 months 77.2% of patients on the bosutinib arm had achieved CCyR compared with 66.4% on the imatinib arm, and this difference did meet statistical significance (P = 0.0075). A lower rate of progression to AP- or BP-CML was noted in bosutinib-treated patients as well (1.6% versus 2.5%). Based on this data, bosutinib gained regulatory approval for first-line therapy in CP-CML at a dose of 400 mg daily.18

Toxicity. On the BFORE trial, the most common treatment-emergent adverse events of any grade reported in the bosutinib-treated patients were diarrhea (70.1%), nausea (35.1%), increased ALT (30.6%), and increased AST (22.8%). Musculoskeletal pain or spasms occurred in 29.5% of patients, rash in 19.8%, fatigue in 19.4%, and headache in 18.7%. Hematologic toxicity was also reported, but most was grade 1/2. Thrombocytopenia was reported in 35.1%, anemia in 18.7%, and neutropenia in 11.2%.18

Cardiovascular events occurred in 5.2% of patients on the bosutinib arm of the BFORE trial, which was similar to the rate observed in imatinib patients. The most common cardiovascular event was QT interval prolongation, which occurred in 1.5% of patients. Pleural effusions were reported in 1.9% of patients treated with bosutinib, and none were grade 3 or higher.18

If liver enzyme elevation occurs at a value greater than 5 times the institutional upper limit of normal, bosutinib should be held until the level recovers to ≤ 2.5 times the upper limit of normal, at which point bosutinib can be restarted at a lower dose. If recovery takes longer than 4 weeks, bosutinib should be permanently discontinued. Liver enzymes elevated greater than 3 times the institutional upper limit of normal and a concurrent elevation in total bilirubin to 2 times the upper limit of normal are consistent with Hy’s law, and bosutinib should be discontinued. Although diarrhea is the most common toxicity associated with bosutinib, it is commonly low grade and transient. Diarrhea occurs most frequently in the first few days after initiating bosutinib. It can often be managed with over-the-counter antidiarrheal medications, but if the diarrhea is grade 3 or higher, bosutinib should be held until recovery to grade 1 or lower. Gastrointestinal side effects may be improved by taking bosutinib with a meal and a large glass of water. Fluid retention can be managed with diuretics and supportive care. Finally, if rash occurs, this can be addressed with topical or systemic steroids as well as bosutinib dose reduction, interruption, or discontinuation.19

 

 

Similar to other TKIs, if bosutinib-induced cytopenias occur, treatment should be held and restarted at the same or a lower dose upon blood count recovery.19

Ponatinib

The most common cause of TKI resistance in CP-CML is the development of ABL kinase domain mutations. The majority of imatinib-resistant mutations can be overcome by the use of second-generation TKIs, including dasatinib, nilotinib, or bosutinib. However, ponatinib is the only BCR-ABL TKI able to overcome a T315I mutation. The phase 2 PACE (Ponatinib Ph-positive ALL and CML Evaluation) trial enrolled patients with CP-, AP-, or BP-CML as well as patients with Ph-positive acute lymphoblastic leukemia who were resistant or intolerant to nilotinib or dasatinib, or who had evidence of a T315I mutation. The starting dose of ponatinib on this trial was 45 mg daily.37 The PACE trial enrolled 267 patients with CP-CML: 203 with resistance or intolerance to nilotinib or dasatinib, and 64 with a T315I mutation. The primary endpoint in the CP cohort was rate of MCyR at any time within 12 months of starting ponatinib. The overall rate of MCyR by 12 months in the CP-CML patients was 56%. In those with a T315I mutation, 70% achieved MCyR, which compared favorably with those with resistance or intolerance to nilotinib or dasatinib, 51% of whom achieved MCyR. CCyR was achieved in 46% of CP-CML patients (40% in the resistant/intolerant cohort and 66% in the T315I cohort). In general, patients with T315I mutations received fewer prior therapies than those in the resistant/intolerant cohort, which likely contributed to the higher response rates in the T315I patients. MR4.5 was achieved in 15% of CP-CML patients by 12 months on the PACE trial.37 The 5-year update to this study reported that 60%, 40%, and 24% of CP-CML patients achieved MCyR, MMR, and MR4.5, respectively. In the patients who achieved MCyR, the probability of maintaining this response for 5 years was 82% and the estimated 5-year OS was 73%.19

Toxicity. In 2013, after the regulatory approval of ponatinib, reports became available that the drug can cause an increase in arterial occlusive events, including fatal myocardial infarctions and cerebrovascular accidents. For this reason, dose reductions were implemented in patients who were deriving clinical benefit from ponatinib. In spite of these dose reductions, ≥ 90% of responders maintained their response for up to 40 months.38 Although the likelihood of developing an arterial occlusive event appears higher in the first year after starting ponatinib than in later years, the cumulative incidence of events continues to increase. The 5-year follow-up to the PACE trial reports 31% of patients experiencing any grade of arterial occlusive event while on ponatinib. Aside from these events, the most common treatment-emergent adverse events in ponatinib-treated patients on the PACE trial included rash (47%), abdominal pain (46%), headache (43%), dry skin (42%), constipation (41%), and hypertension (37%). Hematologic toxicity was also common, with 46% of patients experiencing any grade of thrombocytopenia, 20% experiencing neutropenia, and 20% anemia.38

Patients receiving ponatinib therapy should be monitored closely for any evidence of arterial or venous thrombosis. If an occlusive event occurs, ponatinib should be discontinued. Similarly, in the setting of any new or worsening heart failure symptoms, ponatinib should be promptly discontinued. Management of any underlying cardiovascular risk factors, including hypertension, hyperlipidemia, diabetes, or smoking history, is recommended, and these patients should be referred to a cardiologist for a full evaluation. In the absence of any contraindications to aspirin, low-dose aspirin should be considered as a means of decreasing cardiovascular risks associated with ponatinib. In patients with known risk factors, a ponatinib starting dose of 30 mg daily rather than the standard 45 mg daily may be a safer option, resulting in fewer arterial occlusive events, although the efficacy of this dose is still being studied in comparison to 45 mg daily.19

If ponatinib-induced transaminitis greater than 3 times the upper limit of normal occurs, ponatinib should be held until resolution to less than 3 times the upper limit of normal, at which point it should be resumed at a lower dose. Similarly, in the setting of elevated serum lipase or symptomatic pancreatitis, ponatinib should be held and restarted at a lower dose after resolution of symptoms.19

 

 

In the event of neutropenia or thrombocytopenia, ponatinib should be held until blood count recovery and then restarted at the same dose. If cytopenias occur for a second time, the dose of ponatinib should be lowered at the time of treatment reinitiation. If rash occurs, it can be addressed with topical or systemic steroids as well as dose reduction, interruption, or discontinuation.19

Conclusion

With the development of imatinib and the subsequent TKIs, dasatinib, nilotinib, bosutinib, and ponatinib, CP-CML has become a chronic disease with a life expectancy that is similar to that of the general population. Given the successful treatments available for these patients, it is crucial to identify patients with this diagnosis, ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing, and select the best therapy for each individual patient. Once on treatment, the importance of frequent monitoring cannot be overstated. This is the only way to be certain patients are achieving the desired treatment milestones that correlate with the favorable long-term outcomes that have been observed with TKI-based treatment of CP-CML. 

Corresponding author: Kendra Sweet, MD, MS, Department of Malignant Hematology, Moffitt Cancer Center, Tampa, FL.

Financial disclosures: Dr. Sweet has served on the Advisory Board and Speakers Bureau of Novartis, Bristol-Meyers Squibb, Ariad Pharmaceuticals, and Pfizer, and has served as a consultant to Pfizer.

References

1. Faderl S, Talpaz M, Estrov Z, et al. The biology of chronic myeloid leukemia. N Engl J Med. 1999;341:164-172.

2. Surveillance, Epidemiology, and End Results Program. Cancer Stat Facts: Leukemia - Chronic Myeloid Leukemia (CML). 2018.

3. Huang X, Cortes J, Kantarjian H. Estimations of the increasing prevalence and plateau prevalence of chronic myeloid leukemia in the era of tyrosine kinase inhibitor therapy. Cancer. 2012;118:3123-3127.

4. Savage DG, Szydlo RM, Chase A, et al. Bone marrow transplantation for chronic myeloid leukaemia: the effects of differing criteria for defining chronic phase on probabilities of survival and relapse. Br J Haematol. 1997;99:30-35.

5. Knox WF, Bhavnani M, Davson J, Geary CG. Histological classification of chronic granulocytic leukaemia. Clin Lab Haematol. 1984;6:171-175.

6. Kvasnicka HM, Thiele J, Schmitt-Graeff A, et al. Impact of bone marrow morphology on multivariate risk classification in chronic myelogenous leukemia. Acta Haematol. 2003;109:53-56.

7. Cortes JE, Talpaz M, O’Brien S, et al. Staging of chronic myeloid leukemia in the imatinib era: an evaluation of the World Health Organization proposal. Cancer. 2006;106:1306-1315.

8. Druker BJ. Chronic myeloid leukemia. In: DeVita VT, Lawrence TS, Rosenberg SA, eds. DeVita, Hellman, and Rosenberg’s Cancer Principles & Practice of Oncology. 8th ed. Philadelphia, PA: Lippincott, Williams and Wilkins; 2007:2267-2304.

9. Arber DA, Orazi A, Hasserjian R, et al. The 2016 revision to the World Health Organization classification of myeloid neoplasms and acute leukemia. Blood. 2016;127:2391-2405.

10. Fabarius A, Leitner A, Hochhaus A, et al. Impact of additional cytogenetic aberrations at diagnosis on prognosis of CML: long-term observation of 1151 patients from the randomized CML Study IV. Blood. 2011;118:6760-6768.

11. Alhuraiji A, Kantarjian H, Boddu P, et al. Prognostic significance of additional chromosomal abnormalities at the time of diagnosis in patients with chronic myeloid leukemia treated with frontline tyrosine kinase inhibitors. Am J Hematol. 2018;93:84-90.

12. Melo JV. BCR-ABL gene variants. Baillieres Clin Haematol. 1997;10:203-222.

13. Kantarjian HM, Talpaz M, Cortes J, et al. Quantitative polymerase chain reaction monitoring of BCR-ABL during therapy with imatinib mesylate (STI571; gleevec) in chronic-phase chronic myelogenous leukemia. Clin Cancer Res. 2003;9:160-166.

14. Hughes T, Deininger M, Hochhaus A, et al. Monitoring CML patients responding to treatment with tyrosine kinase inhibitors: review and recommendations for harmonizing current methodology for detecting BCR-ABL transcripts and kinase domain mutations and for expressing results. Blood. 2006;108:28-37.

15. Hochhaus A, Larson RA, Guilhot F, et al. Long-term outcomes of imatinib treatment for chronic myeloid leukemia. N Engl J Med. 2017;376:917-927.

16. Cortes JE, Saglio G, Kantarjian HM, et al. Final 5-year study results of DASISION: the Dasatinib Versus Imatinib Study in Treatment-Naive Chronic Myeloid Leukemia Patients trial. J Clin Oncol. 2016;34:2333-2340.

17. Hochhaus A, Saglio G, Hughes TP, et al. Long-term benefits and risks of frontline nilotinib vs imatinib for chronic myeloid leukemia in chronic phase: 5-year update of the randomized ENESTnd trial. Leukemia. 2016;30:1044-1054.

18. Cortes JE, Gambacorti-Passerini C, Deininger MW, et al. Bosutinib versus imatinib for newly diagnosed chronic myeloid leukemia: results from the randomized BFORE trial. J Clin Oncol. 2018;36:231-237.

19. Radich JP, Deininger M, Abboud CN, et al. Chronic Myeloid Leukemia, Version 1.2019, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2018;16:1108-1135.

20. Faderl S, Talpaz M, Estrov Z, Kantarjian HM. Chronic myelogenous leukemia: biology and therapy. Ann Intern Med. 1999;131:207-219.

21. O’Brien SG, Guilhot F, Larson RA, et al. Imatinib compared with interferon and low-dose cytarabine for newly diagnosed chronic-phase chronic myeloid leukemia. N Engl J Med. 2003;348:994-1004.

22. Baccarani M, Deininger MW, Rosti G, et al. European LeukemiaNet recommendations for the management of chronic myeloid leukemia: 2013. Blood. 2013;122:872-884.

23. Larripa I, Ruiz MS, Gutierrez M, Bianchini M. [Guidelines for molecular monitoring of BCR-ABL1 in chronic myeloid leukemia patients by RT-qPCR]. Medicina (B Aires). 2017;77:61-72.

24. Marin D, Ibrahim AR, Lucas C, et al. Assessment of BCR-ABL1 transcript levels at 3 months is the only requirement for predicting outcome for patients with chronic myeloid leukemia treated with tyrosine kinase inhibitors. J Clin Oncol. 2012;30:232-238.

25. Hughes TP, Ross DM. Moving treatment-free remission into mainstream clinical practice in CML. Blood. 2016;128:17-23.

26. Druker BJ, Talpaz M, Resta DJ, et al. Efficacy and safety of a specific inhibitor of the BCR-ABL tyrosine kinase in chronic myeloid leukemia. N Engl J Med. 2001;344:1031-1037.

27. Baccarani M, Druker BJ, Branford S, et al. Long-term response to imatinib is not affected by the initial dose in patients with Philadelphia chromosome-positive chronic myeloid leukemia in chronic phase: final update from the Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) study. Int J Hematol. 2014;99:616-624.

28. Yeung DT, Osborn MP, White DL, et al. TIDEL-II: first-line use of imatinib in CML with early switch to nilotinib for failure to achieve time-dependent molecular targets. Blood. 2015;125:915-923.

29. Druker BJ, Guilhot F, O’Brien SG, et al. Five-year follow-up of patients receiving imatinib for chronic myeloid leukemia. N Engl J Med. 2006;355:2408-2417.

30. Shah NP, Rousselot P, Schiffer C, et al. Dasatinib in imatinib-resistant or -intolerant chronic-phase, chronic myeloid leukemia patients: 7-year follow-up of study CA180-034. Am J Hematol. 2016;91:869-874.

31. Quintas-Cardama A, Han X, Kantarjian H, Cortes J. Tyrosine kinase inhibitor-induced platelet dysfunction in patients with chronic myeloid leukemia. Blood. 2009;114:261-263.

32. Giles FJ, le Coutre PD, Pinilla-Ibarz J, et al. Nilotinib in imatinib-resistant or imatinib-intolerant patients with chronic myeloid leukemia in chronic phase: 48-month follow-up results of a phase II study. Leukemia. 2013;27:107-112.

33. Saglio G, Kim DW, Issaragrisil S, et al. Nilotinib versus imatinib for newly diagnosed chronic myeloid leukemia. N Engl J Med. 2010;362:2251-2259.

34. Cortes JE, Khoury HJ, Kantarjian HM, et al. Long-term bosutinib for chronic phase chronic myeloid leukemia after failure of imatinib plus dasatinib and/or nilotinib. Am J Hematol. 2016;91:1206-1214.

35. Gambacorti-Passerini C, Cortes JE, Lipton JH, et al. Safety and efficacy of second-line bosutinib for chronic phase chronic myeloid leukemia over a five-year period: final results of a phase I/II study. Haematologica. 2018;103:1298-1307.

36. Cortes JE, Kim DW, Kantarjian HM, et al. Bosutinib versus imatinib in newly diagnosed chronic-phase chronic myeloid leukemia: results from the BELA trial. J Clin Oncol. 2012;30:3486-3492.

37. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. A phase 2 trial of ponatinib in Philadelphia chromosome-positive leukemias. N Engl J Med. 2013;369:1783-1796.

38. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. Ponatinib efficacy and safety in Philadelphia chromosome-positive leukemia: final 5-year results of the phase 2 PACE trial. Blood. 2018;132:393-404.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
131-141
Sections
Article PDF
Article PDF

From the Moffitt Cancer Center, Tampa, FL.

Abstract

  • Objective: To outline the approach to selecting a tyrosine kinase inhibitor (TKI) for initial treatment of chronic myeloid leukemia (CML) and monitoring patients following initiation of therapy.
  • Methods: Review of the literature and evidence-based guidelines.
  • Results: The development and availability of TKIs has improved survival for patients diagnosed with CML. The life expectancy of patients diagnosed with chronic-phase CML (CP-CML) is similar to that of the general population, provided they receive appropriate TKI therapy and adhere to treatment. Selection of the most appropriate first-line TKI for newly diagnosed CP-CML requires incorporation of the patient’s baseline karyotype and Sokal or EURO risk score, and a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy. After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, close monitoring and follow-up are necessary to ensure patients are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses.
  • Conclusion: Given the successful treatments available for patients with CML, it is crucial to identify patients with this diagnosis; ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing; and select the best therapy for each individual patient.

Keywords: chronic myeloid leukemia; CML; tyrosine kinase inhibitor; TKI; cancer; BCR-ABL protein.

Chronic myeloid leukemia (CML) is a rare myeloproliferative neoplasm that is characterized by the presence of the Philadelphia (Ph) chromosome and uninhibited expansion of bone marrow stem cells. The Ph chromosome arises from a reciprocal translocation between the Abelson (ABL) region on chromosome 9 and the breakpoint cluster region (BCR) of chromosome 22 (t(9;22)(q34;q11.2)), resulting in the BCR-ABL1 fusion gene and its protein product, BCR-ABL tyrosine kinase.1 BCR-ABL has constitutive tyrosine kinase activity that promotes growth, replication, and survival of hematopoietic cells through downstream pathways, which is the driving factor in the pathogenesis of CML.1

CML is divided into 3 phases based on the number of myeloblasts observed in the blood or bone marrow: chronic, accelerated, and blast. Most cases of CML are diagnosed in the chronic phase (CP), which is marked by proliferation of primarily the myeloid element.

Typical treatment for CML involves lifelong use of oral BCR-ABL tyrosine kinase inhibitors (TKIs). Currently, 5 TKIs have regulatory approval for treatment of this disease. The advent of TKIs, a class of small molecules targeting the tyrosine kinases, particularly the BCR-ABL tyrosine kinase, led to rapid changes in the management of CML and improved survival for patients. Patients diagnosed with chronic-phase CML (CP-CML) now have a life expectancy that is similar to that of the general population, as long as they receive appropriate TKI therapy and adhere to treatment. As such, it is crucial to identify patients with CML; ensure they receive a complete, appropriate diagnostic workup; and select the best therapy for each patient.

Epidemiology

According to SEER data estimates, 8430 new cases of CML were diagnosed in the United States in 2018. CML is a disease of older adults, with a median age of 65 years at diagnosis, and there is a slight male predominance. Between 2011 and 2015, the number of new CML cases was 1.8 per 100,000 persons. The median overall survival (OS) in patients with newly diagnosed CP-CML has not been reached.2 Given the effective treatments available for managing CML, it is estimated that the prevalence of CML in the United States will plateau at 180,000 patients by 2050.3

 

 

Diagnosis

Clinical Features

The diagnosis of CML is often suspected based on an incidental finding of leukocytosis and, in some cases, thrombocytosis. In many cases, this is an incidental finding on routine blood work, but approximately 50% of patients will present with constitutional symptoms associated with the disease. Characteristic features of the white blood cell differential include left-shifted maturation with neutrophilia and immature circulating myeloid cells. Basophilia and eosinophilia are often present as well. Splenomegaly is a common sign, present in 50% to 90% of patients at diagnosis. In those patients with symptoms related to CML at diagnosis, the most common presentation includes increasing fatigue, fevers, night sweats, early satiety, and weight loss. The diagnosis is confirmed by cytogenetic studies showing the Ph chromosome abnormality, t(9; 22)(q3.4;q1.1), and/or reverse transcriptase polymerase chain reaction (PCR) showing BCR-ABL1 transcripts.

Testing

Bone marrow biopsy. There are 3 distinct phases of CML: CP, accelerated phase (AP), and blast phase (BP). Bone marrow biopsy and aspiration at diagnosis are mandatory in order to determine the phase of the disease at diagnosis. This distinction is based on the percentage of blasts, promyelocytes, and basophils present as well as the platelet count and presence or absence of extramedullary disease.4 The vast majority of patients at diagnosis have CML that is in the chronic phase. The typical appearance in CP-CML is a hypercellular marrow with granulocytic and occasionally megakaryocytic hyperplasia. In many cases, basophilia and/or eosinophilia are noted as well. Dysplasia is not a typical finding in CML.5 Bone marrow fibrosis can be seen in up to one-third of patients at diagnosis, and may indicate a slightly worse prognosis.6 Although a diagnosis of CML can be made without a bone marrow biopsy, complete staging and prognostication are only possible with information gained from this test, including baseline karyotype and confirmation of CP versus a more advanced phase of CML.

Diagnostic criteria. The criteria for diagnosing AP-CML has not been agreed upon by various groups, but the modified MD Anderson Cancer Center (MDACC) criteria are used in the majority of clinical trials evaluating the efficacy of TKIs in preventing progression to advanced phases of CML. MDACC criteria define AP-CML as the presence of 1 of the following: 15% to 29% blasts in the peripheral blood or bone marrow, ≥ 30% peripheral blasts plus promyelocytes, ≥ 20% basophils in the blood or bone marrow, platelet count ≤ 100,000/μL unrelated to therapy, and clonal cytogenetic evolution in Ph-positive metaphases (Table).7

Diagnostic Criteria for Chronic Myeloid Leukemia


BP-CML is typically defined using the criteria developed by the International Bone Marrow Transplant Registry (IBMTR): ≥ 30% blasts in the peripheral blood and/or the bone marrow or the presence of extramedullary disease.8 Although not typically used in clinical trials, the revised World Health Organization (WHO) criteria for BP-CML include ≥ 20% blasts in the peripheral blood or bone marrow, extramedullary blast proliferation, and large foci or clusters of blasts in the bone marrow biopsy sample (Table).9

The defining feature of CML is the presence of the Ph chromosome abnormality. In a small subset of patients, additional chromosome abnormalities (ACA) in the Ph-positive cells may be identified at diagnosis. Some reports indicate that the presence of “major route” ACA (trisomy 8, isochromosome 17q, a second Ph chromosome, or trisomy 19) at diagnosis may negatively impact prognosis, but other reports contradict these findings.10,11

 

 

PCR assay. The typical BCR breakpoint in CML is the major breakpoint cluster region (M-BCR), which results in a 210-kDa protein (p210). Alternate breakpoints that are less frequently identified are the minor BCR (mBCR or p190), which is more commonly found in Ph-positive acute lymphoblastic leukemia (ALL), and the micro BCR (µBCR or p230), which is much less common and is often characterized by chronic neutrophilia.12 Identifying which BCR-ABL1 transcript is present in each patient using qualitative PCR is crucial in order to ensure proper monitoring during treatment.

The most sensitive method for detecting BCR-ABL1 mRNA transcripts is the quantitative real-time PCR (RQ-PCR) assay, which is typically done on peripheral blood. RQ-PCR is capable of detecting a single CML cell in the presence of ≥ 100,000 normal cells. This test should be done during the initial diagnostic workup in order to confirm the presence of BCR-ABL1 transcripts, and it is used as a standard method for monitoring response to TKI therapy.13 The International Scale (IS) is a standardized approach to reporting RQ-PCR results that was developed to allow comparison of results across various laboratories and has become the gold standard for reporting BCR-ABL1 transcript values.14

Determining Risk Scores

Calculating a patient’s Sokal score or EURO risk score at diagnosis remains an important component of the diagnostic workup in CP-CML, as this information has prognostic and therapeutic implications (an online calculator is available through European LeukemiaNet [ELN]). The risk for disease progression to the accelerated or blast phases is higher in patients with intermediate or high risk scores compared to those with a low risk score at diagnosis. The risk of progression in intermediate- or high-risk patients is lower when a second-generation TKI (dasatinib, nilotinib, or bosutinib) is used as frontline therapy compared to imatinib, and therefore, the National Comprehensive Cancer Network (NCCN) CML Panel recommends starting with a second-generation TKI in these patients.15-19

 

Monitoring Response to Therapy

After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, the successful management of CML patients relies on close monitoring and follow-up to ensure they are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses. A complete hematologic response (CHR) implies complete normalization of peripheral blood counts (with the exception of TKI-induced cytopenias) and resolution of any palpable splenomegaly. The majority of patients will achieve a CHR within 4 to 6 weeks after initiating CML-directed therapy.20

Cytogenetic Response

Cytogenetic responses are defined by the decrease in the number of Ph chromosome–positive metaphases when assessed on bone marrow cytogenetics. A partial cytogenetic response (PCyR) is defined as having 1% to 35% Ph-positive metaphases, a major cytogenetic response (MCyR) as having 0% to 35% Ph-positive metaphases, and a complete cytogenetic response (CCyR) implies that no Ph-positive metaphases are identified on bone marrow cytogenetics. An ideal response is the achievement of PCyR after 3 months on a TKI and a CCyR after 12 months on a TKI.21

 

 

Molecular Response

Once a patient has achieved a CCyR, monitoring their response to therapy can only be done using RQ-PCR to measure BCR-ABL1 transcripts in the peripheral blood. The NCCN and the ELN recommend monitoring RQ-PCR from the peripheral blood every 3 months in order to assess response to TKIs.19,22 As noted, the IS has become the gold standard reporting system for all BCR-ABL1 transcript levels in the majority of laboratories worldwide.14,23 Molecular responses are based on a log reduction in BCR-ABL1 transcripts from a standardized baseline. Many molecular responses can be correlated with cytogenetic responses such that, if reliable RQ-PCR testing is available, monitoring can be done using only peripheral blood RQ-PCR rather than repeat bone marrow biopsies. For example, an early molecular response (EMR) is defined as a RQ-PCR value of ≤ 10% IS, which is approximately equivalent to a PCyR.24 A value of 1% IS is approximately equivalent to a CCyR. A major molecular response (MMR) is a ≥ 3-log reduction in BCR-ABL1 transcripts from baseline and is a value of ≤ 0.1% IS. Deeper levels of molecular response are best described by the log reduction in BCR-ABL1 transcripts, with a 4-log reduction denoted as MR4.0, a 4.5-log reduction as MR4.5, and so forth. Complete molecular response (CMR) is defined by the level of sensitivity of the RQ-PCR assay being used.14

The definition of relapsed disease in CML is dependent on the type of response the patient had previously achieved. Relapse could be the loss of a hematologic or cytogenetic response, but fluctuations in BCR-ABL1 transcripts on routine RQ-PCR do not necessarily indicate relapsed CML. A 1-log increase in the level of BCR-ABL1 transcripts with a concurrent loss of MMR should prompt a bone marrow biopsy in order to assess for the loss of CCyR, and thus a cytogenetic relapse; however, this loss of MMR does not define relapse in and of itself. In the setting of relapsed disease, testing should be done to look for possible ABL kinase domain mutations, and alternate therapy should be selected.19

Multiple reports have identified the prognostic relevance of achieving an EMR at 3 and 6 months after starting TKI therapy. Marin and colleagues reported that in 282 imatinib-treated patients, there was a significant improvement in 8-year OS, progression-free survival (PFS), and cumulative incidence of CCyR and CMR in patients who had BCR-ABL1 transcripts < 9.84% IS after 3 months on treatment.24 This data highlights the importance of early molecular monitoring in order to ensure the best outcomes for patients with CP-CML.

The NCCN CML guidelines and ELN recommendations both agree that an ideal response after 3 months on a TKI is BCR-ABL1 transcripts < 10% IS, but treatment is not considered to be failing at this point if the patient marginally misses this milestone. After 6 months on treatment, an ideal response is considered BCR-ABL1 transcripts < 1%–10% IS. Ideally, patients will have BCR-ABL1 transcripts < 0.1%–1% IS by the time they complete 12 months of TKI therapy, suggesting that these patients have at least achieved a CCyR.19,22 Even after patients achieve these early milestones, frequent monitoring by RQ-PCR is required to ensure that they are maintaining their response to treatment. This will help to ensure patient compliance with treatment and will also help to identify a select subset of patients who could potentially be considered for an attempt at TKI cessation (not discussed in detail here) after a minimum of 3 years on therapy.19,25

Selecting First-line TKI Therapy

Selection of the most appropriate first-line TKI for newly diagnosed CP-CML patients requires incorporation of many patient-specific factors. These factors include baseline karyotype and confirmation of CP-CML through bone marrow biopsy, Sokal or EURO risk score, and a thorough patient history, including a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues in order to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy.

 

 

Imatinib

The management of CML was revolutionized by the development and ultimate regulatory approval of imatinib mesylate in 2001. Imatinib was the first small-molecule cancer therapy developed and approved. It acts by binding to the adenosine triphosphate (ATP) binding site in the catalytic domain of BCR-ABL, thus inhibiting the oncoprotein’s tyrosine kinase activity.26

The International Randomized Study of Interferon versus STI571 (IRIS) trial was a randomized phase 3 study that compared imatinib 400 mg daily to interferon alfa (IFNa) plus cytarabine. More than 1000 CP-CML patients were randomly assigned 1:1 to either imatinib or IFNa plus cytarabine and were assessed for event-free survival, hematologic and cytogenetic responses, freedom from progression to AP or BP, and toxicity. Imatinib was superior to the prior standard of care for all these outcomes.21 The long-term follow-up of the IRIS trial reported an 83% estimated 10-year OS and 79% estimated event-free survival for patients on the imatinib arm of this study.15 The cumulative rate of CCyR was 82.8%. Of the 204 imatinib-treated patients who could undergo a molecular response evaluation at 10 years, 93.1% had a MMR and 63.2% had a MR4.5, suggesting durable, deep molecular responses for many patients. The estimated 10-year rate of freedom from progression to AP or BP was 92.1%.

Higher doses of imatinib (600-800 mg daily) have been studied in an attempt to overcome resistance and improve cytogenetic and molecular response rates. The Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) trial was a randomized phase 3 study that compared imatinib 800 mg daily to imatinib 400 mg daily. Although the 6-month assessments found increased rates of CCyR and a MMR in the higher-dose imatinib arm, these differences were no longer present at the 12-month assessment. Furthermore, the higher dose of imatinib led to a significantly higher incidence of grade 3/4 hematologic adverse events, and approximately 50% of patients on imatinib 800 mg daily required a dose reduction to less than 600 mg daily because of toxicity.27

The Therapeutic Intensification in De Novo Leukaemia (TIDEL)-II study used plasma trough levels of imatinib on day 22 of treatment with imatinib 600 mg daily to determine if patients should escalate the imatinib dose to 800 mg daily. In patients who did not meet molecular milestones at 3, 6, or 12 months, cohort 1 was dose escalated to imatinib 800 mg daily and subsequently switched to nilotinib 400 mg twice daily for failing the same target 3 months later, and cohort 2 was switched to nilotinib. At 2 years, 73% of patients achieved MMR and 34% achieved MR4.5, suggesting that initial treatment with higher-dose imatinib, followed by a switch to nilotinib in those failing to achieve desired milestones, could be an effective strategy for managing newly diagnosed CP-CML.28

Toxicity. The standard starting dose of imatinib in CP-CML patients is 400 mg. The safety profile of imatinib has been very well established. In the IRIS trial, the most common adverse events (all grades in decreasing order of frequency) were peripheral and periorbital edema (60%), nausea (50%), muscle cramps (49%), musculoskeletal pain (47%), diarrhea (45%), rash (40%), fatigue (39%), abdominal pain (37%), headache (37%), and joint pain (31%). Grade 3/4 liver enzyme elevation can occur in 5% of patients.29 In the event of severe liver toxicity or fluid retention, imatinib should be held until the event resolves. At that time, imatinib can be restarted if deemed appropriate, but this is dependent on the severity of the inciting event. Fluid retention can be managed by the use of supportive care, diuretics, imatinib dose reduction, dose interruption, or imatinib discontinuation if the fluid retention is severe. Muscle cramps can be managed by the use of calcium supplements or tonic water. Management of rash can include topical or systemic steroids, or in some cases imatinib dose reduction, interruption, or discontinuation.19

 

 

Grade 3/4 imatinib-induced hematologic toxicity is not uncommon, with 17% of patients experiencing neutropenia, 9% thrombocytopenia, and 4% anemia. These adverse events occurred most commonly during the first year of therapy, and the frequency decreased over time.15,29 Depending on the degree of cytopenias, imatinib dosing should be interrupted until recovery of the absolute neutrophil count or platelet count, and can often be resumed at 400 mg daily. However, if cytopenias recur, imatinib should be held and subsequently restarted at 300 mg daily.19

Dasatinib

Dasatinib is a second-generation TKI that has regulatory approval for treatment of adult patients with newly diagnosed CP-CML or CP-CML in patients with resistance or intolerance to prior TKIs. In addition to dasatinib’s ability to inhibit ABL kinases, it is also known to be a potent inhibitor of Src family kinases. Dasatinib has shown efficacy in patients who have developed imatinib-resistant ABL kinase domain mutations.

Dasatinib was initially approved as second-line therapy in patients with resistance or intolerance to imatinib. This indication was based on the results of the phase 3 CA180-034 trial, which ultimately identified dasatinib 100 mg daily as the optimal dose. In this trial, 74% of patients enrolled had resistance to imatinib and the remainder were intolerant. The 7-year follow-up of patients randomized to dasatinib 100 mg (n = 167) daily indicated that 46% achieved MMR while on study. Of the 124 imatinib-resistant patients on dasatinib 100 mg daily, the 7-year PFS was 39% and OS was 63%. In the 43 imatinib-intolerant patients, the 7-year PFS was 51% and OS was 70%.30

Dasatinib 100 mg daily was compared to imatinib 400 mg daily in newly diagnosed CP-CML patients in the randomized phase 3 DASISION (Dasatinib versus Imatinib Study in Treatment-Naive CML Patients) trial. More patients on the dasatinib arm achieved an EMR of BCR-ABL1 transcripts ≤ 10% IS after 3 months on treatment compared to imatinib (84% versus 64%). Furthermore, the 5-year follow-up reports that the cumulative incidence of MMR and MR4.5 in dasatinib-treated patients was 76% and 42%, and was 64% and 33% with imatinib (P = 0.0022 and P = 0.0251, respectively). Fewer patients treated with dasatinib progressed to AP or BP (4.6%) compared to imatinib (7.3%), but the estimated 5-year OS was similar between the 2 arms (91% for dasatinib versus 90% for imatinib).16 Regulatory approval for dasatinib as first-line therapy in newly diagnosed CML patients was based on results of the DASISION trial.

Toxicity. Most dasatinib-related toxicities are reported as grade 1 or grade 2, but grade 3/4 hematologic adverse events are fairly common. In the DASISION trial, grade 3/4 neutropenia, anemia, and thrombocytopenia occurred in 29%, 13%, and 22% of dasatinib-treated patients, respectively. Cytopenias can generally be managed with temporary dose interruptions or dose reductions.

 

 

During the 5-year follow-up of the DASISION trial, pleural effusions were reported in 28% of patients, most of which were grade 1/2. This occurred at a rate of approximately ≤ 8% per year, suggesting a stable incidence over time, and the effusions appear to be dose-dependent.16 Depending on the severity, pleural effusion may be treated with diuretics, dose interruption, and, in some instances, steroids or a thoracentesis. Typically, dasatinib can be restarted at 1 dose level lower than the previous dose once the effusion has resolved.19 Other, less common side effects of dasatinib include pulmonary hypertension (5% of patients), as well as abdominal pain, fluid retention, headaches, fatigue, musculoskeletal pain, rash, nausea, and diarrhea. Pulmonary hypertension is typically reversible after cessation of dasatinib, and thus dasatinib should be permanently discontinued once the diagnosis is confirmed. Fluid retention is often treated with diuretics and supportive care. Nausea and diarrhea are generally manageable and occur less frequently when dasatinib is taken with food and a large glass of water. Antiemetics and antidiarrheals can be used as needed. Troublesome rash can be best managed with topical or systemic steroids as well as possible dose reduction or dose interruption.16,19 In the DASISION trial, adverse events led to therapy discontinuation more often in the dasatinib group than in the imatinib group (16% versus 7%).16 Bleeding, particularly in the setting of thrombocytopenia, has been reported in patients being treated with dasatinib as a result of the drug-induced reversible inhibition of platelet aggregation.31

Nilotinib

The structure of nilotinib is similar to that of imatinib; however, it has a markedly increased affinity for the ATP‐binding site on the BCR-ABL1 protein. It was initially given regulatory approval in the setting of imatinib failure. Nilotinib was studied at a dose of 400 mg twice daily in 321 patients who were imatinib-resistant or -intolerant. It proved to be highly effective at inducing cytogenetic remissions in the second-line setting, with 59% of patients achieving a MCyR and 45% achieving a CCyR. With a median follow-up time of 4 years, the OS was 78%.32 

Nilotinib gained regulatory approval for use as a first-line TKI after completion of the randomized phase 3 ENESTnd (Evaluating Nilotinib Efficacy and Safety in Clinical Trials-Newly Diagnosed Patients) trial. ENESTnd was a 3-arm study comparing nilotinib 300 mg twice daily versus nilotinib 400 mg twice daily versus imatinib 400 mg daily in newly diagnosed, previously untreated patients diagnosed with CP-CML. The primary endpoint of this clinical trial was rate of MMR at 12 months.33 Nilotinib surpassed imatinib in this regard, with 44% of patients on nilotinib 300 mg twice daily achieving MMR at 12 months versus 43% of nilotinib 400 mg twice daily patients versus 22% of the imatinib-treated patients (P < 0.001 for both comparisons). Furthermore, the rate of CCyR by 12 months was significantly higher for both nilotinib arms compared with imatinib (80% for nilotinib 300 mg, 78% for nilotinib 400 mg, and 65% for imatinib) (P < 0.001).12 Based on this data, nilotinib 300 mg twice daily was chosen as the standard dose of nilotinib in the first-line setting. After 5 years of follow-up on the ENESTnd study, there were fewer progressions to AP/BP CML in nilotinib-treated patients compared with imatinib. MMR was achieved in 77% of nilotinib 300 mg patients compared with 60.4% of patients on the imatinib arm. MR4.5 was also more common in patients treated with nilotinib 300 mg twice daily, with a rate of 53.5% at 5 years versus 31.4% in the imatinib arm.17 In spite of the deeper cytogenetic and molecular responses achieved with nilotinib, this did not translate into a significant improvement in OS. The 5-year OS rate was 93.7% in nilotinib 300 mg patients versus 91.7% in imatinib-treated patients, and this difference lacked statistical significance.17

Toxicity. Although some similarities exist between the toxicity profiles of nilotinib and imatinib, each drug has some distinct adverse events. On the ENESTnd trial, the rate of any grade 3/4 non-hematologic adverse event was fairly low; however, lower-grade toxicities were not uncommon. Patients treated with nilotinib 300 mg twice daily experienced rash (31%), headache (14%), pruritis (15%), and fatigue (11%) most commonly. The most frequently reported laboratory abnormalities included increased total bilirubin (53%), hypophosphatemia (32%), hyperglycemia (36%), elevated lipase (24%), increased alanine aminotransferase (ALT; 66%), and increased aspartate aminotransferase (AST; 40%). Any grade of neutropenia, thrombocytopenia, or anemia occurred at rates of 43%, 48%, and 38%, respectively.33 Although nilotinib has a Black Box Warning from the US Food and Drug Administration for QT interval prolongation, no patients on the ENESTnd trial experienced a QT interval corrected for heart rate greater than 500 msec.12

More recent concerns have emerged regarding the potential for cardiovascular toxicity after long-term use of nilotinib. The 5-year update of ENESTnd reports cardiovascular events, including ischemic heart disease, ischemic cerebrovascular events, or peripheral arterial disease occurring in 7.5% of patients treated with nilotinib 300 mg twice daily, as compared with a rate of 2.1% in imatinib-treated patients. The frequency of these cardiovascular events increased linearly over time in both arms. Elevations in total cholesterol from baseline occurred in 27.6% of nilotinib patients compared with 3.9% of imatinib patients. Furthermore, clinically meaningful increases in low-density lipoprotein cholesterol and glycated hemoglobin occurred more frequently with nilotinib therapy.33

 

 

Nilotinib should be taken on an empty stomach; therefore, patients should be made aware of the need to fast for 2 hours prior to each dose and 1 hour after each dose. Given the potential risk of QT interval prolongation, a baseline electrocardiogram (ECG) is recommended prior to initiating treatment to ensure the QT interval is within a normal range. A repeat ECG should be done approximately 7 days after nilotinib initiation to ensure no prolongation of the QT interval after starting. Close monitoring of potassium and magnesium levels is important to decrease the risk of cardiac arrhythmias, and concomitant use of drugs considered strong CYP3A4 inhibitors should be avoided.19

If the patient experiences any grade 3 or higher laboratory abnormalities, nilotinib should be held until resolution of the toxicity, and then restarted at a lower dose. Similarly, if patients develop significant neutropenia or thrombocytopenia, nilotinib doses should be interrupted until resolution of the cytopenias. At that point, nilotinib can be reinitiated at either the same or a lower dose. Rash can be managed by the use of topical or systemic steroids as well as potential dose reduction, interruption, or discontinuation.

Given the concerns for potential cardiovascular events with long-term use of nilotinib, caution is advised when prescribing it to any patient with a history of cardiovascular disease or peripheral arterial occlusive disease. At the first sign of new occlusive disease, nilotinib should be discontinued.19

 

Bosutinib

Bosutinib is a second-generation BCR-ABL TKI with activity against the Src family of kinases; it was initially approved to treat patients with CP-, AP-, or BP-CML after resistance or intolerance to imatinib. Long-term data has been reported from the phase 1/2 trial of bosutinib therapy in patients with CP-CML who developed resistance or intolerance to imatinib plus dasatinib and/or nilotinib. A total of 119 patients were included in the 4-year follow-up; 38 were resistant/intolerant to imatinib and resistant to dasatinib, 50 were resistant/intolerant to imatinib and intolerant to dasatinib, 26 were resistant/intolerant to imatinib and resistant to nilotinib, and 5 were resistant/intolerant to imatinib and intolerant to nilotinib or resistant/intolerant to dasatinib and nilotinib. Bosutinib 400 mg daily was studied in this setting. Of the 38 patients with imatinib resistance/intolerance and dasatinib resistance, 39% achieved MCyR, 22% achieved CCyR, and the OS was 67%. Of the 50 patients with imatinib resistance/intolerance and dasatinib intolerance, 42% achieved MCyR, 40% achieved CCyR, and the OS was 80%. Finally, in the 26 patients with imatinib resistance/intolerance and nilotinib resistance, 38% achieved MCyR, 31% achieved CCyR, and the OS was 87%.34

Five-year follow-up from the phase 1/2 clinical trial that studied bosutinib 500 mg daily in CP-CML patients after imatinib failure reported data on 284 patients. By 5 years on study, 60% of patients had achieved MCyR and 50% achieved CCyR with a 71% and 69% probability, respectively, of maintaining these responses at 5 years. The 5-year OS was 84%.35 These data led to the regulatory approval of bosutinib 500 mg daily as second-line or later therapy.

 

 

Bosutinib was initially studied in the first-line setting in the randomized phase 3 BELA (Bosutinib Efficacy and Safety in Newly Diagnosed Chronic Myeloid Leukemia) trial. This trial compared bosutinib 500 mg daily to imatinib 400 mg daily in newly diagnosed, previously untreated CP-CML patients. This trial failed to meet its primary endpoint of increased rate of CCyR at 12 months, with 70% of bosutinib patients achieving this response, compared to 68% of imatinib-treated patients (P = 0.601). In spite of this, the rate of MMR at 12 months was significantly higher in the bosutinib arm (41%) compared to the imatinib arm (27%; P = 0.001).36

A second phase 3 trial (BFORE) was designed to study bosutinib 400 mg daily versus imatinib in newly diagnosed, previously untreated CP-CML patients. This study enrolled 536 patients who were randomly assigned 1:1 to bosutinib versus imatinib. The primary endpoint of this trial was rate of MMR at 12 months. A significantly higher number of bosutinib-treated patients achieved this response (47.2%) compared with imatinib-treated patients (36.9%, P = 0.02). Furthermore, by 12 months 77.2% of patients on the bosutinib arm had achieved CCyR compared with 66.4% on the imatinib arm, and this difference did meet statistical significance (P = 0.0075). A lower rate of progression to AP- or BP-CML was noted in bosutinib-treated patients as well (1.6% versus 2.5%). Based on this data, bosutinib gained regulatory approval for first-line therapy in CP-CML at a dose of 400 mg daily.18

Toxicity. On the BFORE trial, the most common treatment-emergent adverse events of any grade reported in the bosutinib-treated patients were diarrhea (70.1%), nausea (35.1%), increased ALT (30.6%), and increased AST (22.8%). Musculoskeletal pain or spasms occurred in 29.5% of patients, rash in 19.8%, fatigue in 19.4%, and headache in 18.7%. Hematologic toxicity was also reported, but most was grade 1/2. Thrombocytopenia was reported in 35.1%, anemia in 18.7%, and neutropenia in 11.2%.18

Cardiovascular events occurred in 5.2% of patients on the bosutinib arm of the BFORE trial, which was similar to the rate observed in imatinib patients. The most common cardiovascular event was QT interval prolongation, which occurred in 1.5% of patients. Pleural effusions were reported in 1.9% of patients treated with bosutinib, and none were grade 3 or higher.18

If liver enzyme elevation occurs at a value greater than 5 times the institutional upper limit of normal, bosutinib should be held until the level recovers to ≤ 2.5 times the upper limit of normal, at which point bosutinib can be restarted at a lower dose. If recovery takes longer than 4 weeks, bosutinib should be permanently discontinued. Liver enzymes elevated greater than 3 times the institutional upper limit of normal and a concurrent elevation in total bilirubin to 2 times the upper limit of normal are consistent with Hy’s law, and bosutinib should be discontinued. Although diarrhea is the most common toxicity associated with bosutinib, it is commonly low grade and transient. Diarrhea occurs most frequently in the first few days after initiating bosutinib. It can often be managed with over-the-counter antidiarrheal medications, but if the diarrhea is grade 3 or higher, bosutinib should be held until recovery to grade 1 or lower. Gastrointestinal side effects may be improved by taking bosutinib with a meal and a large glass of water. Fluid retention can be managed with diuretics and supportive care. Finally, if rash occurs, this can be addressed with topical or systemic steroids as well as bosutinib dose reduction, interruption, or discontinuation.19

 

 

Similar to other TKIs, if bosutinib-induced cytopenias occur, treatment should be held and restarted at the same or a lower dose upon blood count recovery.19

Ponatinib

The most common cause of TKI resistance in CP-CML is the development of ABL kinase domain mutations. The majority of imatinib-resistant mutations can be overcome by the use of second-generation TKIs, including dasatinib, nilotinib, or bosutinib. However, ponatinib is the only BCR-ABL TKI able to overcome a T315I mutation. The phase 2 PACE (Ponatinib Ph-positive ALL and CML Evaluation) trial enrolled patients with CP-, AP-, or BP-CML as well as patients with Ph-positive acute lymphoblastic leukemia who were resistant or intolerant to nilotinib or dasatinib, or who had evidence of a T315I mutation. The starting dose of ponatinib on this trial was 45 mg daily.37 The PACE trial enrolled 267 patients with CP-CML: 203 with resistance or intolerance to nilotinib or dasatinib, and 64 with a T315I mutation. The primary endpoint in the CP cohort was rate of MCyR at any time within 12 months of starting ponatinib. The overall rate of MCyR by 12 months in the CP-CML patients was 56%. In those with a T315I mutation, 70% achieved MCyR, which compared favorably with those with resistance or intolerance to nilotinib or dasatinib, 51% of whom achieved MCyR. CCyR was achieved in 46% of CP-CML patients (40% in the resistant/intolerant cohort and 66% in the T315I cohort). In general, patients with T315I mutations received fewer prior therapies than those in the resistant/intolerant cohort, which likely contributed to the higher response rates in the T315I patients. MR4.5 was achieved in 15% of CP-CML patients by 12 months on the PACE trial.37 The 5-year update to this study reported that 60%, 40%, and 24% of CP-CML patients achieved MCyR, MMR, and MR4.5, respectively. In the patients who achieved MCyR, the probability of maintaining this response for 5 years was 82% and the estimated 5-year OS was 73%.19

Toxicity. In 2013, after the regulatory approval of ponatinib, reports became available that the drug can cause an increase in arterial occlusive events, including fatal myocardial infarctions and cerebrovascular accidents. For this reason, dose reductions were implemented in patients who were deriving clinical benefit from ponatinib. In spite of these dose reductions, ≥ 90% of responders maintained their response for up to 40 months.38 Although the likelihood of developing an arterial occlusive event appears higher in the first year after starting ponatinib than in later years, the cumulative incidence of events continues to increase. The 5-year follow-up to the PACE trial reports 31% of patients experiencing any grade of arterial occlusive event while on ponatinib. Aside from these events, the most common treatment-emergent adverse events in ponatinib-treated patients on the PACE trial included rash (47%), abdominal pain (46%), headache (43%), dry skin (42%), constipation (41%), and hypertension (37%). Hematologic toxicity was also common, with 46% of patients experiencing any grade of thrombocytopenia, 20% experiencing neutropenia, and 20% anemia.38

Patients receiving ponatinib therapy should be monitored closely for any evidence of arterial or venous thrombosis. If an occlusive event occurs, ponatinib should be discontinued. Similarly, in the setting of any new or worsening heart failure symptoms, ponatinib should be promptly discontinued. Management of any underlying cardiovascular risk factors, including hypertension, hyperlipidemia, diabetes, or smoking history, is recommended, and these patients should be referred to a cardiologist for a full evaluation. In the absence of any contraindications to aspirin, low-dose aspirin should be considered as a means of decreasing cardiovascular risks associated with ponatinib. In patients with known risk factors, a ponatinib starting dose of 30 mg daily rather than the standard 45 mg daily may be a safer option, resulting in fewer arterial occlusive events, although the efficacy of this dose is still being studied in comparison to 45 mg daily.19

If ponatinib-induced transaminitis greater than 3 times the upper limit of normal occurs, ponatinib should be held until resolution to less than 3 times the upper limit of normal, at which point it should be resumed at a lower dose. Similarly, in the setting of elevated serum lipase or symptomatic pancreatitis, ponatinib should be held and restarted at a lower dose after resolution of symptoms.19

 

 

In the event of neutropenia or thrombocytopenia, ponatinib should be held until blood count recovery and then restarted at the same dose. If cytopenias occur for a second time, the dose of ponatinib should be lowered at the time of treatment reinitiation. If rash occurs, it can be addressed with topical or systemic steroids as well as dose reduction, interruption, or discontinuation.19

Conclusion

With the development of imatinib and the subsequent TKIs, dasatinib, nilotinib, bosutinib, and ponatinib, CP-CML has become a chronic disease with a life expectancy that is similar to that of the general population. Given the successful treatments available for these patients, it is crucial to identify patients with this diagnosis, ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing, and select the best therapy for each individual patient. Once on treatment, the importance of frequent monitoring cannot be overstated. This is the only way to be certain patients are achieving the desired treatment milestones that correlate with the favorable long-term outcomes that have been observed with TKI-based treatment of CP-CML. 

Corresponding author: Kendra Sweet, MD, MS, Department of Malignant Hematology, Moffitt Cancer Center, Tampa, FL.

Financial disclosures: Dr. Sweet has served on the Advisory Board and Speakers Bureau of Novartis, Bristol-Meyers Squibb, Ariad Pharmaceuticals, and Pfizer, and has served as a consultant to Pfizer.

From the Moffitt Cancer Center, Tampa, FL.

Abstract

  • Objective: To outline the approach to selecting a tyrosine kinase inhibitor (TKI) for initial treatment of chronic myeloid leukemia (CML) and monitoring patients following initiation of therapy.
  • Methods: Review of the literature and evidence-based guidelines.
  • Results: The development and availability of TKIs has improved survival for patients diagnosed with CML. The life expectancy of patients diagnosed with chronic-phase CML (CP-CML) is similar to that of the general population, provided they receive appropriate TKI therapy and adhere to treatment. Selection of the most appropriate first-line TKI for newly diagnosed CP-CML requires incorporation of the patient’s baseline karyotype and Sokal or EURO risk score, and a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy. After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, close monitoring and follow-up are necessary to ensure patients are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses.
  • Conclusion: Given the successful treatments available for patients with CML, it is crucial to identify patients with this diagnosis; ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing; and select the best therapy for each individual patient.

Keywords: chronic myeloid leukemia; CML; tyrosine kinase inhibitor; TKI; cancer; BCR-ABL protein.

Chronic myeloid leukemia (CML) is a rare myeloproliferative neoplasm that is characterized by the presence of the Philadelphia (Ph) chromosome and uninhibited expansion of bone marrow stem cells. The Ph chromosome arises from a reciprocal translocation between the Abelson (ABL) region on chromosome 9 and the breakpoint cluster region (BCR) of chromosome 22 (t(9;22)(q34;q11.2)), resulting in the BCR-ABL1 fusion gene and its protein product, BCR-ABL tyrosine kinase.1 BCR-ABL has constitutive tyrosine kinase activity that promotes growth, replication, and survival of hematopoietic cells through downstream pathways, which is the driving factor in the pathogenesis of CML.1

CML is divided into 3 phases based on the number of myeloblasts observed in the blood or bone marrow: chronic, accelerated, and blast. Most cases of CML are diagnosed in the chronic phase (CP), which is marked by proliferation of primarily the myeloid element.

Typical treatment for CML involves lifelong use of oral BCR-ABL tyrosine kinase inhibitors (TKIs). Currently, 5 TKIs have regulatory approval for treatment of this disease. The advent of TKIs, a class of small molecules targeting the tyrosine kinases, particularly the BCR-ABL tyrosine kinase, led to rapid changes in the management of CML and improved survival for patients. Patients diagnosed with chronic-phase CML (CP-CML) now have a life expectancy that is similar to that of the general population, as long as they receive appropriate TKI therapy and adhere to treatment. As such, it is crucial to identify patients with CML; ensure they receive a complete, appropriate diagnostic workup; and select the best therapy for each patient.

Epidemiology

According to SEER data estimates, 8430 new cases of CML were diagnosed in the United States in 2018. CML is a disease of older adults, with a median age of 65 years at diagnosis, and there is a slight male predominance. Between 2011 and 2015, the number of new CML cases was 1.8 per 100,000 persons. The median overall survival (OS) in patients with newly diagnosed CP-CML has not been reached.2 Given the effective treatments available for managing CML, it is estimated that the prevalence of CML in the United States will plateau at 180,000 patients by 2050.3

 

 

Diagnosis

Clinical Features

The diagnosis of CML is often suspected based on an incidental finding of leukocytosis and, in some cases, thrombocytosis. In many cases, this is an incidental finding on routine blood work, but approximately 50% of patients will present with constitutional symptoms associated with the disease. Characteristic features of the white blood cell differential include left-shifted maturation with neutrophilia and immature circulating myeloid cells. Basophilia and eosinophilia are often present as well. Splenomegaly is a common sign, present in 50% to 90% of patients at diagnosis. In those patients with symptoms related to CML at diagnosis, the most common presentation includes increasing fatigue, fevers, night sweats, early satiety, and weight loss. The diagnosis is confirmed by cytogenetic studies showing the Ph chromosome abnormality, t(9; 22)(q3.4;q1.1), and/or reverse transcriptase polymerase chain reaction (PCR) showing BCR-ABL1 transcripts.

Testing

Bone marrow biopsy. There are 3 distinct phases of CML: CP, accelerated phase (AP), and blast phase (BP). Bone marrow biopsy and aspiration at diagnosis are mandatory in order to determine the phase of the disease at diagnosis. This distinction is based on the percentage of blasts, promyelocytes, and basophils present as well as the platelet count and presence or absence of extramedullary disease.4 The vast majority of patients at diagnosis have CML that is in the chronic phase. The typical appearance in CP-CML is a hypercellular marrow with granulocytic and occasionally megakaryocytic hyperplasia. In many cases, basophilia and/or eosinophilia are noted as well. Dysplasia is not a typical finding in CML.5 Bone marrow fibrosis can be seen in up to one-third of patients at diagnosis, and may indicate a slightly worse prognosis.6 Although a diagnosis of CML can be made without a bone marrow biopsy, complete staging and prognostication are only possible with information gained from this test, including baseline karyotype and confirmation of CP versus a more advanced phase of CML.

Diagnostic criteria. The criteria for diagnosing AP-CML has not been agreed upon by various groups, but the modified MD Anderson Cancer Center (MDACC) criteria are used in the majority of clinical trials evaluating the efficacy of TKIs in preventing progression to advanced phases of CML. MDACC criteria define AP-CML as the presence of 1 of the following: 15% to 29% blasts in the peripheral blood or bone marrow, ≥ 30% peripheral blasts plus promyelocytes, ≥ 20% basophils in the blood or bone marrow, platelet count ≤ 100,000/μL unrelated to therapy, and clonal cytogenetic evolution in Ph-positive metaphases (Table).7

Diagnostic Criteria for Chronic Myeloid Leukemia


BP-CML is typically defined using the criteria developed by the International Bone Marrow Transplant Registry (IBMTR): ≥ 30% blasts in the peripheral blood and/or the bone marrow or the presence of extramedullary disease.8 Although not typically used in clinical trials, the revised World Health Organization (WHO) criteria for BP-CML include ≥ 20% blasts in the peripheral blood or bone marrow, extramedullary blast proliferation, and large foci or clusters of blasts in the bone marrow biopsy sample (Table).9

The defining feature of CML is the presence of the Ph chromosome abnormality. In a small subset of patients, additional chromosome abnormalities (ACA) in the Ph-positive cells may be identified at diagnosis. Some reports indicate that the presence of “major route” ACA (trisomy 8, isochromosome 17q, a second Ph chromosome, or trisomy 19) at diagnosis may negatively impact prognosis, but other reports contradict these findings.10,11

 

 

PCR assay. The typical BCR breakpoint in CML is the major breakpoint cluster region (M-BCR), which results in a 210-kDa protein (p210). Alternate breakpoints that are less frequently identified are the minor BCR (mBCR or p190), which is more commonly found in Ph-positive acute lymphoblastic leukemia (ALL), and the micro BCR (µBCR or p230), which is much less common and is often characterized by chronic neutrophilia.12 Identifying which BCR-ABL1 transcript is present in each patient using qualitative PCR is crucial in order to ensure proper monitoring during treatment.

The most sensitive method for detecting BCR-ABL1 mRNA transcripts is the quantitative real-time PCR (RQ-PCR) assay, which is typically done on peripheral blood. RQ-PCR is capable of detecting a single CML cell in the presence of ≥ 100,000 normal cells. This test should be done during the initial diagnostic workup in order to confirm the presence of BCR-ABL1 transcripts, and it is used as a standard method for monitoring response to TKI therapy.13 The International Scale (IS) is a standardized approach to reporting RQ-PCR results that was developed to allow comparison of results across various laboratories and has become the gold standard for reporting BCR-ABL1 transcript values.14

Determining Risk Scores

Calculating a patient’s Sokal score or EURO risk score at diagnosis remains an important component of the diagnostic workup in CP-CML, as this information has prognostic and therapeutic implications (an online calculator is available through European LeukemiaNet [ELN]). The risk for disease progression to the accelerated or blast phases is higher in patients with intermediate or high risk scores compared to those with a low risk score at diagnosis. The risk of progression in intermediate- or high-risk patients is lower when a second-generation TKI (dasatinib, nilotinib, or bosutinib) is used as frontline therapy compared to imatinib, and therefore, the National Comprehensive Cancer Network (NCCN) CML Panel recommends starting with a second-generation TKI in these patients.15-19

 

Monitoring Response to Therapy

After confirming a diagnosis of CML and selecting the most appropriate TKI for first-line therapy, the successful management of CML patients relies on close monitoring and follow-up to ensure they are meeting the desired treatment milestones. Responses in CML can be assessed based on hematologic parameters, cytogenetic results, and molecular responses. A complete hematologic response (CHR) implies complete normalization of peripheral blood counts (with the exception of TKI-induced cytopenias) and resolution of any palpable splenomegaly. The majority of patients will achieve a CHR within 4 to 6 weeks after initiating CML-directed therapy.20

Cytogenetic Response

Cytogenetic responses are defined by the decrease in the number of Ph chromosome–positive metaphases when assessed on bone marrow cytogenetics. A partial cytogenetic response (PCyR) is defined as having 1% to 35% Ph-positive metaphases, a major cytogenetic response (MCyR) as having 0% to 35% Ph-positive metaphases, and a complete cytogenetic response (CCyR) implies that no Ph-positive metaphases are identified on bone marrow cytogenetics. An ideal response is the achievement of PCyR after 3 months on a TKI and a CCyR after 12 months on a TKI.21

 

 

Molecular Response

Once a patient has achieved a CCyR, monitoring their response to therapy can only be done using RQ-PCR to measure BCR-ABL1 transcripts in the peripheral blood. The NCCN and the ELN recommend monitoring RQ-PCR from the peripheral blood every 3 months in order to assess response to TKIs.19,22 As noted, the IS has become the gold standard reporting system for all BCR-ABL1 transcript levels in the majority of laboratories worldwide.14,23 Molecular responses are based on a log reduction in BCR-ABL1 transcripts from a standardized baseline. Many molecular responses can be correlated with cytogenetic responses such that, if reliable RQ-PCR testing is available, monitoring can be done using only peripheral blood RQ-PCR rather than repeat bone marrow biopsies. For example, an early molecular response (EMR) is defined as a RQ-PCR value of ≤ 10% IS, which is approximately equivalent to a PCyR.24 A value of 1% IS is approximately equivalent to a CCyR. A major molecular response (MMR) is a ≥ 3-log reduction in BCR-ABL1 transcripts from baseline and is a value of ≤ 0.1% IS. Deeper levels of molecular response are best described by the log reduction in BCR-ABL1 transcripts, with a 4-log reduction denoted as MR4.0, a 4.5-log reduction as MR4.5, and so forth. Complete molecular response (CMR) is defined by the level of sensitivity of the RQ-PCR assay being used.14

The definition of relapsed disease in CML is dependent on the type of response the patient had previously achieved. Relapse could be the loss of a hematologic or cytogenetic response, but fluctuations in BCR-ABL1 transcripts on routine RQ-PCR do not necessarily indicate relapsed CML. A 1-log increase in the level of BCR-ABL1 transcripts with a concurrent loss of MMR should prompt a bone marrow biopsy in order to assess for the loss of CCyR, and thus a cytogenetic relapse; however, this loss of MMR does not define relapse in and of itself. In the setting of relapsed disease, testing should be done to look for possible ABL kinase domain mutations, and alternate therapy should be selected.19

Multiple reports have identified the prognostic relevance of achieving an EMR at 3 and 6 months after starting TKI therapy. Marin and colleagues reported that in 282 imatinib-treated patients, there was a significant improvement in 8-year OS, progression-free survival (PFS), and cumulative incidence of CCyR and CMR in patients who had BCR-ABL1 transcripts < 9.84% IS after 3 months on treatment.24 This data highlights the importance of early molecular monitoring in order to ensure the best outcomes for patients with CP-CML.

The NCCN CML guidelines and ELN recommendations both agree that an ideal response after 3 months on a TKI is BCR-ABL1 transcripts < 10% IS, but treatment is not considered to be failing at this point if the patient marginally misses this milestone. After 6 months on treatment, an ideal response is considered BCR-ABL1 transcripts < 1%–10% IS. Ideally, patients will have BCR-ABL1 transcripts < 0.1%–1% IS by the time they complete 12 months of TKI therapy, suggesting that these patients have at least achieved a CCyR.19,22 Even after patients achieve these early milestones, frequent monitoring by RQ-PCR is required to ensure that they are maintaining their response to treatment. This will help to ensure patient compliance with treatment and will also help to identify a select subset of patients who could potentially be considered for an attempt at TKI cessation (not discussed in detail here) after a minimum of 3 years on therapy.19,25

Selecting First-line TKI Therapy

Selection of the most appropriate first-line TKI for newly diagnosed CP-CML patients requires incorporation of many patient-specific factors. These factors include baseline karyotype and confirmation of CP-CML through bone marrow biopsy, Sokal or EURO risk score, and a thorough patient history, including a clear understanding of the patient’s comorbidities. The adverse effect profile of all TKIs must be considered in conjunction with the patient’s ongoing medical issues in order to decrease the likelihood of worsening their current symptoms or causing a severe complication from TKI therapy.

 

 

Imatinib

The management of CML was revolutionized by the development and ultimate regulatory approval of imatinib mesylate in 2001. Imatinib was the first small-molecule cancer therapy developed and approved. It acts by binding to the adenosine triphosphate (ATP) binding site in the catalytic domain of BCR-ABL, thus inhibiting the oncoprotein’s tyrosine kinase activity.26

The International Randomized Study of Interferon versus STI571 (IRIS) trial was a randomized phase 3 study that compared imatinib 400 mg daily to interferon alfa (IFNa) plus cytarabine. More than 1000 CP-CML patients were randomly assigned 1:1 to either imatinib or IFNa plus cytarabine and were assessed for event-free survival, hematologic and cytogenetic responses, freedom from progression to AP or BP, and toxicity. Imatinib was superior to the prior standard of care for all these outcomes.21 The long-term follow-up of the IRIS trial reported an 83% estimated 10-year OS and 79% estimated event-free survival for patients on the imatinib arm of this study.15 The cumulative rate of CCyR was 82.8%. Of the 204 imatinib-treated patients who could undergo a molecular response evaluation at 10 years, 93.1% had a MMR and 63.2% had a MR4.5, suggesting durable, deep molecular responses for many patients. The estimated 10-year rate of freedom from progression to AP or BP was 92.1%.

Higher doses of imatinib (600-800 mg daily) have been studied in an attempt to overcome resistance and improve cytogenetic and molecular response rates. The Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) trial was a randomized phase 3 study that compared imatinib 800 mg daily to imatinib 400 mg daily. Although the 6-month assessments found increased rates of CCyR and a MMR in the higher-dose imatinib arm, these differences were no longer present at the 12-month assessment. Furthermore, the higher dose of imatinib led to a significantly higher incidence of grade 3/4 hematologic adverse events, and approximately 50% of patients on imatinib 800 mg daily required a dose reduction to less than 600 mg daily because of toxicity.27

The Therapeutic Intensification in De Novo Leukaemia (TIDEL)-II study used plasma trough levels of imatinib on day 22 of treatment with imatinib 600 mg daily to determine if patients should escalate the imatinib dose to 800 mg daily. In patients who did not meet molecular milestones at 3, 6, or 12 months, cohort 1 was dose escalated to imatinib 800 mg daily and subsequently switched to nilotinib 400 mg twice daily for failing the same target 3 months later, and cohort 2 was switched to nilotinib. At 2 years, 73% of patients achieved MMR and 34% achieved MR4.5, suggesting that initial treatment with higher-dose imatinib, followed by a switch to nilotinib in those failing to achieve desired milestones, could be an effective strategy for managing newly diagnosed CP-CML.28

Toxicity. The standard starting dose of imatinib in CP-CML patients is 400 mg. The safety profile of imatinib has been very well established. In the IRIS trial, the most common adverse events (all grades in decreasing order of frequency) were peripheral and periorbital edema (60%), nausea (50%), muscle cramps (49%), musculoskeletal pain (47%), diarrhea (45%), rash (40%), fatigue (39%), abdominal pain (37%), headache (37%), and joint pain (31%). Grade 3/4 liver enzyme elevation can occur in 5% of patients.29 In the event of severe liver toxicity or fluid retention, imatinib should be held until the event resolves. At that time, imatinib can be restarted if deemed appropriate, but this is dependent on the severity of the inciting event. Fluid retention can be managed by the use of supportive care, diuretics, imatinib dose reduction, dose interruption, or imatinib discontinuation if the fluid retention is severe. Muscle cramps can be managed by the use of calcium supplements or tonic water. Management of rash can include topical or systemic steroids, or in some cases imatinib dose reduction, interruption, or discontinuation.19

 

 

Grade 3/4 imatinib-induced hematologic toxicity is not uncommon, with 17% of patients experiencing neutropenia, 9% thrombocytopenia, and 4% anemia. These adverse events occurred most commonly during the first year of therapy, and the frequency decreased over time.15,29 Depending on the degree of cytopenias, imatinib dosing should be interrupted until recovery of the absolute neutrophil count or platelet count, and can often be resumed at 400 mg daily. However, if cytopenias recur, imatinib should be held and subsequently restarted at 300 mg daily.19

Dasatinib

Dasatinib is a second-generation TKI that has regulatory approval for treatment of adult patients with newly diagnosed CP-CML or CP-CML in patients with resistance or intolerance to prior TKIs. In addition to dasatinib’s ability to inhibit ABL kinases, it is also known to be a potent inhibitor of Src family kinases. Dasatinib has shown efficacy in patients who have developed imatinib-resistant ABL kinase domain mutations.

Dasatinib was initially approved as second-line therapy in patients with resistance or intolerance to imatinib. This indication was based on the results of the phase 3 CA180-034 trial, which ultimately identified dasatinib 100 mg daily as the optimal dose. In this trial, 74% of patients enrolled had resistance to imatinib and the remainder were intolerant. The 7-year follow-up of patients randomized to dasatinib 100 mg (n = 167) daily indicated that 46% achieved MMR while on study. Of the 124 imatinib-resistant patients on dasatinib 100 mg daily, the 7-year PFS was 39% and OS was 63%. In the 43 imatinib-intolerant patients, the 7-year PFS was 51% and OS was 70%.30

Dasatinib 100 mg daily was compared to imatinib 400 mg daily in newly diagnosed CP-CML patients in the randomized phase 3 DASISION (Dasatinib versus Imatinib Study in Treatment-Naive CML Patients) trial. More patients on the dasatinib arm achieved an EMR of BCR-ABL1 transcripts ≤ 10% IS after 3 months on treatment compared to imatinib (84% versus 64%). Furthermore, the 5-year follow-up reports that the cumulative incidence of MMR and MR4.5 in dasatinib-treated patients was 76% and 42%, and was 64% and 33% with imatinib (P = 0.0022 and P = 0.0251, respectively). Fewer patients treated with dasatinib progressed to AP or BP (4.6%) compared to imatinib (7.3%), but the estimated 5-year OS was similar between the 2 arms (91% for dasatinib versus 90% for imatinib).16 Regulatory approval for dasatinib as first-line therapy in newly diagnosed CML patients was based on results of the DASISION trial.

Toxicity. Most dasatinib-related toxicities are reported as grade 1 or grade 2, but grade 3/4 hematologic adverse events are fairly common. In the DASISION trial, grade 3/4 neutropenia, anemia, and thrombocytopenia occurred in 29%, 13%, and 22% of dasatinib-treated patients, respectively. Cytopenias can generally be managed with temporary dose interruptions or dose reductions.

 

 

During the 5-year follow-up of the DASISION trial, pleural effusions were reported in 28% of patients, most of which were grade 1/2. This occurred at a rate of approximately ≤ 8% per year, suggesting a stable incidence over time, and the effusions appear to be dose-dependent.16 Depending on the severity, pleural effusion may be treated with diuretics, dose interruption, and, in some instances, steroids or a thoracentesis. Typically, dasatinib can be restarted at 1 dose level lower than the previous dose once the effusion has resolved.19 Other, less common side effects of dasatinib include pulmonary hypertension (5% of patients), as well as abdominal pain, fluid retention, headaches, fatigue, musculoskeletal pain, rash, nausea, and diarrhea. Pulmonary hypertension is typically reversible after cessation of dasatinib, and thus dasatinib should be permanently discontinued once the diagnosis is confirmed. Fluid retention is often treated with diuretics and supportive care. Nausea and diarrhea are generally manageable and occur less frequently when dasatinib is taken with food and a large glass of water. Antiemetics and antidiarrheals can be used as needed. Troublesome rash can be best managed with topical or systemic steroids as well as possible dose reduction or dose interruption.16,19 In the DASISION trial, adverse events led to therapy discontinuation more often in the dasatinib group than in the imatinib group (16% versus 7%).16 Bleeding, particularly in the setting of thrombocytopenia, has been reported in patients being treated with dasatinib as a result of the drug-induced reversible inhibition of platelet aggregation.31

Nilotinib

The structure of nilotinib is similar to that of imatinib; however, it has a markedly increased affinity for the ATP‐binding site on the BCR-ABL1 protein. It was initially given regulatory approval in the setting of imatinib failure. Nilotinib was studied at a dose of 400 mg twice daily in 321 patients who were imatinib-resistant or -intolerant. It proved to be highly effective at inducing cytogenetic remissions in the second-line setting, with 59% of patients achieving a MCyR and 45% achieving a CCyR. With a median follow-up time of 4 years, the OS was 78%.32 

Nilotinib gained regulatory approval for use as a first-line TKI after completion of the randomized phase 3 ENESTnd (Evaluating Nilotinib Efficacy and Safety in Clinical Trials-Newly Diagnosed Patients) trial. ENESTnd was a 3-arm study comparing nilotinib 300 mg twice daily versus nilotinib 400 mg twice daily versus imatinib 400 mg daily in newly diagnosed, previously untreated patients diagnosed with CP-CML. The primary endpoint of this clinical trial was rate of MMR at 12 months.33 Nilotinib surpassed imatinib in this regard, with 44% of patients on nilotinib 300 mg twice daily achieving MMR at 12 months versus 43% of nilotinib 400 mg twice daily patients versus 22% of the imatinib-treated patients (P < 0.001 for both comparisons). Furthermore, the rate of CCyR by 12 months was significantly higher for both nilotinib arms compared with imatinib (80% for nilotinib 300 mg, 78% for nilotinib 400 mg, and 65% for imatinib) (P < 0.001).12 Based on this data, nilotinib 300 mg twice daily was chosen as the standard dose of nilotinib in the first-line setting. After 5 years of follow-up on the ENESTnd study, there were fewer progressions to AP/BP CML in nilotinib-treated patients compared with imatinib. MMR was achieved in 77% of nilotinib 300 mg patients compared with 60.4% of patients on the imatinib arm. MR4.5 was also more common in patients treated with nilotinib 300 mg twice daily, with a rate of 53.5% at 5 years versus 31.4% in the imatinib arm.17 In spite of the deeper cytogenetic and molecular responses achieved with nilotinib, this did not translate into a significant improvement in OS. The 5-year OS rate was 93.7% in nilotinib 300 mg patients versus 91.7% in imatinib-treated patients, and this difference lacked statistical significance.17

Toxicity. Although some similarities exist between the toxicity profiles of nilotinib and imatinib, each drug has some distinct adverse events. On the ENESTnd trial, the rate of any grade 3/4 non-hematologic adverse event was fairly low; however, lower-grade toxicities were not uncommon. Patients treated with nilotinib 300 mg twice daily experienced rash (31%), headache (14%), pruritis (15%), and fatigue (11%) most commonly. The most frequently reported laboratory abnormalities included increased total bilirubin (53%), hypophosphatemia (32%), hyperglycemia (36%), elevated lipase (24%), increased alanine aminotransferase (ALT; 66%), and increased aspartate aminotransferase (AST; 40%). Any grade of neutropenia, thrombocytopenia, or anemia occurred at rates of 43%, 48%, and 38%, respectively.33 Although nilotinib has a Black Box Warning from the US Food and Drug Administration for QT interval prolongation, no patients on the ENESTnd trial experienced a QT interval corrected for heart rate greater than 500 msec.12

More recent concerns have emerged regarding the potential for cardiovascular toxicity after long-term use of nilotinib. The 5-year update of ENESTnd reports cardiovascular events, including ischemic heart disease, ischemic cerebrovascular events, or peripheral arterial disease occurring in 7.5% of patients treated with nilotinib 300 mg twice daily, as compared with a rate of 2.1% in imatinib-treated patients. The frequency of these cardiovascular events increased linearly over time in both arms. Elevations in total cholesterol from baseline occurred in 27.6% of nilotinib patients compared with 3.9% of imatinib patients. Furthermore, clinically meaningful increases in low-density lipoprotein cholesterol and glycated hemoglobin occurred more frequently with nilotinib therapy.33

 

 

Nilotinib should be taken on an empty stomach; therefore, patients should be made aware of the need to fast for 2 hours prior to each dose and 1 hour after each dose. Given the potential risk of QT interval prolongation, a baseline electrocardiogram (ECG) is recommended prior to initiating treatment to ensure the QT interval is within a normal range. A repeat ECG should be done approximately 7 days after nilotinib initiation to ensure no prolongation of the QT interval after starting. Close monitoring of potassium and magnesium levels is important to decrease the risk of cardiac arrhythmias, and concomitant use of drugs considered strong CYP3A4 inhibitors should be avoided.19

If the patient experiences any grade 3 or higher laboratory abnormalities, nilotinib should be held until resolution of the toxicity, and then restarted at a lower dose. Similarly, if patients develop significant neutropenia or thrombocytopenia, nilotinib doses should be interrupted until resolution of the cytopenias. At that point, nilotinib can be reinitiated at either the same or a lower dose. Rash can be managed by the use of topical or systemic steroids as well as potential dose reduction, interruption, or discontinuation.

Given the concerns for potential cardiovascular events with long-term use of nilotinib, caution is advised when prescribing it to any patient with a history of cardiovascular disease or peripheral arterial occlusive disease. At the first sign of new occlusive disease, nilotinib should be discontinued.19

 

Bosutinib

Bosutinib is a second-generation BCR-ABL TKI with activity against the Src family of kinases; it was initially approved to treat patients with CP-, AP-, or BP-CML after resistance or intolerance to imatinib. Long-term data has been reported from the phase 1/2 trial of bosutinib therapy in patients with CP-CML who developed resistance or intolerance to imatinib plus dasatinib and/or nilotinib. A total of 119 patients were included in the 4-year follow-up; 38 were resistant/intolerant to imatinib and resistant to dasatinib, 50 were resistant/intolerant to imatinib and intolerant to dasatinib, 26 were resistant/intolerant to imatinib and resistant to nilotinib, and 5 were resistant/intolerant to imatinib and intolerant to nilotinib or resistant/intolerant to dasatinib and nilotinib. Bosutinib 400 mg daily was studied in this setting. Of the 38 patients with imatinib resistance/intolerance and dasatinib resistance, 39% achieved MCyR, 22% achieved CCyR, and the OS was 67%. Of the 50 patients with imatinib resistance/intolerance and dasatinib intolerance, 42% achieved MCyR, 40% achieved CCyR, and the OS was 80%. Finally, in the 26 patients with imatinib resistance/intolerance and nilotinib resistance, 38% achieved MCyR, 31% achieved CCyR, and the OS was 87%.34

Five-year follow-up from the phase 1/2 clinical trial that studied bosutinib 500 mg daily in CP-CML patients after imatinib failure reported data on 284 patients. By 5 years on study, 60% of patients had achieved MCyR and 50% achieved CCyR with a 71% and 69% probability, respectively, of maintaining these responses at 5 years. The 5-year OS was 84%.35 These data led to the regulatory approval of bosutinib 500 mg daily as second-line or later therapy.

 

 

Bosutinib was initially studied in the first-line setting in the randomized phase 3 BELA (Bosutinib Efficacy and Safety in Newly Diagnosed Chronic Myeloid Leukemia) trial. This trial compared bosutinib 500 mg daily to imatinib 400 mg daily in newly diagnosed, previously untreated CP-CML patients. This trial failed to meet its primary endpoint of increased rate of CCyR at 12 months, with 70% of bosutinib patients achieving this response, compared to 68% of imatinib-treated patients (P = 0.601). In spite of this, the rate of MMR at 12 months was significantly higher in the bosutinib arm (41%) compared to the imatinib arm (27%; P = 0.001).36

A second phase 3 trial (BFORE) was designed to study bosutinib 400 mg daily versus imatinib in newly diagnosed, previously untreated CP-CML patients. This study enrolled 536 patients who were randomly assigned 1:1 to bosutinib versus imatinib. The primary endpoint of this trial was rate of MMR at 12 months. A significantly higher number of bosutinib-treated patients achieved this response (47.2%) compared with imatinib-treated patients (36.9%, P = 0.02). Furthermore, by 12 months 77.2% of patients on the bosutinib arm had achieved CCyR compared with 66.4% on the imatinib arm, and this difference did meet statistical significance (P = 0.0075). A lower rate of progression to AP- or BP-CML was noted in bosutinib-treated patients as well (1.6% versus 2.5%). Based on this data, bosutinib gained regulatory approval for first-line therapy in CP-CML at a dose of 400 mg daily.18

Toxicity. On the BFORE trial, the most common treatment-emergent adverse events of any grade reported in the bosutinib-treated patients were diarrhea (70.1%), nausea (35.1%), increased ALT (30.6%), and increased AST (22.8%). Musculoskeletal pain or spasms occurred in 29.5% of patients, rash in 19.8%, fatigue in 19.4%, and headache in 18.7%. Hematologic toxicity was also reported, but most was grade 1/2. Thrombocytopenia was reported in 35.1%, anemia in 18.7%, and neutropenia in 11.2%.18

Cardiovascular events occurred in 5.2% of patients on the bosutinib arm of the BFORE trial, which was similar to the rate observed in imatinib patients. The most common cardiovascular event was QT interval prolongation, which occurred in 1.5% of patients. Pleural effusions were reported in 1.9% of patients treated with bosutinib, and none were grade 3 or higher.18

If liver enzyme elevation occurs at a value greater than 5 times the institutional upper limit of normal, bosutinib should be held until the level recovers to ≤ 2.5 times the upper limit of normal, at which point bosutinib can be restarted at a lower dose. If recovery takes longer than 4 weeks, bosutinib should be permanently discontinued. Liver enzymes elevated greater than 3 times the institutional upper limit of normal and a concurrent elevation in total bilirubin to 2 times the upper limit of normal are consistent with Hy’s law, and bosutinib should be discontinued. Although diarrhea is the most common toxicity associated with bosutinib, it is commonly low grade and transient. Diarrhea occurs most frequently in the first few days after initiating bosutinib. It can often be managed with over-the-counter antidiarrheal medications, but if the diarrhea is grade 3 or higher, bosutinib should be held until recovery to grade 1 or lower. Gastrointestinal side effects may be improved by taking bosutinib with a meal and a large glass of water. Fluid retention can be managed with diuretics and supportive care. Finally, if rash occurs, this can be addressed with topical or systemic steroids as well as bosutinib dose reduction, interruption, or discontinuation.19

 

 

Similar to other TKIs, if bosutinib-induced cytopenias occur, treatment should be held and restarted at the same or a lower dose upon blood count recovery.19

Ponatinib

The most common cause of TKI resistance in CP-CML is the development of ABL kinase domain mutations. The majority of imatinib-resistant mutations can be overcome by the use of second-generation TKIs, including dasatinib, nilotinib, or bosutinib. However, ponatinib is the only BCR-ABL TKI able to overcome a T315I mutation. The phase 2 PACE (Ponatinib Ph-positive ALL and CML Evaluation) trial enrolled patients with CP-, AP-, or BP-CML as well as patients with Ph-positive acute lymphoblastic leukemia who were resistant or intolerant to nilotinib or dasatinib, or who had evidence of a T315I mutation. The starting dose of ponatinib on this trial was 45 mg daily.37 The PACE trial enrolled 267 patients with CP-CML: 203 with resistance or intolerance to nilotinib or dasatinib, and 64 with a T315I mutation. The primary endpoint in the CP cohort was rate of MCyR at any time within 12 months of starting ponatinib. The overall rate of MCyR by 12 months in the CP-CML patients was 56%. In those with a T315I mutation, 70% achieved MCyR, which compared favorably with those with resistance or intolerance to nilotinib or dasatinib, 51% of whom achieved MCyR. CCyR was achieved in 46% of CP-CML patients (40% in the resistant/intolerant cohort and 66% in the T315I cohort). In general, patients with T315I mutations received fewer prior therapies than those in the resistant/intolerant cohort, which likely contributed to the higher response rates in the T315I patients. MR4.5 was achieved in 15% of CP-CML patients by 12 months on the PACE trial.37 The 5-year update to this study reported that 60%, 40%, and 24% of CP-CML patients achieved MCyR, MMR, and MR4.5, respectively. In the patients who achieved MCyR, the probability of maintaining this response for 5 years was 82% and the estimated 5-year OS was 73%.19

Toxicity. In 2013, after the regulatory approval of ponatinib, reports became available that the drug can cause an increase in arterial occlusive events, including fatal myocardial infarctions and cerebrovascular accidents. For this reason, dose reductions were implemented in patients who were deriving clinical benefit from ponatinib. In spite of these dose reductions, ≥ 90% of responders maintained their response for up to 40 months.38 Although the likelihood of developing an arterial occlusive event appears higher in the first year after starting ponatinib than in later years, the cumulative incidence of events continues to increase. The 5-year follow-up to the PACE trial reports 31% of patients experiencing any grade of arterial occlusive event while on ponatinib. Aside from these events, the most common treatment-emergent adverse events in ponatinib-treated patients on the PACE trial included rash (47%), abdominal pain (46%), headache (43%), dry skin (42%), constipation (41%), and hypertension (37%). Hematologic toxicity was also common, with 46% of patients experiencing any grade of thrombocytopenia, 20% experiencing neutropenia, and 20% anemia.38

Patients receiving ponatinib therapy should be monitored closely for any evidence of arterial or venous thrombosis. If an occlusive event occurs, ponatinib should be discontinued. Similarly, in the setting of any new or worsening heart failure symptoms, ponatinib should be promptly discontinued. Management of any underlying cardiovascular risk factors, including hypertension, hyperlipidemia, diabetes, or smoking history, is recommended, and these patients should be referred to a cardiologist for a full evaluation. In the absence of any contraindications to aspirin, low-dose aspirin should be considered as a means of decreasing cardiovascular risks associated with ponatinib. In patients with known risk factors, a ponatinib starting dose of 30 mg daily rather than the standard 45 mg daily may be a safer option, resulting in fewer arterial occlusive events, although the efficacy of this dose is still being studied in comparison to 45 mg daily.19

If ponatinib-induced transaminitis greater than 3 times the upper limit of normal occurs, ponatinib should be held until resolution to less than 3 times the upper limit of normal, at which point it should be resumed at a lower dose. Similarly, in the setting of elevated serum lipase or symptomatic pancreatitis, ponatinib should be held and restarted at a lower dose after resolution of symptoms.19

 

 

In the event of neutropenia or thrombocytopenia, ponatinib should be held until blood count recovery and then restarted at the same dose. If cytopenias occur for a second time, the dose of ponatinib should be lowered at the time of treatment reinitiation. If rash occurs, it can be addressed with topical or systemic steroids as well as dose reduction, interruption, or discontinuation.19

Conclusion

With the development of imatinib and the subsequent TKIs, dasatinib, nilotinib, bosutinib, and ponatinib, CP-CML has become a chronic disease with a life expectancy that is similar to that of the general population. Given the successful treatments available for these patients, it is crucial to identify patients with this diagnosis, ensure they receive a complete, appropriate diagnostic workup including a bone marrow biopsy and aspiration with cytogenetic testing, and select the best therapy for each individual patient. Once on treatment, the importance of frequent monitoring cannot be overstated. This is the only way to be certain patients are achieving the desired treatment milestones that correlate with the favorable long-term outcomes that have been observed with TKI-based treatment of CP-CML. 

Corresponding author: Kendra Sweet, MD, MS, Department of Malignant Hematology, Moffitt Cancer Center, Tampa, FL.

Financial disclosures: Dr. Sweet has served on the Advisory Board and Speakers Bureau of Novartis, Bristol-Meyers Squibb, Ariad Pharmaceuticals, and Pfizer, and has served as a consultant to Pfizer.

References

1. Faderl S, Talpaz M, Estrov Z, et al. The biology of chronic myeloid leukemia. N Engl J Med. 1999;341:164-172.

2. Surveillance, Epidemiology, and End Results Program. Cancer Stat Facts: Leukemia - Chronic Myeloid Leukemia (CML). 2018.

3. Huang X, Cortes J, Kantarjian H. Estimations of the increasing prevalence and plateau prevalence of chronic myeloid leukemia in the era of tyrosine kinase inhibitor therapy. Cancer. 2012;118:3123-3127.

4. Savage DG, Szydlo RM, Chase A, et al. Bone marrow transplantation for chronic myeloid leukaemia: the effects of differing criteria for defining chronic phase on probabilities of survival and relapse. Br J Haematol. 1997;99:30-35.

5. Knox WF, Bhavnani M, Davson J, Geary CG. Histological classification of chronic granulocytic leukaemia. Clin Lab Haematol. 1984;6:171-175.

6. Kvasnicka HM, Thiele J, Schmitt-Graeff A, et al. Impact of bone marrow morphology on multivariate risk classification in chronic myelogenous leukemia. Acta Haematol. 2003;109:53-56.

7. Cortes JE, Talpaz M, O’Brien S, et al. Staging of chronic myeloid leukemia in the imatinib era: an evaluation of the World Health Organization proposal. Cancer. 2006;106:1306-1315.

8. Druker BJ. Chronic myeloid leukemia. In: DeVita VT, Lawrence TS, Rosenberg SA, eds. DeVita, Hellman, and Rosenberg’s Cancer Principles & Practice of Oncology. 8th ed. Philadelphia, PA: Lippincott, Williams and Wilkins; 2007:2267-2304.

9. Arber DA, Orazi A, Hasserjian R, et al. The 2016 revision to the World Health Organization classification of myeloid neoplasms and acute leukemia. Blood. 2016;127:2391-2405.

10. Fabarius A, Leitner A, Hochhaus A, et al. Impact of additional cytogenetic aberrations at diagnosis on prognosis of CML: long-term observation of 1151 patients from the randomized CML Study IV. Blood. 2011;118:6760-6768.

11. Alhuraiji A, Kantarjian H, Boddu P, et al. Prognostic significance of additional chromosomal abnormalities at the time of diagnosis in patients with chronic myeloid leukemia treated with frontline tyrosine kinase inhibitors. Am J Hematol. 2018;93:84-90.

12. Melo JV. BCR-ABL gene variants. Baillieres Clin Haematol. 1997;10:203-222.

13. Kantarjian HM, Talpaz M, Cortes J, et al. Quantitative polymerase chain reaction monitoring of BCR-ABL during therapy with imatinib mesylate (STI571; gleevec) in chronic-phase chronic myelogenous leukemia. Clin Cancer Res. 2003;9:160-166.

14. Hughes T, Deininger M, Hochhaus A, et al. Monitoring CML patients responding to treatment with tyrosine kinase inhibitors: review and recommendations for harmonizing current methodology for detecting BCR-ABL transcripts and kinase domain mutations and for expressing results. Blood. 2006;108:28-37.

15. Hochhaus A, Larson RA, Guilhot F, et al. Long-term outcomes of imatinib treatment for chronic myeloid leukemia. N Engl J Med. 2017;376:917-927.

16. Cortes JE, Saglio G, Kantarjian HM, et al. Final 5-year study results of DASISION: the Dasatinib Versus Imatinib Study in Treatment-Naive Chronic Myeloid Leukemia Patients trial. J Clin Oncol. 2016;34:2333-2340.

17. Hochhaus A, Saglio G, Hughes TP, et al. Long-term benefits and risks of frontline nilotinib vs imatinib for chronic myeloid leukemia in chronic phase: 5-year update of the randomized ENESTnd trial. Leukemia. 2016;30:1044-1054.

18. Cortes JE, Gambacorti-Passerini C, Deininger MW, et al. Bosutinib versus imatinib for newly diagnosed chronic myeloid leukemia: results from the randomized BFORE trial. J Clin Oncol. 2018;36:231-237.

19. Radich JP, Deininger M, Abboud CN, et al. Chronic Myeloid Leukemia, Version 1.2019, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2018;16:1108-1135.

20. Faderl S, Talpaz M, Estrov Z, Kantarjian HM. Chronic myelogenous leukemia: biology and therapy. Ann Intern Med. 1999;131:207-219.

21. O’Brien SG, Guilhot F, Larson RA, et al. Imatinib compared with interferon and low-dose cytarabine for newly diagnosed chronic-phase chronic myeloid leukemia. N Engl J Med. 2003;348:994-1004.

22. Baccarani M, Deininger MW, Rosti G, et al. European LeukemiaNet recommendations for the management of chronic myeloid leukemia: 2013. Blood. 2013;122:872-884.

23. Larripa I, Ruiz MS, Gutierrez M, Bianchini M. [Guidelines for molecular monitoring of BCR-ABL1 in chronic myeloid leukemia patients by RT-qPCR]. Medicina (B Aires). 2017;77:61-72.

24. Marin D, Ibrahim AR, Lucas C, et al. Assessment of BCR-ABL1 transcript levels at 3 months is the only requirement for predicting outcome for patients with chronic myeloid leukemia treated with tyrosine kinase inhibitors. J Clin Oncol. 2012;30:232-238.

25. Hughes TP, Ross DM. Moving treatment-free remission into mainstream clinical practice in CML. Blood. 2016;128:17-23.

26. Druker BJ, Talpaz M, Resta DJ, et al. Efficacy and safety of a specific inhibitor of the BCR-ABL tyrosine kinase in chronic myeloid leukemia. N Engl J Med. 2001;344:1031-1037.

27. Baccarani M, Druker BJ, Branford S, et al. Long-term response to imatinib is not affected by the initial dose in patients with Philadelphia chromosome-positive chronic myeloid leukemia in chronic phase: final update from the Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) study. Int J Hematol. 2014;99:616-624.

28. Yeung DT, Osborn MP, White DL, et al. TIDEL-II: first-line use of imatinib in CML with early switch to nilotinib for failure to achieve time-dependent molecular targets. Blood. 2015;125:915-923.

29. Druker BJ, Guilhot F, O’Brien SG, et al. Five-year follow-up of patients receiving imatinib for chronic myeloid leukemia. N Engl J Med. 2006;355:2408-2417.

30. Shah NP, Rousselot P, Schiffer C, et al. Dasatinib in imatinib-resistant or -intolerant chronic-phase, chronic myeloid leukemia patients: 7-year follow-up of study CA180-034. Am J Hematol. 2016;91:869-874.

31. Quintas-Cardama A, Han X, Kantarjian H, Cortes J. Tyrosine kinase inhibitor-induced platelet dysfunction in patients with chronic myeloid leukemia. Blood. 2009;114:261-263.

32. Giles FJ, le Coutre PD, Pinilla-Ibarz J, et al. Nilotinib in imatinib-resistant or imatinib-intolerant patients with chronic myeloid leukemia in chronic phase: 48-month follow-up results of a phase II study. Leukemia. 2013;27:107-112.

33. Saglio G, Kim DW, Issaragrisil S, et al. Nilotinib versus imatinib for newly diagnosed chronic myeloid leukemia. N Engl J Med. 2010;362:2251-2259.

34. Cortes JE, Khoury HJ, Kantarjian HM, et al. Long-term bosutinib for chronic phase chronic myeloid leukemia after failure of imatinib plus dasatinib and/or nilotinib. Am J Hematol. 2016;91:1206-1214.

35. Gambacorti-Passerini C, Cortes JE, Lipton JH, et al. Safety and efficacy of second-line bosutinib for chronic phase chronic myeloid leukemia over a five-year period: final results of a phase I/II study. Haematologica. 2018;103:1298-1307.

36. Cortes JE, Kim DW, Kantarjian HM, et al. Bosutinib versus imatinib in newly diagnosed chronic-phase chronic myeloid leukemia: results from the BELA trial. J Clin Oncol. 2012;30:3486-3492.

37. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. A phase 2 trial of ponatinib in Philadelphia chromosome-positive leukemias. N Engl J Med. 2013;369:1783-1796.

38. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. Ponatinib efficacy and safety in Philadelphia chromosome-positive leukemia: final 5-year results of the phase 2 PACE trial. Blood. 2018;132:393-404.

References

1. Faderl S, Talpaz M, Estrov Z, et al. The biology of chronic myeloid leukemia. N Engl J Med. 1999;341:164-172.

2. Surveillance, Epidemiology, and End Results Program. Cancer Stat Facts: Leukemia - Chronic Myeloid Leukemia (CML). 2018.

3. Huang X, Cortes J, Kantarjian H. Estimations of the increasing prevalence and plateau prevalence of chronic myeloid leukemia in the era of tyrosine kinase inhibitor therapy. Cancer. 2012;118:3123-3127.

4. Savage DG, Szydlo RM, Chase A, et al. Bone marrow transplantation for chronic myeloid leukaemia: the effects of differing criteria for defining chronic phase on probabilities of survival and relapse. Br J Haematol. 1997;99:30-35.

5. Knox WF, Bhavnani M, Davson J, Geary CG. Histological classification of chronic granulocytic leukaemia. Clin Lab Haematol. 1984;6:171-175.

6. Kvasnicka HM, Thiele J, Schmitt-Graeff A, et al. Impact of bone marrow morphology on multivariate risk classification in chronic myelogenous leukemia. Acta Haematol. 2003;109:53-56.

7. Cortes JE, Talpaz M, O’Brien S, et al. Staging of chronic myeloid leukemia in the imatinib era: an evaluation of the World Health Organization proposal. Cancer. 2006;106:1306-1315.

8. Druker BJ. Chronic myeloid leukemia. In: DeVita VT, Lawrence TS, Rosenberg SA, eds. DeVita, Hellman, and Rosenberg’s Cancer Principles & Practice of Oncology. 8th ed. Philadelphia, PA: Lippincott, Williams and Wilkins; 2007:2267-2304.

9. Arber DA, Orazi A, Hasserjian R, et al. The 2016 revision to the World Health Organization classification of myeloid neoplasms and acute leukemia. Blood. 2016;127:2391-2405.

10. Fabarius A, Leitner A, Hochhaus A, et al. Impact of additional cytogenetic aberrations at diagnosis on prognosis of CML: long-term observation of 1151 patients from the randomized CML Study IV. Blood. 2011;118:6760-6768.

11. Alhuraiji A, Kantarjian H, Boddu P, et al. Prognostic significance of additional chromosomal abnormalities at the time of diagnosis in patients with chronic myeloid leukemia treated with frontline tyrosine kinase inhibitors. Am J Hematol. 2018;93:84-90.

12. Melo JV. BCR-ABL gene variants. Baillieres Clin Haematol. 1997;10:203-222.

13. Kantarjian HM, Talpaz M, Cortes J, et al. Quantitative polymerase chain reaction monitoring of BCR-ABL during therapy with imatinib mesylate (STI571; gleevec) in chronic-phase chronic myelogenous leukemia. Clin Cancer Res. 2003;9:160-166.

14. Hughes T, Deininger M, Hochhaus A, et al. Monitoring CML patients responding to treatment with tyrosine kinase inhibitors: review and recommendations for harmonizing current methodology for detecting BCR-ABL transcripts and kinase domain mutations and for expressing results. Blood. 2006;108:28-37.

15. Hochhaus A, Larson RA, Guilhot F, et al. Long-term outcomes of imatinib treatment for chronic myeloid leukemia. N Engl J Med. 2017;376:917-927.

16. Cortes JE, Saglio G, Kantarjian HM, et al. Final 5-year study results of DASISION: the Dasatinib Versus Imatinib Study in Treatment-Naive Chronic Myeloid Leukemia Patients trial. J Clin Oncol. 2016;34:2333-2340.

17. Hochhaus A, Saglio G, Hughes TP, et al. Long-term benefits and risks of frontline nilotinib vs imatinib for chronic myeloid leukemia in chronic phase: 5-year update of the randomized ENESTnd trial. Leukemia. 2016;30:1044-1054.

18. Cortes JE, Gambacorti-Passerini C, Deininger MW, et al. Bosutinib versus imatinib for newly diagnosed chronic myeloid leukemia: results from the randomized BFORE trial. J Clin Oncol. 2018;36:231-237.

19. Radich JP, Deininger M, Abboud CN, et al. Chronic Myeloid Leukemia, Version 1.2019, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2018;16:1108-1135.

20. Faderl S, Talpaz M, Estrov Z, Kantarjian HM. Chronic myelogenous leukemia: biology and therapy. Ann Intern Med. 1999;131:207-219.

21. O’Brien SG, Guilhot F, Larson RA, et al. Imatinib compared with interferon and low-dose cytarabine for newly diagnosed chronic-phase chronic myeloid leukemia. N Engl J Med. 2003;348:994-1004.

22. Baccarani M, Deininger MW, Rosti G, et al. European LeukemiaNet recommendations for the management of chronic myeloid leukemia: 2013. Blood. 2013;122:872-884.

23. Larripa I, Ruiz MS, Gutierrez M, Bianchini M. [Guidelines for molecular monitoring of BCR-ABL1 in chronic myeloid leukemia patients by RT-qPCR]. Medicina (B Aires). 2017;77:61-72.

24. Marin D, Ibrahim AR, Lucas C, et al. Assessment of BCR-ABL1 transcript levels at 3 months is the only requirement for predicting outcome for patients with chronic myeloid leukemia treated with tyrosine kinase inhibitors. J Clin Oncol. 2012;30:232-238.

25. Hughes TP, Ross DM. Moving treatment-free remission into mainstream clinical practice in CML. Blood. 2016;128:17-23.

26. Druker BJ, Talpaz M, Resta DJ, et al. Efficacy and safety of a specific inhibitor of the BCR-ABL tyrosine kinase in chronic myeloid leukemia. N Engl J Med. 2001;344:1031-1037.

27. Baccarani M, Druker BJ, Branford S, et al. Long-term response to imatinib is not affected by the initial dose in patients with Philadelphia chromosome-positive chronic myeloid leukemia in chronic phase: final update from the Tyrosine Kinase Inhibitor Optimization and Selectivity (TOPS) study. Int J Hematol. 2014;99:616-624.

28. Yeung DT, Osborn MP, White DL, et al. TIDEL-II: first-line use of imatinib in CML with early switch to nilotinib for failure to achieve time-dependent molecular targets. Blood. 2015;125:915-923.

29. Druker BJ, Guilhot F, O’Brien SG, et al. Five-year follow-up of patients receiving imatinib for chronic myeloid leukemia. N Engl J Med. 2006;355:2408-2417.

30. Shah NP, Rousselot P, Schiffer C, et al. Dasatinib in imatinib-resistant or -intolerant chronic-phase, chronic myeloid leukemia patients: 7-year follow-up of study CA180-034. Am J Hematol. 2016;91:869-874.

31. Quintas-Cardama A, Han X, Kantarjian H, Cortes J. Tyrosine kinase inhibitor-induced platelet dysfunction in patients with chronic myeloid leukemia. Blood. 2009;114:261-263.

32. Giles FJ, le Coutre PD, Pinilla-Ibarz J, et al. Nilotinib in imatinib-resistant or imatinib-intolerant patients with chronic myeloid leukemia in chronic phase: 48-month follow-up results of a phase II study. Leukemia. 2013;27:107-112.

33. Saglio G, Kim DW, Issaragrisil S, et al. Nilotinib versus imatinib for newly diagnosed chronic myeloid leukemia. N Engl J Med. 2010;362:2251-2259.

34. Cortes JE, Khoury HJ, Kantarjian HM, et al. Long-term bosutinib for chronic phase chronic myeloid leukemia after failure of imatinib plus dasatinib and/or nilotinib. Am J Hematol. 2016;91:1206-1214.

35. Gambacorti-Passerini C, Cortes JE, Lipton JH, et al. Safety and efficacy of second-line bosutinib for chronic phase chronic myeloid leukemia over a five-year period: final results of a phase I/II study. Haematologica. 2018;103:1298-1307.

36. Cortes JE, Kim DW, Kantarjian HM, et al. Bosutinib versus imatinib in newly diagnosed chronic-phase chronic myeloid leukemia: results from the BELA trial. J Clin Oncol. 2012;30:3486-3492.

37. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. A phase 2 trial of ponatinib in Philadelphia chromosome-positive leukemias. N Engl J Med. 2013;369:1783-1796.

38. Cortes JE, Kim DW, Pinilla-Ibarz J, et al. Ponatinib efficacy and safety in Philadelphia chromosome-positive leukemia: final 5-year results of the phase 2 PACE trial. Blood. 2018;132:393-404.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
131-141
Page Number
131-141
Publications
Publications
Topics
Article Type
Display Headline
Chronic Myeloid Leukemia: Selecting First-line TKI Therapy
Display Headline
Chronic Myeloid Leukemia: Selecting First-line TKI Therapy
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Calculating Risk for Poor Outcomes After Transcatheter Aortic Valve Replacement

Article Type
Changed
Thu, 04/23/2020 - 15:10
Display Headline
Calculating Risk for Poor Outcomes After Transcatheter Aortic Valve Replacement

From Saint Luke’s Mid America Heart Institute/University of Missouri–Kansas City, Kansas City, MO.

Abstract

  • Objective: To outline the tools available to help understand the risk of transcatheter aortic valve replacement (TAVR) and the gaps in knowledge regarding TAVR risk estimation.
  • Methods: Review of the literature.
  • Results: Two models developed and validated by the American College of Cardiology can be used to estimate the risk of short-term mortality, a 6-variable in-hospital model designed for clinical use and a 41-variable 30-day model designed primarily for site comparisons and quality improvement. Importantly, neither model should be used to inform the choice of TAVR versus surgical aortic valve replacement. Regarding long-term outcomes, a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR has been developed and validated. Factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, dependencies in activities of daily living, and dementia. If a patient has ≥ 2 or 3 major risk factors for a poor outcome, this risk and the uncertainty about the degree of recovery expected after TAVR should be discussed with the patient (and family).
  • Conclusion: It is important to understand the patient factors that most strongly drive risk of poor outcomes after TAVR and use this information to set appropriate expectations for recovery.

Keywords: aortic valve stenosis; risk factors; postoperative complications; TAVR.

Among patients with severe aortic stenosis, trans­catheter aortic valve replacement (TAVR) has emerged as a less invasive option for aortic valve replacement. This procedure offers substantial reductions in mortality and improvement in quality of life compared with medical therapy1,2 and at least similar long-term outcomes compared to surgical aortic valve replacement (SAVR).3-9

As with any emerging technology, selecting the appropriate patients for TAVR—a procedure with high initial costs10—has been an area of active investigation. As TAVR was first introduced in patients who were considered inoperable, initial efforts focused on trying to identify the patients who did not improve functionally or live longer following TAVR. Termed Cohort C patients, these patients were thought to have too many comorbidities, be too sick, and have too little reserve to recover from TAVR, and in the early trials, represented a substantial minority of the patients. For example, in pivotal clinical trials of patients at high or extreme surgical risk, approximately 1 in 4 patients who were treated with TAVR were dead at 1 year.1,3,11 Furthermore, a number of patients who received TAVR were alive at 1 year but continued to have significant heart failure symptoms and functional limitations.2,4 Practitioners,12,13 regulators,14 and third-party payers15 have recommended that TAVR should not be offered to patients in whom valve replacement would not be expected to positively impact either their survival or quality of life, but how best to identify these patients has been less clear.

More recently, as the use of TAVR has moved down the risk spectrum, patient selection for TAVR has shifted to understanding which patients should be preferentially treated with TAVR versus SAVR. While patients often prefer a less invasive treatment option with faster recovery—which is what TAVR offers—there are lingering questions about valve longevity, need for a pacemaker (and the associated long-term implications), and the ability to treat other cardiovascular conditions (eg, Maze, mitral valve repair) that potentially make a patient a more appropriate candidate for valve surgery. This review outlines the tools currently available to help understand the risk of TAVR and the gaps in knowledge.

Short-Term Outcomes

When TAVR was initially introduced, the 30-day mortality rate was 5% to 8%.1,11,16 This high mortality rate was a function of treating very ill patients and more invasive procedures with larger sheath sizes and routine use of general anesthesia, transesophageal echocardiography, pulmonary artery catheterization, and so on. Over time, however, this rate has gone down substantially, with the 30-day mortality rate in intermediate- and low-risk patients now ranging from 0.5% to 1%.8,17-19 Although this low mortality rate indicates that the vast majority of patients will survive to discharge from the hospital, 2 models can be used to estimate the risk of short-term mortality: an in-hospital20 and a 30-day model,21 both developed and validated by the American College of Cardiology. The in-hospital model was developed for clinical use, as it includes only 6 variables (age, renal function, severe lung disease, non-femoral access, New York Heart Association class IV, and acuity of the procedure [elective versus urgent versus shock versus emergent])20 and has an online calculator (http://tools.acc.org/tavrrisk/). The 30-day model was developed for risk adjustment (primarily for site comparisons and quality improvement) and includes 41 variables (including pre-TAVR patient health status and gait speed).21

While 30 days is a better time frame for assessment because outcome is less impacted by differences in local post-acute care facilities, we explicitly did not create a parsimonious 30-day mortality model for clinical use due to concern that having such a model would allow for indirect comparisons with estimated risk of SAVR using the Society of Thoracic Surgeons risk model (http://riskcalc.sts.org/stswebriskcalc). It would be tempting to estimate a patient’s risk of mortality with the TAVR calculator and the SAVR calculator and use those risk estimates to inform the choice of treatment; however, these risk estimates should not be directly compared to make treatment selections, as they were built on entirely different patient populations. In real-world practice, there is minimal overlap in the characteristics of patients who are treated with TAVR and SAVR. For example, in an analysis that merged surgical and transcatheter databases, less than 25% of patients treated with TAVR could be matched to a clinically similar patient treated with SAVR.22 As such, these TAVR models should be used to estimate a patient’s risk for short-term mortality, but should not be used to contribute to the decision on TAVR versus SAVR.

 

 

The decision of selecting SAVR over TAVR is typically driven by factors other than short- or long-term mortality (eg, whether TAVR will be covered by insurance, very young age and concern about durability, need to treat concomitant mitral regurgitation or aortopathy), as clinical trials have shown that survival and quality of life outcomes are at least as good with TAVR compared with SAVR.6,7,9,23 In fact, in an analysis that compared similar patients treated with TAVR versus SAVR and specifically looked for patient factors that might make one treatment preferable to the other, patients who had a prior cardiac operation and those on home oxygen were more likely to do better with TAVR, whereas no patient factors that favored SAVR were found.24 The majority of patients, however, were expected to have similar long-term outcomes regardless of treatment choice, and as such, the benefit of TAVR appears mostly to be an earlier and easier recovery.

Long-Term Outcomes: Estimating the Risk for Failure to Recover

While many patients who undergo TAVR are quite ill prior to the procedure, with substantial limitations due to the fatigue and shortness of breath associated with severe aortic stenosis, most patients recover well after the procedure, with marked improvement in symptoms and functional capacity. Approximately 25% to 35% of patients currently treated with TAVR commercially (ie, intermediate- and high-surgical-risk patients) either die or do not recover a reasonable quality of life after the procedure. Identifying those patients prior to the procedure can be challenging. We have previously developed and externally validated a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR.25,26 The factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, and dementia. For example, a patient who is short of breath at rest, is on home oxygen, has a serum creatinine of 2.5 mg/dL, and has atrial fibrillation has an estimated risk of poor outcome at 1 year of ~70%. However, it should be noted that ~25% of patients with no risk factors for poor outcomes (ie, those considered “low risk”) still have a poor outcome at 1 year after TAVR, as the patients who undergo TAVR are typically at an advanced age with at least some comorbidities. Therefore, a 1-year mortality rate of 10% to 15% would not be unexpected in this population independent of the TAVR, although this will likely change over time as TAVR expands to patients at low surgical risk.

Beyond clinical factors, frailty negatively impacts both survival and quality of life after TAVR. Frailty is a geriatric syndrome of impaired physiologic reserve and decreased resistance to stressors27 that is characterized by weakness, slowness, exhaustion, wasting, and low activity level. Across a wide variety of clinical situations (eg, pneumonia,28 myocardial infarction,29 general30,31 and cardiac surgery32,33), frailty increases the risk of morbidity and mortality after nearly any intervention34 or clinical insult, independent of traditional demographic and clinical risk factors. Frail patients often do better with less invasive interventions such as TAVR compared with traditional surgery, but nonetheless remain at increased risk for death35-37 or failure to recover quality of life and functional status25,37 after TAVR. However, there are unique challenges in both assessing and managing frailty in patients who are considered potential candidates for TAVR. One challenge is the lack of a laboratory or radiologic test for frailty; instead, the lack of physiologic reserve of frailty is identified through a combination of factors, such as slow gait speed, weak grip strength, and unintentional weight loss. While these factors readily identify frail patients in general elderly populations, in patients with severe symptomatic aortic stenosis, these metrics can be impacted by the disease process itself. This distinction is important as slow gait speed that is due to aortic stenosis will be “fixed” by TAVR, but slow gait speed from frailty would identify a patient who will have a difficult time recovering from the procedure. For example, in the CoreValve High Risk Pivotal Trial, 80% of patients had a slow gait speed and 67% had a weak grip strength,5 and yet 58% of patients in this trial were alive and with a reasonable quality of life 1 year after TAVR.6 A number of studies have attempted to define true frailty within the pre-TAVR population, that which represents decreased physiologic reserve and an impaired ability to recover from an insult, and the factors that appear to be most prognostically important are malnutrition38 or unintentional weight loss25 and the inability to be independent in activities of daily living (eg, dressing, feeding, transferring).25,37

Even with frailty assessments, the ability to predict who is or is not going to have a poor outcome after TAVR (ie, to use pre-procedural factors to identify patients who perhaps should not be offered TAVR because he or she will not recover from the procedure) is exceedingly difficult. The Table shows how to grossly estimate risk using the major factors that impact risk based on the more precise estimates from our models.25,26

Estimation of Risk for Poor Outcome

The model shown in the Table can be used to estimate a patient’s risk for a poor outcome, but it should be noted that even at the extreme high end of risk, there will be some patients who still do well after TAVR. Furthermore, being high risk for a poor outcome after TAVR does not imply anything about how the patient would do without TAVR, as many of these patients would likely die even sooner or have worse quality of life with medical therapy only. However, if a patient has ≥ 2 or 3 major risk factors for a poor outcome, it may be worthwhile to have a serious conversation with the patient (and family) about this risk and the uncertainty about the degree of recovery expected after TAVR.

Conclusion

Calculating the risk of TAVR can be complicated. In patients who are electively treated using transfemoral access and a less invasive approach, the short-term risk of mortality is very low. Risk calculators can be used to estimate short-term risk, but the patients who are high risk for in-hospital mortality are often fairly easy to recognize, as the factors that drive that risk are not subtle (eg, the patient is in shock at the time of the procedure). The true risk of TAVR lies in the inability to recover from the procedure—being chronically ill, frail, or debilitated to a degree that the patient either dies or fails to recover a reasonable quality of life. Given the overlap of symptomatic aortic stenosis with true frailty, it is often difficult to identify these patients who will not thrive after TAVR. Understanding the patient factors that most strongly drive risk of poor outcomes after TAVR, and allowing this information to guide the conversation prior to TAVR so as to set appropriate expectations for recovery, can be a good place to start.

Corresponding author: Suzanne V. Arnold, MD, MHA, 4401 Wornall Rd., Kansas City, MO 64111.

Financial disclosures: This work was funded in part by grant K23HL116799 from the National Institutes of Health.

References

1. Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363:1597-1607.

2. Reynolds MR, Magnuson EA, Lei Y, et al. Health-related quality of life after transcatheter aortic valve replacement in inoperable patients with severe aortic stenosis. Circulation. 2011;124(:1964-1972.

3. Smith CR, Leon MB, Mack MJ, et al. Transcatheter versus surgical aortic-valve replacement in high-risk patients. N Engl J Med. 2011;364:2187-2198.

4. Reynolds MR, Magnuson EA, Wang K, et al. Health-related quality of life after transcatheter or surgical aortic valve replacement in high-risk patients with severe aortic stenosis: results from the PARTNER (Placement of AoRTic TraNscathetER Valve) trial (Cohort A). J Am Coll Cardiol. 2012;60:548-558.

5. Adams DH, Popma JJ, Reardon MJ, et al. Transcatheter aortic-valve replacement with a self-expanding prosthesis. N Engl J Med. 2014;370:1790-1798.

6. Arnold SV, Reynolds MR, Wang K, et al. Health status after trans­catheter or surgical aortic valve replacement in patients with severe aortic stenosis at increased surgical risk: results from the CoreValve US Pivotal trial. JACC Cardiovasc Interv. 2015;8:1207-1217.

7. Leon MB, Smith CR, Mack MJ, et al. Transcatheter or surgical aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2016;374:1609-1620.

8. Reardon MJ, Van Mieghem NM, Popma JJ, et al. Surgical or trans­catheter aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2017;376:1321-1331.

9. Baron SJ, Arnold SV, Wang K, et al. Health status benefits of trans­catheter vs surgical aortic valve replacement in patients with severe aortic stenosis at intermediate surgical risk: results from the PARTNER 2 randomized clinical trial. JAMA Cardiol. 2017;2:837-845.

10. Reynolds MR, Magnuson EA, Wang K, et al. Cost-effectiveness of transcatheter aortic valve replacement compared with standard care among inoperable patients with severe aortic stenosis: results from the placement of aortic transcatheter valves (PARTNER) trial (Cohort B). Circulation. 2012;125:1102-1109.

11. Popma JJ, Adams DH, Reardon MJ, et al. Transcatheter aortic valve replacement using a self-expanding bioprosthesis in patients with severe aortic stenosis at extreme risk for surgery. J Am Coll Cardiol. 2014;63:1972-1981.

12. Vahanian A, Alfieri O, Al-Attar N, et al. Transcatheter valve implantation for patients with aortic stenosis: a position statement from the European Association of Cardio-Thoracic Surgery (EACTS) and the European Society of Cardiology (ESC), in collaboration with the European Association of Percutaneous Cardiovascular Interventions (EAPCI). Eur Heart J. 2008;29:1463-1470.

13. Holmes DR Jr, Mack MJ, Kaul S, et al. 2012 ACCF/AATS/SCAI/STS expert consensus document on transcatheter aortic valve replacement. J Am Coll Cardiol. 2012;59:1200-1254.

14. US Food and Drug Administration. FDA Executive Summary: Edwards SAPIEN™ Transcatheter Heart Valve. Presented July 20, 2011, Gaithersburg, MD.

15. Centers for Medicare & Medicaid Services. Decision Memo for Transcatheter Aortic Valve Replacement (TAVR) (CAG-00430N). May 5, 2012.

16. Mack MJ, Brennan JM, Brindis R, et al. Outcomes following trans­catheter aortic valve replacement in the United States. JAMA. 2013;310:2069-2077.

17. Thourani VH, Kodali S, Makkar RR, et al. Transcatheter aortic valve replacement versus surgical valve replacement in intermediate-risk patients: a propensity score analysis. Lancet. 2016;387:2218-2225.

18. Mack MJ, Leon MB, Thourani VH, et al. Transcatheter aortic-valve replacement with a balloon-expandable valve in low-risk patients. N Engl J Med. 2019;380:1695-1705.

19. Popma JJ, Deeb GM, Yakubov SJ, et al. Transcatheter aortic-valve replacement with a self-expanding valve in low-risk patients. N Engl J Med. 2019;380:1706-1715.

20. Edwards FH, Peterson ED, Coombs LP, et al. Prediction of operative mortality after valve replacement surgery. J Am Coll Cardiol. 2001;37:885-892.

21. Arnold SV, O’Brien SM, Vemulapalli S, et al. Inclusion of functional status measures in the risk adjustment of 30-day mortality after transcatheter aortic valve replacement: a report from the Society of Thoracic Surgeons/American College of Cardiology TVT Registry. JACC Cardiovasc Interv. 2018;11:581-589.

22. Brennan JM, Thomas L, Cohen DJ, et al. Transcatheter versus surgical aortic valve replacement: propensity-matched comparison. J Am Coll Cardiol. 2017;70:439-450.

23. Reardon MJ, Adams DH, Kleiman NS, et al. 2-year outcomes in patients undergoing surgical or self-expanding transcatheter aortic valve replacement. J Am Coll Cardiol. 2015;66:113-121.

24. Baron SJ, Cohen DJ, Suchindran S, et al. Development of a risk prediction model for 1-year mortality after surgical vs. transcatheter aortic valve replacement in patients with severe aortic stenosis. Circulation. 2016;134(A20166).

25. Arnold SV, Afilalo J, Spertus JA, et al. Prediction of poor outcome after transcatheter aortic valve replacement. J Am Coll Cardiol. 2016;68:1868-1877.

26. Arnold SV, Reynolds MR, Lei Y, et al. Predictors of poor outcomes after transcatheter aortic valve replacement: results from the PARTNER (Placement of Aortic Transcatheter Valve) trial. Circulation. 2014;129:2682-2690.

27. Fried LP, Hadley EC, Walston JD, et al. From bedside to bench: research agenda for frailty. Sci Aging Knowledge Environ. 2005;2005:pe24.

28. Torres OH, Munoz J, Ruiz D, et al. Outcome predictors of pneumonia in elderly patients: importance of functional assessment. J Am Geriatr Soc. 2004;52:1603-1609.

29. Ekerstad N, Swahn E, Janzon M, et al. Frailty is independently associated with short-term outcomes for elderly patients with non-ST-segment elevation myocardial infarction. Circulation. 2011;124:2397-2404.

30. Makary MA, Segev DL, Pronovost PJ, et al. Frailty as a predictor of surgical outcomes in older patients. J Am Coll Surg. 2010;210:901-908.

31. Hewitt J, Moug SJ, Middleton M, et al. Prevalence of frailty and its association with mortality in general surgery. Am J Surg. 2015;209:254-259.

32. Sundermann S, Dademasch A, Praetorius J, et al. Comprehensive assessment of frailty for elderly high-risk patients undergoing cardiac surgery. Eur J Cardiothorac Surg. 2011;39:33-37.

33. Afilalo J, Mottillo S, Eisenberg MJ, et al. Addition of frailty and disability to cardiac surgery risk scores identifies elderly patients at high risk of mortality or major morbidity. Circ Cardiovasc Qual Outcomes. 2012;5:222-228.

34. Lin HS, Watts JN, Peel NM, Hubbard RE. Frailty and post-operative outcomes in older surgical patients: a systematic review. BMC Geriatr. 2016;16:157.

35. Stortecky S, Schoenenberger AW, Moser A, et al. Evaluation of multidimensional geriatric assessment as a predictor of mortality and cardiovascular events after transcatheter aortic valve implantation. JACC Cardiovasc Interv. 2012;5:489-496.

36. Schoenenberger AW, Stortecky S, Neumann S, et al. Predictors of functional decline in elderly patients undergoing transcatheter aortic valve implantation (TAVI). Eur Heart J. 2013;34:684-689.

37. Green P, Arnold SV, Cohen DJ, et al. Relation of frailty to outcomes after transcatheter aortic valve replacement (from the PARTNER trial). Am J Cardiol. 2015;116:264-269.

38. Goldfarb M, Lauck S, Webb J, et al. Malnutrition and mortality in frail and non-frail older adults undergoing aortic valve replacement. Circulation. 2018;138:2202-2211.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
125-129
Sections
Article PDF
Article PDF

From Saint Luke’s Mid America Heart Institute/University of Missouri–Kansas City, Kansas City, MO.

Abstract

  • Objective: To outline the tools available to help understand the risk of transcatheter aortic valve replacement (TAVR) and the gaps in knowledge regarding TAVR risk estimation.
  • Methods: Review of the literature.
  • Results: Two models developed and validated by the American College of Cardiology can be used to estimate the risk of short-term mortality, a 6-variable in-hospital model designed for clinical use and a 41-variable 30-day model designed primarily for site comparisons and quality improvement. Importantly, neither model should be used to inform the choice of TAVR versus surgical aortic valve replacement. Regarding long-term outcomes, a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR has been developed and validated. Factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, dependencies in activities of daily living, and dementia. If a patient has ≥ 2 or 3 major risk factors for a poor outcome, this risk and the uncertainty about the degree of recovery expected after TAVR should be discussed with the patient (and family).
  • Conclusion: It is important to understand the patient factors that most strongly drive risk of poor outcomes after TAVR and use this information to set appropriate expectations for recovery.

Keywords: aortic valve stenosis; risk factors; postoperative complications; TAVR.

Among patients with severe aortic stenosis, trans­catheter aortic valve replacement (TAVR) has emerged as a less invasive option for aortic valve replacement. This procedure offers substantial reductions in mortality and improvement in quality of life compared with medical therapy1,2 and at least similar long-term outcomes compared to surgical aortic valve replacement (SAVR).3-9

As with any emerging technology, selecting the appropriate patients for TAVR—a procedure with high initial costs10—has been an area of active investigation. As TAVR was first introduced in patients who were considered inoperable, initial efforts focused on trying to identify the patients who did not improve functionally or live longer following TAVR. Termed Cohort C patients, these patients were thought to have too many comorbidities, be too sick, and have too little reserve to recover from TAVR, and in the early trials, represented a substantial minority of the patients. For example, in pivotal clinical trials of patients at high or extreme surgical risk, approximately 1 in 4 patients who were treated with TAVR were dead at 1 year.1,3,11 Furthermore, a number of patients who received TAVR were alive at 1 year but continued to have significant heart failure symptoms and functional limitations.2,4 Practitioners,12,13 regulators,14 and third-party payers15 have recommended that TAVR should not be offered to patients in whom valve replacement would not be expected to positively impact either their survival or quality of life, but how best to identify these patients has been less clear.

More recently, as the use of TAVR has moved down the risk spectrum, patient selection for TAVR has shifted to understanding which patients should be preferentially treated with TAVR versus SAVR. While patients often prefer a less invasive treatment option with faster recovery—which is what TAVR offers—there are lingering questions about valve longevity, need for a pacemaker (and the associated long-term implications), and the ability to treat other cardiovascular conditions (eg, Maze, mitral valve repair) that potentially make a patient a more appropriate candidate for valve surgery. This review outlines the tools currently available to help understand the risk of TAVR and the gaps in knowledge.

Short-Term Outcomes

When TAVR was initially introduced, the 30-day mortality rate was 5% to 8%.1,11,16 This high mortality rate was a function of treating very ill patients and more invasive procedures with larger sheath sizes and routine use of general anesthesia, transesophageal echocardiography, pulmonary artery catheterization, and so on. Over time, however, this rate has gone down substantially, with the 30-day mortality rate in intermediate- and low-risk patients now ranging from 0.5% to 1%.8,17-19 Although this low mortality rate indicates that the vast majority of patients will survive to discharge from the hospital, 2 models can be used to estimate the risk of short-term mortality: an in-hospital20 and a 30-day model,21 both developed and validated by the American College of Cardiology. The in-hospital model was developed for clinical use, as it includes only 6 variables (age, renal function, severe lung disease, non-femoral access, New York Heart Association class IV, and acuity of the procedure [elective versus urgent versus shock versus emergent])20 and has an online calculator (http://tools.acc.org/tavrrisk/). The 30-day model was developed for risk adjustment (primarily for site comparisons and quality improvement) and includes 41 variables (including pre-TAVR patient health status and gait speed).21

While 30 days is a better time frame for assessment because outcome is less impacted by differences in local post-acute care facilities, we explicitly did not create a parsimonious 30-day mortality model for clinical use due to concern that having such a model would allow for indirect comparisons with estimated risk of SAVR using the Society of Thoracic Surgeons risk model (http://riskcalc.sts.org/stswebriskcalc). It would be tempting to estimate a patient’s risk of mortality with the TAVR calculator and the SAVR calculator and use those risk estimates to inform the choice of treatment; however, these risk estimates should not be directly compared to make treatment selections, as they were built on entirely different patient populations. In real-world practice, there is minimal overlap in the characteristics of patients who are treated with TAVR and SAVR. For example, in an analysis that merged surgical and transcatheter databases, less than 25% of patients treated with TAVR could be matched to a clinically similar patient treated with SAVR.22 As such, these TAVR models should be used to estimate a patient’s risk for short-term mortality, but should not be used to contribute to the decision on TAVR versus SAVR.

 

 

The decision of selecting SAVR over TAVR is typically driven by factors other than short- or long-term mortality (eg, whether TAVR will be covered by insurance, very young age and concern about durability, need to treat concomitant mitral regurgitation or aortopathy), as clinical trials have shown that survival and quality of life outcomes are at least as good with TAVR compared with SAVR.6,7,9,23 In fact, in an analysis that compared similar patients treated with TAVR versus SAVR and specifically looked for patient factors that might make one treatment preferable to the other, patients who had a prior cardiac operation and those on home oxygen were more likely to do better with TAVR, whereas no patient factors that favored SAVR were found.24 The majority of patients, however, were expected to have similar long-term outcomes regardless of treatment choice, and as such, the benefit of TAVR appears mostly to be an earlier and easier recovery.

Long-Term Outcomes: Estimating the Risk for Failure to Recover

While many patients who undergo TAVR are quite ill prior to the procedure, with substantial limitations due to the fatigue and shortness of breath associated with severe aortic stenosis, most patients recover well after the procedure, with marked improvement in symptoms and functional capacity. Approximately 25% to 35% of patients currently treated with TAVR commercially (ie, intermediate- and high-surgical-risk patients) either die or do not recover a reasonable quality of life after the procedure. Identifying those patients prior to the procedure can be challenging. We have previously developed and externally validated a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR.25,26 The factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, and dementia. For example, a patient who is short of breath at rest, is on home oxygen, has a serum creatinine of 2.5 mg/dL, and has atrial fibrillation has an estimated risk of poor outcome at 1 year of ~70%. However, it should be noted that ~25% of patients with no risk factors for poor outcomes (ie, those considered “low risk”) still have a poor outcome at 1 year after TAVR, as the patients who undergo TAVR are typically at an advanced age with at least some comorbidities. Therefore, a 1-year mortality rate of 10% to 15% would not be unexpected in this population independent of the TAVR, although this will likely change over time as TAVR expands to patients at low surgical risk.

Beyond clinical factors, frailty negatively impacts both survival and quality of life after TAVR. Frailty is a geriatric syndrome of impaired physiologic reserve and decreased resistance to stressors27 that is characterized by weakness, slowness, exhaustion, wasting, and low activity level. Across a wide variety of clinical situations (eg, pneumonia,28 myocardial infarction,29 general30,31 and cardiac surgery32,33), frailty increases the risk of morbidity and mortality after nearly any intervention34 or clinical insult, independent of traditional demographic and clinical risk factors. Frail patients often do better with less invasive interventions such as TAVR compared with traditional surgery, but nonetheless remain at increased risk for death35-37 or failure to recover quality of life and functional status25,37 after TAVR. However, there are unique challenges in both assessing and managing frailty in patients who are considered potential candidates for TAVR. One challenge is the lack of a laboratory or radiologic test for frailty; instead, the lack of physiologic reserve of frailty is identified through a combination of factors, such as slow gait speed, weak grip strength, and unintentional weight loss. While these factors readily identify frail patients in general elderly populations, in patients with severe symptomatic aortic stenosis, these metrics can be impacted by the disease process itself. This distinction is important as slow gait speed that is due to aortic stenosis will be “fixed” by TAVR, but slow gait speed from frailty would identify a patient who will have a difficult time recovering from the procedure. For example, in the CoreValve High Risk Pivotal Trial, 80% of patients had a slow gait speed and 67% had a weak grip strength,5 and yet 58% of patients in this trial were alive and with a reasonable quality of life 1 year after TAVR.6 A number of studies have attempted to define true frailty within the pre-TAVR population, that which represents decreased physiologic reserve and an impaired ability to recover from an insult, and the factors that appear to be most prognostically important are malnutrition38 or unintentional weight loss25 and the inability to be independent in activities of daily living (eg, dressing, feeding, transferring).25,37

Even with frailty assessments, the ability to predict who is or is not going to have a poor outcome after TAVR (ie, to use pre-procedural factors to identify patients who perhaps should not be offered TAVR because he or she will not recover from the procedure) is exceedingly difficult. The Table shows how to grossly estimate risk using the major factors that impact risk based on the more precise estimates from our models.25,26

Estimation of Risk for Poor Outcome

The model shown in the Table can be used to estimate a patient’s risk for a poor outcome, but it should be noted that even at the extreme high end of risk, there will be some patients who still do well after TAVR. Furthermore, being high risk for a poor outcome after TAVR does not imply anything about how the patient would do without TAVR, as many of these patients would likely die even sooner or have worse quality of life with medical therapy only. However, if a patient has ≥ 2 or 3 major risk factors for a poor outcome, it may be worthwhile to have a serious conversation with the patient (and family) about this risk and the uncertainty about the degree of recovery expected after TAVR.

Conclusion

Calculating the risk of TAVR can be complicated. In patients who are electively treated using transfemoral access and a less invasive approach, the short-term risk of mortality is very low. Risk calculators can be used to estimate short-term risk, but the patients who are high risk for in-hospital mortality are often fairly easy to recognize, as the factors that drive that risk are not subtle (eg, the patient is in shock at the time of the procedure). The true risk of TAVR lies in the inability to recover from the procedure—being chronically ill, frail, or debilitated to a degree that the patient either dies or fails to recover a reasonable quality of life. Given the overlap of symptomatic aortic stenosis with true frailty, it is often difficult to identify these patients who will not thrive after TAVR. Understanding the patient factors that most strongly drive risk of poor outcomes after TAVR, and allowing this information to guide the conversation prior to TAVR so as to set appropriate expectations for recovery, can be a good place to start.

Corresponding author: Suzanne V. Arnold, MD, MHA, 4401 Wornall Rd., Kansas City, MO 64111.

Financial disclosures: This work was funded in part by grant K23HL116799 from the National Institutes of Health.

From Saint Luke’s Mid America Heart Institute/University of Missouri–Kansas City, Kansas City, MO.

Abstract

  • Objective: To outline the tools available to help understand the risk of transcatheter aortic valve replacement (TAVR) and the gaps in knowledge regarding TAVR risk estimation.
  • Methods: Review of the literature.
  • Results: Two models developed and validated by the American College of Cardiology can be used to estimate the risk of short-term mortality, a 6-variable in-hospital model designed for clinical use and a 41-variable 30-day model designed primarily for site comparisons and quality improvement. Importantly, neither model should be used to inform the choice of TAVR versus surgical aortic valve replacement. Regarding long-term outcomes, a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR has been developed and validated. Factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, dependencies in activities of daily living, and dementia. If a patient has ≥ 2 or 3 major risk factors for a poor outcome, this risk and the uncertainty about the degree of recovery expected after TAVR should be discussed with the patient (and family).
  • Conclusion: It is important to understand the patient factors that most strongly drive risk of poor outcomes after TAVR and use this information to set appropriate expectations for recovery.

Keywords: aortic valve stenosis; risk factors; postoperative complications; TAVR.

Among patients with severe aortic stenosis, trans­catheter aortic valve replacement (TAVR) has emerged as a less invasive option for aortic valve replacement. This procedure offers substantial reductions in mortality and improvement in quality of life compared with medical therapy1,2 and at least similar long-term outcomes compared to surgical aortic valve replacement (SAVR).3-9

As with any emerging technology, selecting the appropriate patients for TAVR—a procedure with high initial costs10—has been an area of active investigation. As TAVR was first introduced in patients who were considered inoperable, initial efforts focused on trying to identify the patients who did not improve functionally or live longer following TAVR. Termed Cohort C patients, these patients were thought to have too many comorbidities, be too sick, and have too little reserve to recover from TAVR, and in the early trials, represented a substantial minority of the patients. For example, in pivotal clinical trials of patients at high or extreme surgical risk, approximately 1 in 4 patients who were treated with TAVR were dead at 1 year.1,3,11 Furthermore, a number of patients who received TAVR were alive at 1 year but continued to have significant heart failure symptoms and functional limitations.2,4 Practitioners,12,13 regulators,14 and third-party payers15 have recommended that TAVR should not be offered to patients in whom valve replacement would not be expected to positively impact either their survival or quality of life, but how best to identify these patients has been less clear.

More recently, as the use of TAVR has moved down the risk spectrum, patient selection for TAVR has shifted to understanding which patients should be preferentially treated with TAVR versus SAVR. While patients often prefer a less invasive treatment option with faster recovery—which is what TAVR offers—there are lingering questions about valve longevity, need for a pacemaker (and the associated long-term implications), and the ability to treat other cardiovascular conditions (eg, Maze, mitral valve repair) that potentially make a patient a more appropriate candidate for valve surgery. This review outlines the tools currently available to help understand the risk of TAVR and the gaps in knowledge.

Short-Term Outcomes

When TAVR was initially introduced, the 30-day mortality rate was 5% to 8%.1,11,16 This high mortality rate was a function of treating very ill patients and more invasive procedures with larger sheath sizes and routine use of general anesthesia, transesophageal echocardiography, pulmonary artery catheterization, and so on. Over time, however, this rate has gone down substantially, with the 30-day mortality rate in intermediate- and low-risk patients now ranging from 0.5% to 1%.8,17-19 Although this low mortality rate indicates that the vast majority of patients will survive to discharge from the hospital, 2 models can be used to estimate the risk of short-term mortality: an in-hospital20 and a 30-day model,21 both developed and validated by the American College of Cardiology. The in-hospital model was developed for clinical use, as it includes only 6 variables (age, renal function, severe lung disease, non-femoral access, New York Heart Association class IV, and acuity of the procedure [elective versus urgent versus shock versus emergent])20 and has an online calculator (http://tools.acc.org/tavrrisk/). The 30-day model was developed for risk adjustment (primarily for site comparisons and quality improvement) and includes 41 variables (including pre-TAVR patient health status and gait speed).21

While 30 days is a better time frame for assessment because outcome is less impacted by differences in local post-acute care facilities, we explicitly did not create a parsimonious 30-day mortality model for clinical use due to concern that having such a model would allow for indirect comparisons with estimated risk of SAVR using the Society of Thoracic Surgeons risk model (http://riskcalc.sts.org/stswebriskcalc). It would be tempting to estimate a patient’s risk of mortality with the TAVR calculator and the SAVR calculator and use those risk estimates to inform the choice of treatment; however, these risk estimates should not be directly compared to make treatment selections, as they were built on entirely different patient populations. In real-world practice, there is minimal overlap in the characteristics of patients who are treated with TAVR and SAVR. For example, in an analysis that merged surgical and transcatheter databases, less than 25% of patients treated with TAVR could be matched to a clinically similar patient treated with SAVR.22 As such, these TAVR models should be used to estimate a patient’s risk for short-term mortality, but should not be used to contribute to the decision on TAVR versus SAVR.

 

 

The decision of selecting SAVR over TAVR is typically driven by factors other than short- or long-term mortality (eg, whether TAVR will be covered by insurance, very young age and concern about durability, need to treat concomitant mitral regurgitation or aortopathy), as clinical trials have shown that survival and quality of life outcomes are at least as good with TAVR compared with SAVR.6,7,9,23 In fact, in an analysis that compared similar patients treated with TAVR versus SAVR and specifically looked for patient factors that might make one treatment preferable to the other, patients who had a prior cardiac operation and those on home oxygen were more likely to do better with TAVR, whereas no patient factors that favored SAVR were found.24 The majority of patients, however, were expected to have similar long-term outcomes regardless of treatment choice, and as such, the benefit of TAVR appears mostly to be an earlier and easier recovery.

Long-Term Outcomes: Estimating the Risk for Failure to Recover

While many patients who undergo TAVR are quite ill prior to the procedure, with substantial limitations due to the fatigue and shortness of breath associated with severe aortic stenosis, most patients recover well after the procedure, with marked improvement in symptoms and functional capacity. Approximately 25% to 35% of patients currently treated with TAVR commercially (ie, intermediate- and high-surgical-risk patients) either die or do not recover a reasonable quality of life after the procedure. Identifying those patients prior to the procedure can be challenging. We have previously developed and externally validated a risk model to estimate risk of dying or having a persistently poor quality of life at 1 year after TAVR.25,26 The factors that most significantly increase a patient’s risk for poor outcomes are very poor functional status prior to TAVR, requiring home oxygen, chronic renal insufficiency, atrial fibrillation, and dementia. For example, a patient who is short of breath at rest, is on home oxygen, has a serum creatinine of 2.5 mg/dL, and has atrial fibrillation has an estimated risk of poor outcome at 1 year of ~70%. However, it should be noted that ~25% of patients with no risk factors for poor outcomes (ie, those considered “low risk”) still have a poor outcome at 1 year after TAVR, as the patients who undergo TAVR are typically at an advanced age with at least some comorbidities. Therefore, a 1-year mortality rate of 10% to 15% would not be unexpected in this population independent of the TAVR, although this will likely change over time as TAVR expands to patients at low surgical risk.

Beyond clinical factors, frailty negatively impacts both survival and quality of life after TAVR. Frailty is a geriatric syndrome of impaired physiologic reserve and decreased resistance to stressors27 that is characterized by weakness, slowness, exhaustion, wasting, and low activity level. Across a wide variety of clinical situations (eg, pneumonia,28 myocardial infarction,29 general30,31 and cardiac surgery32,33), frailty increases the risk of morbidity and mortality after nearly any intervention34 or clinical insult, independent of traditional demographic and clinical risk factors. Frail patients often do better with less invasive interventions such as TAVR compared with traditional surgery, but nonetheless remain at increased risk for death35-37 or failure to recover quality of life and functional status25,37 after TAVR. However, there are unique challenges in both assessing and managing frailty in patients who are considered potential candidates for TAVR. One challenge is the lack of a laboratory or radiologic test for frailty; instead, the lack of physiologic reserve of frailty is identified through a combination of factors, such as slow gait speed, weak grip strength, and unintentional weight loss. While these factors readily identify frail patients in general elderly populations, in patients with severe symptomatic aortic stenosis, these metrics can be impacted by the disease process itself. This distinction is important as slow gait speed that is due to aortic stenosis will be “fixed” by TAVR, but slow gait speed from frailty would identify a patient who will have a difficult time recovering from the procedure. For example, in the CoreValve High Risk Pivotal Trial, 80% of patients had a slow gait speed and 67% had a weak grip strength,5 and yet 58% of patients in this trial were alive and with a reasonable quality of life 1 year after TAVR.6 A number of studies have attempted to define true frailty within the pre-TAVR population, that which represents decreased physiologic reserve and an impaired ability to recover from an insult, and the factors that appear to be most prognostically important are malnutrition38 or unintentional weight loss25 and the inability to be independent in activities of daily living (eg, dressing, feeding, transferring).25,37

Even with frailty assessments, the ability to predict who is or is not going to have a poor outcome after TAVR (ie, to use pre-procedural factors to identify patients who perhaps should not be offered TAVR because he or she will not recover from the procedure) is exceedingly difficult. The Table shows how to grossly estimate risk using the major factors that impact risk based on the more precise estimates from our models.25,26

Estimation of Risk for Poor Outcome

The model shown in the Table can be used to estimate a patient’s risk for a poor outcome, but it should be noted that even at the extreme high end of risk, there will be some patients who still do well after TAVR. Furthermore, being high risk for a poor outcome after TAVR does not imply anything about how the patient would do without TAVR, as many of these patients would likely die even sooner or have worse quality of life with medical therapy only. However, if a patient has ≥ 2 or 3 major risk factors for a poor outcome, it may be worthwhile to have a serious conversation with the patient (and family) about this risk and the uncertainty about the degree of recovery expected after TAVR.

Conclusion

Calculating the risk of TAVR can be complicated. In patients who are electively treated using transfemoral access and a less invasive approach, the short-term risk of mortality is very low. Risk calculators can be used to estimate short-term risk, but the patients who are high risk for in-hospital mortality are often fairly easy to recognize, as the factors that drive that risk are not subtle (eg, the patient is in shock at the time of the procedure). The true risk of TAVR lies in the inability to recover from the procedure—being chronically ill, frail, or debilitated to a degree that the patient either dies or fails to recover a reasonable quality of life. Given the overlap of symptomatic aortic stenosis with true frailty, it is often difficult to identify these patients who will not thrive after TAVR. Understanding the patient factors that most strongly drive risk of poor outcomes after TAVR, and allowing this information to guide the conversation prior to TAVR so as to set appropriate expectations for recovery, can be a good place to start.

Corresponding author: Suzanne V. Arnold, MD, MHA, 4401 Wornall Rd., Kansas City, MO 64111.

Financial disclosures: This work was funded in part by grant K23HL116799 from the National Institutes of Health.

References

1. Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363:1597-1607.

2. Reynolds MR, Magnuson EA, Lei Y, et al. Health-related quality of life after transcatheter aortic valve replacement in inoperable patients with severe aortic stenosis. Circulation. 2011;124(:1964-1972.

3. Smith CR, Leon MB, Mack MJ, et al. Transcatheter versus surgical aortic-valve replacement in high-risk patients. N Engl J Med. 2011;364:2187-2198.

4. Reynolds MR, Magnuson EA, Wang K, et al. Health-related quality of life after transcatheter or surgical aortic valve replacement in high-risk patients with severe aortic stenosis: results from the PARTNER (Placement of AoRTic TraNscathetER Valve) trial (Cohort A). J Am Coll Cardiol. 2012;60:548-558.

5. Adams DH, Popma JJ, Reardon MJ, et al. Transcatheter aortic-valve replacement with a self-expanding prosthesis. N Engl J Med. 2014;370:1790-1798.

6. Arnold SV, Reynolds MR, Wang K, et al. Health status after trans­catheter or surgical aortic valve replacement in patients with severe aortic stenosis at increased surgical risk: results from the CoreValve US Pivotal trial. JACC Cardiovasc Interv. 2015;8:1207-1217.

7. Leon MB, Smith CR, Mack MJ, et al. Transcatheter or surgical aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2016;374:1609-1620.

8. Reardon MJ, Van Mieghem NM, Popma JJ, et al. Surgical or trans­catheter aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2017;376:1321-1331.

9. Baron SJ, Arnold SV, Wang K, et al. Health status benefits of trans­catheter vs surgical aortic valve replacement in patients with severe aortic stenosis at intermediate surgical risk: results from the PARTNER 2 randomized clinical trial. JAMA Cardiol. 2017;2:837-845.

10. Reynolds MR, Magnuson EA, Wang K, et al. Cost-effectiveness of transcatheter aortic valve replacement compared with standard care among inoperable patients with severe aortic stenosis: results from the placement of aortic transcatheter valves (PARTNER) trial (Cohort B). Circulation. 2012;125:1102-1109.

11. Popma JJ, Adams DH, Reardon MJ, et al. Transcatheter aortic valve replacement using a self-expanding bioprosthesis in patients with severe aortic stenosis at extreme risk for surgery. J Am Coll Cardiol. 2014;63:1972-1981.

12. Vahanian A, Alfieri O, Al-Attar N, et al. Transcatheter valve implantation for patients with aortic stenosis: a position statement from the European Association of Cardio-Thoracic Surgery (EACTS) and the European Society of Cardiology (ESC), in collaboration with the European Association of Percutaneous Cardiovascular Interventions (EAPCI). Eur Heart J. 2008;29:1463-1470.

13. Holmes DR Jr, Mack MJ, Kaul S, et al. 2012 ACCF/AATS/SCAI/STS expert consensus document on transcatheter aortic valve replacement. J Am Coll Cardiol. 2012;59:1200-1254.

14. US Food and Drug Administration. FDA Executive Summary: Edwards SAPIEN™ Transcatheter Heart Valve. Presented July 20, 2011, Gaithersburg, MD.

15. Centers for Medicare & Medicaid Services. Decision Memo for Transcatheter Aortic Valve Replacement (TAVR) (CAG-00430N). May 5, 2012.

16. Mack MJ, Brennan JM, Brindis R, et al. Outcomes following trans­catheter aortic valve replacement in the United States. JAMA. 2013;310:2069-2077.

17. Thourani VH, Kodali S, Makkar RR, et al. Transcatheter aortic valve replacement versus surgical valve replacement in intermediate-risk patients: a propensity score analysis. Lancet. 2016;387:2218-2225.

18. Mack MJ, Leon MB, Thourani VH, et al. Transcatheter aortic-valve replacement with a balloon-expandable valve in low-risk patients. N Engl J Med. 2019;380:1695-1705.

19. Popma JJ, Deeb GM, Yakubov SJ, et al. Transcatheter aortic-valve replacement with a self-expanding valve in low-risk patients. N Engl J Med. 2019;380:1706-1715.

20. Edwards FH, Peterson ED, Coombs LP, et al. Prediction of operative mortality after valve replacement surgery. J Am Coll Cardiol. 2001;37:885-892.

21. Arnold SV, O’Brien SM, Vemulapalli S, et al. Inclusion of functional status measures in the risk adjustment of 30-day mortality after transcatheter aortic valve replacement: a report from the Society of Thoracic Surgeons/American College of Cardiology TVT Registry. JACC Cardiovasc Interv. 2018;11:581-589.

22. Brennan JM, Thomas L, Cohen DJ, et al. Transcatheter versus surgical aortic valve replacement: propensity-matched comparison. J Am Coll Cardiol. 2017;70:439-450.

23. Reardon MJ, Adams DH, Kleiman NS, et al. 2-year outcomes in patients undergoing surgical or self-expanding transcatheter aortic valve replacement. J Am Coll Cardiol. 2015;66:113-121.

24. Baron SJ, Cohen DJ, Suchindran S, et al. Development of a risk prediction model for 1-year mortality after surgical vs. transcatheter aortic valve replacement in patients with severe aortic stenosis. Circulation. 2016;134(A20166).

25. Arnold SV, Afilalo J, Spertus JA, et al. Prediction of poor outcome after transcatheter aortic valve replacement. J Am Coll Cardiol. 2016;68:1868-1877.

26. Arnold SV, Reynolds MR, Lei Y, et al. Predictors of poor outcomes after transcatheter aortic valve replacement: results from the PARTNER (Placement of Aortic Transcatheter Valve) trial. Circulation. 2014;129:2682-2690.

27. Fried LP, Hadley EC, Walston JD, et al. From bedside to bench: research agenda for frailty. Sci Aging Knowledge Environ. 2005;2005:pe24.

28. Torres OH, Munoz J, Ruiz D, et al. Outcome predictors of pneumonia in elderly patients: importance of functional assessment. J Am Geriatr Soc. 2004;52:1603-1609.

29. Ekerstad N, Swahn E, Janzon M, et al. Frailty is independently associated with short-term outcomes for elderly patients with non-ST-segment elevation myocardial infarction. Circulation. 2011;124:2397-2404.

30. Makary MA, Segev DL, Pronovost PJ, et al. Frailty as a predictor of surgical outcomes in older patients. J Am Coll Surg. 2010;210:901-908.

31. Hewitt J, Moug SJ, Middleton M, et al. Prevalence of frailty and its association with mortality in general surgery. Am J Surg. 2015;209:254-259.

32. Sundermann S, Dademasch A, Praetorius J, et al. Comprehensive assessment of frailty for elderly high-risk patients undergoing cardiac surgery. Eur J Cardiothorac Surg. 2011;39:33-37.

33. Afilalo J, Mottillo S, Eisenberg MJ, et al. Addition of frailty and disability to cardiac surgery risk scores identifies elderly patients at high risk of mortality or major morbidity. Circ Cardiovasc Qual Outcomes. 2012;5:222-228.

34. Lin HS, Watts JN, Peel NM, Hubbard RE. Frailty and post-operative outcomes in older surgical patients: a systematic review. BMC Geriatr. 2016;16:157.

35. Stortecky S, Schoenenberger AW, Moser A, et al. Evaluation of multidimensional geriatric assessment as a predictor of mortality and cardiovascular events after transcatheter aortic valve implantation. JACC Cardiovasc Interv. 2012;5:489-496.

36. Schoenenberger AW, Stortecky S, Neumann S, et al. Predictors of functional decline in elderly patients undergoing transcatheter aortic valve implantation (TAVI). Eur Heart J. 2013;34:684-689.

37. Green P, Arnold SV, Cohen DJ, et al. Relation of frailty to outcomes after transcatheter aortic valve replacement (from the PARTNER trial). Am J Cardiol. 2015;116:264-269.

38. Goldfarb M, Lauck S, Webb J, et al. Malnutrition and mortality in frail and non-frail older adults undergoing aortic valve replacement. Circulation. 2018;138:2202-2211.

References

1. Leon MB, Smith CR, Mack M, et al. Transcatheter aortic-valve implantation for aortic stenosis in patients who cannot undergo surgery. N Engl J Med. 2010;363:1597-1607.

2. Reynolds MR, Magnuson EA, Lei Y, et al. Health-related quality of life after transcatheter aortic valve replacement in inoperable patients with severe aortic stenosis. Circulation. 2011;124(:1964-1972.

3. Smith CR, Leon MB, Mack MJ, et al. Transcatheter versus surgical aortic-valve replacement in high-risk patients. N Engl J Med. 2011;364:2187-2198.

4. Reynolds MR, Magnuson EA, Wang K, et al. Health-related quality of life after transcatheter or surgical aortic valve replacement in high-risk patients with severe aortic stenosis: results from the PARTNER (Placement of AoRTic TraNscathetER Valve) trial (Cohort A). J Am Coll Cardiol. 2012;60:548-558.

5. Adams DH, Popma JJ, Reardon MJ, et al. Transcatheter aortic-valve replacement with a self-expanding prosthesis. N Engl J Med. 2014;370:1790-1798.

6. Arnold SV, Reynolds MR, Wang K, et al. Health status after trans­catheter or surgical aortic valve replacement in patients with severe aortic stenosis at increased surgical risk: results from the CoreValve US Pivotal trial. JACC Cardiovasc Interv. 2015;8:1207-1217.

7. Leon MB, Smith CR, Mack MJ, et al. Transcatheter or surgical aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2016;374:1609-1620.

8. Reardon MJ, Van Mieghem NM, Popma JJ, et al. Surgical or trans­catheter aortic-valve replacement in intermediate-risk patients. N Engl J Med. 2017;376:1321-1331.

9. Baron SJ, Arnold SV, Wang K, et al. Health status benefits of trans­catheter vs surgical aortic valve replacement in patients with severe aortic stenosis at intermediate surgical risk: results from the PARTNER 2 randomized clinical trial. JAMA Cardiol. 2017;2:837-845.

10. Reynolds MR, Magnuson EA, Wang K, et al. Cost-effectiveness of transcatheter aortic valve replacement compared with standard care among inoperable patients with severe aortic stenosis: results from the placement of aortic transcatheter valves (PARTNER) trial (Cohort B). Circulation. 2012;125:1102-1109.

11. Popma JJ, Adams DH, Reardon MJ, et al. Transcatheter aortic valve replacement using a self-expanding bioprosthesis in patients with severe aortic stenosis at extreme risk for surgery. J Am Coll Cardiol. 2014;63:1972-1981.

12. Vahanian A, Alfieri O, Al-Attar N, et al. Transcatheter valve implantation for patients with aortic stenosis: a position statement from the European Association of Cardio-Thoracic Surgery (EACTS) and the European Society of Cardiology (ESC), in collaboration with the European Association of Percutaneous Cardiovascular Interventions (EAPCI). Eur Heart J. 2008;29:1463-1470.

13. Holmes DR Jr, Mack MJ, Kaul S, et al. 2012 ACCF/AATS/SCAI/STS expert consensus document on transcatheter aortic valve replacement. J Am Coll Cardiol. 2012;59:1200-1254.

14. US Food and Drug Administration. FDA Executive Summary: Edwards SAPIEN™ Transcatheter Heart Valve. Presented July 20, 2011, Gaithersburg, MD.

15. Centers for Medicare & Medicaid Services. Decision Memo for Transcatheter Aortic Valve Replacement (TAVR) (CAG-00430N). May 5, 2012.

16. Mack MJ, Brennan JM, Brindis R, et al. Outcomes following trans­catheter aortic valve replacement in the United States. JAMA. 2013;310:2069-2077.

17. Thourani VH, Kodali S, Makkar RR, et al. Transcatheter aortic valve replacement versus surgical valve replacement in intermediate-risk patients: a propensity score analysis. Lancet. 2016;387:2218-2225.

18. Mack MJ, Leon MB, Thourani VH, et al. Transcatheter aortic-valve replacement with a balloon-expandable valve in low-risk patients. N Engl J Med. 2019;380:1695-1705.

19. Popma JJ, Deeb GM, Yakubov SJ, et al. Transcatheter aortic-valve replacement with a self-expanding valve in low-risk patients. N Engl J Med. 2019;380:1706-1715.

20. Edwards FH, Peterson ED, Coombs LP, et al. Prediction of operative mortality after valve replacement surgery. J Am Coll Cardiol. 2001;37:885-892.

21. Arnold SV, O’Brien SM, Vemulapalli S, et al. Inclusion of functional status measures in the risk adjustment of 30-day mortality after transcatheter aortic valve replacement: a report from the Society of Thoracic Surgeons/American College of Cardiology TVT Registry. JACC Cardiovasc Interv. 2018;11:581-589.

22. Brennan JM, Thomas L, Cohen DJ, et al. Transcatheter versus surgical aortic valve replacement: propensity-matched comparison. J Am Coll Cardiol. 2017;70:439-450.

23. Reardon MJ, Adams DH, Kleiman NS, et al. 2-year outcomes in patients undergoing surgical or self-expanding transcatheter aortic valve replacement. J Am Coll Cardiol. 2015;66:113-121.

24. Baron SJ, Cohen DJ, Suchindran S, et al. Development of a risk prediction model for 1-year mortality after surgical vs. transcatheter aortic valve replacement in patients with severe aortic stenosis. Circulation. 2016;134(A20166).

25. Arnold SV, Afilalo J, Spertus JA, et al. Prediction of poor outcome after transcatheter aortic valve replacement. J Am Coll Cardiol. 2016;68:1868-1877.

26. Arnold SV, Reynolds MR, Lei Y, et al. Predictors of poor outcomes after transcatheter aortic valve replacement: results from the PARTNER (Placement of Aortic Transcatheter Valve) trial. Circulation. 2014;129:2682-2690.

27. Fried LP, Hadley EC, Walston JD, et al. From bedside to bench: research agenda for frailty. Sci Aging Knowledge Environ. 2005;2005:pe24.

28. Torres OH, Munoz J, Ruiz D, et al. Outcome predictors of pneumonia in elderly patients: importance of functional assessment. J Am Geriatr Soc. 2004;52:1603-1609.

29. Ekerstad N, Swahn E, Janzon M, et al. Frailty is independently associated with short-term outcomes for elderly patients with non-ST-segment elevation myocardial infarction. Circulation. 2011;124:2397-2404.

30. Makary MA, Segev DL, Pronovost PJ, et al. Frailty as a predictor of surgical outcomes in older patients. J Am Coll Surg. 2010;210:901-908.

31. Hewitt J, Moug SJ, Middleton M, et al. Prevalence of frailty and its association with mortality in general surgery. Am J Surg. 2015;209:254-259.

32. Sundermann S, Dademasch A, Praetorius J, et al. Comprehensive assessment of frailty for elderly high-risk patients undergoing cardiac surgery. Eur J Cardiothorac Surg. 2011;39:33-37.

33. Afilalo J, Mottillo S, Eisenberg MJ, et al. Addition of frailty and disability to cardiac surgery risk scores identifies elderly patients at high risk of mortality or major morbidity. Circ Cardiovasc Qual Outcomes. 2012;5:222-228.

34. Lin HS, Watts JN, Peel NM, Hubbard RE. Frailty and post-operative outcomes in older surgical patients: a systematic review. BMC Geriatr. 2016;16:157.

35. Stortecky S, Schoenenberger AW, Moser A, et al. Evaluation of multidimensional geriatric assessment as a predictor of mortality and cardiovascular events after transcatheter aortic valve implantation. JACC Cardiovasc Interv. 2012;5:489-496.

36. Schoenenberger AW, Stortecky S, Neumann S, et al. Predictors of functional decline in elderly patients undergoing transcatheter aortic valve implantation (TAVI). Eur Heart J. 2013;34:684-689.

37. Green P, Arnold SV, Cohen DJ, et al. Relation of frailty to outcomes after transcatheter aortic valve replacement (from the PARTNER trial). Am J Cardiol. 2015;116:264-269.

38. Goldfarb M, Lauck S, Webb J, et al. Malnutrition and mortality in frail and non-frail older adults undergoing aortic valve replacement. Circulation. 2018;138:2202-2211.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
125-129
Page Number
125-129
Publications
Publications
Topics
Article Type
Display Headline
Calculating Risk for Poor Outcomes After Transcatheter Aortic Valve Replacement
Display Headline
Calculating Risk for Poor Outcomes After Transcatheter Aortic Valve Replacement
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Use of Hybrid Coronary Revascularization in Patients with Multivessel Coronary Artery Disease

Article Type
Changed
Thu, 04/23/2020 - 15:17
Display Headline
Use of Hybrid Coronary Revascularization in Patients with Multivessel Coronary Artery Disease

Study Overview

Objective. To investigate the 5-year clinical outcome of patients undergoing hybrid revascularization for multivessel coronary artery disease (CAD).

Design. Multicenter, open-label, prospective randomized control trial.

Setting and participants. 200 patients with multivessel CAD referred for conventional surgical revascularization were randomly assigned to undergo hybrid coronary revascularization (HCR) or coronary artery bypass grafting (CABG).

Main outcome measures. The primary endpoint was all-cause mortality at 5 years.

Main results. After excluding 9 patients who were lost to follow-up before 5 years, 191 patients (94 in HCR group and 97 in CABG group) formed the basis of the study. All-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction (4.3% versus 7.2%, P = 0.30), repeat revascularization (37.2% versus 45.4%, P = 0.38), stroke (2.1% versus 4.1%, P = 0.35), and major adverse and cardiac and cerebrovascular events (45.2% versus 53.4%, P = 0.39) were similar in the 2 groups. These findings were consistent across all levels of risk for surgical complications (EuroScore) and for complexity of revascularization (SYNTAX score).

Conclusion. HCR has similar 5-year all-cause mortality when compared with conventional CABG.

Commentary

HCR has been proposed as a less invasive, effective alternative revascularization strategy to conventional CABG for patients with multivessel CAD. The hybrid approach typically combines the long-term durability of grafting of the left anterior descending artery (LAD) using the left internal mammary artery and the percutaneous coronary intervention (PCI) for non-LAD stenosis; this approach has been shown to have similar or perhaps even better long-term patency compared with saphenous vein grafts.1,2 Previous studies have demonstrated the feasibility of HCR by comparing HCR to conventional CABG at 1 year.2 However, the long-term outcome of HCR compared to conventional CABG has not been previously reported.

 

 

In this context, Tajstra et al reported the 5-year follow-up from their prospective randomized pilot study. They report that among the 200 patients with multivessel coronary disease randomly assigned to either HCR or CABG, all-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction, repeat revascularization, stroke, and major adverse and cardiac and cerebrovascular event (MACCE) were also similar in the 2 groups.

This is an important study because it is the first to compare the long-term outcome of HCR with conventional CABG; previous studies have been limited due to their short- to mid-term follow-up.2 However, because this study was not powered to assess the superiority of the HCR compared to conventional CABG, future randomized control trials with a larger number of patients are needed.

Future studies must address some important questions. First, the patients in the present study were younger (mean age, 62.1 ± 8.3 years) with less comorbidity and a relatively low SYNTAX score (23.6 ± 6.1 for the HCR arm). As CABG and PCI are associated with similar long- term outcomes in patients with low (< 22) to intermediate (22–32) SYNTAX score,3 comparisons between HCR and multivessel PCI using the current generation of drug-eluting stents are needed. The results from the ongoing Hybrid Coronary Revascularization Trial (NCT03089398) will shed light on this clinical question. Second, whether these findings can be extended to patients with a high baseline SYNTAX score needs further study. Nonetheless, outcomes were similar between the 2 strategies in the intermediate (n = 98) and high (n = 8) SYNTAX score groups. Interestingly, there is no clear benefit of HCR in the high surgical risk groups as measured by EuroScore. Third, in addition to the hard outcomes (death and MACCE), the quality of life of patients measured by an established metric, such as the Seattle Angina Questionnaire, need to be assessed. Last, the completeness of revascularization in each group needs to be further evaluated because incomplete revascularization is a known predictor of adverse outcomes.4,5

 

Applications for Clinical Practice

In patients with multivessel coronary disease with low SYNTAX score, the 5-year outcome for HCR was similar to that of conventional CABG. Further larger studies are needed to assess the superiority of this approach.

—Taishi Hirai, MD, University of Missouri Medical Center, Columbia, MO; Hiroto Kitahara, MD, University of Chicago Medical Center, Chicago, IL; and John Blair, MD, Medstar Washington Hospital Center, Washington, DC

References

1. Lee PH, Kwon O, Ahn JM, et al. Safety and effectiveness of second-generation drug-eluting stents in patients with left main coronary artery disease. J Am Coll Cardiol. 2018;71:832-841.

2. Gasior M, Zembala MO, Tajstra M, et al. Hybrid revascularization for multivessel coronary artery disease. JACC Cardiovasc Interv. 2014;7:1277-1283.

3. Serruys PW, Onuma Y, Garg S, et al. Assessment of the SYNTAX score in the Syntax study. EuroIntervention. 2009;5:50-56.

4. Genereux P, Palmerini T, Caixeta A, et al. Quantification and impact of untreated coronary artery disease after percutaneous coronary intervention: the residual SYNTAX (Synergy Between PCI with Taxus and Cardiac Surgery) score. J Am Coll Cardiol. 2012;59:2165-2174.

5. Choi KH, Lee JM, Koo BK, et al. Prognostic implication of functional incomplete revascularization and residual functional SYNTAX score in patients with coronary artery disease. JACC Cardiovasc Interv. 2018;11:237-245.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
108-109
Sections
Article PDF
Article PDF

Study Overview

Objective. To investigate the 5-year clinical outcome of patients undergoing hybrid revascularization for multivessel coronary artery disease (CAD).

Design. Multicenter, open-label, prospective randomized control trial.

Setting and participants. 200 patients with multivessel CAD referred for conventional surgical revascularization were randomly assigned to undergo hybrid coronary revascularization (HCR) or coronary artery bypass grafting (CABG).

Main outcome measures. The primary endpoint was all-cause mortality at 5 years.

Main results. After excluding 9 patients who were lost to follow-up before 5 years, 191 patients (94 in HCR group and 97 in CABG group) formed the basis of the study. All-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction (4.3% versus 7.2%, P = 0.30), repeat revascularization (37.2% versus 45.4%, P = 0.38), stroke (2.1% versus 4.1%, P = 0.35), and major adverse and cardiac and cerebrovascular events (45.2% versus 53.4%, P = 0.39) were similar in the 2 groups. These findings were consistent across all levels of risk for surgical complications (EuroScore) and for complexity of revascularization (SYNTAX score).

Conclusion. HCR has similar 5-year all-cause mortality when compared with conventional CABG.

Commentary

HCR has been proposed as a less invasive, effective alternative revascularization strategy to conventional CABG for patients with multivessel CAD. The hybrid approach typically combines the long-term durability of grafting of the left anterior descending artery (LAD) using the left internal mammary artery and the percutaneous coronary intervention (PCI) for non-LAD stenosis; this approach has been shown to have similar or perhaps even better long-term patency compared with saphenous vein grafts.1,2 Previous studies have demonstrated the feasibility of HCR by comparing HCR to conventional CABG at 1 year.2 However, the long-term outcome of HCR compared to conventional CABG has not been previously reported.

 

 

In this context, Tajstra et al reported the 5-year follow-up from their prospective randomized pilot study. They report that among the 200 patients with multivessel coronary disease randomly assigned to either HCR or CABG, all-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction, repeat revascularization, stroke, and major adverse and cardiac and cerebrovascular event (MACCE) were also similar in the 2 groups.

This is an important study because it is the first to compare the long-term outcome of HCR with conventional CABG; previous studies have been limited due to their short- to mid-term follow-up.2 However, because this study was not powered to assess the superiority of the HCR compared to conventional CABG, future randomized control trials with a larger number of patients are needed.

Future studies must address some important questions. First, the patients in the present study were younger (mean age, 62.1 ± 8.3 years) with less comorbidity and a relatively low SYNTAX score (23.6 ± 6.1 for the HCR arm). As CABG and PCI are associated with similar long- term outcomes in patients with low (< 22) to intermediate (22–32) SYNTAX score,3 comparisons between HCR and multivessel PCI using the current generation of drug-eluting stents are needed. The results from the ongoing Hybrid Coronary Revascularization Trial (NCT03089398) will shed light on this clinical question. Second, whether these findings can be extended to patients with a high baseline SYNTAX score needs further study. Nonetheless, outcomes were similar between the 2 strategies in the intermediate (n = 98) and high (n = 8) SYNTAX score groups. Interestingly, there is no clear benefit of HCR in the high surgical risk groups as measured by EuroScore. Third, in addition to the hard outcomes (death and MACCE), the quality of life of patients measured by an established metric, such as the Seattle Angina Questionnaire, need to be assessed. Last, the completeness of revascularization in each group needs to be further evaluated because incomplete revascularization is a known predictor of adverse outcomes.4,5

 

Applications for Clinical Practice

In patients with multivessel coronary disease with low SYNTAX score, the 5-year outcome for HCR was similar to that of conventional CABG. Further larger studies are needed to assess the superiority of this approach.

—Taishi Hirai, MD, University of Missouri Medical Center, Columbia, MO; Hiroto Kitahara, MD, University of Chicago Medical Center, Chicago, IL; and John Blair, MD, Medstar Washington Hospital Center, Washington, DC

Study Overview

Objective. To investigate the 5-year clinical outcome of patients undergoing hybrid revascularization for multivessel coronary artery disease (CAD).

Design. Multicenter, open-label, prospective randomized control trial.

Setting and participants. 200 patients with multivessel CAD referred for conventional surgical revascularization were randomly assigned to undergo hybrid coronary revascularization (HCR) or coronary artery bypass grafting (CABG).

Main outcome measures. The primary endpoint was all-cause mortality at 5 years.

Main results. After excluding 9 patients who were lost to follow-up before 5 years, 191 patients (94 in HCR group and 97 in CABG group) formed the basis of the study. All-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction (4.3% versus 7.2%, P = 0.30), repeat revascularization (37.2% versus 45.4%, P = 0.38), stroke (2.1% versus 4.1%, P = 0.35), and major adverse and cardiac and cerebrovascular events (45.2% versus 53.4%, P = 0.39) were similar in the 2 groups. These findings were consistent across all levels of risk for surgical complications (EuroScore) and for complexity of revascularization (SYNTAX score).

Conclusion. HCR has similar 5-year all-cause mortality when compared with conventional CABG.

Commentary

HCR has been proposed as a less invasive, effective alternative revascularization strategy to conventional CABG for patients with multivessel CAD. The hybrid approach typically combines the long-term durability of grafting of the left anterior descending artery (LAD) using the left internal mammary artery and the percutaneous coronary intervention (PCI) for non-LAD stenosis; this approach has been shown to have similar or perhaps even better long-term patency compared with saphenous vein grafts.1,2 Previous studies have demonstrated the feasibility of HCR by comparing HCR to conventional CABG at 1 year.2 However, the long-term outcome of HCR compared to conventional CABG has not been previously reported.

 

 

In this context, Tajstra et al reported the 5-year follow-up from their prospective randomized pilot study. They report that among the 200 patients with multivessel coronary disease randomly assigned to either HCR or CABG, all-cause mortality at 5-year follow-up was similar in the 2 groups (6.4% versus 9.2%, P = 0.69). The rates of myocardial infarction, repeat revascularization, stroke, and major adverse and cardiac and cerebrovascular event (MACCE) were also similar in the 2 groups.

This is an important study because it is the first to compare the long-term outcome of HCR with conventional CABG; previous studies have been limited due to their short- to mid-term follow-up.2 However, because this study was not powered to assess the superiority of the HCR compared to conventional CABG, future randomized control trials with a larger number of patients are needed.

Future studies must address some important questions. First, the patients in the present study were younger (mean age, 62.1 ± 8.3 years) with less comorbidity and a relatively low SYNTAX score (23.6 ± 6.1 for the HCR arm). As CABG and PCI are associated with similar long- term outcomes in patients with low (< 22) to intermediate (22–32) SYNTAX score,3 comparisons between HCR and multivessel PCI using the current generation of drug-eluting stents are needed. The results from the ongoing Hybrid Coronary Revascularization Trial (NCT03089398) will shed light on this clinical question. Second, whether these findings can be extended to patients with a high baseline SYNTAX score needs further study. Nonetheless, outcomes were similar between the 2 strategies in the intermediate (n = 98) and high (n = 8) SYNTAX score groups. Interestingly, there is no clear benefit of HCR in the high surgical risk groups as measured by EuroScore. Third, in addition to the hard outcomes (death and MACCE), the quality of life of patients measured by an established metric, such as the Seattle Angina Questionnaire, need to be assessed. Last, the completeness of revascularization in each group needs to be further evaluated because incomplete revascularization is a known predictor of adverse outcomes.4,5

 

Applications for Clinical Practice

In patients with multivessel coronary disease with low SYNTAX score, the 5-year outcome for HCR was similar to that of conventional CABG. Further larger studies are needed to assess the superiority of this approach.

—Taishi Hirai, MD, University of Missouri Medical Center, Columbia, MO; Hiroto Kitahara, MD, University of Chicago Medical Center, Chicago, IL; and John Blair, MD, Medstar Washington Hospital Center, Washington, DC

References

1. Lee PH, Kwon O, Ahn JM, et al. Safety and effectiveness of second-generation drug-eluting stents in patients with left main coronary artery disease. J Am Coll Cardiol. 2018;71:832-841.

2. Gasior M, Zembala MO, Tajstra M, et al. Hybrid revascularization for multivessel coronary artery disease. JACC Cardiovasc Interv. 2014;7:1277-1283.

3. Serruys PW, Onuma Y, Garg S, et al. Assessment of the SYNTAX score in the Syntax study. EuroIntervention. 2009;5:50-56.

4. Genereux P, Palmerini T, Caixeta A, et al. Quantification and impact of untreated coronary artery disease after percutaneous coronary intervention: the residual SYNTAX (Synergy Between PCI with Taxus and Cardiac Surgery) score. J Am Coll Cardiol. 2012;59:2165-2174.

5. Choi KH, Lee JM, Koo BK, et al. Prognostic implication of functional incomplete revascularization and residual functional SYNTAX score in patients with coronary artery disease. JACC Cardiovasc Interv. 2018;11:237-245.

References

1. Lee PH, Kwon O, Ahn JM, et al. Safety and effectiveness of second-generation drug-eluting stents in patients with left main coronary artery disease. J Am Coll Cardiol. 2018;71:832-841.

2. Gasior M, Zembala MO, Tajstra M, et al. Hybrid revascularization for multivessel coronary artery disease. JACC Cardiovasc Interv. 2014;7:1277-1283.

3. Serruys PW, Onuma Y, Garg S, et al. Assessment of the SYNTAX score in the Syntax study. EuroIntervention. 2009;5:50-56.

4. Genereux P, Palmerini T, Caixeta A, et al. Quantification and impact of untreated coronary artery disease after percutaneous coronary intervention: the residual SYNTAX (Synergy Between PCI with Taxus and Cardiac Surgery) score. J Am Coll Cardiol. 2012;59:2165-2174.

5. Choi KH, Lee JM, Koo BK, et al. Prognostic implication of functional incomplete revascularization and residual functional SYNTAX score in patients with coronary artery disease. JACC Cardiovasc Interv. 2018;11:237-245.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
108-109
Page Number
108-109
Publications
Publications
Topics
Article Type
Display Headline
Use of Hybrid Coronary Revascularization in Patients with Multivessel Coronary Artery Disease
Display Headline
Use of Hybrid Coronary Revascularization in Patients with Multivessel Coronary Artery Disease
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Mismatch Between Process and Outcome Measures for Hospital-Acquired Venous Thromboembolism in a Surgical Cohort

Article Type
Changed
Thu, 04/23/2020 - 15:12
Display Headline
Mismatch Between Process and Outcome Measures for Hospital-Acquired Venous Thromboembolism in a Surgical Cohort

From Tufts Medical Center, Boston, MA.

Abstract

  • Objective: Audits at our academic medical center revealed near 100% compliance with protocols for perioperative venous thromboembolism (VTE) prophylaxis, but recent National Surgical Quality Improvement Program data demonstrated a higher than expected incidence of VTE (observed/expected = 1.32). The objective of this study was to identify potential causes of this discrepancy.
  • Design: Retrospective case-control study.
  • Setting: Urban academic medical center with high case-mix indices (Medicare approximately 2.4, non-Medicare approximately 2.0).
  • Participants: 102 surgical inpatients with VTE (September 2012 to October 2015) matched with controls for age, gender, and type of procedure.
  • Measurements: Prevalence of common VTE risk factors, length of stay, number of procedures, index operation times, and postoperative bed rest > 12 hours were assessed. Utilization of and compliance with our VTE risk assessment tool was also investigated.
  • Results: Cases underwent more procedures and had longer lengths of stay and index procedures than controls. In addition, cases were more likely to have had > 12 hours of postoperative bed rest and central venous access than controls. Cases had more infections and were more likely to have severe lung disease, thrombophilia, and a history of prior VTE than controls. No differences in body mass index, tobacco use, current or previous malignancy, or VTE risk assessment form use were observed. Overall, care complexity and risk factors were equally important in determining VTE incidence. Our analyses also revealed lack of strict adherence to our VTE risk stratification protocol and frequent use of suboptimal prophylactic regimens.
  • Conclusion: Well-accepted risk factors and overall care complexity determine VTE risk. Preventing VTE in high-risk patients requires assiduous attention to detail in VTE risk assessment and in delivery of optimal prophylaxis. Patients at especially high risk may require customized prophylactic regimens.

Keywords: hospital-acquired venous thromboembolic disease; VTE prophylaxis, surgical patients.

Deep vein thrombosis (DVT) and pulmonary embolism (PE) are well-recognized causes of morbidity and mortality in surgical patients. Between 350,000 and 600,000 cases of venous thromboembolism (VTE) occur each year in the United States, and it is responsible for approximately 10% of preventable in-hospital fatalities.1-3 Given VTE’s impact on patients and the healthcare system and the fact that it is preventable, intense effort has been focused on developing more effective prophylactic measures to decrease its incidence.2-4 In 2008, the surgeon general issued a “call to action” for increased efforts to prevent VTE.5

The American College of Chest Physicians (ACCP) guidelines subcategorize patients based on type of surgery. In addition, the ACCP guidelines support the use of a Caprini-based scoring system to aid in risk stratification and improve clinical decision-making (Table 1).4,6-9 In general, scores ≥ 5 qualify individuals as high risk. Based on their risk category, patients receive mechanical prophylaxis, chemical prophylaxis, or a combination of the 2. Lower-risk patients who are ambulatory typically receive only mechanical prophylaxis while in bed, whereas higher-risk patients receive a combination of mechanical prophylaxis and chemoprophylaxis measures.7 In general, low-molecular-weight heparin (40 mg daily) and low-dose unfractionated heparin (5000 units 3 times daily) have been the standard evidence-based options for chemoprophylaxis in surgical patients. Absolute contraindications for prophylaxis include active bleeding and known increased risk of bleeding based on patient- or procedure-specific factors.

Caprini Risk Assessment Model

Our hospital, a 350-bed academic medical center in downtown Boston, MA, serving a diverse population with a very high case-mix index (2.4 Medicare and 2.0 non-Medicare), has strict protocols for VTE prophylaxis consistent with the ACCP guidelines and based on the Surgical Care Improvement Project (SCIP) measures published in 2006.10 The SCIP mandates allow for considerable surgeon discretion in the use of chemoprophylaxis for neurosurgical cases and general and orthopedic surgery cases deemed to be at high risk for bleeding. In addition, SCIP requires only that prophylaxis be initiated within 24 hours of surgical end time. Although recent audits revealed nearly 100% compliance with SCIP-mandated protocols, National Surgical Quality Improvement Program (NSQIP) data showed that the incidence of VTE events at our institution was higher than expected (observed/expected [O/E] = 1.32).

In order to determine the reasons for this mismatch between process and outcome performance, we investigated whether there were characteristics of our patient population that contributed to the higher than expected rates of VTE, and we scrutinized our VTE prophylaxis protocol to determine if there were aspects of our process that were also contributory.

Methods

Study Sample

This is a retrospective case-control study of surgical inpatients at our hospital during the period September 2012 to October 2015. Cases were identified as patients diagnosed with a VTE (DVT or PE). Controls were identified from a pool of surgical patients whose courses were not complicated by VTE during the same time frame as the cases and who were matched as closely as possible by procedure code, age, and gender.

 

 

Variables

Patient and hospital course variables that were analyzed included demographics, comorbidities, length of stay, number of procedures, index operation times, duration of postoperative bed rest, use of mechanical prophylaxis, and type of chemoprophylaxis and time frame within which it was initiated. Data were collected via chart review using International Classification of Diseases-9 and -10 codes to identify surgical cases within the allotted time period who were diagnosed with VTE. Demographic variables included age, sex, and ethnicity. Comorbidities included hypertension, diabetes, coronary artery disease, serious lung disease, previous or current malignancy, documented hypercoagulable state, and previous history of VTE. Body mass index (BMI) was also recorded. The aforementioned disease-specific variables were not matched between the case and control groups, as this data was obtained retrospectively during data collection.

Analysis

Associations between case and matched control were analyzed using the paired t-test for continuous variables and McNemar’s test for categorical variables. P values < 0.05 were considered statistically significant. SAS Enterprise Guide 7.15 (Cary, NC) was used for all statistical analyses.

The requirement for informed consent was waived by our Institutional Review Board, as the study was initially deemed to be a quality improvement project, and all data used for this report were de-identified.

Results

Our retrospective case-control analysis included a sample of 102 surgical patients whose courses were complicated by VTE between September 2012 and October 2015. The cases were distributed among 6 different surgical categories (Figure 1): trauma (20%), cancer (10%), cardiovascular (21%), noncancer neurosurgery (28%), elective orthopedics (11%), and miscellaneous general surgery (10%).

Distribution of procedure type.

Comparisons between cases and controls in terms of patient demographics and risk factors are shown in Table 2. No statistically significant difference was observed in ethnicity or race between the 2 groups. Overall, cases had more hip/pelvis/leg fractures at presentation (P = 0.0008). The case group also had higher proportions of patients with postoperative bed rest greater than 12 hours (P = 0.009), central venous access (P < 0.0001), infection (P < 0.0001), and lower extremity edema documented during the hospitalization prior to development of DVT (P < 0.0001). Additionally, cases had significantly greater rates of previous VTE (P = 0.0004), inherited or acquired thrombophilia (P = 0.03), history of stroke (P = 0.0003), and severe lung disease, including pneumonia (P = 0.0008). No significant differences were noted between cases and matched controls in BMI (P = 0.43), current tobacco use (P = 0.71), current malignancy (P = 0.80), previous malignancy (P = 0.83), head trauma (P = 0.17), or acute cardiac disease (myocardial infarction or congestive heart failure; P = 0.12).

Patient Demographics and Risk Factors

Variables felt to indicate overall complexity of hospital course for cases as compared to controls are outlined in Table 3. Cases were found to have significantly longer lengths of stay (median, 15.5 days versus 3 days, P < 0.0001). To account for the possibility that the development of VTE contributed to the increased length of stay in the cases, we also looked at the duration between admission date and the date of VTE diagnosis and determined that cases still had a longer length of stay when this was accounted for (median, 7 days versus 3 days, P < 0.0001). A much higher proportion of cases underwent more than 1 procedure compared to controls (P < 0.0001), and cases had significantly longer index operations as compared to controls (P = 0.002).

Complexity of Care

 

 

Seventeen cases received heparin on induction during their index procedure, compared to 23 controls (P = 0.24). Additionally, 63 cases began a prophylaxis regimen within 24 hours of surgery end time, compared to 68 controls (P = 0.24). The chemoprophylactic regimens utilized in cases and in controls are summarized in Figure 2. Of note, only 26 cases and 32 controls received standard prophylactic regimens with no missed doses (heparin 5000 units 3 times daily or enoxaparin 40 mg daily). Additionally, in over half of cases and a third of controls, nonstandard regimens were ordered. Examples of nonstandard regimens included nonstandard heparin or enoxaparin doses, low-dose warfarin, or aspirin alone. In most cases, nonstandard regimens were justified on the basis of high risk for bleeding.

Frequencies of prophylactic regimens utilized.

Mechanical prophylaxis with pneumatic sequential compression devices (SCDs) was ordered in 93 (91%) cases and 87 (85%) controls; however, we were unable to accurately document uniform compliance in the use of these devices.

With regard to evaluation of our process measures, we found only 17% of cases and controls combined actually had a VTE risk assessment in their chart, and when it was present, it was often incomplete or was completed inaccurately.

 

Discussion

The goal of this study was to identify factors (patient characteristics and/or processes of care) that may be contributing to the higher than expected incidence of VTE events at our medical center, despite internal audits suggesting near perfect compliance with SCIP-mandated protocols. We found that in addition to usual risk factors for VTE, an overarching theme of our case cohort was their high complexity of illness. At baseline, these patients had significantly greater rates of stroke, thrombophilia, severe lung disease, infection, and history of VTE than controls. Moreover, the hospital courses of cases were significantly more complex than those of controls, as these patients had more procedures, longer lengths of stay and longer index operations, higher rates of postoperative bed rest exceeding 12 hours, and more prevalent central venous access than controls (Table 2). Several of these risk factors have been found to contribute to VTE development despite compliance with prophylaxis protocols.

Cassidy et al reviewed a cohort of nontrauma general surgery patients who developed VTE despite receiving appropriate prophylaxis and found that both multiple operations and emergency procedures contributed to the failure of VTE prophylaxis.11 Similarly, Wang et al identified several independent risk factors for VTE despite thromboprophylaxis, including central venous access and infection, as well as intensive care unit admission, hospitalization for cranial surgery, and admission from a long-term care facility.12 While our study did not capture some of these additional factors considered by Wang et al, the presence of risk factors not captured in traditional assessment tools suggests that additional consideration for complex patients is warranted.

 

 

In addition to these nonmodifiable patient characteristics, aspects of our VTE prophylaxis processes likely contributed to the higher than expected rate of VTE. While the electronic medical record at our institution does contain a VTE risk assessment tool based on the Caprini score, we found it often is not used at all or is used incorrectly/incompletely, which likely reflects the fact that physicians are neither prompted nor required to complete the assessment prior to prescribing VTE prophylaxis.

There is a significant body of evidence demonstrating that mandatory computerized VTE risk assessments can effectively reduce VTE rates and that improved outcomes occur shortly after implementation. Cassidy et al demonstrated the benefits of instituting a hospital-wide, mandatory, Caprini-based computerized VTE risk assessment that provides prophylaxis/early ambulation recommendations. Two years after implementing this system, they observed an 84% reduction in DVTs (P < 0.001) and a 55% reduction in PEs (P < 0.001).13 Nimeri et al had similarly impressive success, achieving a reduction in their NSQIP O/E for PE/DVT in general surgery from 6.00 in 2010 to 0.82 (for DVTs) and 0.78 (for PEs) 5 years after implementation of mandatory VTE risk assessment (though they noted that the most dramatic reduction occurred 1 year after implementation).14 Additionally, a recent systematic review and meta-analysis by Borab et al found computerized VTE risk assessments to be associated with a significant decrease in VTE events.15

The risk assessment tool used at our institution is qualitative in nature, and current literature suggests that employing a more quantitative tool may yield improved outcomes. Numerous studies have highlighted the importance of identifying patients at very high risk for VTE, as higher risk may necessitate more careful consideration of their prophylactic regimens. Obi et al found patients with Caprini scores higher than 8 to be at significantly greater risk of developing VTE compared to patients with scores of 7 or 8. Also, patients with scores of 7 or 8 were significantly more likely to have a VTE compared to those with scores of 5 or 6.16 In another study, Lobastov et al identified Caprini scores of 11 or higher as representing an extremely high-risk category for which standard prophylaxis regimens may not be effective.17 Thus, while having mandatory risk assessment has been shown to dramatically decrease VTE incidence, it is important to consider the magnitude of the numerical risk score. This is of particular importance at medical centers with high case-mix indices where patients at the highest risk might need to be managed with different prophylactic guidelines.

Another notable aspect of the process at our hospital was the great variation in the types of prophylactic regimens ordered, and the adherence to what was ordered. Only 25.5% of patients were maintained on a standard prophylactic regimen with no missed doses (heparin 5000 every 8 hours or enoxaparin 40 mg daily). Thus, the vast majority of the patients who went on to develop VTE either were prescribed a nontraditional prophylaxis regimen or missed doses of standard agents. The need for secondary surgical procedures or other invasive interventions may explain many, but not all, of the missed doses.

The timing of prophylaxis initiation for our patients was also found to deviate from accepted standards. Only 16.8% of cases received prophylaxis upon induction of anesthesia, and furthermore, 38% of cases did not receive any anticoagulation within 24 hours of their index operation. While this variability in prophylaxis implementation was acceptable within the SCIP guidelines based on “high risk for bleeding” or other considerations, it likely contributed to our suboptimal outcomes. The variations and interruptions in prophylactic regimens speak to barriers that have previously been reported as contributing factors to noncompliance with VTE prophylaxis.18

 

 

Given these known barriers and the observed underutilization and improper use of our risk assessment tool, we have recently changed our surgical admission order sets such that a mandatory quantitative risk assessment must be done for every surgical patient at the time of admission/operation before other orders can be completed. Following completion of the assessment, the physician will be presented with an appropriate standard regimen based on the individual patient’s risk assessment. Early results of our VTE quality improvement project have been satisfying: in the most recent NSQIP semi-annual report, our O/E for VTE was 0.74, placing us in the first decile. Some of these early reports may simply be the product of the Hawthorne effect; however, we are encouraged by the early improvements seen in other research. While we are hopeful that these changes will result in sustainable improvements in outcomes, patients at extremely high risk may require novel weight-based or otherwise customized aggressive prophylactic regimens. Such regimens have already been proposed for arthroplasty and other high-risk patients.

Future research may identify other risk factors not captured by traditional risk assessments. In addition, research should continue to explore the use and efficacy of standard prophylactic regimens in these populations to help determine if they are sufficient. Currently, weight-based low-molecular-weight heparin dosing and alternative regimens employing fondaparinux are under investigation for very-high-risk patients.19

There were several limitations to the present study. First, due to the retrospective design of our study, we could collect only data that had been uniformly recorded in the charts throughout the study period. Second, we were unable to accurately assess compliance with mechanical prophylaxis. While our chart review showed that the vast majority of cases and controls were ordered to have mechanical prophylaxis, it is impossible to document how often these devices were used appropriately in a retrospective analysis. Anecdotal observation suggests that once patients are out of post-anesthesia or critical care units, SCD use is not standardized. The inability to measure compliance precisely may be leading to an overestimation of our compliance with prophylaxis. Finally, because our study included only patients who underwent surgery at our hospital, our observations may not be generalizable outside our institution.

 

Conclusion

Our study findings reinforce the importance of attention to detail in VTE risk assessment and in ordering and administering VTE prophylactic regimens, especially in high-risk surgical patients. While we adhered to the SCIP-mandated prophylaxis requirements, the complexity of our patients and our lack of a truly standardized approach to risk assessment and prophylactic regimens resulted in suboptimal outcomes. Stricter and more quantitative mandatory VTE risk assessment, along with highly standardized VTE prophylaxis regimens, are required to achieve optimal outcomes.

Corresponding author: Jason C. DeGiovanni, MS, BA, Jason.DeGiovanni@tufts.edu.

Financial disclosures: None.

References

1. Spyropoulos AC, Hussein M, Lin J, et al. Rates of symptomatic venous thromboembolism in US surgical patients: a retrospective administrative database study. J Thromb Thrombolysis. 2009;28:458-464.

2. Deitzelzweig SB, Johnson BH, Lin J, et al. Prevalence of clinical venous thromboembolism in the USA: Current trends and future projections. Am J Hematol. 2011;86:217-220.

3. Horlander KT, Mannino DM, Leeper KV. Pulmonary embolism mortality in the United States, 1979-1998: an analysis using multiple-cause mortality data. Arch Intern Med. 2003;163:1711-1717.

4. Guyatt GH, Akl EA, Crowther M, et al. Introduction to the ninth edition: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(suppl):48S-52S.

5. Office of the Surgeon General; National Heart, Lung, and Blood Institute. The Surgeon General’s Call to Action to Prevent Deep Vein Thrombosis and Pulmonary Embolism. Rockville, MD: Office of the Surgeon General; 2008. www.ncbi.nlm.nih.gov/books/NBK44178/. Accessed May 2, 2019.

6. Pannucci CJ, Swistun L, MacDonald JK, et al. Individualized venous thromboembolism risk stratification using the 2005 Caprini score to identify the benefits and harms of chemoprophylaxis in surgical patients: a meta-analysis. Ann Surg. 2017;265:1094-1102.

7. Caprini JA, Arcelus JI, Hasty JH, et al. Clinical assessment of venous thromboembolic risk in surgical patients. Semin Thromb Hemost. 1991;17(suppl 3):304-312.

8. Caprini JA. Risk assessment as a guide for the prevention of the many faces of venous thromboembolism. Am J Surg. 2010;199:S3-S10.

9. Gould MK, Garcia DA, Wren SM, et al. Prevention of VTE in nonorthopedic surgical patients: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e227S-e277S.

10. The Joint Commission. Surgical Care Improvement Project (SCIP) Measure Information Form (Version 2.1c). www.jointcommission.org/surgical_care_improvement_project_scip_measure_information_form_version_21c/. Accessed June 22, 2016.

11. Cassidy MR, Macht RD, Rosenkranz P, et al. Patterns of failure of a standardized perioperative venous thromboembolism prophylaxis protocol. J Am Coll Surg. 2016;222:1074-1081.

12. Wang TF, Wong CA, Milligan PE, et al. Risk factors for inpatient venous thromboembolism despite thromboprophylaxis. Thromb Res. 2014;133:25-29.

13. Cassidy MR, Rosenkranz P, McAneny D. Reducing postoperative venous thromboembolism complications with a standardized risk-stratified prophylaxis protocol and mobilization program. J Am Coll Surg. 2014;218:1095-1104.

14. Nimeri AA, Gamaleldin MM, McKenna KL, et al. Reduction of venous thromboembolism in surgical patients using a mandatory risk-scoring system: 5-year follow-up of an American College of Surgeons National Quality Improvement Program. Clin Appl Thromb Hemost. 2017;23:392-396.

15. Borab ZM, Lanni MA, Tecce MG, et al. Use of computerized clinical decision support systems to prevent venous thromboembolism in surgical patients: a systematic review and meta-analysis. JAMA Surg. 2017;152:638–645.

16. Obi AT, Pannucci CJ, Nackashi A, et al. Validation of the Caprini venous thromboembolism risk assessment model in critically ill surgical patients. JAMA Surg. 2015;150:941-948.

17. Lobastov K, Barinov V, Schastlivtsev I, et al. Validation of the Caprini risk assessment model for venous thromboembolism in high-risk surgical patients in the background of standard prophylaxis. J Vasc Surg Venous Lymphat Disord. 2016;4:153-160.

18. Kakkar AK, Cohen AT, Tapson VF, et al. Venous thromboembolism risk and prophylaxis in the acute care hospital setting (ENDORSE survey): findings in surgical patients. Ann Surg. 2010;251:330-338.

19. Smythe MA, Priziola J, Dobesh PP, et al. Guidance for the practical management of the heparin anticoagulants in the treatment of venous thromboembolism. J Thromb Thrombolysis. 2016;41:165-186.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
117-124
Sections
Article PDF
Article PDF

From Tufts Medical Center, Boston, MA.

Abstract

  • Objective: Audits at our academic medical center revealed near 100% compliance with protocols for perioperative venous thromboembolism (VTE) prophylaxis, but recent National Surgical Quality Improvement Program data demonstrated a higher than expected incidence of VTE (observed/expected = 1.32). The objective of this study was to identify potential causes of this discrepancy.
  • Design: Retrospective case-control study.
  • Setting: Urban academic medical center with high case-mix indices (Medicare approximately 2.4, non-Medicare approximately 2.0).
  • Participants: 102 surgical inpatients with VTE (September 2012 to October 2015) matched with controls for age, gender, and type of procedure.
  • Measurements: Prevalence of common VTE risk factors, length of stay, number of procedures, index operation times, and postoperative bed rest > 12 hours were assessed. Utilization of and compliance with our VTE risk assessment tool was also investigated.
  • Results: Cases underwent more procedures and had longer lengths of stay and index procedures than controls. In addition, cases were more likely to have had > 12 hours of postoperative bed rest and central venous access than controls. Cases had more infections and were more likely to have severe lung disease, thrombophilia, and a history of prior VTE than controls. No differences in body mass index, tobacco use, current or previous malignancy, or VTE risk assessment form use were observed. Overall, care complexity and risk factors were equally important in determining VTE incidence. Our analyses also revealed lack of strict adherence to our VTE risk stratification protocol and frequent use of suboptimal prophylactic regimens.
  • Conclusion: Well-accepted risk factors and overall care complexity determine VTE risk. Preventing VTE in high-risk patients requires assiduous attention to detail in VTE risk assessment and in delivery of optimal prophylaxis. Patients at especially high risk may require customized prophylactic regimens.

Keywords: hospital-acquired venous thromboembolic disease; VTE prophylaxis, surgical patients.

Deep vein thrombosis (DVT) and pulmonary embolism (PE) are well-recognized causes of morbidity and mortality in surgical patients. Between 350,000 and 600,000 cases of venous thromboembolism (VTE) occur each year in the United States, and it is responsible for approximately 10% of preventable in-hospital fatalities.1-3 Given VTE’s impact on patients and the healthcare system and the fact that it is preventable, intense effort has been focused on developing more effective prophylactic measures to decrease its incidence.2-4 In 2008, the surgeon general issued a “call to action” for increased efforts to prevent VTE.5

The American College of Chest Physicians (ACCP) guidelines subcategorize patients based on type of surgery. In addition, the ACCP guidelines support the use of a Caprini-based scoring system to aid in risk stratification and improve clinical decision-making (Table 1).4,6-9 In general, scores ≥ 5 qualify individuals as high risk. Based on their risk category, patients receive mechanical prophylaxis, chemical prophylaxis, or a combination of the 2. Lower-risk patients who are ambulatory typically receive only mechanical prophylaxis while in bed, whereas higher-risk patients receive a combination of mechanical prophylaxis and chemoprophylaxis measures.7 In general, low-molecular-weight heparin (40 mg daily) and low-dose unfractionated heparin (5000 units 3 times daily) have been the standard evidence-based options for chemoprophylaxis in surgical patients. Absolute contraindications for prophylaxis include active bleeding and known increased risk of bleeding based on patient- or procedure-specific factors.

Caprini Risk Assessment Model

Our hospital, a 350-bed academic medical center in downtown Boston, MA, serving a diverse population with a very high case-mix index (2.4 Medicare and 2.0 non-Medicare), has strict protocols for VTE prophylaxis consistent with the ACCP guidelines and based on the Surgical Care Improvement Project (SCIP) measures published in 2006.10 The SCIP mandates allow for considerable surgeon discretion in the use of chemoprophylaxis for neurosurgical cases and general and orthopedic surgery cases deemed to be at high risk for bleeding. In addition, SCIP requires only that prophylaxis be initiated within 24 hours of surgical end time. Although recent audits revealed nearly 100% compliance with SCIP-mandated protocols, National Surgical Quality Improvement Program (NSQIP) data showed that the incidence of VTE events at our institution was higher than expected (observed/expected [O/E] = 1.32).

In order to determine the reasons for this mismatch between process and outcome performance, we investigated whether there were characteristics of our patient population that contributed to the higher than expected rates of VTE, and we scrutinized our VTE prophylaxis protocol to determine if there were aspects of our process that were also contributory.

Methods

Study Sample

This is a retrospective case-control study of surgical inpatients at our hospital during the period September 2012 to October 2015. Cases were identified as patients diagnosed with a VTE (DVT or PE). Controls were identified from a pool of surgical patients whose courses were not complicated by VTE during the same time frame as the cases and who were matched as closely as possible by procedure code, age, and gender.

 

 

Variables

Patient and hospital course variables that were analyzed included demographics, comorbidities, length of stay, number of procedures, index operation times, duration of postoperative bed rest, use of mechanical prophylaxis, and type of chemoprophylaxis and time frame within which it was initiated. Data were collected via chart review using International Classification of Diseases-9 and -10 codes to identify surgical cases within the allotted time period who were diagnosed with VTE. Demographic variables included age, sex, and ethnicity. Comorbidities included hypertension, diabetes, coronary artery disease, serious lung disease, previous or current malignancy, documented hypercoagulable state, and previous history of VTE. Body mass index (BMI) was also recorded. The aforementioned disease-specific variables were not matched between the case and control groups, as this data was obtained retrospectively during data collection.

Analysis

Associations between case and matched control were analyzed using the paired t-test for continuous variables and McNemar’s test for categorical variables. P values < 0.05 were considered statistically significant. SAS Enterprise Guide 7.15 (Cary, NC) was used for all statistical analyses.

The requirement for informed consent was waived by our Institutional Review Board, as the study was initially deemed to be a quality improvement project, and all data used for this report were de-identified.

Results

Our retrospective case-control analysis included a sample of 102 surgical patients whose courses were complicated by VTE between September 2012 and October 2015. The cases were distributed among 6 different surgical categories (Figure 1): trauma (20%), cancer (10%), cardiovascular (21%), noncancer neurosurgery (28%), elective orthopedics (11%), and miscellaneous general surgery (10%).

Distribution of procedure type.

Comparisons between cases and controls in terms of patient demographics and risk factors are shown in Table 2. No statistically significant difference was observed in ethnicity or race between the 2 groups. Overall, cases had more hip/pelvis/leg fractures at presentation (P = 0.0008). The case group also had higher proportions of patients with postoperative bed rest greater than 12 hours (P = 0.009), central venous access (P < 0.0001), infection (P < 0.0001), and lower extremity edema documented during the hospitalization prior to development of DVT (P < 0.0001). Additionally, cases had significantly greater rates of previous VTE (P = 0.0004), inherited or acquired thrombophilia (P = 0.03), history of stroke (P = 0.0003), and severe lung disease, including pneumonia (P = 0.0008). No significant differences were noted between cases and matched controls in BMI (P = 0.43), current tobacco use (P = 0.71), current malignancy (P = 0.80), previous malignancy (P = 0.83), head trauma (P = 0.17), or acute cardiac disease (myocardial infarction or congestive heart failure; P = 0.12).

Patient Demographics and Risk Factors

Variables felt to indicate overall complexity of hospital course for cases as compared to controls are outlined in Table 3. Cases were found to have significantly longer lengths of stay (median, 15.5 days versus 3 days, P < 0.0001). To account for the possibility that the development of VTE contributed to the increased length of stay in the cases, we also looked at the duration between admission date and the date of VTE diagnosis and determined that cases still had a longer length of stay when this was accounted for (median, 7 days versus 3 days, P < 0.0001). A much higher proportion of cases underwent more than 1 procedure compared to controls (P < 0.0001), and cases had significantly longer index operations as compared to controls (P = 0.002).

Complexity of Care

 

 

Seventeen cases received heparin on induction during their index procedure, compared to 23 controls (P = 0.24). Additionally, 63 cases began a prophylaxis regimen within 24 hours of surgery end time, compared to 68 controls (P = 0.24). The chemoprophylactic regimens utilized in cases and in controls are summarized in Figure 2. Of note, only 26 cases and 32 controls received standard prophylactic regimens with no missed doses (heparin 5000 units 3 times daily or enoxaparin 40 mg daily). Additionally, in over half of cases and a third of controls, nonstandard regimens were ordered. Examples of nonstandard regimens included nonstandard heparin or enoxaparin doses, low-dose warfarin, or aspirin alone. In most cases, nonstandard regimens were justified on the basis of high risk for bleeding.

Frequencies of prophylactic regimens utilized.

Mechanical prophylaxis with pneumatic sequential compression devices (SCDs) was ordered in 93 (91%) cases and 87 (85%) controls; however, we were unable to accurately document uniform compliance in the use of these devices.

With regard to evaluation of our process measures, we found only 17% of cases and controls combined actually had a VTE risk assessment in their chart, and when it was present, it was often incomplete or was completed inaccurately.

 

Discussion

The goal of this study was to identify factors (patient characteristics and/or processes of care) that may be contributing to the higher than expected incidence of VTE events at our medical center, despite internal audits suggesting near perfect compliance with SCIP-mandated protocols. We found that in addition to usual risk factors for VTE, an overarching theme of our case cohort was their high complexity of illness. At baseline, these patients had significantly greater rates of stroke, thrombophilia, severe lung disease, infection, and history of VTE than controls. Moreover, the hospital courses of cases were significantly more complex than those of controls, as these patients had more procedures, longer lengths of stay and longer index operations, higher rates of postoperative bed rest exceeding 12 hours, and more prevalent central venous access than controls (Table 2). Several of these risk factors have been found to contribute to VTE development despite compliance with prophylaxis protocols.

Cassidy et al reviewed a cohort of nontrauma general surgery patients who developed VTE despite receiving appropriate prophylaxis and found that both multiple operations and emergency procedures contributed to the failure of VTE prophylaxis.11 Similarly, Wang et al identified several independent risk factors for VTE despite thromboprophylaxis, including central venous access and infection, as well as intensive care unit admission, hospitalization for cranial surgery, and admission from a long-term care facility.12 While our study did not capture some of these additional factors considered by Wang et al, the presence of risk factors not captured in traditional assessment tools suggests that additional consideration for complex patients is warranted.

 

 

In addition to these nonmodifiable patient characteristics, aspects of our VTE prophylaxis processes likely contributed to the higher than expected rate of VTE. While the electronic medical record at our institution does contain a VTE risk assessment tool based on the Caprini score, we found it often is not used at all or is used incorrectly/incompletely, which likely reflects the fact that physicians are neither prompted nor required to complete the assessment prior to prescribing VTE prophylaxis.

There is a significant body of evidence demonstrating that mandatory computerized VTE risk assessments can effectively reduce VTE rates and that improved outcomes occur shortly after implementation. Cassidy et al demonstrated the benefits of instituting a hospital-wide, mandatory, Caprini-based computerized VTE risk assessment that provides prophylaxis/early ambulation recommendations. Two years after implementing this system, they observed an 84% reduction in DVTs (P < 0.001) and a 55% reduction in PEs (P < 0.001).13 Nimeri et al had similarly impressive success, achieving a reduction in their NSQIP O/E for PE/DVT in general surgery from 6.00 in 2010 to 0.82 (for DVTs) and 0.78 (for PEs) 5 years after implementation of mandatory VTE risk assessment (though they noted that the most dramatic reduction occurred 1 year after implementation).14 Additionally, a recent systematic review and meta-analysis by Borab et al found computerized VTE risk assessments to be associated with a significant decrease in VTE events.15

The risk assessment tool used at our institution is qualitative in nature, and current literature suggests that employing a more quantitative tool may yield improved outcomes. Numerous studies have highlighted the importance of identifying patients at very high risk for VTE, as higher risk may necessitate more careful consideration of their prophylactic regimens. Obi et al found patients with Caprini scores higher than 8 to be at significantly greater risk of developing VTE compared to patients with scores of 7 or 8. Also, patients with scores of 7 or 8 were significantly more likely to have a VTE compared to those with scores of 5 or 6.16 In another study, Lobastov et al identified Caprini scores of 11 or higher as representing an extremely high-risk category for which standard prophylaxis regimens may not be effective.17 Thus, while having mandatory risk assessment has been shown to dramatically decrease VTE incidence, it is important to consider the magnitude of the numerical risk score. This is of particular importance at medical centers with high case-mix indices where patients at the highest risk might need to be managed with different prophylactic guidelines.

Another notable aspect of the process at our hospital was the great variation in the types of prophylactic regimens ordered, and the adherence to what was ordered. Only 25.5% of patients were maintained on a standard prophylactic regimen with no missed doses (heparin 5000 every 8 hours or enoxaparin 40 mg daily). Thus, the vast majority of the patients who went on to develop VTE either were prescribed a nontraditional prophylaxis regimen or missed doses of standard agents. The need for secondary surgical procedures or other invasive interventions may explain many, but not all, of the missed doses.

The timing of prophylaxis initiation for our patients was also found to deviate from accepted standards. Only 16.8% of cases received prophylaxis upon induction of anesthesia, and furthermore, 38% of cases did not receive any anticoagulation within 24 hours of their index operation. While this variability in prophylaxis implementation was acceptable within the SCIP guidelines based on “high risk for bleeding” or other considerations, it likely contributed to our suboptimal outcomes. The variations and interruptions in prophylactic regimens speak to barriers that have previously been reported as contributing factors to noncompliance with VTE prophylaxis.18

 

 

Given these known barriers and the observed underutilization and improper use of our risk assessment tool, we have recently changed our surgical admission order sets such that a mandatory quantitative risk assessment must be done for every surgical patient at the time of admission/operation before other orders can be completed. Following completion of the assessment, the physician will be presented with an appropriate standard regimen based on the individual patient’s risk assessment. Early results of our VTE quality improvement project have been satisfying: in the most recent NSQIP semi-annual report, our O/E for VTE was 0.74, placing us in the first decile. Some of these early reports may simply be the product of the Hawthorne effect; however, we are encouraged by the early improvements seen in other research. While we are hopeful that these changes will result in sustainable improvements in outcomes, patients at extremely high risk may require novel weight-based or otherwise customized aggressive prophylactic regimens. Such regimens have already been proposed for arthroplasty and other high-risk patients.

Future research may identify other risk factors not captured by traditional risk assessments. In addition, research should continue to explore the use and efficacy of standard prophylactic regimens in these populations to help determine if they are sufficient. Currently, weight-based low-molecular-weight heparin dosing and alternative regimens employing fondaparinux are under investigation for very-high-risk patients.19

There were several limitations to the present study. First, due to the retrospective design of our study, we could collect only data that had been uniformly recorded in the charts throughout the study period. Second, we were unable to accurately assess compliance with mechanical prophylaxis. While our chart review showed that the vast majority of cases and controls were ordered to have mechanical prophylaxis, it is impossible to document how often these devices were used appropriately in a retrospective analysis. Anecdotal observation suggests that once patients are out of post-anesthesia or critical care units, SCD use is not standardized. The inability to measure compliance precisely may be leading to an overestimation of our compliance with prophylaxis. Finally, because our study included only patients who underwent surgery at our hospital, our observations may not be generalizable outside our institution.

 

Conclusion

Our study findings reinforce the importance of attention to detail in VTE risk assessment and in ordering and administering VTE prophylactic regimens, especially in high-risk surgical patients. While we adhered to the SCIP-mandated prophylaxis requirements, the complexity of our patients and our lack of a truly standardized approach to risk assessment and prophylactic regimens resulted in suboptimal outcomes. Stricter and more quantitative mandatory VTE risk assessment, along with highly standardized VTE prophylaxis regimens, are required to achieve optimal outcomes.

Corresponding author: Jason C. DeGiovanni, MS, BA, Jason.DeGiovanni@tufts.edu.

Financial disclosures: None.

From Tufts Medical Center, Boston, MA.

Abstract

  • Objective: Audits at our academic medical center revealed near 100% compliance with protocols for perioperative venous thromboembolism (VTE) prophylaxis, but recent National Surgical Quality Improvement Program data demonstrated a higher than expected incidence of VTE (observed/expected = 1.32). The objective of this study was to identify potential causes of this discrepancy.
  • Design: Retrospective case-control study.
  • Setting: Urban academic medical center with high case-mix indices (Medicare approximately 2.4, non-Medicare approximately 2.0).
  • Participants: 102 surgical inpatients with VTE (September 2012 to October 2015) matched with controls for age, gender, and type of procedure.
  • Measurements: Prevalence of common VTE risk factors, length of stay, number of procedures, index operation times, and postoperative bed rest > 12 hours were assessed. Utilization of and compliance with our VTE risk assessment tool was also investigated.
  • Results: Cases underwent more procedures and had longer lengths of stay and index procedures than controls. In addition, cases were more likely to have had > 12 hours of postoperative bed rest and central venous access than controls. Cases had more infections and were more likely to have severe lung disease, thrombophilia, and a history of prior VTE than controls. No differences in body mass index, tobacco use, current or previous malignancy, or VTE risk assessment form use were observed. Overall, care complexity and risk factors were equally important in determining VTE incidence. Our analyses also revealed lack of strict adherence to our VTE risk stratification protocol and frequent use of suboptimal prophylactic regimens.
  • Conclusion: Well-accepted risk factors and overall care complexity determine VTE risk. Preventing VTE in high-risk patients requires assiduous attention to detail in VTE risk assessment and in delivery of optimal prophylaxis. Patients at especially high risk may require customized prophylactic regimens.

Keywords: hospital-acquired venous thromboembolic disease; VTE prophylaxis, surgical patients.

Deep vein thrombosis (DVT) and pulmonary embolism (PE) are well-recognized causes of morbidity and mortality in surgical patients. Between 350,000 and 600,000 cases of venous thromboembolism (VTE) occur each year in the United States, and it is responsible for approximately 10% of preventable in-hospital fatalities.1-3 Given VTE’s impact on patients and the healthcare system and the fact that it is preventable, intense effort has been focused on developing more effective prophylactic measures to decrease its incidence.2-4 In 2008, the surgeon general issued a “call to action” for increased efforts to prevent VTE.5

The American College of Chest Physicians (ACCP) guidelines subcategorize patients based on type of surgery. In addition, the ACCP guidelines support the use of a Caprini-based scoring system to aid in risk stratification and improve clinical decision-making (Table 1).4,6-9 In general, scores ≥ 5 qualify individuals as high risk. Based on their risk category, patients receive mechanical prophylaxis, chemical prophylaxis, or a combination of the 2. Lower-risk patients who are ambulatory typically receive only mechanical prophylaxis while in bed, whereas higher-risk patients receive a combination of mechanical prophylaxis and chemoprophylaxis measures.7 In general, low-molecular-weight heparin (40 mg daily) and low-dose unfractionated heparin (5000 units 3 times daily) have been the standard evidence-based options for chemoprophylaxis in surgical patients. Absolute contraindications for prophylaxis include active bleeding and known increased risk of bleeding based on patient- or procedure-specific factors.

Caprini Risk Assessment Model

Our hospital, a 350-bed academic medical center in downtown Boston, MA, serving a diverse population with a very high case-mix index (2.4 Medicare and 2.0 non-Medicare), has strict protocols for VTE prophylaxis consistent with the ACCP guidelines and based on the Surgical Care Improvement Project (SCIP) measures published in 2006.10 The SCIP mandates allow for considerable surgeon discretion in the use of chemoprophylaxis for neurosurgical cases and general and orthopedic surgery cases deemed to be at high risk for bleeding. In addition, SCIP requires only that prophylaxis be initiated within 24 hours of surgical end time. Although recent audits revealed nearly 100% compliance with SCIP-mandated protocols, National Surgical Quality Improvement Program (NSQIP) data showed that the incidence of VTE events at our institution was higher than expected (observed/expected [O/E] = 1.32).

In order to determine the reasons for this mismatch between process and outcome performance, we investigated whether there were characteristics of our patient population that contributed to the higher than expected rates of VTE, and we scrutinized our VTE prophylaxis protocol to determine if there were aspects of our process that were also contributory.

Methods

Study Sample

This is a retrospective case-control study of surgical inpatients at our hospital during the period September 2012 to October 2015. Cases were identified as patients diagnosed with a VTE (DVT or PE). Controls were identified from a pool of surgical patients whose courses were not complicated by VTE during the same time frame as the cases and who were matched as closely as possible by procedure code, age, and gender.

 

 

Variables

Patient and hospital course variables that were analyzed included demographics, comorbidities, length of stay, number of procedures, index operation times, duration of postoperative bed rest, use of mechanical prophylaxis, and type of chemoprophylaxis and time frame within which it was initiated. Data were collected via chart review using International Classification of Diseases-9 and -10 codes to identify surgical cases within the allotted time period who were diagnosed with VTE. Demographic variables included age, sex, and ethnicity. Comorbidities included hypertension, diabetes, coronary artery disease, serious lung disease, previous or current malignancy, documented hypercoagulable state, and previous history of VTE. Body mass index (BMI) was also recorded. The aforementioned disease-specific variables were not matched between the case and control groups, as this data was obtained retrospectively during data collection.

Analysis

Associations between case and matched control were analyzed using the paired t-test for continuous variables and McNemar’s test for categorical variables. P values < 0.05 were considered statistically significant. SAS Enterprise Guide 7.15 (Cary, NC) was used for all statistical analyses.

The requirement for informed consent was waived by our Institutional Review Board, as the study was initially deemed to be a quality improvement project, and all data used for this report were de-identified.

Results

Our retrospective case-control analysis included a sample of 102 surgical patients whose courses were complicated by VTE between September 2012 and October 2015. The cases were distributed among 6 different surgical categories (Figure 1): trauma (20%), cancer (10%), cardiovascular (21%), noncancer neurosurgery (28%), elective orthopedics (11%), and miscellaneous general surgery (10%).

Distribution of procedure type.

Comparisons between cases and controls in terms of patient demographics and risk factors are shown in Table 2. No statistically significant difference was observed in ethnicity or race between the 2 groups. Overall, cases had more hip/pelvis/leg fractures at presentation (P = 0.0008). The case group also had higher proportions of patients with postoperative bed rest greater than 12 hours (P = 0.009), central venous access (P < 0.0001), infection (P < 0.0001), and lower extremity edema documented during the hospitalization prior to development of DVT (P < 0.0001). Additionally, cases had significantly greater rates of previous VTE (P = 0.0004), inherited or acquired thrombophilia (P = 0.03), history of stroke (P = 0.0003), and severe lung disease, including pneumonia (P = 0.0008). No significant differences were noted between cases and matched controls in BMI (P = 0.43), current tobacco use (P = 0.71), current malignancy (P = 0.80), previous malignancy (P = 0.83), head trauma (P = 0.17), or acute cardiac disease (myocardial infarction or congestive heart failure; P = 0.12).

Patient Demographics and Risk Factors

Variables felt to indicate overall complexity of hospital course for cases as compared to controls are outlined in Table 3. Cases were found to have significantly longer lengths of stay (median, 15.5 days versus 3 days, P < 0.0001). To account for the possibility that the development of VTE contributed to the increased length of stay in the cases, we also looked at the duration between admission date and the date of VTE diagnosis and determined that cases still had a longer length of stay when this was accounted for (median, 7 days versus 3 days, P < 0.0001). A much higher proportion of cases underwent more than 1 procedure compared to controls (P < 0.0001), and cases had significantly longer index operations as compared to controls (P = 0.002).

Complexity of Care

 

 

Seventeen cases received heparin on induction during their index procedure, compared to 23 controls (P = 0.24). Additionally, 63 cases began a prophylaxis regimen within 24 hours of surgery end time, compared to 68 controls (P = 0.24). The chemoprophylactic regimens utilized in cases and in controls are summarized in Figure 2. Of note, only 26 cases and 32 controls received standard prophylactic regimens with no missed doses (heparin 5000 units 3 times daily or enoxaparin 40 mg daily). Additionally, in over half of cases and a third of controls, nonstandard regimens were ordered. Examples of nonstandard regimens included nonstandard heparin or enoxaparin doses, low-dose warfarin, or aspirin alone. In most cases, nonstandard regimens were justified on the basis of high risk for bleeding.

Frequencies of prophylactic regimens utilized.

Mechanical prophylaxis with pneumatic sequential compression devices (SCDs) was ordered in 93 (91%) cases and 87 (85%) controls; however, we were unable to accurately document uniform compliance in the use of these devices.

With regard to evaluation of our process measures, we found only 17% of cases and controls combined actually had a VTE risk assessment in their chart, and when it was present, it was often incomplete or was completed inaccurately.

 

Discussion

The goal of this study was to identify factors (patient characteristics and/or processes of care) that may be contributing to the higher than expected incidence of VTE events at our medical center, despite internal audits suggesting near perfect compliance with SCIP-mandated protocols. We found that in addition to usual risk factors for VTE, an overarching theme of our case cohort was their high complexity of illness. At baseline, these patients had significantly greater rates of stroke, thrombophilia, severe lung disease, infection, and history of VTE than controls. Moreover, the hospital courses of cases were significantly more complex than those of controls, as these patients had more procedures, longer lengths of stay and longer index operations, higher rates of postoperative bed rest exceeding 12 hours, and more prevalent central venous access than controls (Table 2). Several of these risk factors have been found to contribute to VTE development despite compliance with prophylaxis protocols.

Cassidy et al reviewed a cohort of nontrauma general surgery patients who developed VTE despite receiving appropriate prophylaxis and found that both multiple operations and emergency procedures contributed to the failure of VTE prophylaxis.11 Similarly, Wang et al identified several independent risk factors for VTE despite thromboprophylaxis, including central venous access and infection, as well as intensive care unit admission, hospitalization for cranial surgery, and admission from a long-term care facility.12 While our study did not capture some of these additional factors considered by Wang et al, the presence of risk factors not captured in traditional assessment tools suggests that additional consideration for complex patients is warranted.

 

 

In addition to these nonmodifiable patient characteristics, aspects of our VTE prophylaxis processes likely contributed to the higher than expected rate of VTE. While the electronic medical record at our institution does contain a VTE risk assessment tool based on the Caprini score, we found it often is not used at all or is used incorrectly/incompletely, which likely reflects the fact that physicians are neither prompted nor required to complete the assessment prior to prescribing VTE prophylaxis.

There is a significant body of evidence demonstrating that mandatory computerized VTE risk assessments can effectively reduce VTE rates and that improved outcomes occur shortly after implementation. Cassidy et al demonstrated the benefits of instituting a hospital-wide, mandatory, Caprini-based computerized VTE risk assessment that provides prophylaxis/early ambulation recommendations. Two years after implementing this system, they observed an 84% reduction in DVTs (P < 0.001) and a 55% reduction in PEs (P < 0.001).13 Nimeri et al had similarly impressive success, achieving a reduction in their NSQIP O/E for PE/DVT in general surgery from 6.00 in 2010 to 0.82 (for DVTs) and 0.78 (for PEs) 5 years after implementation of mandatory VTE risk assessment (though they noted that the most dramatic reduction occurred 1 year after implementation).14 Additionally, a recent systematic review and meta-analysis by Borab et al found computerized VTE risk assessments to be associated with a significant decrease in VTE events.15

The risk assessment tool used at our institution is qualitative in nature, and current literature suggests that employing a more quantitative tool may yield improved outcomes. Numerous studies have highlighted the importance of identifying patients at very high risk for VTE, as higher risk may necessitate more careful consideration of their prophylactic regimens. Obi et al found patients with Caprini scores higher than 8 to be at significantly greater risk of developing VTE compared to patients with scores of 7 or 8. Also, patients with scores of 7 or 8 were significantly more likely to have a VTE compared to those with scores of 5 or 6.16 In another study, Lobastov et al identified Caprini scores of 11 or higher as representing an extremely high-risk category for which standard prophylaxis regimens may not be effective.17 Thus, while having mandatory risk assessment has been shown to dramatically decrease VTE incidence, it is important to consider the magnitude of the numerical risk score. This is of particular importance at medical centers with high case-mix indices where patients at the highest risk might need to be managed with different prophylactic guidelines.

Another notable aspect of the process at our hospital was the great variation in the types of prophylactic regimens ordered, and the adherence to what was ordered. Only 25.5% of patients were maintained on a standard prophylactic regimen with no missed doses (heparin 5000 every 8 hours or enoxaparin 40 mg daily). Thus, the vast majority of the patients who went on to develop VTE either were prescribed a nontraditional prophylaxis regimen or missed doses of standard agents. The need for secondary surgical procedures or other invasive interventions may explain many, but not all, of the missed doses.

The timing of prophylaxis initiation for our patients was also found to deviate from accepted standards. Only 16.8% of cases received prophylaxis upon induction of anesthesia, and furthermore, 38% of cases did not receive any anticoagulation within 24 hours of their index operation. While this variability in prophylaxis implementation was acceptable within the SCIP guidelines based on “high risk for bleeding” or other considerations, it likely contributed to our suboptimal outcomes. The variations and interruptions in prophylactic regimens speak to barriers that have previously been reported as contributing factors to noncompliance with VTE prophylaxis.18

 

 

Given these known barriers and the observed underutilization and improper use of our risk assessment tool, we have recently changed our surgical admission order sets such that a mandatory quantitative risk assessment must be done for every surgical patient at the time of admission/operation before other orders can be completed. Following completion of the assessment, the physician will be presented with an appropriate standard regimen based on the individual patient’s risk assessment. Early results of our VTE quality improvement project have been satisfying: in the most recent NSQIP semi-annual report, our O/E for VTE was 0.74, placing us in the first decile. Some of these early reports may simply be the product of the Hawthorne effect; however, we are encouraged by the early improvements seen in other research. While we are hopeful that these changes will result in sustainable improvements in outcomes, patients at extremely high risk may require novel weight-based or otherwise customized aggressive prophylactic regimens. Such regimens have already been proposed for arthroplasty and other high-risk patients.

Future research may identify other risk factors not captured by traditional risk assessments. In addition, research should continue to explore the use and efficacy of standard prophylactic regimens in these populations to help determine if they are sufficient. Currently, weight-based low-molecular-weight heparin dosing and alternative regimens employing fondaparinux are under investigation for very-high-risk patients.19

There were several limitations to the present study. First, due to the retrospective design of our study, we could collect only data that had been uniformly recorded in the charts throughout the study period. Second, we were unable to accurately assess compliance with mechanical prophylaxis. While our chart review showed that the vast majority of cases and controls were ordered to have mechanical prophylaxis, it is impossible to document how often these devices were used appropriately in a retrospective analysis. Anecdotal observation suggests that once patients are out of post-anesthesia or critical care units, SCD use is not standardized. The inability to measure compliance precisely may be leading to an overestimation of our compliance with prophylaxis. Finally, because our study included only patients who underwent surgery at our hospital, our observations may not be generalizable outside our institution.

 

Conclusion

Our study findings reinforce the importance of attention to detail in VTE risk assessment and in ordering and administering VTE prophylactic regimens, especially in high-risk surgical patients. While we adhered to the SCIP-mandated prophylaxis requirements, the complexity of our patients and our lack of a truly standardized approach to risk assessment and prophylactic regimens resulted in suboptimal outcomes. Stricter and more quantitative mandatory VTE risk assessment, along with highly standardized VTE prophylaxis regimens, are required to achieve optimal outcomes.

Corresponding author: Jason C. DeGiovanni, MS, BA, Jason.DeGiovanni@tufts.edu.

Financial disclosures: None.

References

1. Spyropoulos AC, Hussein M, Lin J, et al. Rates of symptomatic venous thromboembolism in US surgical patients: a retrospective administrative database study. J Thromb Thrombolysis. 2009;28:458-464.

2. Deitzelzweig SB, Johnson BH, Lin J, et al. Prevalence of clinical venous thromboembolism in the USA: Current trends and future projections. Am J Hematol. 2011;86:217-220.

3. Horlander KT, Mannino DM, Leeper KV. Pulmonary embolism mortality in the United States, 1979-1998: an analysis using multiple-cause mortality data. Arch Intern Med. 2003;163:1711-1717.

4. Guyatt GH, Akl EA, Crowther M, et al. Introduction to the ninth edition: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(suppl):48S-52S.

5. Office of the Surgeon General; National Heart, Lung, and Blood Institute. The Surgeon General’s Call to Action to Prevent Deep Vein Thrombosis and Pulmonary Embolism. Rockville, MD: Office of the Surgeon General; 2008. www.ncbi.nlm.nih.gov/books/NBK44178/. Accessed May 2, 2019.

6. Pannucci CJ, Swistun L, MacDonald JK, et al. Individualized venous thromboembolism risk stratification using the 2005 Caprini score to identify the benefits and harms of chemoprophylaxis in surgical patients: a meta-analysis. Ann Surg. 2017;265:1094-1102.

7. Caprini JA, Arcelus JI, Hasty JH, et al. Clinical assessment of venous thromboembolic risk in surgical patients. Semin Thromb Hemost. 1991;17(suppl 3):304-312.

8. Caprini JA. Risk assessment as a guide for the prevention of the many faces of venous thromboembolism. Am J Surg. 2010;199:S3-S10.

9. Gould MK, Garcia DA, Wren SM, et al. Prevention of VTE in nonorthopedic surgical patients: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e227S-e277S.

10. The Joint Commission. Surgical Care Improvement Project (SCIP) Measure Information Form (Version 2.1c). www.jointcommission.org/surgical_care_improvement_project_scip_measure_information_form_version_21c/. Accessed June 22, 2016.

11. Cassidy MR, Macht RD, Rosenkranz P, et al. Patterns of failure of a standardized perioperative venous thromboembolism prophylaxis protocol. J Am Coll Surg. 2016;222:1074-1081.

12. Wang TF, Wong CA, Milligan PE, et al. Risk factors for inpatient venous thromboembolism despite thromboprophylaxis. Thromb Res. 2014;133:25-29.

13. Cassidy MR, Rosenkranz P, McAneny D. Reducing postoperative venous thromboembolism complications with a standardized risk-stratified prophylaxis protocol and mobilization program. J Am Coll Surg. 2014;218:1095-1104.

14. Nimeri AA, Gamaleldin MM, McKenna KL, et al. Reduction of venous thromboembolism in surgical patients using a mandatory risk-scoring system: 5-year follow-up of an American College of Surgeons National Quality Improvement Program. Clin Appl Thromb Hemost. 2017;23:392-396.

15. Borab ZM, Lanni MA, Tecce MG, et al. Use of computerized clinical decision support systems to prevent venous thromboembolism in surgical patients: a systematic review and meta-analysis. JAMA Surg. 2017;152:638–645.

16. Obi AT, Pannucci CJ, Nackashi A, et al. Validation of the Caprini venous thromboembolism risk assessment model in critically ill surgical patients. JAMA Surg. 2015;150:941-948.

17. Lobastov K, Barinov V, Schastlivtsev I, et al. Validation of the Caprini risk assessment model for venous thromboembolism in high-risk surgical patients in the background of standard prophylaxis. J Vasc Surg Venous Lymphat Disord. 2016;4:153-160.

18. Kakkar AK, Cohen AT, Tapson VF, et al. Venous thromboembolism risk and prophylaxis in the acute care hospital setting (ENDORSE survey): findings in surgical patients. Ann Surg. 2010;251:330-338.

19. Smythe MA, Priziola J, Dobesh PP, et al. Guidance for the practical management of the heparin anticoagulants in the treatment of venous thromboembolism. J Thromb Thrombolysis. 2016;41:165-186.

References

1. Spyropoulos AC, Hussein M, Lin J, et al. Rates of symptomatic venous thromboembolism in US surgical patients: a retrospective administrative database study. J Thromb Thrombolysis. 2009;28:458-464.

2. Deitzelzweig SB, Johnson BH, Lin J, et al. Prevalence of clinical venous thromboembolism in the USA: Current trends and future projections. Am J Hematol. 2011;86:217-220.

3. Horlander KT, Mannino DM, Leeper KV. Pulmonary embolism mortality in the United States, 1979-1998: an analysis using multiple-cause mortality data. Arch Intern Med. 2003;163:1711-1717.

4. Guyatt GH, Akl EA, Crowther M, et al. Introduction to the ninth edition: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(suppl):48S-52S.

5. Office of the Surgeon General; National Heart, Lung, and Blood Institute. The Surgeon General’s Call to Action to Prevent Deep Vein Thrombosis and Pulmonary Embolism. Rockville, MD: Office of the Surgeon General; 2008. www.ncbi.nlm.nih.gov/books/NBK44178/. Accessed May 2, 2019.

6. Pannucci CJ, Swistun L, MacDonald JK, et al. Individualized venous thromboembolism risk stratification using the 2005 Caprini score to identify the benefits and harms of chemoprophylaxis in surgical patients: a meta-analysis. Ann Surg. 2017;265:1094-1102.

7. Caprini JA, Arcelus JI, Hasty JH, et al. Clinical assessment of venous thromboembolic risk in surgical patients. Semin Thromb Hemost. 1991;17(suppl 3):304-312.

8. Caprini JA. Risk assessment as a guide for the prevention of the many faces of venous thromboembolism. Am J Surg. 2010;199:S3-S10.

9. Gould MK, Garcia DA, Wren SM, et al. Prevention of VTE in nonorthopedic surgical patients: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e227S-e277S.

10. The Joint Commission. Surgical Care Improvement Project (SCIP) Measure Information Form (Version 2.1c). www.jointcommission.org/surgical_care_improvement_project_scip_measure_information_form_version_21c/. Accessed June 22, 2016.

11. Cassidy MR, Macht RD, Rosenkranz P, et al. Patterns of failure of a standardized perioperative venous thromboembolism prophylaxis protocol. J Am Coll Surg. 2016;222:1074-1081.

12. Wang TF, Wong CA, Milligan PE, et al. Risk factors for inpatient venous thromboembolism despite thromboprophylaxis. Thromb Res. 2014;133:25-29.

13. Cassidy MR, Rosenkranz P, McAneny D. Reducing postoperative venous thromboembolism complications with a standardized risk-stratified prophylaxis protocol and mobilization program. J Am Coll Surg. 2014;218:1095-1104.

14. Nimeri AA, Gamaleldin MM, McKenna KL, et al. Reduction of venous thromboembolism in surgical patients using a mandatory risk-scoring system: 5-year follow-up of an American College of Surgeons National Quality Improvement Program. Clin Appl Thromb Hemost. 2017;23:392-396.

15. Borab ZM, Lanni MA, Tecce MG, et al. Use of computerized clinical decision support systems to prevent venous thromboembolism in surgical patients: a systematic review and meta-analysis. JAMA Surg. 2017;152:638–645.

16. Obi AT, Pannucci CJ, Nackashi A, et al. Validation of the Caprini venous thromboembolism risk assessment model in critically ill surgical patients. JAMA Surg. 2015;150:941-948.

17. Lobastov K, Barinov V, Schastlivtsev I, et al. Validation of the Caprini risk assessment model for venous thromboembolism in high-risk surgical patients in the background of standard prophylaxis. J Vasc Surg Venous Lymphat Disord. 2016;4:153-160.

18. Kakkar AK, Cohen AT, Tapson VF, et al. Venous thromboembolism risk and prophylaxis in the acute care hospital setting (ENDORSE survey): findings in surgical patients. Ann Surg. 2010;251:330-338.

19. Smythe MA, Priziola J, Dobesh PP, et al. Guidance for the practical management of the heparin anticoagulants in the treatment of venous thromboembolism. J Thromb Thrombolysis. 2016;41:165-186.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
117-124
Page Number
117-124
Publications
Publications
Topics
Article Type
Display Headline
Mismatch Between Process and Outcome Measures for Hospital-Acquired Venous Thromboembolism in a Surgical Cohort
Display Headline
Mismatch Between Process and Outcome Measures for Hospital-Acquired Venous Thromboembolism in a Surgical Cohort
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Structured Approach to Venous Access Associated with Zero Risk of Pneumothorax During Cardiac Device Implant Procedures

Article Type
Changed
Thu, 04/23/2020 - 15:14
Display Headline
Structured Approach to Venous Access Associated with Zero Risk of Pneumothorax During Cardiac Device Implant Procedures

Iatrogenic pneumothorax, an acute serious complication, is reported to occur in 0.1% to 2% of permanent trans-venous cardiac device implant procedures. 1,2 A National Cardiovascular Data Registry analysis of data between January 2006 and December 2008 found that pneumothorax incidence after a new defibrillator implant was 0.5%. 1 Among 4355 Danish patients undergoing a new device implant, 0.9% experienced pneumothorax requiring drainage and 0.7% had pneumothorax treated conservatively. 2 Studies have shown a higher risk of complications when procedures were performed at low-volume centers compared with the highest volume quartile (odds ratio, 1.26; 95% confidence interval, 1.05-1.52) or when procedures were performed by low-volume operators. 1

Methods. At 2 community hospitals, a project to reduce pneumothorax risk related to new device implants was implemented. This project consisted of obtaining a pre-procedure venogram (right anterior oblique [RAO] view, 12–18 degrees, 42 cm magnification), creating a subcutaneous pocket first and then obtaining axillary venous access with a 4Fr micro-puncture needle, and obtaining a post-procedure chest radiograph. During venous access, the needle was never advanced beyond the inner border of the first rib. This new process was fully implemented by January 2015. A chart review of all patients who underwent a new device implant between January 2015 and July 2017 at the 2 community medical centers was performed.

Results. Seventy patients received new implants during the review period (31 female, 39 male). The median age was 78 years (range, 34–94 years), median body mass index was 29.05 (range, 17.3–67.9), median procedural time was 70 minutes (range, 26–146 minutes), and median fluoroscopic time was 6.4 minutes (range, 0.5–35.7 minutes). A total of 131 independent venous accesses were obtained to implant 42 pacemakers and 28 defibrillators (10 single, 54 dual, and 6 biventricular devices). Of these accesses, 127 were axillary and the remainder were cephalic. There was no incidence of pneumothorax reported during these venous accesses.

Discussion. A structured approach to venous access during device implants was associated with zero incidence of pneumothorax in a low-volume center where implants were performed by a low-volume trained operator. The venogram eliminates “blind attempts,” and the RAO view reduces the likelihood of going too posterior. Using caudal fluoroscopy and targeting the axillary vein, other groups have reported a 0% to 0.2% risk for acute pneumothorax in larger patient groups. 3,4 Creating a subcutaneous pocket first allows the needle to be aligned more longitudinally along the course of the vein. The 4Fr needle increases the ratio of vein-to-needle surface area, reducing risk for pneumothorax.

Standardization of venous access can potentially reduce iatrogenic pneumothorax risk to a never event, similar to the approach used to prevent central line–associated blood stream infections. 5

Benjamin Carmel
Lake Erie College of Osteopathic Medicine
Bradenton, FL

Indiresha R. Iyer, MD
Case Western Reserve University
Cleveland, OH

Corresponding author: Indiresha R. Iyer, MD, Indiresha.iyer@ uhhospitals.org.

Financial disclosures: None.

References

1. Freeman JV, Wang Y, Curtis JP, et al. The relation between hospital procedure volume and complications of cardioverter-defibrillator implantation from the implantable cardioverter-defibrillator registry. J Am Coll Cardiol . 2010; 56:1133-1139.

2. Kirkfeldt RE, Johansen JB, Nohr, EA, et al. Complications after cardiac implantable electronic device implantations: an analysis of a complete, nationwide cohort in Denmark, Eur Heart J . 2014;35:1186–1194.

3. Yang F, Kulbak GA. New trick to a routine procedure: taking the fear out of the axillary vein stick using the 35° caudal view. Europace . 2015;17:1157-1160.

4. Hettiarachchi EMS, Arsene C, Fares S, et al. Fluoroscopy-guided axillary vein puncture, a reliable method to prevent acute complications associated with pacemaker, defibrillator, and cardiac resynchronization therapy leads insertion. J Cardiovasc Dis Diagn. 2014;2:136.

5. Chu H, Cosgrove S, Sexton B, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med . 2006;355:2725-2732.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
115
Sections
Article PDF
Article PDF

Iatrogenic pneumothorax, an acute serious complication, is reported to occur in 0.1% to 2% of permanent trans-venous cardiac device implant procedures. 1,2 A National Cardiovascular Data Registry analysis of data between January 2006 and December 2008 found that pneumothorax incidence after a new defibrillator implant was 0.5%. 1 Among 4355 Danish patients undergoing a new device implant, 0.9% experienced pneumothorax requiring drainage and 0.7% had pneumothorax treated conservatively. 2 Studies have shown a higher risk of complications when procedures were performed at low-volume centers compared with the highest volume quartile (odds ratio, 1.26; 95% confidence interval, 1.05-1.52) or when procedures were performed by low-volume operators. 1

Methods. At 2 community hospitals, a project to reduce pneumothorax risk related to new device implants was implemented. This project consisted of obtaining a pre-procedure venogram (right anterior oblique [RAO] view, 12–18 degrees, 42 cm magnification), creating a subcutaneous pocket first and then obtaining axillary venous access with a 4Fr micro-puncture needle, and obtaining a post-procedure chest radiograph. During venous access, the needle was never advanced beyond the inner border of the first rib. This new process was fully implemented by January 2015. A chart review of all patients who underwent a new device implant between January 2015 and July 2017 at the 2 community medical centers was performed.

Results. Seventy patients received new implants during the review period (31 female, 39 male). The median age was 78 years (range, 34–94 years), median body mass index was 29.05 (range, 17.3–67.9), median procedural time was 70 minutes (range, 26–146 minutes), and median fluoroscopic time was 6.4 minutes (range, 0.5–35.7 minutes). A total of 131 independent venous accesses were obtained to implant 42 pacemakers and 28 defibrillators (10 single, 54 dual, and 6 biventricular devices). Of these accesses, 127 were axillary and the remainder were cephalic. There was no incidence of pneumothorax reported during these venous accesses.

Discussion. A structured approach to venous access during device implants was associated with zero incidence of pneumothorax in a low-volume center where implants were performed by a low-volume trained operator. The venogram eliminates “blind attempts,” and the RAO view reduces the likelihood of going too posterior. Using caudal fluoroscopy and targeting the axillary vein, other groups have reported a 0% to 0.2% risk for acute pneumothorax in larger patient groups. 3,4 Creating a subcutaneous pocket first allows the needle to be aligned more longitudinally along the course of the vein. The 4Fr needle increases the ratio of vein-to-needle surface area, reducing risk for pneumothorax.

Standardization of venous access can potentially reduce iatrogenic pneumothorax risk to a never event, similar to the approach used to prevent central line–associated blood stream infections. 5

Benjamin Carmel
Lake Erie College of Osteopathic Medicine
Bradenton, FL

Indiresha R. Iyer, MD
Case Western Reserve University
Cleveland, OH

Corresponding author: Indiresha R. Iyer, MD, Indiresha.iyer@ uhhospitals.org.

Financial disclosures: None.

Iatrogenic pneumothorax, an acute serious complication, is reported to occur in 0.1% to 2% of permanent trans-venous cardiac device implant procedures. 1,2 A National Cardiovascular Data Registry analysis of data between January 2006 and December 2008 found that pneumothorax incidence after a new defibrillator implant was 0.5%. 1 Among 4355 Danish patients undergoing a new device implant, 0.9% experienced pneumothorax requiring drainage and 0.7% had pneumothorax treated conservatively. 2 Studies have shown a higher risk of complications when procedures were performed at low-volume centers compared with the highest volume quartile (odds ratio, 1.26; 95% confidence interval, 1.05-1.52) or when procedures were performed by low-volume operators. 1

Methods. At 2 community hospitals, a project to reduce pneumothorax risk related to new device implants was implemented. This project consisted of obtaining a pre-procedure venogram (right anterior oblique [RAO] view, 12–18 degrees, 42 cm magnification), creating a subcutaneous pocket first and then obtaining axillary venous access with a 4Fr micro-puncture needle, and obtaining a post-procedure chest radiograph. During venous access, the needle was never advanced beyond the inner border of the first rib. This new process was fully implemented by January 2015. A chart review of all patients who underwent a new device implant between January 2015 and July 2017 at the 2 community medical centers was performed.

Results. Seventy patients received new implants during the review period (31 female, 39 male). The median age was 78 years (range, 34–94 years), median body mass index was 29.05 (range, 17.3–67.9), median procedural time was 70 minutes (range, 26–146 minutes), and median fluoroscopic time was 6.4 minutes (range, 0.5–35.7 minutes). A total of 131 independent venous accesses were obtained to implant 42 pacemakers and 28 defibrillators (10 single, 54 dual, and 6 biventricular devices). Of these accesses, 127 were axillary and the remainder were cephalic. There was no incidence of pneumothorax reported during these venous accesses.

Discussion. A structured approach to venous access during device implants was associated with zero incidence of pneumothorax in a low-volume center where implants were performed by a low-volume trained operator. The venogram eliminates “blind attempts,” and the RAO view reduces the likelihood of going too posterior. Using caudal fluoroscopy and targeting the axillary vein, other groups have reported a 0% to 0.2% risk for acute pneumothorax in larger patient groups. 3,4 Creating a subcutaneous pocket first allows the needle to be aligned more longitudinally along the course of the vein. The 4Fr needle increases the ratio of vein-to-needle surface area, reducing risk for pneumothorax.

Standardization of venous access can potentially reduce iatrogenic pneumothorax risk to a never event, similar to the approach used to prevent central line–associated blood stream infections. 5

Benjamin Carmel
Lake Erie College of Osteopathic Medicine
Bradenton, FL

Indiresha R. Iyer, MD
Case Western Reserve University
Cleveland, OH

Corresponding author: Indiresha R. Iyer, MD, Indiresha.iyer@ uhhospitals.org.

Financial disclosures: None.

References

1. Freeman JV, Wang Y, Curtis JP, et al. The relation between hospital procedure volume and complications of cardioverter-defibrillator implantation from the implantable cardioverter-defibrillator registry. J Am Coll Cardiol . 2010; 56:1133-1139.

2. Kirkfeldt RE, Johansen JB, Nohr, EA, et al. Complications after cardiac implantable electronic device implantations: an analysis of a complete, nationwide cohort in Denmark, Eur Heart J . 2014;35:1186–1194.

3. Yang F, Kulbak GA. New trick to a routine procedure: taking the fear out of the axillary vein stick using the 35° caudal view. Europace . 2015;17:1157-1160.

4. Hettiarachchi EMS, Arsene C, Fares S, et al. Fluoroscopy-guided axillary vein puncture, a reliable method to prevent acute complications associated with pacemaker, defibrillator, and cardiac resynchronization therapy leads insertion. J Cardiovasc Dis Diagn. 2014;2:136.

5. Chu H, Cosgrove S, Sexton B, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med . 2006;355:2725-2732.

References

1. Freeman JV, Wang Y, Curtis JP, et al. The relation between hospital procedure volume and complications of cardioverter-defibrillator implantation from the implantable cardioverter-defibrillator registry. J Am Coll Cardiol . 2010; 56:1133-1139.

2. Kirkfeldt RE, Johansen JB, Nohr, EA, et al. Complications after cardiac implantable electronic device implantations: an analysis of a complete, nationwide cohort in Denmark, Eur Heart J . 2014;35:1186–1194.

3. Yang F, Kulbak GA. New trick to a routine procedure: taking the fear out of the axillary vein stick using the 35° caudal view. Europace . 2015;17:1157-1160.

4. Hettiarachchi EMS, Arsene C, Fares S, et al. Fluoroscopy-guided axillary vein puncture, a reliable method to prevent acute complications associated with pacemaker, defibrillator, and cardiac resynchronization therapy leads insertion. J Cardiovasc Dis Diagn. 2014;2:136.

5. Chu H, Cosgrove S, Sexton B, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med . 2006;355:2725-2732.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
115
Page Number
115
Publications
Publications
Topics
Article Type
Display Headline
Structured Approach to Venous Access Associated with Zero Risk of Pneumothorax During Cardiac Device Implant Procedures
Display Headline
Structured Approach to Venous Access Associated with Zero Risk of Pneumothorax During Cardiac Device Implant Procedures
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Delayed Cardioversion Noninferior to Early Cardioversion in Recent-Onset Atrial Fibrillation

Article Type
Changed
Thu, 04/23/2020 - 15:20
Display Headline
Delayed Cardioversion Noninferior to Early Cardioversion in Recent-Onset Atrial Fibrillation

Study Overview

Objective. To assess whether immediate restoration of sinus rhythm is necessary in hemodynamically stable, recent onset (< 36 hr), symptomatic atrial fibrillation in the emergency department.

Design. Multicenter, randomized, open-label, noninferiority trial, RACE 7 ACWAS (Rate Control versus Electrical Cardioversion Trial 7--Acute Cardioversion versus Wait and See).

Setting and participants. 15 hospitals in the Netherlands, including 3 academic hospitals, 8 nonacademic teaching hospitals, and 4 nonteaching hospitals. Patients 18 years of age or older with recent-onset (< 36 hr), symptomatic atrial fibrillation without signs of myocardial ischemia or a history of persistent atrial fibrillation who presented to the emergency department were randomized in a 1:1 ratio to either a wait-and-see approach or early cardioversion. The wait-and-see approach consisted of the administration of rate-control medication, including intravenous or oral beta-adrenergic-receptor blocking agents, nondihydropyridine calcium-channel blockers, or digoxin to achieve a heart rate of 110 beats per minute or less and symptomatic relief. Patients were then discharged with an outpatient visit scheduled for the next day and a referral for cardioversion as close as possible to 48 hours after the onset of symptoms. The cardioconversion group received pharmacologic cardioversion with flecainide unless contraindicated, then electrical cardioversion was performed.

Main outcome measures. Primary outcome was the presence of sinus rhythm on electrocardiogram (ECG) recorded at the 4-week trial visit. Secondary endpoints included the duration of the index visit at the emergency department, emergency department visits related to atrial fibrillation, cardiovascular complications, and time until recurrence of atrial fibrillation.

Main results. From October 2014 through September 2018, 437 patients underwent randomization, with 218 patients assigned to the delayed cardioversion group and 219 to the early cardioversion group. Mean age was 65 years, and a majority of the patients (60%) were men (n = 261). The primary end point of the presence of sinus rhythm on the ECG recorded at the 4-week visit was present in 193 of 212 patients (91%) in the delayed cardioversion group and in 202 of 215 patients (94%) in the early cardioversion group. The –2.9 percentage points with confidence interval [CI] –8.2 to 2.2 (P = 0.005) met the criteria for the noninferiority of the wait-and-see approach.

For secondary outcomes, the median duration of the index visit was 120 minutes (range, 60 to 253) in the delayed cardioversion group and 158 minutes (range, 110 to 228) in the early cardioversion group. The median difference between the 2 groups was 30 minutes (95% CI, 6 to 51 minutes). There was no significant difference in cardiovascular complications between the 2 groups. Fourteen of 212 patients (7%) in the delayed cardioversion group and 14 of 215 patients (7%) in the early cardioversion group had subsequent visits to the emergency department because of a recurrence of atrial fibrillation. Telemetric ECG recordings were available for 335 of the 437 patients. Recurrence of atrial fibrillation occurred in 49 of the 164 (30%) patients in the delayed cardioversion group and 50 of the 171 (29%) patients in the early cardioversion group.

In terms of treatment, conversion to sinus rhythm within 48 hours occurred spontaneously in 150 of 218 patients (69%) in the delayed cardioversion group after receiving rate-control medications only. Of the 218 patients, 61 (28%) had delayed cardioversion (9 by pharmacologic and 52 by electrical cardioversion) as per protocol and achieved sinus rhythm within 48 hours. In the early cardioversion group, conversion to sinus rhythm occurred spontaneously in 36 of 219 patients (16%) before the initiation of the cardioversion and in 171 of 219 (78%) after cardioversion (83 by pharmacologic and 88 by electrical).

 

 

Conclusion. For patients with recent-onset, symptomatic atrial fibrillation, allowing a short time for spontaneous conversion to sinus rhythm is reasonable as demonstrated by this noninferiority study.

Commentary

Atrial fibrillation accounts for nearly 0.5% of all emergency department visits, and this number is increasing.1,2 Patients commonly undergo immediate restoration of sinus rhythm by means of pharmacologic or electrical cardioversion. However, it is questionable whether immediate restoration of sinus rhythm is necessary, as spontaneous conversion to sinus rhythm occurs frequently. In addition, the safety of cardioversion between 12 and 48 hours after the onset of atrial fibrillation is questionable.3,4

In this pragmatic trial, the findings suggest that rate-control therapy alone can achieve prompt symptom relief in almost all eligible patients, had a low risk of complications, and reduced the median length of stay in the emergency department to 2 hours. Independent of cardioversion strategy, the authors stressed the importance of management of stroke risk when patients present with atrial fibrillation to the emergency department. In this trial, 2 patients had cerebral embolism even though both were started on anticoagulation in the index visit. One patient from the delayed cardioversion group was on dabigatran after spontaneous conversion to sinus rhythm and had an event 5 days after the index visit. The other patient, from the early cardioversion group, was on rivaroxaban and had an event 10 days after electrical cardiology. In order for the results of this trial to be broadly applicable, exclusion of intraatrial thrombus on transesophageal echocardiography may be necessary when the onset of atrial fibrillation is not as clear.

There are several limitations of this study. First, this study included only 171 of the 3706 patients (4.6%) screened systematically at the 2 academic centers, but included 266 from 13 centers without systematic screening. The large amount of patients excluded from the controlled environment made the results less generalizable in the broader scope. Second, the reported incidence of recurrent atrial fibrillation within 4 weeks after randomization was an underestimation of the true recurrence rate since the trial used intermittent monitoring. Although the incidence of about 30% was similar between the 2 groups, the authors suggested that the probability of recurrence of atrial fibrillation was not affected by management approach during the acute event. Finally, for these results to be applicable in the general population, defined treatment algorithms and access to prompt follow-up are needed, and these may not be practical in other clinical settings.2,5

Applications for Clinical Practice

The current study demonstrated immediate cardioversion is not necessary for patients with recent-onset, symptomatic atrial fibrillation in the emergency department. Allowing a short time for spontaneous conversion to sinus rhythm is reasonable as long as the total time in atrial fibrillation is less than 48 hours. Special consideration for anticoagulation is critical because stroke has been associated with atrial fibrillation duration between 24 and 48 hours.

—Ka Ming Gordon Ngai, MD, MPH

References

1. Rozen G, Hosseini SM, Kaadan MI, et al. Emergency department visits for atrial fibrillation in the United States: trends in admission rates and economic burden from 2007 to 2014. J Am Heart Assoc. 2018;7(15):e009024.

2. Healey JS, McIntyre WF. The RACE to treat atrial fibrillation in the emergency department. N Engl J Med. 2019 Mar 18.

3. Andrade JM, Verma A, Mitchell LB, et al. 2018 Focused update of the Canadian Cardiovascular Society guidelines for the management of atrial fibrillation. Can J Cardiol. 2018;34:1371-1392. 


4. Nuotio I, Hartikainen JE, Grönberg T, et al. Time to cardioversion for acute atrial fibrillation and thromboembolic complications. JAMA. 2014;312:647-649

5. Baugh CW, Clark CL, Wilson JW, et al. Creation and implementation of an outpatient pathway for atrial fibrillation in the emergency department setting: results of an expert panel. Acad Emerg Med. 2018;25:1065-1075.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
113-114
Sections
Article PDF
Article PDF

Study Overview

Objective. To assess whether immediate restoration of sinus rhythm is necessary in hemodynamically stable, recent onset (< 36 hr), symptomatic atrial fibrillation in the emergency department.

Design. Multicenter, randomized, open-label, noninferiority trial, RACE 7 ACWAS (Rate Control versus Electrical Cardioversion Trial 7--Acute Cardioversion versus Wait and See).

Setting and participants. 15 hospitals in the Netherlands, including 3 academic hospitals, 8 nonacademic teaching hospitals, and 4 nonteaching hospitals. Patients 18 years of age or older with recent-onset (< 36 hr), symptomatic atrial fibrillation without signs of myocardial ischemia or a history of persistent atrial fibrillation who presented to the emergency department were randomized in a 1:1 ratio to either a wait-and-see approach or early cardioversion. The wait-and-see approach consisted of the administration of rate-control medication, including intravenous or oral beta-adrenergic-receptor blocking agents, nondihydropyridine calcium-channel blockers, or digoxin to achieve a heart rate of 110 beats per minute or less and symptomatic relief. Patients were then discharged with an outpatient visit scheduled for the next day and a referral for cardioversion as close as possible to 48 hours after the onset of symptoms. The cardioconversion group received pharmacologic cardioversion with flecainide unless contraindicated, then electrical cardioversion was performed.

Main outcome measures. Primary outcome was the presence of sinus rhythm on electrocardiogram (ECG) recorded at the 4-week trial visit. Secondary endpoints included the duration of the index visit at the emergency department, emergency department visits related to atrial fibrillation, cardiovascular complications, and time until recurrence of atrial fibrillation.

Main results. From October 2014 through September 2018, 437 patients underwent randomization, with 218 patients assigned to the delayed cardioversion group and 219 to the early cardioversion group. Mean age was 65 years, and a majority of the patients (60%) were men (n = 261). The primary end point of the presence of sinus rhythm on the ECG recorded at the 4-week visit was present in 193 of 212 patients (91%) in the delayed cardioversion group and in 202 of 215 patients (94%) in the early cardioversion group. The –2.9 percentage points with confidence interval [CI] –8.2 to 2.2 (P = 0.005) met the criteria for the noninferiority of the wait-and-see approach.

For secondary outcomes, the median duration of the index visit was 120 minutes (range, 60 to 253) in the delayed cardioversion group and 158 minutes (range, 110 to 228) in the early cardioversion group. The median difference between the 2 groups was 30 minutes (95% CI, 6 to 51 minutes). There was no significant difference in cardiovascular complications between the 2 groups. Fourteen of 212 patients (7%) in the delayed cardioversion group and 14 of 215 patients (7%) in the early cardioversion group had subsequent visits to the emergency department because of a recurrence of atrial fibrillation. Telemetric ECG recordings were available for 335 of the 437 patients. Recurrence of atrial fibrillation occurred in 49 of the 164 (30%) patients in the delayed cardioversion group and 50 of the 171 (29%) patients in the early cardioversion group.

In terms of treatment, conversion to sinus rhythm within 48 hours occurred spontaneously in 150 of 218 patients (69%) in the delayed cardioversion group after receiving rate-control medications only. Of the 218 patients, 61 (28%) had delayed cardioversion (9 by pharmacologic and 52 by electrical cardioversion) as per protocol and achieved sinus rhythm within 48 hours. In the early cardioversion group, conversion to sinus rhythm occurred spontaneously in 36 of 219 patients (16%) before the initiation of the cardioversion and in 171 of 219 (78%) after cardioversion (83 by pharmacologic and 88 by electrical).

 

 

Conclusion. For patients with recent-onset, symptomatic atrial fibrillation, allowing a short time for spontaneous conversion to sinus rhythm is reasonable as demonstrated by this noninferiority study.

Commentary

Atrial fibrillation accounts for nearly 0.5% of all emergency department visits, and this number is increasing.1,2 Patients commonly undergo immediate restoration of sinus rhythm by means of pharmacologic or electrical cardioversion. However, it is questionable whether immediate restoration of sinus rhythm is necessary, as spontaneous conversion to sinus rhythm occurs frequently. In addition, the safety of cardioversion between 12 and 48 hours after the onset of atrial fibrillation is questionable.3,4

In this pragmatic trial, the findings suggest that rate-control therapy alone can achieve prompt symptom relief in almost all eligible patients, had a low risk of complications, and reduced the median length of stay in the emergency department to 2 hours. Independent of cardioversion strategy, the authors stressed the importance of management of stroke risk when patients present with atrial fibrillation to the emergency department. In this trial, 2 patients had cerebral embolism even though both were started on anticoagulation in the index visit. One patient from the delayed cardioversion group was on dabigatran after spontaneous conversion to sinus rhythm and had an event 5 days after the index visit. The other patient, from the early cardioversion group, was on rivaroxaban and had an event 10 days after electrical cardiology. In order for the results of this trial to be broadly applicable, exclusion of intraatrial thrombus on transesophageal echocardiography may be necessary when the onset of atrial fibrillation is not as clear.

There are several limitations of this study. First, this study included only 171 of the 3706 patients (4.6%) screened systematically at the 2 academic centers, but included 266 from 13 centers without systematic screening. The large amount of patients excluded from the controlled environment made the results less generalizable in the broader scope. Second, the reported incidence of recurrent atrial fibrillation within 4 weeks after randomization was an underestimation of the true recurrence rate since the trial used intermittent monitoring. Although the incidence of about 30% was similar between the 2 groups, the authors suggested that the probability of recurrence of atrial fibrillation was not affected by management approach during the acute event. Finally, for these results to be applicable in the general population, defined treatment algorithms and access to prompt follow-up are needed, and these may not be practical in other clinical settings.2,5

Applications for Clinical Practice

The current study demonstrated immediate cardioversion is not necessary for patients with recent-onset, symptomatic atrial fibrillation in the emergency department. Allowing a short time for spontaneous conversion to sinus rhythm is reasonable as long as the total time in atrial fibrillation is less than 48 hours. Special consideration for anticoagulation is critical because stroke has been associated with atrial fibrillation duration between 24 and 48 hours.

—Ka Ming Gordon Ngai, MD, MPH

Study Overview

Objective. To assess whether immediate restoration of sinus rhythm is necessary in hemodynamically stable, recent onset (< 36 hr), symptomatic atrial fibrillation in the emergency department.

Design. Multicenter, randomized, open-label, noninferiority trial, RACE 7 ACWAS (Rate Control versus Electrical Cardioversion Trial 7--Acute Cardioversion versus Wait and See).

Setting and participants. 15 hospitals in the Netherlands, including 3 academic hospitals, 8 nonacademic teaching hospitals, and 4 nonteaching hospitals. Patients 18 years of age or older with recent-onset (< 36 hr), symptomatic atrial fibrillation without signs of myocardial ischemia or a history of persistent atrial fibrillation who presented to the emergency department were randomized in a 1:1 ratio to either a wait-and-see approach or early cardioversion. The wait-and-see approach consisted of the administration of rate-control medication, including intravenous or oral beta-adrenergic-receptor blocking agents, nondihydropyridine calcium-channel blockers, or digoxin to achieve a heart rate of 110 beats per minute or less and symptomatic relief. Patients were then discharged with an outpatient visit scheduled for the next day and a referral for cardioversion as close as possible to 48 hours after the onset of symptoms. The cardioconversion group received pharmacologic cardioversion with flecainide unless contraindicated, then electrical cardioversion was performed.

Main outcome measures. Primary outcome was the presence of sinus rhythm on electrocardiogram (ECG) recorded at the 4-week trial visit. Secondary endpoints included the duration of the index visit at the emergency department, emergency department visits related to atrial fibrillation, cardiovascular complications, and time until recurrence of atrial fibrillation.

Main results. From October 2014 through September 2018, 437 patients underwent randomization, with 218 patients assigned to the delayed cardioversion group and 219 to the early cardioversion group. Mean age was 65 years, and a majority of the patients (60%) were men (n = 261). The primary end point of the presence of sinus rhythm on the ECG recorded at the 4-week visit was present in 193 of 212 patients (91%) in the delayed cardioversion group and in 202 of 215 patients (94%) in the early cardioversion group. The –2.9 percentage points with confidence interval [CI] –8.2 to 2.2 (P = 0.005) met the criteria for the noninferiority of the wait-and-see approach.

For secondary outcomes, the median duration of the index visit was 120 minutes (range, 60 to 253) in the delayed cardioversion group and 158 minutes (range, 110 to 228) in the early cardioversion group. The median difference between the 2 groups was 30 minutes (95% CI, 6 to 51 minutes). There was no significant difference in cardiovascular complications between the 2 groups. Fourteen of 212 patients (7%) in the delayed cardioversion group and 14 of 215 patients (7%) in the early cardioversion group had subsequent visits to the emergency department because of a recurrence of atrial fibrillation. Telemetric ECG recordings were available for 335 of the 437 patients. Recurrence of atrial fibrillation occurred in 49 of the 164 (30%) patients in the delayed cardioversion group and 50 of the 171 (29%) patients in the early cardioversion group.

In terms of treatment, conversion to sinus rhythm within 48 hours occurred spontaneously in 150 of 218 patients (69%) in the delayed cardioversion group after receiving rate-control medications only. Of the 218 patients, 61 (28%) had delayed cardioversion (9 by pharmacologic and 52 by electrical cardioversion) as per protocol and achieved sinus rhythm within 48 hours. In the early cardioversion group, conversion to sinus rhythm occurred spontaneously in 36 of 219 patients (16%) before the initiation of the cardioversion and in 171 of 219 (78%) after cardioversion (83 by pharmacologic and 88 by electrical).

 

 

Conclusion. For patients with recent-onset, symptomatic atrial fibrillation, allowing a short time for spontaneous conversion to sinus rhythm is reasonable as demonstrated by this noninferiority study.

Commentary

Atrial fibrillation accounts for nearly 0.5% of all emergency department visits, and this number is increasing.1,2 Patients commonly undergo immediate restoration of sinus rhythm by means of pharmacologic or electrical cardioversion. However, it is questionable whether immediate restoration of sinus rhythm is necessary, as spontaneous conversion to sinus rhythm occurs frequently. In addition, the safety of cardioversion between 12 and 48 hours after the onset of atrial fibrillation is questionable.3,4

In this pragmatic trial, the findings suggest that rate-control therapy alone can achieve prompt symptom relief in almost all eligible patients, had a low risk of complications, and reduced the median length of stay in the emergency department to 2 hours. Independent of cardioversion strategy, the authors stressed the importance of management of stroke risk when patients present with atrial fibrillation to the emergency department. In this trial, 2 patients had cerebral embolism even though both were started on anticoagulation in the index visit. One patient from the delayed cardioversion group was on dabigatran after spontaneous conversion to sinus rhythm and had an event 5 days after the index visit. The other patient, from the early cardioversion group, was on rivaroxaban and had an event 10 days after electrical cardiology. In order for the results of this trial to be broadly applicable, exclusion of intraatrial thrombus on transesophageal echocardiography may be necessary when the onset of atrial fibrillation is not as clear.

There are several limitations of this study. First, this study included only 171 of the 3706 patients (4.6%) screened systematically at the 2 academic centers, but included 266 from 13 centers without systematic screening. The large amount of patients excluded from the controlled environment made the results less generalizable in the broader scope. Second, the reported incidence of recurrent atrial fibrillation within 4 weeks after randomization was an underestimation of the true recurrence rate since the trial used intermittent monitoring. Although the incidence of about 30% was similar between the 2 groups, the authors suggested that the probability of recurrence of atrial fibrillation was not affected by management approach during the acute event. Finally, for these results to be applicable in the general population, defined treatment algorithms and access to prompt follow-up are needed, and these may not be practical in other clinical settings.2,5

Applications for Clinical Practice

The current study demonstrated immediate cardioversion is not necessary for patients with recent-onset, symptomatic atrial fibrillation in the emergency department. Allowing a short time for spontaneous conversion to sinus rhythm is reasonable as long as the total time in atrial fibrillation is less than 48 hours. Special consideration for anticoagulation is critical because stroke has been associated with atrial fibrillation duration between 24 and 48 hours.

—Ka Ming Gordon Ngai, MD, MPH

References

1. Rozen G, Hosseini SM, Kaadan MI, et al. Emergency department visits for atrial fibrillation in the United States: trends in admission rates and economic burden from 2007 to 2014. J Am Heart Assoc. 2018;7(15):e009024.

2. Healey JS, McIntyre WF. The RACE to treat atrial fibrillation in the emergency department. N Engl J Med. 2019 Mar 18.

3. Andrade JM, Verma A, Mitchell LB, et al. 2018 Focused update of the Canadian Cardiovascular Society guidelines for the management of atrial fibrillation. Can J Cardiol. 2018;34:1371-1392. 


4. Nuotio I, Hartikainen JE, Grönberg T, et al. Time to cardioversion for acute atrial fibrillation and thromboembolic complications. JAMA. 2014;312:647-649

5. Baugh CW, Clark CL, Wilson JW, et al. Creation and implementation of an outpatient pathway for atrial fibrillation in the emergency department setting: results of an expert panel. Acad Emerg Med. 2018;25:1065-1075.

References

1. Rozen G, Hosseini SM, Kaadan MI, et al. Emergency department visits for atrial fibrillation in the United States: trends in admission rates and economic burden from 2007 to 2014. J Am Heart Assoc. 2018;7(15):e009024.

2. Healey JS, McIntyre WF. The RACE to treat atrial fibrillation in the emergency department. N Engl J Med. 2019 Mar 18.

3. Andrade JM, Verma A, Mitchell LB, et al. 2018 Focused update of the Canadian Cardiovascular Society guidelines for the management of atrial fibrillation. Can J Cardiol. 2018;34:1371-1392. 


4. Nuotio I, Hartikainen JE, Grönberg T, et al. Time to cardioversion for acute atrial fibrillation and thromboembolic complications. JAMA. 2014;312:647-649

5. Baugh CW, Clark CL, Wilson JW, et al. Creation and implementation of an outpatient pathway for atrial fibrillation in the emergency department setting: results of an expert panel. Acad Emerg Med. 2018;25:1065-1075.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
113-114
Page Number
113-114
Publications
Publications
Topics
Article Type
Display Headline
Delayed Cardioversion Noninferior to Early Cardioversion in Recent-Onset Atrial Fibrillation
Display Headline
Delayed Cardioversion Noninferior to Early Cardioversion in Recent-Onset Atrial Fibrillation
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Does Vitamin D Supplementation Improve Lower Extremity Power and Function in Community-Dwelling Older Adults?

Article Type
Changed
Thu, 04/23/2020 - 15:18
Display Headline
Does Vitamin D Supplementation Improve Lower Extremity Power and Function in Community-Dwelling Older Adults?

Study Overview

Objective. To test the effect of 12 months of vitamin D supplementation on lower-extremity power and function in older community-dwelling adults screened for low serum 25-hydroxyvitamin D (25(OH)D).

Design. A single-center, double-blind, randomized placebo-controlled study in which participants were assigned to 800 IU of vitamin D3 supplementation or placebo daily and were followed over a total period of 12 months.

Setting and participants. A total of 100 community-dwelling men and women aged ≥ 60 years with serum 25(OH)D ≤ 20 ng/mL at screening participated. Participants were prescreened by phone, and were excluded if they met any of the following exclusion criteria: vitamin D supplement use > 600 IU/day (for age 60-70 years) or > 800 IU/day (for age ≥ 71 years); vitamin D injection within the previous 3 months; > 2 falls or 1 fall with injury in past year; use of cane, walker, or other indoor walking aid; history of kidney stones within past 3 years; hypercalcemia (serum calcium > 10.8 mg/dL); renal dysfunction (glomerular filtration rate, < 30 mL/min); history of liver disease, sarcoidosis, lymphoma, dysphagia, or other gastrointestinal disorder; neuromuscular disorder affecting lower-extremity function; hip replacement within the past year; cancer treatment in the past 3 years; treatment with thiazide diuretics > 37.5 mg, teriparatide, denosumab, or bisphosphonates within the past 2 years; oral steroids (for > 3 weeks in the past 6 months); and use of fat malabsorption products or anticonvulsive therapy.

Main outcome measures. The primary outcome was leg extensor power assessed using a computer-interfaced bilateral Keiser pneumatic leg press. Secondary outcomes to measure physical function included: (1) backward tandem walk test (which is an indicator of balance and postural control during movement1); (2) Short Physical Performance Battery (SPPB) testing, which includes a balance assessment (ability to stand with feet positioned normally, semi-tandem, and tandem for 10s), a timed 4-m walk, and a chair stand test (time to complete 5 repeated chair stands); (3) stair climbing (ie, time to climb 10 steps, as a measure of knee extensor strength and functional capacity); and (4) handgrip strength (using a dynamometer). Lean tissue mass was assessed by dual X-ray absorptiometry (DEXA scan). Finally, other measures included serum total 25(OH)D levels measured at baseline, 4, 8, and 12 months, as well as 24-hour urine collection for urea-nitrogen and creatinine measurements.

Main results. Of the 2289 individuals screened for the study, 100 met eligibility criteria and underwent randomization to receive either 800 IU vitamin D supplementation daily (n = 49) or placebo (n = 51). Three patients (2 in vitamin D group and 1 in placebo group) were lost to follow up. The mean age of all participants was 69.6 ± 6.9 years. In the vitamin D group versus the control group, respectively, the percent male: female ratio was 66:34 versus 63:37, and percent Caucasian was 75% versus 82%. Mean body mass index was 28.2 ± 7.0 and mean serum 25(OH)D was 20.2 ± 6.7 ng/mL. At the end of the study (12 months), 70% of participants given vitamin D supplementation had 25(OH)D levels ≥ 30 ng/mL and all participants had levels ≥ 20 ng/mL. In the placebo group, the serum 25(OH)D level was ≥ 20 ng/mL in 54% and ≥ 30 ng/mL in 6%. The mean serum 25(OH)D level increased to 32.5 ± 5.1 ng/mL in the vitamin D–supplemented group, but no significant change was found in the placebo group (treatment × time, P < 0.001). Overall, the serum 1,25 (OH)2D3 levels did not differ between the 2 groups over the intervention period (time, P = 0.49; treatment × time, P = 0.27). Dietary intake of vitamin D, calcium, nitrogen, and protein did not differ or change over time between the 2 groups. The change in leg press power, function, and strength did not differ between the groups over 12 months (all treatment × time, P values ≥ 0.60). A total of 27 falls were reported (14 in vitamin D versus 9 in control group), of which 9 were associated with injuries. There was no significant change in lean body mass at the end of the study period in either group (treatment × time, P = 0.98).

Conclusion. In community-dwelling older adults with vitamin D deficiency (≤ 20 ng/mL), 12-month daily supplementation with 800 IU of vitamin D3 resulted in sufficient increases in serum 25(OH)D levels, but did not improve lower-extremity power, strength, or lean mass.

Commentary

Vitamin D deficiency is common in older adults (prevalence of about 41% in US adults ≥ 65 years old, according to Forrest et al2) and is likely due to dietary deficiency, reduced sun exposure (lifestyle), and decreased intestinal calcium absorption. As such, vitamin D deficiency has historically been a topic of debate and of interest in geriatric medicine, as it relates to muscle weakness, which in turn leads to increased susceptibility to falls.3 Interestingly, vitamin D receptors are expressed in human skeletal muscle,4 and in one study, 3-month supplementation of vitamin D led to an increase in type II skeletal muscle fibers in older women.5 Similarly, results from a meta-analysis of 5 randomized controlled trials (RCTs)6 showed that vitamin D supplementation may reduce fall risk in older adults by 22% (corrected odds ratio, 0.78; 95% confidence interval, 0.64-0.92). Thus, in keeping with this general theme of vitamin D supplementation yielding beneficial effects in clinical outcomes, clinicians have long accepted and practiced routine vitamin D supplementation in caring for older adults.

 

 

In more recent years, the role of vitamin D supplementation in primary care has become controversial,7 as observed in a recent paradigm shift of moving away from routine supplementation for fall and fracture prevention in clinical practice.8 In a recent meta-analysis of 33 RCTs in older community-dwelling adults, supplementation with vitamin D with or without calcium did not result in a reduction of hip fracture or total number of fractures.9 Moreover, the United States Preventive Services Task Force (USPSTF) recently published updated recommendations on the use of vitamin D supplementation for primary prevention of fractures10 and prevention of falls11 in community-dwelling adults. In these updated recommendations, the USPSTF indicated that insufficient evidence exists to recommend vitamin D supplementation to prevent fractures in men and premenopausal women, and recommends against vitamin D supplementation for prevention of falls. Finally, USPSTF recommends against low-dose vitamin D (400 IU or less) supplementation for primary prevention of fractures in community-dwelling, postmenopausal women.10 Nevertheless, these statements are not applicable for individuals with a prior history of osteoporotic fractures, increased risk of falls, or a diagnosis of vitamin D deficiency or osteoporosis. Therefore, vitamin D supplementation for prevention of fall and fractures should be practiced with caution.

Vitamin D supplementation is no longer routinely recommended for fall and fracture prevention. However, if we believe that poor lower extremity muscle strength is a risk factor for falls,12 then the question of whether vitamin D has a beneficial role in improving lower extremity strength in older adults needs to be addressed. Results regarding the effect of vitamin D supplementation on muscle function have so far been mixed. For example, in a randomized, double-blinded, placebo-controlled trial of 160 postmenopausal women with low vitamin D level (< 20 ng/mL), vitamin D3 supplementation at 1000 IU/day for 9 months showed a significant increase in lower extremity muscle strength.13 However, in another randomized double-blinded, placebo-controlled trial of 130 men aged 65 to 90 years with low vitamin D level (< 30 ng/mL) and an SPPB score of ≤ 9 (mild-moderate limitation in mobility), daily supplementation with 4000 IU of vitamin D3 for 9 months did not result in improved SPPB score or gait speed.14 In the study reported by Shea et al, the authors showed that 800 IU of daily vitamin D supplementation (consistent with the Institute of Medicine [IOM] recommendations for older adults15) in community-dwelling older adults with vitamin D deficiency (< 20 ng/mL) did not improve lower extremity muscle strength. This finding is significant in that it adds further evidence to support the rationale against using vitamin D supplementation for the sole purpose of improving lower extremity muscle function in older adults with vitamin D deficiency.

Valuable strengths of this study include its randomized, double-blinded, placebo-controlled trial design testing the IOM recommended dose of daily vitamin D supplementation for older adults. In addition, compared to some of the prior studies mentioned above, the study population included both males and females, although the final study population resulted in some gender bias (with male predominance). Moreover, participants were followed for a sufficient amount of time (1 year), with an excellent adherence rate (only 3 were lost to follow-up) and with corresponding improvement in vitamin D levels. Finally, the use of SPPB as a readout for primary outcome should also be commended, as this assessment is a well-validated method for measuring lower extremity function with scaled scores that predict poor outcomes.16 However, some limitations include the aforementioned predominance of male participants and Caucasian race in both intervention and control groups, as well as discrepancies between the measurement methods for serum vitamin D levels (ie, finger-stick cards versus clinical lab measurement) that may have underestimated the actual serum 25(OH)D levels.

 

Applications for Clinical Practice

While the null findings from the Shea and colleagues study are applicable to healthier community-dwelling older adults, they may not be generalizable to the care of more frail older patients due to their increased risks for falls and high vulnerability to adverse outcomes. Thus, further studies that account for baseline sarcopenia, frailty, and other fall-risk factors (eg, polypharmacy) are needed to better evaluate the value of vitamin D supplementation in this most vulnerable population.

Caroline Park, MD, PhD, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai, New York, NY

References

1. Husu P, Suni J, Pasanen M, Miilunpalo S. Health-related fitness tests as predictors of difficulties in long-distance walking among high-functioning older adults. Aging Clin Exp Res. 2007;19:444-450.

2. Forrest KYZ, Stuhldreher WL. Prevalence and correlates of vitamin D deficiency in US adults. Nutr Res. 2011;31:48-54.

3. Bischoff-Ferrari HA, Giovannucci E, Willett WC, et al. Estimation of optimal serum concentrations of 25-hydroxyvitamin D for multiple health outcomes. Am J Clin Nutr. 2006;84:1253.

4. Simpson RU, Thomas GA, Arnold AJ. Identification of 1,25-dihydroxyvitamin-D3 receptors and activities in muscle. J Biol Chem. 1985;260:8882-8891.

5. Sorensen OH, Lund BI, Saltin B, et al. Myopathy in bone loss ofaging - improvement by treatment with 1alpha-hydroxycholecalciferol and calcium. Clinical Science. 1979;56:157-161.

6. Bischoff-Ferrari HA, Dawson-Hughes B, Willett WC, et al. Effect of vitamin D on falls - A meta-analysis. JAMA. 2004;291:1999-2006.

7. Lewis JR SM, Daly RM. The vitamin D and calcium controversy: an update. Curr Opin Rheumatol. 2019;31:91-97.

8. Schwenk T. No value for routine vitamin D supplementation. NEJM Journal Watch. December 26, 2018.

9. Zhao JG, Zeng XT, Wang J, Liu L. Association between calcium or vitamin D supplementation and fracture incidence in community-dwelling older adults: a systematic review and meta-analysis. JAMA. 2017;318:2466-2482.

10. Grossman DC, Curry SJ, Owens DK, et al. Vitamin D, calcium, or combined supplementation for the primary prevention of fractures in community-dwelling adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1592-1599.

11. Grossman DC, Curry SJ, Owens DK, et al. Interventions to prevent falls in community-dwelling older adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1696-1704.

12. Tinetti ME, Speechley M, Ginter SF. Risk-factors for falls among elderly persons living in the community. N Engl J Med. 1988;319:1701-1707.

13. Cangussu LM, Nahas-Neto J, Orsatti CL, et al. Effect of vitamin D supplementation alone on muscle function in postmenopausal women: a randomized, double-blind, placebo-controlled clinical trial. Osteoporos Int. 2015;26:2413-2421.

14. Levis S, Gomez-Marin O. Vitamin D and physical function in sedentary older men. J Am Geriatr Soc. 2017;65:323-331.

15. Ross CA TC, Yaktine AL, Del Valle HB. Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium. National Academies Press. 2011.

16. Guralnik JM, Ferrucci L, Simonsick EM, et al. Lower-extremity function in persons over the age of 70 years as a predictor of subsequent disability. N Engl J Med. 1995;332:556-561

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
110-112
Sections
Article PDF
Article PDF

Study Overview

Objective. To test the effect of 12 months of vitamin D supplementation on lower-extremity power and function in older community-dwelling adults screened for low serum 25-hydroxyvitamin D (25(OH)D).

Design. A single-center, double-blind, randomized placebo-controlled study in which participants were assigned to 800 IU of vitamin D3 supplementation or placebo daily and were followed over a total period of 12 months.

Setting and participants. A total of 100 community-dwelling men and women aged ≥ 60 years with serum 25(OH)D ≤ 20 ng/mL at screening participated. Participants were prescreened by phone, and were excluded if they met any of the following exclusion criteria: vitamin D supplement use > 600 IU/day (for age 60-70 years) or > 800 IU/day (for age ≥ 71 years); vitamin D injection within the previous 3 months; > 2 falls or 1 fall with injury in past year; use of cane, walker, or other indoor walking aid; history of kidney stones within past 3 years; hypercalcemia (serum calcium > 10.8 mg/dL); renal dysfunction (glomerular filtration rate, < 30 mL/min); history of liver disease, sarcoidosis, lymphoma, dysphagia, or other gastrointestinal disorder; neuromuscular disorder affecting lower-extremity function; hip replacement within the past year; cancer treatment in the past 3 years; treatment with thiazide diuretics > 37.5 mg, teriparatide, denosumab, or bisphosphonates within the past 2 years; oral steroids (for > 3 weeks in the past 6 months); and use of fat malabsorption products or anticonvulsive therapy.

Main outcome measures. The primary outcome was leg extensor power assessed using a computer-interfaced bilateral Keiser pneumatic leg press. Secondary outcomes to measure physical function included: (1) backward tandem walk test (which is an indicator of balance and postural control during movement1); (2) Short Physical Performance Battery (SPPB) testing, which includes a balance assessment (ability to stand with feet positioned normally, semi-tandem, and tandem for 10s), a timed 4-m walk, and a chair stand test (time to complete 5 repeated chair stands); (3) stair climbing (ie, time to climb 10 steps, as a measure of knee extensor strength and functional capacity); and (4) handgrip strength (using a dynamometer). Lean tissue mass was assessed by dual X-ray absorptiometry (DEXA scan). Finally, other measures included serum total 25(OH)D levels measured at baseline, 4, 8, and 12 months, as well as 24-hour urine collection for urea-nitrogen and creatinine measurements.

Main results. Of the 2289 individuals screened for the study, 100 met eligibility criteria and underwent randomization to receive either 800 IU vitamin D supplementation daily (n = 49) or placebo (n = 51). Three patients (2 in vitamin D group and 1 in placebo group) were lost to follow up. The mean age of all participants was 69.6 ± 6.9 years. In the vitamin D group versus the control group, respectively, the percent male: female ratio was 66:34 versus 63:37, and percent Caucasian was 75% versus 82%. Mean body mass index was 28.2 ± 7.0 and mean serum 25(OH)D was 20.2 ± 6.7 ng/mL. At the end of the study (12 months), 70% of participants given vitamin D supplementation had 25(OH)D levels ≥ 30 ng/mL and all participants had levels ≥ 20 ng/mL. In the placebo group, the serum 25(OH)D level was ≥ 20 ng/mL in 54% and ≥ 30 ng/mL in 6%. The mean serum 25(OH)D level increased to 32.5 ± 5.1 ng/mL in the vitamin D–supplemented group, but no significant change was found in the placebo group (treatment × time, P < 0.001). Overall, the serum 1,25 (OH)2D3 levels did not differ between the 2 groups over the intervention period (time, P = 0.49; treatment × time, P = 0.27). Dietary intake of vitamin D, calcium, nitrogen, and protein did not differ or change over time between the 2 groups. The change in leg press power, function, and strength did not differ between the groups over 12 months (all treatment × time, P values ≥ 0.60). A total of 27 falls were reported (14 in vitamin D versus 9 in control group), of which 9 were associated with injuries. There was no significant change in lean body mass at the end of the study period in either group (treatment × time, P = 0.98).

Conclusion. In community-dwelling older adults with vitamin D deficiency (≤ 20 ng/mL), 12-month daily supplementation with 800 IU of vitamin D3 resulted in sufficient increases in serum 25(OH)D levels, but did not improve lower-extremity power, strength, or lean mass.

Commentary

Vitamin D deficiency is common in older adults (prevalence of about 41% in US adults ≥ 65 years old, according to Forrest et al2) and is likely due to dietary deficiency, reduced sun exposure (lifestyle), and decreased intestinal calcium absorption. As such, vitamin D deficiency has historically been a topic of debate and of interest in geriatric medicine, as it relates to muscle weakness, which in turn leads to increased susceptibility to falls.3 Interestingly, vitamin D receptors are expressed in human skeletal muscle,4 and in one study, 3-month supplementation of vitamin D led to an increase in type II skeletal muscle fibers in older women.5 Similarly, results from a meta-analysis of 5 randomized controlled trials (RCTs)6 showed that vitamin D supplementation may reduce fall risk in older adults by 22% (corrected odds ratio, 0.78; 95% confidence interval, 0.64-0.92). Thus, in keeping with this general theme of vitamin D supplementation yielding beneficial effects in clinical outcomes, clinicians have long accepted and practiced routine vitamin D supplementation in caring for older adults.

 

 

In more recent years, the role of vitamin D supplementation in primary care has become controversial,7 as observed in a recent paradigm shift of moving away from routine supplementation for fall and fracture prevention in clinical practice.8 In a recent meta-analysis of 33 RCTs in older community-dwelling adults, supplementation with vitamin D with or without calcium did not result in a reduction of hip fracture or total number of fractures.9 Moreover, the United States Preventive Services Task Force (USPSTF) recently published updated recommendations on the use of vitamin D supplementation for primary prevention of fractures10 and prevention of falls11 in community-dwelling adults. In these updated recommendations, the USPSTF indicated that insufficient evidence exists to recommend vitamin D supplementation to prevent fractures in men and premenopausal women, and recommends against vitamin D supplementation for prevention of falls. Finally, USPSTF recommends against low-dose vitamin D (400 IU or less) supplementation for primary prevention of fractures in community-dwelling, postmenopausal women.10 Nevertheless, these statements are not applicable for individuals with a prior history of osteoporotic fractures, increased risk of falls, or a diagnosis of vitamin D deficiency or osteoporosis. Therefore, vitamin D supplementation for prevention of fall and fractures should be practiced with caution.

Vitamin D supplementation is no longer routinely recommended for fall and fracture prevention. However, if we believe that poor lower extremity muscle strength is a risk factor for falls,12 then the question of whether vitamin D has a beneficial role in improving lower extremity strength in older adults needs to be addressed. Results regarding the effect of vitamin D supplementation on muscle function have so far been mixed. For example, in a randomized, double-blinded, placebo-controlled trial of 160 postmenopausal women with low vitamin D level (< 20 ng/mL), vitamin D3 supplementation at 1000 IU/day for 9 months showed a significant increase in lower extremity muscle strength.13 However, in another randomized double-blinded, placebo-controlled trial of 130 men aged 65 to 90 years with low vitamin D level (< 30 ng/mL) and an SPPB score of ≤ 9 (mild-moderate limitation in mobility), daily supplementation with 4000 IU of vitamin D3 for 9 months did not result in improved SPPB score or gait speed.14 In the study reported by Shea et al, the authors showed that 800 IU of daily vitamin D supplementation (consistent with the Institute of Medicine [IOM] recommendations for older adults15) in community-dwelling older adults with vitamin D deficiency (< 20 ng/mL) did not improve lower extremity muscle strength. This finding is significant in that it adds further evidence to support the rationale against using vitamin D supplementation for the sole purpose of improving lower extremity muscle function in older adults with vitamin D deficiency.

Valuable strengths of this study include its randomized, double-blinded, placebo-controlled trial design testing the IOM recommended dose of daily vitamin D supplementation for older adults. In addition, compared to some of the prior studies mentioned above, the study population included both males and females, although the final study population resulted in some gender bias (with male predominance). Moreover, participants were followed for a sufficient amount of time (1 year), with an excellent adherence rate (only 3 were lost to follow-up) and with corresponding improvement in vitamin D levels. Finally, the use of SPPB as a readout for primary outcome should also be commended, as this assessment is a well-validated method for measuring lower extremity function with scaled scores that predict poor outcomes.16 However, some limitations include the aforementioned predominance of male participants and Caucasian race in both intervention and control groups, as well as discrepancies between the measurement methods for serum vitamin D levels (ie, finger-stick cards versus clinical lab measurement) that may have underestimated the actual serum 25(OH)D levels.

 

Applications for Clinical Practice

While the null findings from the Shea and colleagues study are applicable to healthier community-dwelling older adults, they may not be generalizable to the care of more frail older patients due to their increased risks for falls and high vulnerability to adverse outcomes. Thus, further studies that account for baseline sarcopenia, frailty, and other fall-risk factors (eg, polypharmacy) are needed to better evaluate the value of vitamin D supplementation in this most vulnerable population.

Caroline Park, MD, PhD, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai, New York, NY

Study Overview

Objective. To test the effect of 12 months of vitamin D supplementation on lower-extremity power and function in older community-dwelling adults screened for low serum 25-hydroxyvitamin D (25(OH)D).

Design. A single-center, double-blind, randomized placebo-controlled study in which participants were assigned to 800 IU of vitamin D3 supplementation or placebo daily and were followed over a total period of 12 months.

Setting and participants. A total of 100 community-dwelling men and women aged ≥ 60 years with serum 25(OH)D ≤ 20 ng/mL at screening participated. Participants were prescreened by phone, and were excluded if they met any of the following exclusion criteria: vitamin D supplement use > 600 IU/day (for age 60-70 years) or > 800 IU/day (for age ≥ 71 years); vitamin D injection within the previous 3 months; > 2 falls or 1 fall with injury in past year; use of cane, walker, or other indoor walking aid; history of kidney stones within past 3 years; hypercalcemia (serum calcium > 10.8 mg/dL); renal dysfunction (glomerular filtration rate, < 30 mL/min); history of liver disease, sarcoidosis, lymphoma, dysphagia, or other gastrointestinal disorder; neuromuscular disorder affecting lower-extremity function; hip replacement within the past year; cancer treatment in the past 3 years; treatment with thiazide diuretics > 37.5 mg, teriparatide, denosumab, or bisphosphonates within the past 2 years; oral steroids (for > 3 weeks in the past 6 months); and use of fat malabsorption products or anticonvulsive therapy.

Main outcome measures. The primary outcome was leg extensor power assessed using a computer-interfaced bilateral Keiser pneumatic leg press. Secondary outcomes to measure physical function included: (1) backward tandem walk test (which is an indicator of balance and postural control during movement1); (2) Short Physical Performance Battery (SPPB) testing, which includes a balance assessment (ability to stand with feet positioned normally, semi-tandem, and tandem for 10s), a timed 4-m walk, and a chair stand test (time to complete 5 repeated chair stands); (3) stair climbing (ie, time to climb 10 steps, as a measure of knee extensor strength and functional capacity); and (4) handgrip strength (using a dynamometer). Lean tissue mass was assessed by dual X-ray absorptiometry (DEXA scan). Finally, other measures included serum total 25(OH)D levels measured at baseline, 4, 8, and 12 months, as well as 24-hour urine collection for urea-nitrogen and creatinine measurements.

Main results. Of the 2289 individuals screened for the study, 100 met eligibility criteria and underwent randomization to receive either 800 IU vitamin D supplementation daily (n = 49) or placebo (n = 51). Three patients (2 in vitamin D group and 1 in placebo group) were lost to follow up. The mean age of all participants was 69.6 ± 6.9 years. In the vitamin D group versus the control group, respectively, the percent male: female ratio was 66:34 versus 63:37, and percent Caucasian was 75% versus 82%. Mean body mass index was 28.2 ± 7.0 and mean serum 25(OH)D was 20.2 ± 6.7 ng/mL. At the end of the study (12 months), 70% of participants given vitamin D supplementation had 25(OH)D levels ≥ 30 ng/mL and all participants had levels ≥ 20 ng/mL. In the placebo group, the serum 25(OH)D level was ≥ 20 ng/mL in 54% and ≥ 30 ng/mL in 6%. The mean serum 25(OH)D level increased to 32.5 ± 5.1 ng/mL in the vitamin D–supplemented group, but no significant change was found in the placebo group (treatment × time, P < 0.001). Overall, the serum 1,25 (OH)2D3 levels did not differ between the 2 groups over the intervention period (time, P = 0.49; treatment × time, P = 0.27). Dietary intake of vitamin D, calcium, nitrogen, and protein did not differ or change over time between the 2 groups. The change in leg press power, function, and strength did not differ between the groups over 12 months (all treatment × time, P values ≥ 0.60). A total of 27 falls were reported (14 in vitamin D versus 9 in control group), of which 9 were associated with injuries. There was no significant change in lean body mass at the end of the study period in either group (treatment × time, P = 0.98).

Conclusion. In community-dwelling older adults with vitamin D deficiency (≤ 20 ng/mL), 12-month daily supplementation with 800 IU of vitamin D3 resulted in sufficient increases in serum 25(OH)D levels, but did not improve lower-extremity power, strength, or lean mass.

Commentary

Vitamin D deficiency is common in older adults (prevalence of about 41% in US adults ≥ 65 years old, according to Forrest et al2) and is likely due to dietary deficiency, reduced sun exposure (lifestyle), and decreased intestinal calcium absorption. As such, vitamin D deficiency has historically been a topic of debate and of interest in geriatric medicine, as it relates to muscle weakness, which in turn leads to increased susceptibility to falls.3 Interestingly, vitamin D receptors are expressed in human skeletal muscle,4 and in one study, 3-month supplementation of vitamin D led to an increase in type II skeletal muscle fibers in older women.5 Similarly, results from a meta-analysis of 5 randomized controlled trials (RCTs)6 showed that vitamin D supplementation may reduce fall risk in older adults by 22% (corrected odds ratio, 0.78; 95% confidence interval, 0.64-0.92). Thus, in keeping with this general theme of vitamin D supplementation yielding beneficial effects in clinical outcomes, clinicians have long accepted and practiced routine vitamin D supplementation in caring for older adults.

 

 

In more recent years, the role of vitamin D supplementation in primary care has become controversial,7 as observed in a recent paradigm shift of moving away from routine supplementation for fall and fracture prevention in clinical practice.8 In a recent meta-analysis of 33 RCTs in older community-dwelling adults, supplementation with vitamin D with or without calcium did not result in a reduction of hip fracture or total number of fractures.9 Moreover, the United States Preventive Services Task Force (USPSTF) recently published updated recommendations on the use of vitamin D supplementation for primary prevention of fractures10 and prevention of falls11 in community-dwelling adults. In these updated recommendations, the USPSTF indicated that insufficient evidence exists to recommend vitamin D supplementation to prevent fractures in men and premenopausal women, and recommends against vitamin D supplementation for prevention of falls. Finally, USPSTF recommends against low-dose vitamin D (400 IU or less) supplementation for primary prevention of fractures in community-dwelling, postmenopausal women.10 Nevertheless, these statements are not applicable for individuals with a prior history of osteoporotic fractures, increased risk of falls, or a diagnosis of vitamin D deficiency or osteoporosis. Therefore, vitamin D supplementation for prevention of fall and fractures should be practiced with caution.

Vitamin D supplementation is no longer routinely recommended for fall and fracture prevention. However, if we believe that poor lower extremity muscle strength is a risk factor for falls,12 then the question of whether vitamin D has a beneficial role in improving lower extremity strength in older adults needs to be addressed. Results regarding the effect of vitamin D supplementation on muscle function have so far been mixed. For example, in a randomized, double-blinded, placebo-controlled trial of 160 postmenopausal women with low vitamin D level (< 20 ng/mL), vitamin D3 supplementation at 1000 IU/day for 9 months showed a significant increase in lower extremity muscle strength.13 However, in another randomized double-blinded, placebo-controlled trial of 130 men aged 65 to 90 years with low vitamin D level (< 30 ng/mL) and an SPPB score of ≤ 9 (mild-moderate limitation in mobility), daily supplementation with 4000 IU of vitamin D3 for 9 months did not result in improved SPPB score or gait speed.14 In the study reported by Shea et al, the authors showed that 800 IU of daily vitamin D supplementation (consistent with the Institute of Medicine [IOM] recommendations for older adults15) in community-dwelling older adults with vitamin D deficiency (< 20 ng/mL) did not improve lower extremity muscle strength. This finding is significant in that it adds further evidence to support the rationale against using vitamin D supplementation for the sole purpose of improving lower extremity muscle function in older adults with vitamin D deficiency.

Valuable strengths of this study include its randomized, double-blinded, placebo-controlled trial design testing the IOM recommended dose of daily vitamin D supplementation for older adults. In addition, compared to some of the prior studies mentioned above, the study population included both males and females, although the final study population resulted in some gender bias (with male predominance). Moreover, participants were followed for a sufficient amount of time (1 year), with an excellent adherence rate (only 3 were lost to follow-up) and with corresponding improvement in vitamin D levels. Finally, the use of SPPB as a readout for primary outcome should also be commended, as this assessment is a well-validated method for measuring lower extremity function with scaled scores that predict poor outcomes.16 However, some limitations include the aforementioned predominance of male participants and Caucasian race in both intervention and control groups, as well as discrepancies between the measurement methods for serum vitamin D levels (ie, finger-stick cards versus clinical lab measurement) that may have underestimated the actual serum 25(OH)D levels.

 

Applications for Clinical Practice

While the null findings from the Shea and colleagues study are applicable to healthier community-dwelling older adults, they may not be generalizable to the care of more frail older patients due to their increased risks for falls and high vulnerability to adverse outcomes. Thus, further studies that account for baseline sarcopenia, frailty, and other fall-risk factors (eg, polypharmacy) are needed to better evaluate the value of vitamin D supplementation in this most vulnerable population.

Caroline Park, MD, PhD, and Fred Ko, MD
Icahn School of Medicine at Mount Sinai, New York, NY

References

1. Husu P, Suni J, Pasanen M, Miilunpalo S. Health-related fitness tests as predictors of difficulties in long-distance walking among high-functioning older adults. Aging Clin Exp Res. 2007;19:444-450.

2. Forrest KYZ, Stuhldreher WL. Prevalence and correlates of vitamin D deficiency in US adults. Nutr Res. 2011;31:48-54.

3. Bischoff-Ferrari HA, Giovannucci E, Willett WC, et al. Estimation of optimal serum concentrations of 25-hydroxyvitamin D for multiple health outcomes. Am J Clin Nutr. 2006;84:1253.

4. Simpson RU, Thomas GA, Arnold AJ. Identification of 1,25-dihydroxyvitamin-D3 receptors and activities in muscle. J Biol Chem. 1985;260:8882-8891.

5. Sorensen OH, Lund BI, Saltin B, et al. Myopathy in bone loss ofaging - improvement by treatment with 1alpha-hydroxycholecalciferol and calcium. Clinical Science. 1979;56:157-161.

6. Bischoff-Ferrari HA, Dawson-Hughes B, Willett WC, et al. Effect of vitamin D on falls - A meta-analysis. JAMA. 2004;291:1999-2006.

7. Lewis JR SM, Daly RM. The vitamin D and calcium controversy: an update. Curr Opin Rheumatol. 2019;31:91-97.

8. Schwenk T. No value for routine vitamin D supplementation. NEJM Journal Watch. December 26, 2018.

9. Zhao JG, Zeng XT, Wang J, Liu L. Association between calcium or vitamin D supplementation and fracture incidence in community-dwelling older adults: a systematic review and meta-analysis. JAMA. 2017;318:2466-2482.

10. Grossman DC, Curry SJ, Owens DK, et al. Vitamin D, calcium, or combined supplementation for the primary prevention of fractures in community-dwelling adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1592-1599.

11. Grossman DC, Curry SJ, Owens DK, et al. Interventions to prevent falls in community-dwelling older adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1696-1704.

12. Tinetti ME, Speechley M, Ginter SF. Risk-factors for falls among elderly persons living in the community. N Engl J Med. 1988;319:1701-1707.

13. Cangussu LM, Nahas-Neto J, Orsatti CL, et al. Effect of vitamin D supplementation alone on muscle function in postmenopausal women: a randomized, double-blind, placebo-controlled clinical trial. Osteoporos Int. 2015;26:2413-2421.

14. Levis S, Gomez-Marin O. Vitamin D and physical function in sedentary older men. J Am Geriatr Soc. 2017;65:323-331.

15. Ross CA TC, Yaktine AL, Del Valle HB. Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium. National Academies Press. 2011.

16. Guralnik JM, Ferrucci L, Simonsick EM, et al. Lower-extremity function in persons over the age of 70 years as a predictor of subsequent disability. N Engl J Med. 1995;332:556-561

References

1. Husu P, Suni J, Pasanen M, Miilunpalo S. Health-related fitness tests as predictors of difficulties in long-distance walking among high-functioning older adults. Aging Clin Exp Res. 2007;19:444-450.

2. Forrest KYZ, Stuhldreher WL. Prevalence and correlates of vitamin D deficiency in US adults. Nutr Res. 2011;31:48-54.

3. Bischoff-Ferrari HA, Giovannucci E, Willett WC, et al. Estimation of optimal serum concentrations of 25-hydroxyvitamin D for multiple health outcomes. Am J Clin Nutr. 2006;84:1253.

4. Simpson RU, Thomas GA, Arnold AJ. Identification of 1,25-dihydroxyvitamin-D3 receptors and activities in muscle. J Biol Chem. 1985;260:8882-8891.

5. Sorensen OH, Lund BI, Saltin B, et al. Myopathy in bone loss ofaging - improvement by treatment with 1alpha-hydroxycholecalciferol and calcium. Clinical Science. 1979;56:157-161.

6. Bischoff-Ferrari HA, Dawson-Hughes B, Willett WC, et al. Effect of vitamin D on falls - A meta-analysis. JAMA. 2004;291:1999-2006.

7. Lewis JR SM, Daly RM. The vitamin D and calcium controversy: an update. Curr Opin Rheumatol. 2019;31:91-97.

8. Schwenk T. No value for routine vitamin D supplementation. NEJM Journal Watch. December 26, 2018.

9. Zhao JG, Zeng XT, Wang J, Liu L. Association between calcium or vitamin D supplementation and fracture incidence in community-dwelling older adults: a systematic review and meta-analysis. JAMA. 2017;318:2466-2482.

10. Grossman DC, Curry SJ, Owens DK, et al. Vitamin D, calcium, or combined supplementation for the primary prevention of fractures in community-dwelling adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1592-1599.

11. Grossman DC, Curry SJ, Owens DK, et al. Interventions to prevent falls in community-dwelling older adults US Preventive Services Task Force Recommendation Statement. JAMA. 2018;319:1696-1704.

12. Tinetti ME, Speechley M, Ginter SF. Risk-factors for falls among elderly persons living in the community. N Engl J Med. 1988;319:1701-1707.

13. Cangussu LM, Nahas-Neto J, Orsatti CL, et al. Effect of vitamin D supplementation alone on muscle function in postmenopausal women: a randomized, double-blind, placebo-controlled clinical trial. Osteoporos Int. 2015;26:2413-2421.

14. Levis S, Gomez-Marin O. Vitamin D and physical function in sedentary older men. J Am Geriatr Soc. 2017;65:323-331.

15. Ross CA TC, Yaktine AL, Del Valle HB. Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium. National Academies Press. 2011.

16. Guralnik JM, Ferrucci L, Simonsick EM, et al. Lower-extremity function in persons over the age of 70 years as a predictor of subsequent disability. N Engl J Med. 1995;332:556-561

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
110-112
Page Number
110-112
Publications
Publications
Topics
Article Type
Display Headline
Does Vitamin D Supplementation Improve Lower Extremity Power and Function in Community-Dwelling Older Adults?
Display Headline
Does Vitamin D Supplementation Improve Lower Extremity Power and Function in Community-Dwelling Older Adults?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

Once-Daily 2-Drug versus 3-Drug Antiretroviral Therapy for HIV Infection in Treatment-naive Adults: Less Is Best?

Article Type
Changed
Thu, 04/23/2020 - 15:01
Display Headline
Once-Daily 2-Drug versus 3-Drug Antiretroviral Therapy for HIV Infection in Treatment-naive Adults: Less Is Best?

Study Overview

Objective. To evaluate the efficacy and safety of a once-daily 2-drug antiretroviral (ARV) regimen, dolutegravir plus lamivudine, for the treatment of HIV-1 infection in adults naive to antiretroviral therapy (ART).

Design. GEMINI-1 and GEMINI-2 were 2 identically designed multicenter, double-blind, randomized, noninferiority, phase 3 clinical trials conducted between July 18, 2016 and March 31, 2017. Participants were stratified to receive 1 of 2 once-daily HIV regimens: the study regimen, consisting of once-daily dolutegravir 50 mg plus lamivudine 300 mg, or the standard-of-care regimen, consisting of once-daily dolutegravir 50 mg plus tenofovir disoproxil fumarate (TDF) 300 mg plus emtricitabine 200 mg. While this article presents results at week 48, both trials are scheduled to evaluate participants up to week 148 in an attempt to evaluate long-term efficacy and safety.

Setting and participants. Eligible participants had to be aged 18 years or older with treatment-naive HIV-1 infection. Women were eligible if they were not (1) pregnant, (2) lactating, or (3) of reproductive potential, defined by various means, including tubal ligation, hysterectomy, postmenopausal, and the use of highly effective contraception. Initially, eligibility screening restricted participation to those with viral loads between 1000 and 100,000 copies/mL. However, the upper limit was later increased to 500,000 copies/mL based on an independent review of results from other clinical trials1,2 evaluating dual therapy with dolutegravir and lamivudine, which indicated efficacy in patients with viral loads up to 500,000.3-5

Notable exclusion criteria included: (1) major mutations to nucleoside reverse transcriptase inhibitors, non-nucleoside reverse transcriptase inhibitors, and protease inhibitors; (2) evidence of hepatitis B infection; (3) hepatitis C infection with anticipation of initiating treatment within 48 weeks of study enrollment; and (4) stage 3 HIV disease, per Centers for Disease Control and Prevention criteria, with the exception of cutaneous Kaposi sarcoma and CD4 cell counts < 200 cells/mL.

Main outcome measures. The primary endpoint was demonstration of noninferiority of the 2-drug ARV regimen through assessment of the proportion of participants who achieved virologic suppression at week 48 in the intent-to-treat-exposed population. For the purposes of this study, virologic suppression was defined as having fewer than 50 copies of HIV-1 RNA per mL at week 48. For evaluation of safety and toxicity concerns, renal and bone biomarkers were assessed at study entry and at weeks 24 and 48. In addition, participants who met virological withdrawal criteria were evaluated for integrase strand transfer inhibitor mutations. Virological withdrawal was defined as the presence of 1 of the following: (1) HIV RNA > 200 copies/mL at week 24, (2) HIV RNA > 200 copies/mL after previous HIV RNA < 200 copies/mL (confirmed rebound), and (3) a < 1 log10 copies/mL decrease from baseline (unless already < 200 copies/mL).

Main results. GEMINI-1 and GEMINI-2 randomized a combined total of 1441 participants to receive either the once-daily 2-drug ARV regimen (dolutegravir and lamivudine, n = 719) or the once-daily 3-drug ARV regimen (dolutegravir, TDF, and emtricitabine, n = 722). Of the 533 participants who did not meet inclusion criteria, the predominant reasons for exclusion were either having preexisting major viral resistance mutations (n = 246) or viral loads outside the range of 1000 to 500,000 copies/mL (n = 133).

Baseline demographic and clinical characteristics were similar between both groups. The median age was 33 years (10% were over 50 years of age), and participants were mostly male (85%) and white (68%). Baseline HIV RNA counts of > 100,000 copies/mL were found in 293 participants (20%), and 188 (8%) participants had CD4 counts of ≤ 200 cells/mL.

 

 

Noninferiority of the once-daily 2-drug versus the once-daily 3-drug ARV regimen was demonstrated in both the GEMINI-1 and GEMINI-2 trials for the intent-to-treat-exposed population. In GEMINI-1, 90% (n = 320) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 93% (n = 332) in the 3-drug ARV group (no statistically significant difference). In GEMINI-2, 93% (n =335 ) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 94% (n = 337) in the 3-drug ARV group (no statistically significant difference).

A subgroup analysis found no significant impact of baseline HIV RNA (> 100,000 compared to ≤ 100,000 copies/mL) on achieving virologic suppression at week 48. However, a subgroup analysis did find that participants with CD4 counts < 200 copies/mL had a reduced response in the once-daily 2-drug versus 3-drug ARV regimen for achieving virologic response at week 48 (79% versus 93%, respectively).

Overall, 10 participants met virological withdrawal criteria during the study period, and 4 of these were on the 2-drug ARV regimen. For these 10 participants, genotypic testing did not find emergence of resistance to either nucleoside reverse transcriptase or integrase strand transfer inhibitors.

Regarding renal biomarkers, increases of both serum creatinine and urinary excretion of protein creatinine were significantly greater in the 3-drug ARV group. Also, biomarkers indicating increased bone turnover were elevated in both groups, but the degree of elevation was significantly lower in the 2-drug ARV regimen cohort. It is unclear whether these findings reflect an increased or decreased risk of developing osteopenia or osteoporosis in the 2 study groups.

Conclusion. The once-daily 2-drug ARV regimen dolutegravir and lamivudine is noninferior to the guideline-recommended once-daily 3-drug ARV regimen dolutegravir, TDF, and emtricitabine at achieving viral suppression in ART-naive HIV-1 infected individuals with HIV RNA counts < 500,000 copies/mL. However, the efficacy of this ARV regimen may be compromised in individuals with CD4 counts < 200 cells/mL.

 

 

Commentary

Currently, the mainstay of HIV pharmacotherapy is a 3-drug regimen consisting of 2 nucleoside reverse transcriptase inhibitors in combination with 1 drug from another class, with an integrase strand transfer inhibitor being the preferred third drug.6 Despite the improved tolerability of contemporary ARVs, there remains concern among HIV practitioners regarding potential toxicities associated with cumulative drug exposure, specifically related to nucleoside reverse transcriptase inhibitors. As a result, there has been much interest in evaluating 2-drug ARV regimens for HIV treatment in order to reduce overall drug exposure.7-10

The 48-week results of the GEMINI-1 and GEMINI-2 trials, published in early 2019, further expand our understanding regarding the efficacy and safety of 2-drug regimens in HIV treatment. These identically designed studies evaluated once-daily dolutegravir and lamivudine for HIV in a treatment-naive population. This goes a step further than the SWORD-1 and SWORD-2 trials, which evaluated once-daily dolutegravir and rilpivirine as a step-down therapy for virologically suppressed individuals and led to the U.S. Food and Drug Administration (FDA) approval of the single-tablet combination regimen dolutegravir/rilpivirine (Juluca).10 Therefore, whereas the SWORD trials evaluated a 2-drug regimen for maintenance of virologic suppression, the GEMINI trials assessed whether a 2-drug regimen can both achieve and maintain virologic suppression.

The results of the GEMINI trials are promising for a future direction in HIV care. The rates of virologic suppression achieved in these trials are comparable to those seen in the SWORD trials.10 Furthermore, the virologic response seen in the GEMINI trials is comparable to that seen in similar trials that evaluated a 3-drug ARV regimen consisting of an integrase strand transfer inhibitor–based backbone in ART-naive individuals.11,12

A major confounder to the design of this trial was that it included TDF as one of the components in the comparator arm, an agent that has already been demonstrated to have detrimental effects on both renal and bone health.13,14 Additionally, the bone biomarker results were inconclusive, and the agents’ effects on bone would have been better demonstrated through bone mineral density testing, as had been done in prior trials.

Applications for Clinical Practice

Given the recent FDA approval of the single-tablet combination regimen dolutegravir and lamivudine (Dovato), this once-daily 2-drug ARV regimen will begin making its way into clinical practice for certain patients. Prior to starting this regimen, hepatitis B infection first must be ruled out due to poor efficacy of lamivudine monotherapy for management of chronic hepatitis B infection.15 Additionally, baseline genotype testing should be performed prior to starting this ART given that approximately 10% of newly diagnosed HIV patients have baseline resistance mutations.16 Obtaining rapid genotype testing may be difficult to accomplish in low-resource settings where such testing is not readily available. Finally, this approach may not be applicable to those presenting with acute HIV infection, in whom viral loads are often in the millions of copies per mL. It is likely that dolutegravir/lamivudine could assume a role similar to that of dolutegravir/rilpivirine, in which patients who present with acute HIV step down to a 2-drug regimen once their viral loads have either dropped below 500,000 copies/mL or have already been suppressed.

—Evan K. Mallory, PharmD, Banner-University Medical Center Tucson, and Norman L. Beatty, MD, University of Arizona College of Medicine, Tucson, AZ

References

1. Cahn P, Rolón MJ, Figueroa MI, et al. Dolutegravir-lamivudine as initial therapy in HIV-1 infected, ARV-naive patients, 48-week results of the PADDLE (Pilot Antiretroviral Design with Dolutegravir LamivudinE) study. J Int AIDS Soc. 2017;20:21678.

2. Taiwo BO, Zheng L, Stefanescu A, et al. ACTG A5353: a pilot study of dolutegravir plus lamivudine for initial treatment of human immunodeficiency virus-1 (HIV-1)-infected participants eith HIV-1 RNA <500000 vopies/mL. Clin Infect Dis. 2018;66:1689-1697.

3. Min S, Sloan L, DeJesus E, et al. Antiviral activity, safety, and pharmacokinetics/pharmacodynamics of dolutegravir as 10-day monotherapy in HIV-1-infected adults. AIDS. 2011;25:1737-1745.

4. Eron JJ, Benoit SL, Jemsek J, et al. Treatment with lamivudine, zidovudine, or both in HIV-positive patients with 200 to 500 CD4+ cells per cubic millimeter. North American HIV Working Party. N Engl J Med. 1995;333:1662-1669.

5. Kuritzkes DR, Quinn JB, Benoit SL, et al. Drug resistance and virologic response in NUCA 3001, a randomized trial of lamivudine (3TC) versus zidovudine (ZDV) versus ZDV plus 3TC in previously untreated patients. AIDS. 1996;10:975-981.

6. Department of Health and Human Services. Panel on Antiretroviral Guidelines for Adults and Adolescents. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. http://aidsinfo.nih.gov/contentfiles/lvguidelines/AdultandAdolescentGL.pdf. Accessed April 1, 2019.

7. Riddler SA, Haubrich R, DiRienzo AG, et al. Class-sparing regimens for initial treatment of HIV-1 infection. N Engl J Med. 2008;358:2095-2106.

8. Reynes J, Lawal A, Pulido F, et al. Examination of noninferiority, safety, and tolerability of lopinavir/ritonavir and raltegravir compared with lopinavir/ritonavir and tenofovir/ emtricitabine in antiretroviral-naïve subjects: the progress study, 48-week results. HIV Clin Trials. 2011;12:255-267.

9. Cahn P, Andrade-Villanueva J, Arribas JR, et al. Dual therapy with lopinavir and ritonavir plus lamivudine versus triple therapy with lopinavir and ritonavir plus two nucleoside reverse transcriptase inhibitors in antiretroviral-therapy-naive adults with HIV-1 infection: 48 week results of the randomised, open label, non-inferiority GARDEL trial. Lancet Infect Dis. 2014;14:572-580.

10. Llibre JM, Hung CC, Brinson C, et al. Efficacy, safety, and tolerability of dolutegravir-rilpivirine for the maintenance of virological suppression in adults with HIV-1: phase 3, randomised, non-inferiority SWORD-1 and SWORD-2 studies. Lancet. 2018;391:839-849.

11. Walmsley SL, Antela A, Clumeck N, et al. Dolutegravir plus abacavir-lamivudine for the treatment of HIV-1 infection. N Engl J Med. 2013;369:1807-1818.

12. Sax PE, Wohl D, Yin MT, et al. Tenofovir alafenamide versus tenofovir disoproxil fumarate, coformulated with elvitegravir, cobicistat, and emtricitabine, for initial treatment of HIV-1 infection: two randomised, double-blind, phase 3, non-inferiority trials. Lancet. 2015;385:2606-2615.

13. Mulligan K, Glidden DV, Anderson PL, et al. Effects of emtricitabine/tenofovir on bone mineral density in HIV-negative persons in a randomized, double-blind, placebo-controlled trial. Clin Infect Dis. 2015;61:572-580.

14. Cooper RD, Wiebe N, Smith N, et al. Systematic review and meta-analysis: renal safety of tenofovir disoproxil fumarate in HIV-infected patients. Clin Infect Dis. 2010;51:496-505.

15. Kim D, Wheeler W, Ziebell R, et al. Prevalence of antiretroviral drug resistance among newly diagnosed HIV-1 infected persons, United States, 2007. 17th Conference on Retroviruses & Opportunistic Infections; San Francisco, CA: 2010. Feb 16-19. Abstract 580.

16. Terrault NA, Lok ASF, McMahon BJ, et al. Update on prevention, diagnosis, and treatment of chronic hepatitis B: AASLD 2018 hepatitis B guidance. Hepatology. 2018;67:1560-1599.

Article PDF
Issue
Journal of Clinical Outcomes Management - 26(3)
Publications
Topics
Page Number
105-108
Sections
Article PDF
Article PDF

Study Overview

Objective. To evaluate the efficacy and safety of a once-daily 2-drug antiretroviral (ARV) regimen, dolutegravir plus lamivudine, for the treatment of HIV-1 infection in adults naive to antiretroviral therapy (ART).

Design. GEMINI-1 and GEMINI-2 were 2 identically designed multicenter, double-blind, randomized, noninferiority, phase 3 clinical trials conducted between July 18, 2016 and March 31, 2017. Participants were stratified to receive 1 of 2 once-daily HIV regimens: the study regimen, consisting of once-daily dolutegravir 50 mg plus lamivudine 300 mg, or the standard-of-care regimen, consisting of once-daily dolutegravir 50 mg plus tenofovir disoproxil fumarate (TDF) 300 mg plus emtricitabine 200 mg. While this article presents results at week 48, both trials are scheduled to evaluate participants up to week 148 in an attempt to evaluate long-term efficacy and safety.

Setting and participants. Eligible participants had to be aged 18 years or older with treatment-naive HIV-1 infection. Women were eligible if they were not (1) pregnant, (2) lactating, or (3) of reproductive potential, defined by various means, including tubal ligation, hysterectomy, postmenopausal, and the use of highly effective contraception. Initially, eligibility screening restricted participation to those with viral loads between 1000 and 100,000 copies/mL. However, the upper limit was later increased to 500,000 copies/mL based on an independent review of results from other clinical trials1,2 evaluating dual therapy with dolutegravir and lamivudine, which indicated efficacy in patients with viral loads up to 500,000.3-5

Notable exclusion criteria included: (1) major mutations to nucleoside reverse transcriptase inhibitors, non-nucleoside reverse transcriptase inhibitors, and protease inhibitors; (2) evidence of hepatitis B infection; (3) hepatitis C infection with anticipation of initiating treatment within 48 weeks of study enrollment; and (4) stage 3 HIV disease, per Centers for Disease Control and Prevention criteria, with the exception of cutaneous Kaposi sarcoma and CD4 cell counts < 200 cells/mL.

Main outcome measures. The primary endpoint was demonstration of noninferiority of the 2-drug ARV regimen through assessment of the proportion of participants who achieved virologic suppression at week 48 in the intent-to-treat-exposed population. For the purposes of this study, virologic suppression was defined as having fewer than 50 copies of HIV-1 RNA per mL at week 48. For evaluation of safety and toxicity concerns, renal and bone biomarkers were assessed at study entry and at weeks 24 and 48. In addition, participants who met virological withdrawal criteria were evaluated for integrase strand transfer inhibitor mutations. Virological withdrawal was defined as the presence of 1 of the following: (1) HIV RNA > 200 copies/mL at week 24, (2) HIV RNA > 200 copies/mL after previous HIV RNA < 200 copies/mL (confirmed rebound), and (3) a < 1 log10 copies/mL decrease from baseline (unless already < 200 copies/mL).

Main results. GEMINI-1 and GEMINI-2 randomized a combined total of 1441 participants to receive either the once-daily 2-drug ARV regimen (dolutegravir and lamivudine, n = 719) or the once-daily 3-drug ARV regimen (dolutegravir, TDF, and emtricitabine, n = 722). Of the 533 participants who did not meet inclusion criteria, the predominant reasons for exclusion were either having preexisting major viral resistance mutations (n = 246) or viral loads outside the range of 1000 to 500,000 copies/mL (n = 133).

Baseline demographic and clinical characteristics were similar between both groups. The median age was 33 years (10% were over 50 years of age), and participants were mostly male (85%) and white (68%). Baseline HIV RNA counts of > 100,000 copies/mL were found in 293 participants (20%), and 188 (8%) participants had CD4 counts of ≤ 200 cells/mL.

 

 

Noninferiority of the once-daily 2-drug versus the once-daily 3-drug ARV regimen was demonstrated in both the GEMINI-1 and GEMINI-2 trials for the intent-to-treat-exposed population. In GEMINI-1, 90% (n = 320) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 93% (n = 332) in the 3-drug ARV group (no statistically significant difference). In GEMINI-2, 93% (n =335 ) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 94% (n = 337) in the 3-drug ARV group (no statistically significant difference).

A subgroup analysis found no significant impact of baseline HIV RNA (> 100,000 compared to ≤ 100,000 copies/mL) on achieving virologic suppression at week 48. However, a subgroup analysis did find that participants with CD4 counts < 200 copies/mL had a reduced response in the once-daily 2-drug versus 3-drug ARV regimen for achieving virologic response at week 48 (79% versus 93%, respectively).

Overall, 10 participants met virological withdrawal criteria during the study period, and 4 of these were on the 2-drug ARV regimen. For these 10 participants, genotypic testing did not find emergence of resistance to either nucleoside reverse transcriptase or integrase strand transfer inhibitors.

Regarding renal biomarkers, increases of both serum creatinine and urinary excretion of protein creatinine were significantly greater in the 3-drug ARV group. Also, biomarkers indicating increased bone turnover were elevated in both groups, but the degree of elevation was significantly lower in the 2-drug ARV regimen cohort. It is unclear whether these findings reflect an increased or decreased risk of developing osteopenia or osteoporosis in the 2 study groups.

Conclusion. The once-daily 2-drug ARV regimen dolutegravir and lamivudine is noninferior to the guideline-recommended once-daily 3-drug ARV regimen dolutegravir, TDF, and emtricitabine at achieving viral suppression in ART-naive HIV-1 infected individuals with HIV RNA counts < 500,000 copies/mL. However, the efficacy of this ARV regimen may be compromised in individuals with CD4 counts < 200 cells/mL.

 

 

Commentary

Currently, the mainstay of HIV pharmacotherapy is a 3-drug regimen consisting of 2 nucleoside reverse transcriptase inhibitors in combination with 1 drug from another class, with an integrase strand transfer inhibitor being the preferred third drug.6 Despite the improved tolerability of contemporary ARVs, there remains concern among HIV practitioners regarding potential toxicities associated with cumulative drug exposure, specifically related to nucleoside reverse transcriptase inhibitors. As a result, there has been much interest in evaluating 2-drug ARV regimens for HIV treatment in order to reduce overall drug exposure.7-10

The 48-week results of the GEMINI-1 and GEMINI-2 trials, published in early 2019, further expand our understanding regarding the efficacy and safety of 2-drug regimens in HIV treatment. These identically designed studies evaluated once-daily dolutegravir and lamivudine for HIV in a treatment-naive population. This goes a step further than the SWORD-1 and SWORD-2 trials, which evaluated once-daily dolutegravir and rilpivirine as a step-down therapy for virologically suppressed individuals and led to the U.S. Food and Drug Administration (FDA) approval of the single-tablet combination regimen dolutegravir/rilpivirine (Juluca).10 Therefore, whereas the SWORD trials evaluated a 2-drug regimen for maintenance of virologic suppression, the GEMINI trials assessed whether a 2-drug regimen can both achieve and maintain virologic suppression.

The results of the GEMINI trials are promising for a future direction in HIV care. The rates of virologic suppression achieved in these trials are comparable to those seen in the SWORD trials.10 Furthermore, the virologic response seen in the GEMINI trials is comparable to that seen in similar trials that evaluated a 3-drug ARV regimen consisting of an integrase strand transfer inhibitor–based backbone in ART-naive individuals.11,12

A major confounder to the design of this trial was that it included TDF as one of the components in the comparator arm, an agent that has already been demonstrated to have detrimental effects on both renal and bone health.13,14 Additionally, the bone biomarker results were inconclusive, and the agents’ effects on bone would have been better demonstrated through bone mineral density testing, as had been done in prior trials.

Applications for Clinical Practice

Given the recent FDA approval of the single-tablet combination regimen dolutegravir and lamivudine (Dovato), this once-daily 2-drug ARV regimen will begin making its way into clinical practice for certain patients. Prior to starting this regimen, hepatitis B infection first must be ruled out due to poor efficacy of lamivudine monotherapy for management of chronic hepatitis B infection.15 Additionally, baseline genotype testing should be performed prior to starting this ART given that approximately 10% of newly diagnosed HIV patients have baseline resistance mutations.16 Obtaining rapid genotype testing may be difficult to accomplish in low-resource settings where such testing is not readily available. Finally, this approach may not be applicable to those presenting with acute HIV infection, in whom viral loads are often in the millions of copies per mL. It is likely that dolutegravir/lamivudine could assume a role similar to that of dolutegravir/rilpivirine, in which patients who present with acute HIV step down to a 2-drug regimen once their viral loads have either dropped below 500,000 copies/mL or have already been suppressed.

—Evan K. Mallory, PharmD, Banner-University Medical Center Tucson, and Norman L. Beatty, MD, University of Arizona College of Medicine, Tucson, AZ

Study Overview

Objective. To evaluate the efficacy and safety of a once-daily 2-drug antiretroviral (ARV) regimen, dolutegravir plus lamivudine, for the treatment of HIV-1 infection in adults naive to antiretroviral therapy (ART).

Design. GEMINI-1 and GEMINI-2 were 2 identically designed multicenter, double-blind, randomized, noninferiority, phase 3 clinical trials conducted between July 18, 2016 and March 31, 2017. Participants were stratified to receive 1 of 2 once-daily HIV regimens: the study regimen, consisting of once-daily dolutegravir 50 mg plus lamivudine 300 mg, or the standard-of-care regimen, consisting of once-daily dolutegravir 50 mg plus tenofovir disoproxil fumarate (TDF) 300 mg plus emtricitabine 200 mg. While this article presents results at week 48, both trials are scheduled to evaluate participants up to week 148 in an attempt to evaluate long-term efficacy and safety.

Setting and participants. Eligible participants had to be aged 18 years or older with treatment-naive HIV-1 infection. Women were eligible if they were not (1) pregnant, (2) lactating, or (3) of reproductive potential, defined by various means, including tubal ligation, hysterectomy, postmenopausal, and the use of highly effective contraception. Initially, eligibility screening restricted participation to those with viral loads between 1000 and 100,000 copies/mL. However, the upper limit was later increased to 500,000 copies/mL based on an independent review of results from other clinical trials1,2 evaluating dual therapy with dolutegravir and lamivudine, which indicated efficacy in patients with viral loads up to 500,000.3-5

Notable exclusion criteria included: (1) major mutations to nucleoside reverse transcriptase inhibitors, non-nucleoside reverse transcriptase inhibitors, and protease inhibitors; (2) evidence of hepatitis B infection; (3) hepatitis C infection with anticipation of initiating treatment within 48 weeks of study enrollment; and (4) stage 3 HIV disease, per Centers for Disease Control and Prevention criteria, with the exception of cutaneous Kaposi sarcoma and CD4 cell counts < 200 cells/mL.

Main outcome measures. The primary endpoint was demonstration of noninferiority of the 2-drug ARV regimen through assessment of the proportion of participants who achieved virologic suppression at week 48 in the intent-to-treat-exposed population. For the purposes of this study, virologic suppression was defined as having fewer than 50 copies of HIV-1 RNA per mL at week 48. For evaluation of safety and toxicity concerns, renal and bone biomarkers were assessed at study entry and at weeks 24 and 48. In addition, participants who met virological withdrawal criteria were evaluated for integrase strand transfer inhibitor mutations. Virological withdrawal was defined as the presence of 1 of the following: (1) HIV RNA > 200 copies/mL at week 24, (2) HIV RNA > 200 copies/mL after previous HIV RNA < 200 copies/mL (confirmed rebound), and (3) a < 1 log10 copies/mL decrease from baseline (unless already < 200 copies/mL).

Main results. GEMINI-1 and GEMINI-2 randomized a combined total of 1441 participants to receive either the once-daily 2-drug ARV regimen (dolutegravir and lamivudine, n = 719) or the once-daily 3-drug ARV regimen (dolutegravir, TDF, and emtricitabine, n = 722). Of the 533 participants who did not meet inclusion criteria, the predominant reasons for exclusion were either having preexisting major viral resistance mutations (n = 246) or viral loads outside the range of 1000 to 500,000 copies/mL (n = 133).

Baseline demographic and clinical characteristics were similar between both groups. The median age was 33 years (10% were over 50 years of age), and participants were mostly male (85%) and white (68%). Baseline HIV RNA counts of > 100,000 copies/mL were found in 293 participants (20%), and 188 (8%) participants had CD4 counts of ≤ 200 cells/mL.

 

 

Noninferiority of the once-daily 2-drug versus the once-daily 3-drug ARV regimen was demonstrated in both the GEMINI-1 and GEMINI-2 trials for the intent-to-treat-exposed population. In GEMINI-1, 90% (n = 320) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 93% (n = 332) in the 3-drug ARV group (no statistically significant difference). In GEMINI-2, 93% (n =335 ) in the 2-drug ARV group achieved virologic suppression at week 48 compared to 94% (n = 337) in the 3-drug ARV group (no statistically significant difference).

A subgroup analysis found no significant impact of baseline HIV RNA (> 100,000 compared to ≤ 100,000 copies/mL) on achieving virologic suppression at week 48. However, a subgroup analysis did find that participants with CD4 counts < 200 copies/mL had a reduced response in the once-daily 2-drug versus 3-drug ARV regimen for achieving virologic response at week 48 (79% versus 93%, respectively).

Overall, 10 participants met virological withdrawal criteria during the study period, and 4 of these were on the 2-drug ARV regimen. For these 10 participants, genotypic testing did not find emergence of resistance to either nucleoside reverse transcriptase or integrase strand transfer inhibitors.

Regarding renal biomarkers, increases of both serum creatinine and urinary excretion of protein creatinine were significantly greater in the 3-drug ARV group. Also, biomarkers indicating increased bone turnover were elevated in both groups, but the degree of elevation was significantly lower in the 2-drug ARV regimen cohort. It is unclear whether these findings reflect an increased or decreased risk of developing osteopenia or osteoporosis in the 2 study groups.

Conclusion. The once-daily 2-drug ARV regimen dolutegravir and lamivudine is noninferior to the guideline-recommended once-daily 3-drug ARV regimen dolutegravir, TDF, and emtricitabine at achieving viral suppression in ART-naive HIV-1 infected individuals with HIV RNA counts < 500,000 copies/mL. However, the efficacy of this ARV regimen may be compromised in individuals with CD4 counts < 200 cells/mL.

 

 

Commentary

Currently, the mainstay of HIV pharmacotherapy is a 3-drug regimen consisting of 2 nucleoside reverse transcriptase inhibitors in combination with 1 drug from another class, with an integrase strand transfer inhibitor being the preferred third drug.6 Despite the improved tolerability of contemporary ARVs, there remains concern among HIV practitioners regarding potential toxicities associated with cumulative drug exposure, specifically related to nucleoside reverse transcriptase inhibitors. As a result, there has been much interest in evaluating 2-drug ARV regimens for HIV treatment in order to reduce overall drug exposure.7-10

The 48-week results of the GEMINI-1 and GEMINI-2 trials, published in early 2019, further expand our understanding regarding the efficacy and safety of 2-drug regimens in HIV treatment. These identically designed studies evaluated once-daily dolutegravir and lamivudine for HIV in a treatment-naive population. This goes a step further than the SWORD-1 and SWORD-2 trials, which evaluated once-daily dolutegravir and rilpivirine as a step-down therapy for virologically suppressed individuals and led to the U.S. Food and Drug Administration (FDA) approval of the single-tablet combination regimen dolutegravir/rilpivirine (Juluca).10 Therefore, whereas the SWORD trials evaluated a 2-drug regimen for maintenance of virologic suppression, the GEMINI trials assessed whether a 2-drug regimen can both achieve and maintain virologic suppression.

The results of the GEMINI trials are promising for a future direction in HIV care. The rates of virologic suppression achieved in these trials are comparable to those seen in the SWORD trials.10 Furthermore, the virologic response seen in the GEMINI trials is comparable to that seen in similar trials that evaluated a 3-drug ARV regimen consisting of an integrase strand transfer inhibitor–based backbone in ART-naive individuals.11,12

A major confounder to the design of this trial was that it included TDF as one of the components in the comparator arm, an agent that has already been demonstrated to have detrimental effects on both renal and bone health.13,14 Additionally, the bone biomarker results were inconclusive, and the agents’ effects on bone would have been better demonstrated through bone mineral density testing, as had been done in prior trials.

Applications for Clinical Practice

Given the recent FDA approval of the single-tablet combination regimen dolutegravir and lamivudine (Dovato), this once-daily 2-drug ARV regimen will begin making its way into clinical practice for certain patients. Prior to starting this regimen, hepatitis B infection first must be ruled out due to poor efficacy of lamivudine monotherapy for management of chronic hepatitis B infection.15 Additionally, baseline genotype testing should be performed prior to starting this ART given that approximately 10% of newly diagnosed HIV patients have baseline resistance mutations.16 Obtaining rapid genotype testing may be difficult to accomplish in low-resource settings where such testing is not readily available. Finally, this approach may not be applicable to those presenting with acute HIV infection, in whom viral loads are often in the millions of copies per mL. It is likely that dolutegravir/lamivudine could assume a role similar to that of dolutegravir/rilpivirine, in which patients who present with acute HIV step down to a 2-drug regimen once their viral loads have either dropped below 500,000 copies/mL or have already been suppressed.

—Evan K. Mallory, PharmD, Banner-University Medical Center Tucson, and Norman L. Beatty, MD, University of Arizona College of Medicine, Tucson, AZ

References

1. Cahn P, Rolón MJ, Figueroa MI, et al. Dolutegravir-lamivudine as initial therapy in HIV-1 infected, ARV-naive patients, 48-week results of the PADDLE (Pilot Antiretroviral Design with Dolutegravir LamivudinE) study. J Int AIDS Soc. 2017;20:21678.

2. Taiwo BO, Zheng L, Stefanescu A, et al. ACTG A5353: a pilot study of dolutegravir plus lamivudine for initial treatment of human immunodeficiency virus-1 (HIV-1)-infected participants eith HIV-1 RNA <500000 vopies/mL. Clin Infect Dis. 2018;66:1689-1697.

3. Min S, Sloan L, DeJesus E, et al. Antiviral activity, safety, and pharmacokinetics/pharmacodynamics of dolutegravir as 10-day monotherapy in HIV-1-infected adults. AIDS. 2011;25:1737-1745.

4. Eron JJ, Benoit SL, Jemsek J, et al. Treatment with lamivudine, zidovudine, or both in HIV-positive patients with 200 to 500 CD4+ cells per cubic millimeter. North American HIV Working Party. N Engl J Med. 1995;333:1662-1669.

5. Kuritzkes DR, Quinn JB, Benoit SL, et al. Drug resistance and virologic response in NUCA 3001, a randomized trial of lamivudine (3TC) versus zidovudine (ZDV) versus ZDV plus 3TC in previously untreated patients. AIDS. 1996;10:975-981.

6. Department of Health and Human Services. Panel on Antiretroviral Guidelines for Adults and Adolescents. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. http://aidsinfo.nih.gov/contentfiles/lvguidelines/AdultandAdolescentGL.pdf. Accessed April 1, 2019.

7. Riddler SA, Haubrich R, DiRienzo AG, et al. Class-sparing regimens for initial treatment of HIV-1 infection. N Engl J Med. 2008;358:2095-2106.

8. Reynes J, Lawal A, Pulido F, et al. Examination of noninferiority, safety, and tolerability of lopinavir/ritonavir and raltegravir compared with lopinavir/ritonavir and tenofovir/ emtricitabine in antiretroviral-naïve subjects: the progress study, 48-week results. HIV Clin Trials. 2011;12:255-267.

9. Cahn P, Andrade-Villanueva J, Arribas JR, et al. Dual therapy with lopinavir and ritonavir plus lamivudine versus triple therapy with lopinavir and ritonavir plus two nucleoside reverse transcriptase inhibitors in antiretroviral-therapy-naive adults with HIV-1 infection: 48 week results of the randomised, open label, non-inferiority GARDEL trial. Lancet Infect Dis. 2014;14:572-580.

10. Llibre JM, Hung CC, Brinson C, et al. Efficacy, safety, and tolerability of dolutegravir-rilpivirine for the maintenance of virological suppression in adults with HIV-1: phase 3, randomised, non-inferiority SWORD-1 and SWORD-2 studies. Lancet. 2018;391:839-849.

11. Walmsley SL, Antela A, Clumeck N, et al. Dolutegravir plus abacavir-lamivudine for the treatment of HIV-1 infection. N Engl J Med. 2013;369:1807-1818.

12. Sax PE, Wohl D, Yin MT, et al. Tenofovir alafenamide versus tenofovir disoproxil fumarate, coformulated with elvitegravir, cobicistat, and emtricitabine, for initial treatment of HIV-1 infection: two randomised, double-blind, phase 3, non-inferiority trials. Lancet. 2015;385:2606-2615.

13. Mulligan K, Glidden DV, Anderson PL, et al. Effects of emtricitabine/tenofovir on bone mineral density in HIV-negative persons in a randomized, double-blind, placebo-controlled trial. Clin Infect Dis. 2015;61:572-580.

14. Cooper RD, Wiebe N, Smith N, et al. Systematic review and meta-analysis: renal safety of tenofovir disoproxil fumarate in HIV-infected patients. Clin Infect Dis. 2010;51:496-505.

15. Kim D, Wheeler W, Ziebell R, et al. Prevalence of antiretroviral drug resistance among newly diagnosed HIV-1 infected persons, United States, 2007. 17th Conference on Retroviruses & Opportunistic Infections; San Francisco, CA: 2010. Feb 16-19. Abstract 580.

16. Terrault NA, Lok ASF, McMahon BJ, et al. Update on prevention, diagnosis, and treatment of chronic hepatitis B: AASLD 2018 hepatitis B guidance. Hepatology. 2018;67:1560-1599.

References

1. Cahn P, Rolón MJ, Figueroa MI, et al. Dolutegravir-lamivudine as initial therapy in HIV-1 infected, ARV-naive patients, 48-week results of the PADDLE (Pilot Antiretroviral Design with Dolutegravir LamivudinE) study. J Int AIDS Soc. 2017;20:21678.

2. Taiwo BO, Zheng L, Stefanescu A, et al. ACTG A5353: a pilot study of dolutegravir plus lamivudine for initial treatment of human immunodeficiency virus-1 (HIV-1)-infected participants eith HIV-1 RNA <500000 vopies/mL. Clin Infect Dis. 2018;66:1689-1697.

3. Min S, Sloan L, DeJesus E, et al. Antiviral activity, safety, and pharmacokinetics/pharmacodynamics of dolutegravir as 10-day monotherapy in HIV-1-infected adults. AIDS. 2011;25:1737-1745.

4. Eron JJ, Benoit SL, Jemsek J, et al. Treatment with lamivudine, zidovudine, or both in HIV-positive patients with 200 to 500 CD4+ cells per cubic millimeter. North American HIV Working Party. N Engl J Med. 1995;333:1662-1669.

5. Kuritzkes DR, Quinn JB, Benoit SL, et al. Drug resistance and virologic response in NUCA 3001, a randomized trial of lamivudine (3TC) versus zidovudine (ZDV) versus ZDV plus 3TC in previously untreated patients. AIDS. 1996;10:975-981.

6. Department of Health and Human Services. Panel on Antiretroviral Guidelines for Adults and Adolescents. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. http://aidsinfo.nih.gov/contentfiles/lvguidelines/AdultandAdolescentGL.pdf. Accessed April 1, 2019.

7. Riddler SA, Haubrich R, DiRienzo AG, et al. Class-sparing regimens for initial treatment of HIV-1 infection. N Engl J Med. 2008;358:2095-2106.

8. Reynes J, Lawal A, Pulido F, et al. Examination of noninferiority, safety, and tolerability of lopinavir/ritonavir and raltegravir compared with lopinavir/ritonavir and tenofovir/ emtricitabine in antiretroviral-naïve subjects: the progress study, 48-week results. HIV Clin Trials. 2011;12:255-267.

9. Cahn P, Andrade-Villanueva J, Arribas JR, et al. Dual therapy with lopinavir and ritonavir plus lamivudine versus triple therapy with lopinavir and ritonavir plus two nucleoside reverse transcriptase inhibitors in antiretroviral-therapy-naive adults with HIV-1 infection: 48 week results of the randomised, open label, non-inferiority GARDEL trial. Lancet Infect Dis. 2014;14:572-580.

10. Llibre JM, Hung CC, Brinson C, et al. Efficacy, safety, and tolerability of dolutegravir-rilpivirine for the maintenance of virological suppression in adults with HIV-1: phase 3, randomised, non-inferiority SWORD-1 and SWORD-2 studies. Lancet. 2018;391:839-849.

11. Walmsley SL, Antela A, Clumeck N, et al. Dolutegravir plus abacavir-lamivudine for the treatment of HIV-1 infection. N Engl J Med. 2013;369:1807-1818.

12. Sax PE, Wohl D, Yin MT, et al. Tenofovir alafenamide versus tenofovir disoproxil fumarate, coformulated with elvitegravir, cobicistat, and emtricitabine, for initial treatment of HIV-1 infection: two randomised, double-blind, phase 3, non-inferiority trials. Lancet. 2015;385:2606-2615.

13. Mulligan K, Glidden DV, Anderson PL, et al. Effects of emtricitabine/tenofovir on bone mineral density in HIV-negative persons in a randomized, double-blind, placebo-controlled trial. Clin Infect Dis. 2015;61:572-580.

14. Cooper RD, Wiebe N, Smith N, et al. Systematic review and meta-analysis: renal safety of tenofovir disoproxil fumarate in HIV-infected patients. Clin Infect Dis. 2010;51:496-505.

15. Kim D, Wheeler W, Ziebell R, et al. Prevalence of antiretroviral drug resistance among newly diagnosed HIV-1 infected persons, United States, 2007. 17th Conference on Retroviruses & Opportunistic Infections; San Francisco, CA: 2010. Feb 16-19. Abstract 580.

16. Terrault NA, Lok ASF, McMahon BJ, et al. Update on prevention, diagnosis, and treatment of chronic hepatitis B: AASLD 2018 hepatitis B guidance. Hepatology. 2018;67:1560-1599.

Issue
Journal of Clinical Outcomes Management - 26(3)
Issue
Journal of Clinical Outcomes Management - 26(3)
Page Number
105-108
Page Number
105-108
Publications
Publications
Topics
Article Type
Display Headline
Once-Daily 2-Drug versus 3-Drug Antiretroviral Therapy for HIV Infection in Treatment-naive Adults: Less Is Best?
Display Headline
Once-Daily 2-Drug versus 3-Drug Antiretroviral Therapy for HIV Infection in Treatment-naive Adults: Less Is Best?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media

FDA launches call center project to streamline Expanded Access request process

Article Type
Changed
Mon, 06/03/2019 - 18:56

– The Food and Drug Administration launched a new call center project to assist physicians seeking to help cancer patients access unapproved therapies.

Dr. Richard Pazdur,  director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products
MDedge/Neil Osterweil
Dr. Richard Pazdur

Entitled “Project Facilitate,” the program aims to create a single point of contact with FDA oncology staff who can guide physicians through the process of submitting Expanded Access (EA) requests on behalf of individual patients.

“This is a pilot program to provide continuous support to health care professionals throughout the entire Expanded Access process,” Richard Pazdur, MD, director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products said during the unveiling of the project during a press briefing at the annual meeting of the American Society of Clinical Oncology.

Physicians utilizing Project Facilitate can expect a “concierge service” experience including advice on the information needed to complete requests, assistance completing forms, pharma/biotech contact information, independent review board resource options, and follow-up on patient outcomes.

The project will work in synergy with the Reagan-Udall EA Navigator website, an “online road map” for physicians and patients that was launched 2 years ago “to facilitate and coordinate and collaborate with the FDA to advance the science mission of FDA,” and which has been expanded in conjunction with Project Facilitate, Ellen V. Sigal, PhD, chair of the board of the Reagan-Udall Foundation for the FDA, said at the press briefing.

“EA Navigator delivers transparent, concise, and searchable information provided by companies about their Expanded Access policies,” Dr. Sigal said. “Today I’m pleased to announce that the Navigator now features Expanded Access opportunities listed in ClinicalTrials.gov for companies in the directory.

“For the first time, those who need quick access to drug availability and Expanded Access options will find it in one place without having to visit site by site by site, or sift through thousands of studies that don’t merit their needs,” she added, noting that EA Navigator will often be the first step for physicians before they engage with Project Facilitate.

Project Facilitate can be reached Monday-Friday, 9 a.m.-5 p.m. ET at 240-402-0004, or by email at OncProjectFacilitate@fda.hhs.gov.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– The Food and Drug Administration launched a new call center project to assist physicians seeking to help cancer patients access unapproved therapies.

Dr. Richard Pazdur,  director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products
MDedge/Neil Osterweil
Dr. Richard Pazdur

Entitled “Project Facilitate,” the program aims to create a single point of contact with FDA oncology staff who can guide physicians through the process of submitting Expanded Access (EA) requests on behalf of individual patients.

“This is a pilot program to provide continuous support to health care professionals throughout the entire Expanded Access process,” Richard Pazdur, MD, director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products said during the unveiling of the project during a press briefing at the annual meeting of the American Society of Clinical Oncology.

Physicians utilizing Project Facilitate can expect a “concierge service” experience including advice on the information needed to complete requests, assistance completing forms, pharma/biotech contact information, independent review board resource options, and follow-up on patient outcomes.

The project will work in synergy with the Reagan-Udall EA Navigator website, an “online road map” for physicians and patients that was launched 2 years ago “to facilitate and coordinate and collaborate with the FDA to advance the science mission of FDA,” and which has been expanded in conjunction with Project Facilitate, Ellen V. Sigal, PhD, chair of the board of the Reagan-Udall Foundation for the FDA, said at the press briefing.

“EA Navigator delivers transparent, concise, and searchable information provided by companies about their Expanded Access policies,” Dr. Sigal said. “Today I’m pleased to announce that the Navigator now features Expanded Access opportunities listed in ClinicalTrials.gov for companies in the directory.

“For the first time, those who need quick access to drug availability and Expanded Access options will find it in one place without having to visit site by site by site, or sift through thousands of studies that don’t merit their needs,” she added, noting that EA Navigator will often be the first step for physicians before they engage with Project Facilitate.

Project Facilitate can be reached Monday-Friday, 9 a.m.-5 p.m. ET at 240-402-0004, or by email at OncProjectFacilitate@fda.hhs.gov.

– The Food and Drug Administration launched a new call center project to assist physicians seeking to help cancer patients access unapproved therapies.

Dr. Richard Pazdur,  director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products
MDedge/Neil Osterweil
Dr. Richard Pazdur

Entitled “Project Facilitate,” the program aims to create a single point of contact with FDA oncology staff who can guide physicians through the process of submitting Expanded Access (EA) requests on behalf of individual patients.

“This is a pilot program to provide continuous support to health care professionals throughout the entire Expanded Access process,” Richard Pazdur, MD, director of the FDA’s Oncology Center of Excellence and acting director of the Office of Hematology and Oncology Products said during the unveiling of the project during a press briefing at the annual meeting of the American Society of Clinical Oncology.

Physicians utilizing Project Facilitate can expect a “concierge service” experience including advice on the information needed to complete requests, assistance completing forms, pharma/biotech contact information, independent review board resource options, and follow-up on patient outcomes.

The project will work in synergy with the Reagan-Udall EA Navigator website, an “online road map” for physicians and patients that was launched 2 years ago “to facilitate and coordinate and collaborate with the FDA to advance the science mission of FDA,” and which has been expanded in conjunction with Project Facilitate, Ellen V. Sigal, PhD, chair of the board of the Reagan-Udall Foundation for the FDA, said at the press briefing.

“EA Navigator delivers transparent, concise, and searchable information provided by companies about their Expanded Access policies,” Dr. Sigal said. “Today I’m pleased to announce that the Navigator now features Expanded Access opportunities listed in ClinicalTrials.gov for companies in the directory.

“For the first time, those who need quick access to drug availability and Expanded Access options will find it in one place without having to visit site by site by site, or sift through thousands of studies that don’t merit their needs,” she added, noting that EA Navigator will often be the first step for physicians before they engage with Project Facilitate.

Project Facilitate can be reached Monday-Friday, 9 a.m.-5 p.m. ET at 240-402-0004, or by email at OncProjectFacilitate@fda.hhs.gov.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM ASCO 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.