Ursodiol Plus Methotrexate Effective Long Term in Biliary Cirrhosis

Article Type
Changed
Fri, 12/07/2018 - 14:15
Display Headline
Ursodiol Plus Methotrexate Effective Long Term in Biliary Cirrhosis

Combination therapy consisting of ursodiol and either methotrexate or colchicine for primary biliary cirrhosis showed long-term effectiveness, lasting up to 20 years, wrote Dr. John Leung and his colleagues in the September issue of Clinical Gastroenterology and Hepatology (doi:10.1016/j.cgh.2011.05.010).

Dr. Leung, of the department of gastroenterology at Tufts Medical Center, Boston, and his colleagues studied 29 patients with primary biliary cirrhosis, a chronic progressive disease thought to have an autoimmune etiology. The patients were originally part of an 85-patient, double-blind, prospective, randomized controlled trial comparing colchicine and methotrexate from 1988 to 2000, with ursodiol (ursodeoxycholic acid) added to that regimen 3 years after study initiation.

The patients examined in the current study had completed all 10 years of follow-up in the original trial. At completion, "the randomization code was broken and these 29 patients were treated according to their clinical response, personal preference, and tolerance to therapy."

They were then followed for an additional 9-13 years, either at the authors’ institution (21 patients) or via telephone calls and e-mail correspondence with referring physicians.

All patients except one were female, and the median age at the end of the initial 10-year randomized controlled trial (RCT) was 59 years.

According to the authors, of the 29 patients followed for 20 years, "Twenty-one patients are alive and well. Of these, 19 have normal tests of liver function and no signs of portal hypertension."

The outcomes were then analyzed by specific treatment regimen. Of the 11 patients on methotrexate plus ursodiol, "2 died of causes unrelated to liver disease at the age of 79 and 70, and 9 are alive and well," reported the authors. All nine of these patients have normal serum levels of alanine transaminase (ALT), aspartate transaminase (AST), alkaline phosphatase, and bilirubin.

Additionally, albumin levels have remained normal in eight patients; the ninth patient entered the initial trial 20 years ago with stage III biliary cirrhosis and now has a slightly decreased albumin level of 3.3 g/dL, along with portal hypertension and nonbleeding, grade 2 varices. This patient remains asymptomatic, however. Another patient receiving this regimen also developed grade 2 esophageal varices, also without bleeding.

There were 18 patients in the colchicine plus ursodiol group, and 12 were alive and well at the end of follow-up, reported the investigators. Of the remaining six, "three died of liver-unrelated causes at the age of 73, 76 and 76 respectively," wrote the authors. "They had normal biochemical tests and two had small esophageal varices at the end of the RCT."

Two patients with elevated liver enzymes at the end of the RCT underwent liver transplant; a third developed varices but was not a candidate for transplant, and died of pneumonia.

Overall, the investigators reported no treatment-related adverse events at the conclusion of follow-up.

"The results of this current study provide further evidence that combination therapy with [ursodiol], colchicine and [methotrexate] is durable and improves the natural history of primary biliary cirrhosis in a subset of primary biliary cirrhosis patients, including some who had histologically advanced liver disease at diagnosis," concluded the authors.

And while they conceded that it is impossible to determine whether the benefits were because of ursodiol, combination therapy, methotrexate, or colchicine alone, "the observations that liver function remained normal for 10 additional years and that very few patients developed portal hypertension suggests that combination therapy may have been effective."

Dr. Leung and his colleagues declared no outside funding for this study and no personal conflicts of interest.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
primary biliary cirrhosis treatment, chronic progressive disease, ursodiol, methotrexate, colchicine, symptoms of portal hypertension, advanced liver disease
Author and Disclosure Information

Author and Disclosure Information

Combination therapy consisting of ursodiol and either methotrexate or colchicine for primary biliary cirrhosis showed long-term effectiveness, lasting up to 20 years, wrote Dr. John Leung and his colleagues in the September issue of Clinical Gastroenterology and Hepatology (doi:10.1016/j.cgh.2011.05.010).

Dr. Leung, of the department of gastroenterology at Tufts Medical Center, Boston, and his colleagues studied 29 patients with primary biliary cirrhosis, a chronic progressive disease thought to have an autoimmune etiology. The patients were originally part of an 85-patient, double-blind, prospective, randomized controlled trial comparing colchicine and methotrexate from 1988 to 2000, with ursodiol (ursodeoxycholic acid) added to that regimen 3 years after study initiation.

The patients examined in the current study had completed all 10 years of follow-up in the original trial. At completion, "the randomization code was broken and these 29 patients were treated according to their clinical response, personal preference, and tolerance to therapy."

They were then followed for an additional 9-13 years, either at the authors’ institution (21 patients) or via telephone calls and e-mail correspondence with referring physicians.

All patients except one were female, and the median age at the end of the initial 10-year randomized controlled trial (RCT) was 59 years.

According to the authors, of the 29 patients followed for 20 years, "Twenty-one patients are alive and well. Of these, 19 have normal tests of liver function and no signs of portal hypertension."

The outcomes were then analyzed by specific treatment regimen. Of the 11 patients on methotrexate plus ursodiol, "2 died of causes unrelated to liver disease at the age of 79 and 70, and 9 are alive and well," reported the authors. All nine of these patients have normal serum levels of alanine transaminase (ALT), aspartate transaminase (AST), alkaline phosphatase, and bilirubin.

Additionally, albumin levels have remained normal in eight patients; the ninth patient entered the initial trial 20 years ago with stage III biliary cirrhosis and now has a slightly decreased albumin level of 3.3 g/dL, along with portal hypertension and nonbleeding, grade 2 varices. This patient remains asymptomatic, however. Another patient receiving this regimen also developed grade 2 esophageal varices, also without bleeding.

There were 18 patients in the colchicine plus ursodiol group, and 12 were alive and well at the end of follow-up, reported the investigators. Of the remaining six, "three died of liver-unrelated causes at the age of 73, 76 and 76 respectively," wrote the authors. "They had normal biochemical tests and two had small esophageal varices at the end of the RCT."

Two patients with elevated liver enzymes at the end of the RCT underwent liver transplant; a third developed varices but was not a candidate for transplant, and died of pneumonia.

Overall, the investigators reported no treatment-related adverse events at the conclusion of follow-up.

"The results of this current study provide further evidence that combination therapy with [ursodiol], colchicine and [methotrexate] is durable and improves the natural history of primary biliary cirrhosis in a subset of primary biliary cirrhosis patients, including some who had histologically advanced liver disease at diagnosis," concluded the authors.

And while they conceded that it is impossible to determine whether the benefits were because of ursodiol, combination therapy, methotrexate, or colchicine alone, "the observations that liver function remained normal for 10 additional years and that very few patients developed portal hypertension suggests that combination therapy may have been effective."

Dr. Leung and his colleagues declared no outside funding for this study and no personal conflicts of interest.

Combination therapy consisting of ursodiol and either methotrexate or colchicine for primary biliary cirrhosis showed long-term effectiveness, lasting up to 20 years, wrote Dr. John Leung and his colleagues in the September issue of Clinical Gastroenterology and Hepatology (doi:10.1016/j.cgh.2011.05.010).

Dr. Leung, of the department of gastroenterology at Tufts Medical Center, Boston, and his colleagues studied 29 patients with primary biliary cirrhosis, a chronic progressive disease thought to have an autoimmune etiology. The patients were originally part of an 85-patient, double-blind, prospective, randomized controlled trial comparing colchicine and methotrexate from 1988 to 2000, with ursodiol (ursodeoxycholic acid) added to that regimen 3 years after study initiation.

The patients examined in the current study had completed all 10 years of follow-up in the original trial. At completion, "the randomization code was broken and these 29 patients were treated according to their clinical response, personal preference, and tolerance to therapy."

They were then followed for an additional 9-13 years, either at the authors’ institution (21 patients) or via telephone calls and e-mail correspondence with referring physicians.

All patients except one were female, and the median age at the end of the initial 10-year randomized controlled trial (RCT) was 59 years.

According to the authors, of the 29 patients followed for 20 years, "Twenty-one patients are alive and well. Of these, 19 have normal tests of liver function and no signs of portal hypertension."

The outcomes were then analyzed by specific treatment regimen. Of the 11 patients on methotrexate plus ursodiol, "2 died of causes unrelated to liver disease at the age of 79 and 70, and 9 are alive and well," reported the authors. All nine of these patients have normal serum levels of alanine transaminase (ALT), aspartate transaminase (AST), alkaline phosphatase, and bilirubin.

Additionally, albumin levels have remained normal in eight patients; the ninth patient entered the initial trial 20 years ago with stage III biliary cirrhosis and now has a slightly decreased albumin level of 3.3 g/dL, along with portal hypertension and nonbleeding, grade 2 varices. This patient remains asymptomatic, however. Another patient receiving this regimen also developed grade 2 esophageal varices, also without bleeding.

There were 18 patients in the colchicine plus ursodiol group, and 12 were alive and well at the end of follow-up, reported the investigators. Of the remaining six, "three died of liver-unrelated causes at the age of 73, 76 and 76 respectively," wrote the authors. "They had normal biochemical tests and two had small esophageal varices at the end of the RCT."

Two patients with elevated liver enzymes at the end of the RCT underwent liver transplant; a third developed varices but was not a candidate for transplant, and died of pneumonia.

Overall, the investigators reported no treatment-related adverse events at the conclusion of follow-up.

"The results of this current study provide further evidence that combination therapy with [ursodiol], colchicine and [methotrexate] is durable and improves the natural history of primary biliary cirrhosis in a subset of primary biliary cirrhosis patients, including some who had histologically advanced liver disease at diagnosis," concluded the authors.

And while they conceded that it is impossible to determine whether the benefits were because of ursodiol, combination therapy, methotrexate, or colchicine alone, "the observations that liver function remained normal for 10 additional years and that very few patients developed portal hypertension suggests that combination therapy may have been effective."

Dr. Leung and his colleagues declared no outside funding for this study and no personal conflicts of interest.

Publications
Publications
Topics
Article Type
Display Headline
Ursodiol Plus Methotrexate Effective Long Term in Biliary Cirrhosis
Display Headline
Ursodiol Plus Methotrexate Effective Long Term in Biliary Cirrhosis
Legacy Keywords
primary biliary cirrhosis treatment, chronic progressive disease, ursodiol, methotrexate, colchicine, symptoms of portal hypertension, advanced liver disease
Legacy Keywords
primary biliary cirrhosis treatment, chronic progressive disease, ursodiol, methotrexate, colchicine, symptoms of portal hypertension, advanced liver disease
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Of 29 patients with primary biliary cirrhosis receiving ursodiol plus either methotrexate or colchicine, 21 were alive and well after 20 years of follow-up, with 19 having normal liver function tests and no signs of portal hypertension.

Data Source: A retrospective, observational study conducted at a single tertiary center.

Disclosures: The authors declared no outside funding for this study and no personal conflicts of interest.

Novel Therapy for Iron Deficiency Effective in IBD

Article Type
Changed
Fri, 01/18/2019 - 11:18
Display Headline
Novel Therapy for Iron Deficiency Effective in IBD

A novel, fixed-dose intravenous ferric carboxymaltose regimen was more effective than was Ganzoni-calculated iron sucrose at treating iron-deficiency anemia, according to a report by Dr. Rayko Evstatiev and colleagues in the September issue of Gastroenterology.

The regimen, which included a maximum of three infusions, was well tolerated, "and the lower number of infusions increases the convenience and cost-effectiveness of intravenous iron repletion in patients with iron deficiency," wrote the authors (Gastroenterology 2011 September [doi: 10.1053/j.gastro.2011.06.005].

Dr. Evstatiev, of the Medical University of Vienna, and colleagues studied 485 patients with iron-deficiency anemia, defined as hemoglobin of 7-12 g/dL for females and 7-13 g/dL for males, along with ferritin values of less than 100 mcg/L.

All patients also had mild to moderate inflammatory bowel disease – either Crohn’s disease with a Crohn’s Disease Activity Index of less than 220 or ulcerative colitis with a Colitis Activity Index less than or equal to 7.

Exclusion criteria were administration of intravenous or oral iron treatment or blood transfusion within 4 weeks of screening, or a history of erythropoietin treatment, chronic alcohol abuse, or chronic liver disease.

Patients were randomized in a 1:1 ratio to receive either ferric carboxymaltose or the standard, individually calculated iron sucrose.

The intravenous ferric carboxymaltose, marketed as Ferinject by Vifor Pharma, the study’s sponsor, was administered in once-weekly infusions over at least 15 minutes.

In the ferric carboxymaltose group, for patients with an initial hemoglobin of greater than or equal to 10 g/dL, weighing less than 70 kg, a cumulative total dose of 1,000 mg of the drug was given; for patients weighing 70 kg or more, a total of 1,500 mg was administered.

Likewise, for patients with an initial hemoglobin of 7-10 g/dL weighing less than 70 kg, a total dose of 1,500 mg was infused. This was raised to 2,000 mg in patients weighing 70 kg or more.

Patients weighing less than 67 kg received a maximum of 500 mg per infusion, such that low-hemoglobin, low-weight patients could receive up to three total infusions, on days 1, 8, and 15. All other patients received 1,000 mg per infusion.

The iron sucrose regimen (Venofer, also from Vifor Pharma) was calculated for each patient individually by using the standard Ganzoni formula with a target hemoglobin of 15 g/dL, and included up to 11 infusions of 200 mg over at least 30 minutes, given up to twice weekly.

By week 12, an increase of hemoglobin concentration of greater than or equal to 2 g/dL was achieved in 66% of patients in the ferric carboxymaltose group, versus 54% receiving calculated iron sucrose (P = .004).

Moreover, normal hemoglobin values, defined as greater or equal to 12 g/dL in women and 13 g/dL in men, were seen in 73% of ferric carboxymaltose patients, versus 62% of iron sucrose recipients (P = .015).

Finally, the researchers also noted that normal transferrin saturation (20%-50%), normal ferritin concentration (greater than or equal to 100 mcg/L), and normal hemoglobin combined with normal ferritin were achieved significantly more often by patients in the ferric carboxymaltose group.

In a cost-effectiveness analysis, Dr. Evstatiev found that while treatment costs for the ferric carboxymaltose infusions were nearly double the price of iron sucrose per infusion, since more infusions were required among the latter group, "total treatment costs for [ferric carboxymaltose] over the whole study period were lower than those for [iron sucrose] (US$ 653 vs. US$ 891, respectively)."

Both treatment regimens were well tolerated, with the most common adverse events being nasopharyngitis and worsening of ulcerative colitis, reported the authors. No true hypersensitivity reactions were noted.

One of the authors was employed by Vifor Pharma, and others received speaker and consultancy fees from the company.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
inflammatory bowl disease, ibd, anemia, iron deficiency, ferric carboxymaltose, Ferinject , Crohn's disease
Author and Disclosure Information

Author and Disclosure Information

A novel, fixed-dose intravenous ferric carboxymaltose regimen was more effective than was Ganzoni-calculated iron sucrose at treating iron-deficiency anemia, according to a report by Dr. Rayko Evstatiev and colleagues in the September issue of Gastroenterology.

The regimen, which included a maximum of three infusions, was well tolerated, "and the lower number of infusions increases the convenience and cost-effectiveness of intravenous iron repletion in patients with iron deficiency," wrote the authors (Gastroenterology 2011 September [doi: 10.1053/j.gastro.2011.06.005].

Dr. Evstatiev, of the Medical University of Vienna, and colleagues studied 485 patients with iron-deficiency anemia, defined as hemoglobin of 7-12 g/dL for females and 7-13 g/dL for males, along with ferritin values of less than 100 mcg/L.

All patients also had mild to moderate inflammatory bowel disease – either Crohn’s disease with a Crohn’s Disease Activity Index of less than 220 or ulcerative colitis with a Colitis Activity Index less than or equal to 7.

Exclusion criteria were administration of intravenous or oral iron treatment or blood transfusion within 4 weeks of screening, or a history of erythropoietin treatment, chronic alcohol abuse, or chronic liver disease.

Patients were randomized in a 1:1 ratio to receive either ferric carboxymaltose or the standard, individually calculated iron sucrose.

The intravenous ferric carboxymaltose, marketed as Ferinject by Vifor Pharma, the study’s sponsor, was administered in once-weekly infusions over at least 15 minutes.

In the ferric carboxymaltose group, for patients with an initial hemoglobin of greater than or equal to 10 g/dL, weighing less than 70 kg, a cumulative total dose of 1,000 mg of the drug was given; for patients weighing 70 kg or more, a total of 1,500 mg was administered.

Likewise, for patients with an initial hemoglobin of 7-10 g/dL weighing less than 70 kg, a total dose of 1,500 mg was infused. This was raised to 2,000 mg in patients weighing 70 kg or more.

Patients weighing less than 67 kg received a maximum of 500 mg per infusion, such that low-hemoglobin, low-weight patients could receive up to three total infusions, on days 1, 8, and 15. All other patients received 1,000 mg per infusion.

The iron sucrose regimen (Venofer, also from Vifor Pharma) was calculated for each patient individually by using the standard Ganzoni formula with a target hemoglobin of 15 g/dL, and included up to 11 infusions of 200 mg over at least 30 minutes, given up to twice weekly.

By week 12, an increase of hemoglobin concentration of greater than or equal to 2 g/dL was achieved in 66% of patients in the ferric carboxymaltose group, versus 54% receiving calculated iron sucrose (P = .004).

Moreover, normal hemoglobin values, defined as greater or equal to 12 g/dL in women and 13 g/dL in men, were seen in 73% of ferric carboxymaltose patients, versus 62% of iron sucrose recipients (P = .015).

Finally, the researchers also noted that normal transferrin saturation (20%-50%), normal ferritin concentration (greater than or equal to 100 mcg/L), and normal hemoglobin combined with normal ferritin were achieved significantly more often by patients in the ferric carboxymaltose group.

In a cost-effectiveness analysis, Dr. Evstatiev found that while treatment costs for the ferric carboxymaltose infusions were nearly double the price of iron sucrose per infusion, since more infusions were required among the latter group, "total treatment costs for [ferric carboxymaltose] over the whole study period were lower than those for [iron sucrose] (US$ 653 vs. US$ 891, respectively)."

Both treatment regimens were well tolerated, with the most common adverse events being nasopharyngitis and worsening of ulcerative colitis, reported the authors. No true hypersensitivity reactions were noted.

One of the authors was employed by Vifor Pharma, and others received speaker and consultancy fees from the company.

A novel, fixed-dose intravenous ferric carboxymaltose regimen was more effective than was Ganzoni-calculated iron sucrose at treating iron-deficiency anemia, according to a report by Dr. Rayko Evstatiev and colleagues in the September issue of Gastroenterology.

The regimen, which included a maximum of three infusions, was well tolerated, "and the lower number of infusions increases the convenience and cost-effectiveness of intravenous iron repletion in patients with iron deficiency," wrote the authors (Gastroenterology 2011 September [doi: 10.1053/j.gastro.2011.06.005].

Dr. Evstatiev, of the Medical University of Vienna, and colleagues studied 485 patients with iron-deficiency anemia, defined as hemoglobin of 7-12 g/dL for females and 7-13 g/dL for males, along with ferritin values of less than 100 mcg/L.

All patients also had mild to moderate inflammatory bowel disease – either Crohn’s disease with a Crohn’s Disease Activity Index of less than 220 or ulcerative colitis with a Colitis Activity Index less than or equal to 7.

Exclusion criteria were administration of intravenous or oral iron treatment or blood transfusion within 4 weeks of screening, or a history of erythropoietin treatment, chronic alcohol abuse, or chronic liver disease.

Patients were randomized in a 1:1 ratio to receive either ferric carboxymaltose or the standard, individually calculated iron sucrose.

The intravenous ferric carboxymaltose, marketed as Ferinject by Vifor Pharma, the study’s sponsor, was administered in once-weekly infusions over at least 15 minutes.

In the ferric carboxymaltose group, for patients with an initial hemoglobin of greater than or equal to 10 g/dL, weighing less than 70 kg, a cumulative total dose of 1,000 mg of the drug was given; for patients weighing 70 kg or more, a total of 1,500 mg was administered.

Likewise, for patients with an initial hemoglobin of 7-10 g/dL weighing less than 70 kg, a total dose of 1,500 mg was infused. This was raised to 2,000 mg in patients weighing 70 kg or more.

Patients weighing less than 67 kg received a maximum of 500 mg per infusion, such that low-hemoglobin, low-weight patients could receive up to three total infusions, on days 1, 8, and 15. All other patients received 1,000 mg per infusion.

The iron sucrose regimen (Venofer, also from Vifor Pharma) was calculated for each patient individually by using the standard Ganzoni formula with a target hemoglobin of 15 g/dL, and included up to 11 infusions of 200 mg over at least 30 minutes, given up to twice weekly.

By week 12, an increase of hemoglobin concentration of greater than or equal to 2 g/dL was achieved in 66% of patients in the ferric carboxymaltose group, versus 54% receiving calculated iron sucrose (P = .004).

Moreover, normal hemoglobin values, defined as greater or equal to 12 g/dL in women and 13 g/dL in men, were seen in 73% of ferric carboxymaltose patients, versus 62% of iron sucrose recipients (P = .015).

Finally, the researchers also noted that normal transferrin saturation (20%-50%), normal ferritin concentration (greater than or equal to 100 mcg/L), and normal hemoglobin combined with normal ferritin were achieved significantly more often by patients in the ferric carboxymaltose group.

In a cost-effectiveness analysis, Dr. Evstatiev found that while treatment costs for the ferric carboxymaltose infusions were nearly double the price of iron sucrose per infusion, since more infusions were required among the latter group, "total treatment costs for [ferric carboxymaltose] over the whole study period were lower than those for [iron sucrose] (US$ 653 vs. US$ 891, respectively)."

Both treatment regimens were well tolerated, with the most common adverse events being nasopharyngitis and worsening of ulcerative colitis, reported the authors. No true hypersensitivity reactions were noted.

One of the authors was employed by Vifor Pharma, and others received speaker and consultancy fees from the company.

Publications
Publications
Topics
Article Type
Display Headline
Novel Therapy for Iron Deficiency Effective in IBD
Display Headline
Novel Therapy for Iron Deficiency Effective in IBD
Legacy Keywords
inflammatory bowl disease, ibd, anemia, iron deficiency, ferric carboxymaltose, Ferinject , Crohn's disease
Legacy Keywords
inflammatory bowl disease, ibd, anemia, iron deficiency, ferric carboxymaltose, Ferinject , Crohn's disease
Article Source

FROM GASTROENTEROLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: By week 12, a hemoglobin increase greater than or equal to 2 g/dL was achieved in 66% of patients in the ferric carboxymaltose group, versus 54% receiving calculated iron sucrose (P = .004).

Data Source: A randomized, controlled, open-label, multicenter study of patients with iron-deficiency anemia and mild to moderate inflammatory bowel disease.

Disclosures: Vifor Pharma, maker of the study drug and the comparison treatment, sponsored the study. One of the authors was employed by Vifor Pharma, and others received speaker and consultancy fees from the company.

Cancer Stage Not a Significant Factor in Radioiodine Use

Varying Use Patterns May Be Justified
Article Type
Changed
Fri, 01/04/2019 - 11:44
Display Headline
Cancer Stage Not a Significant Factor in Radioiodine Use

Despite an overall increase in the use of radioactive iodine following total thyroidectomy for primary thyroid cancer, there are still significant variations in use among institutions and across demographically different populations.

That variation, however, doesn’t appear to have had much of an impact on disease severity, according to a study published online Aug. 16 in JAMA.

"The recent increase in the incidence of small, low-risk thyroid cancer mandates an understanding of patterns of care in thyroid cancer," wrote lead author Dr. Megan R. Haymart and her colleagues (JAMA 2011;306:721-8).

Moreover, "the significant between-hospital variation in radioactive iodine use suggests clinical uncertainty about the role of radioactive iodine in thyroid cancer management."

Dr. Haymart, of the University of Michigan, Ann Arbor, and her coauthors analyzed the cases of 189,219 patients with primary thyroid cancer who underwent total thyroidectomy between 1990 and 2008. Data were culled from the National Cancer Database, which captures close to 85% of all thyroid cancers in the United States, according to the investigators.

They found that in 1990, 1,373 of 3,397 patients with the diagnosis received radioactive iodine (40%).

By 2008, that number had jumped to 11,539 of 20,620 cases (56%) – a significant increase (P less than .001).

The authors then conducted a subgroup analysis involving 85,948 patients diagnosed between 2004 and 2008, in order to "define the most contemporary practice patterns." They found that "younger age and absence of comorbidity were associated with a small but significantly greater likelihood of receiving radioactive iodine after total thyroidectomy."

Younger patients (aged 44 years and less) had an odds ratio of 2.15 for receiving the treatment compared with patients aged 60 years and older (95% confidence interval, 2.04-2.26).

Similarly, patients with a Charlson-Deyo comorbidity index score of 0 registered an OR of 1.19 for receiving radioactive iodine following thyroidectomy, compared to patients with scores of 2 or greater (95% CI, 1.07-1.35).

Factors significantly associated with a lower rate of radioactive iodine use were female sex (OR, 0.87; 95% CI, 0.84-0.91), African American race (OR, 0.83; 95% CI, 0.77-0.89), and the absence of private/government insurance (OR, 0.84; 95% CI, 0.81-0.88).

By comparison, disease severity appeared to play less of a role in treatment patterns. There was a significant difference between radioactive iodine use between American Joint Committee on Cancer designation stage I and stage IV (OR for stage I vs. stage IV, 0.34; 95% CI, 0.31-0.37). However, no difference in use existed between stage II and stage IV (OR for stage II vs. stage IV, 0.97; 95% CI, 0.88-1.07). Nor was there a significant difference in use between stage III and stage IV (OR, 1.06; 95% CI, 0.95-1.17).

The number of cases of post-thyroidectomy thyroid cancers seen at a particular institution per year also affected the use of radioactive treatment. Compared with high-volume institutions, defined as treating 35 or more cases per year, there was significantly less use of radioactive iodine at low-volume centers, treating 6 or fewer cases per year (OR, 0.44; 95% CI, 0.33-0.58) and medium-volume centers, treating 7-11 cases per year (OR, 0.62; 95% CI, 0.48-0.80).

According to Dr. Haymart and her colleagues, the conflicting use patterns are not easily explained, although some uncertainty may be due to a lack of clinical trials, as well as previous conflicting, single-institution studies. "Because of limited clinical evidence, clinical guidelines have left radioactive iodine use to physician discretion in many cases," they wrote.

"In the interest of curbing the increasing health care costs and preventing both overtreatment and undertreatment of disease, indications for radioactive iodine should be clearly defined and disease severity should become the primary driver of radioactive iodine use," they said.

The authors reported no potential conflicts of interest. The study was funded by a grant to Dr. Haymart from the National Institutes of Health.

Body

While there does appear to be wide variation in use of radioactive iodine, the conclusion that this variation is inappropriate may not be accurate, wrote Dr. Edward H. Livingston and Dr. Robert A. McNutt in an accompanying editorial.

"There is incomplete knowledge about how and why care was delivered in hospitals showing variation," they wrote.

"If RAI [radioactive iodine] was not given to high-risk patients, the reasons it was not administered (such as patient preferences) are not captured in the database. If RAI was given to low-risk patients, subtle information regarding a clinician’s decision to administer RAI is not captured in these databases."

For example, "during total thyroidectomy, some surgeons leave a rim of thyroid tissue adjacent to nerves to minimize the risk of nerve injury and rely on RAI to ablate the residual thyroid tissue," they pointed out.

"This procedure is coded as a total thyroidectomy in an administrative database and appears in an analysis of that database to be associated with inappropriate administration of RAI."

Indeed, "without knowing if patients receiving RAI derived benefit or harm, it is difficult to conclude that RAI administration was appropriate or not," they added.

And while the study is telling, its usefulness in setting clinical guidelines is limited.

"For individual patients cared for by individual physicians, variation in care is sometimes desirable. One patient’s chronic illness is not another’s, and treating all patients the same would be clinical nonsense.

"Because of uncertainty in the integrity of most administrative databases and registries and the inherent limitation in the amount of information they contain about patient care, policy should only rarely be made based on findings from these sources," they added.

Dr. Livingston, of the University of Texas Southwestern Medical Center, Dallas, and Dr. McNutt, of the Rush University School of Medicine, Chicago, are both contributing editors to JAMA. Both stated that they had no conflicts of interest related to the editorial (JAMA 2011;306:762-3).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
oncology, endocrinology
Author and Disclosure Information

Author and Disclosure Information

Body

While there does appear to be wide variation in use of radioactive iodine, the conclusion that this variation is inappropriate may not be accurate, wrote Dr. Edward H. Livingston and Dr. Robert A. McNutt in an accompanying editorial.

"There is incomplete knowledge about how and why care was delivered in hospitals showing variation," they wrote.

"If RAI [radioactive iodine] was not given to high-risk patients, the reasons it was not administered (such as patient preferences) are not captured in the database. If RAI was given to low-risk patients, subtle information regarding a clinician’s decision to administer RAI is not captured in these databases."

For example, "during total thyroidectomy, some surgeons leave a rim of thyroid tissue adjacent to nerves to minimize the risk of nerve injury and rely on RAI to ablate the residual thyroid tissue," they pointed out.

"This procedure is coded as a total thyroidectomy in an administrative database and appears in an analysis of that database to be associated with inappropriate administration of RAI."

Indeed, "without knowing if patients receiving RAI derived benefit or harm, it is difficult to conclude that RAI administration was appropriate or not," they added.

And while the study is telling, its usefulness in setting clinical guidelines is limited.

"For individual patients cared for by individual physicians, variation in care is sometimes desirable. One patient’s chronic illness is not another’s, and treating all patients the same would be clinical nonsense.

"Because of uncertainty in the integrity of most administrative databases and registries and the inherent limitation in the amount of information they contain about patient care, policy should only rarely be made based on findings from these sources," they added.

Dr. Livingston, of the University of Texas Southwestern Medical Center, Dallas, and Dr. McNutt, of the Rush University School of Medicine, Chicago, are both contributing editors to JAMA. Both stated that they had no conflicts of interest related to the editorial (JAMA 2011;306:762-3).

Body

While there does appear to be wide variation in use of radioactive iodine, the conclusion that this variation is inappropriate may not be accurate, wrote Dr. Edward H. Livingston and Dr. Robert A. McNutt in an accompanying editorial.

"There is incomplete knowledge about how and why care was delivered in hospitals showing variation," they wrote.

"If RAI [radioactive iodine] was not given to high-risk patients, the reasons it was not administered (such as patient preferences) are not captured in the database. If RAI was given to low-risk patients, subtle information regarding a clinician’s decision to administer RAI is not captured in these databases."

For example, "during total thyroidectomy, some surgeons leave a rim of thyroid tissue adjacent to nerves to minimize the risk of nerve injury and rely on RAI to ablate the residual thyroid tissue," they pointed out.

"This procedure is coded as a total thyroidectomy in an administrative database and appears in an analysis of that database to be associated with inappropriate administration of RAI."

Indeed, "without knowing if patients receiving RAI derived benefit or harm, it is difficult to conclude that RAI administration was appropriate or not," they added.

And while the study is telling, its usefulness in setting clinical guidelines is limited.

"For individual patients cared for by individual physicians, variation in care is sometimes desirable. One patient’s chronic illness is not another’s, and treating all patients the same would be clinical nonsense.

"Because of uncertainty in the integrity of most administrative databases and registries and the inherent limitation in the amount of information they contain about patient care, policy should only rarely be made based on findings from these sources," they added.

Dr. Livingston, of the University of Texas Southwestern Medical Center, Dallas, and Dr. McNutt, of the Rush University School of Medicine, Chicago, are both contributing editors to JAMA. Both stated that they had no conflicts of interest related to the editorial (JAMA 2011;306:762-3).

Title
Varying Use Patterns May Be Justified
Varying Use Patterns May Be Justified

Despite an overall increase in the use of radioactive iodine following total thyroidectomy for primary thyroid cancer, there are still significant variations in use among institutions and across demographically different populations.

That variation, however, doesn’t appear to have had much of an impact on disease severity, according to a study published online Aug. 16 in JAMA.

"The recent increase in the incidence of small, low-risk thyroid cancer mandates an understanding of patterns of care in thyroid cancer," wrote lead author Dr. Megan R. Haymart and her colleagues (JAMA 2011;306:721-8).

Moreover, "the significant between-hospital variation in radioactive iodine use suggests clinical uncertainty about the role of radioactive iodine in thyroid cancer management."

Dr. Haymart, of the University of Michigan, Ann Arbor, and her coauthors analyzed the cases of 189,219 patients with primary thyroid cancer who underwent total thyroidectomy between 1990 and 2008. Data were culled from the National Cancer Database, which captures close to 85% of all thyroid cancers in the United States, according to the investigators.

They found that in 1990, 1,373 of 3,397 patients with the diagnosis received radioactive iodine (40%).

By 2008, that number had jumped to 11,539 of 20,620 cases (56%) – a significant increase (P less than .001).

The authors then conducted a subgroup analysis involving 85,948 patients diagnosed between 2004 and 2008, in order to "define the most contemporary practice patterns." They found that "younger age and absence of comorbidity were associated with a small but significantly greater likelihood of receiving radioactive iodine after total thyroidectomy."

Younger patients (aged 44 years and less) had an odds ratio of 2.15 for receiving the treatment compared with patients aged 60 years and older (95% confidence interval, 2.04-2.26).

Similarly, patients with a Charlson-Deyo comorbidity index score of 0 registered an OR of 1.19 for receiving radioactive iodine following thyroidectomy, compared to patients with scores of 2 or greater (95% CI, 1.07-1.35).

Factors significantly associated with a lower rate of radioactive iodine use were female sex (OR, 0.87; 95% CI, 0.84-0.91), African American race (OR, 0.83; 95% CI, 0.77-0.89), and the absence of private/government insurance (OR, 0.84; 95% CI, 0.81-0.88).

By comparison, disease severity appeared to play less of a role in treatment patterns. There was a significant difference between radioactive iodine use between American Joint Committee on Cancer designation stage I and stage IV (OR for stage I vs. stage IV, 0.34; 95% CI, 0.31-0.37). However, no difference in use existed between stage II and stage IV (OR for stage II vs. stage IV, 0.97; 95% CI, 0.88-1.07). Nor was there a significant difference in use between stage III and stage IV (OR, 1.06; 95% CI, 0.95-1.17).

The number of cases of post-thyroidectomy thyroid cancers seen at a particular institution per year also affected the use of radioactive treatment. Compared with high-volume institutions, defined as treating 35 or more cases per year, there was significantly less use of radioactive iodine at low-volume centers, treating 6 or fewer cases per year (OR, 0.44; 95% CI, 0.33-0.58) and medium-volume centers, treating 7-11 cases per year (OR, 0.62; 95% CI, 0.48-0.80).

According to Dr. Haymart and her colleagues, the conflicting use patterns are not easily explained, although some uncertainty may be due to a lack of clinical trials, as well as previous conflicting, single-institution studies. "Because of limited clinical evidence, clinical guidelines have left radioactive iodine use to physician discretion in many cases," they wrote.

"In the interest of curbing the increasing health care costs and preventing both overtreatment and undertreatment of disease, indications for radioactive iodine should be clearly defined and disease severity should become the primary driver of radioactive iodine use," they said.

The authors reported no potential conflicts of interest. The study was funded by a grant to Dr. Haymart from the National Institutes of Health.

Despite an overall increase in the use of radioactive iodine following total thyroidectomy for primary thyroid cancer, there are still significant variations in use among institutions and across demographically different populations.

That variation, however, doesn’t appear to have had much of an impact on disease severity, according to a study published online Aug. 16 in JAMA.

"The recent increase in the incidence of small, low-risk thyroid cancer mandates an understanding of patterns of care in thyroid cancer," wrote lead author Dr. Megan R. Haymart and her colleagues (JAMA 2011;306:721-8).

Moreover, "the significant between-hospital variation in radioactive iodine use suggests clinical uncertainty about the role of radioactive iodine in thyroid cancer management."

Dr. Haymart, of the University of Michigan, Ann Arbor, and her coauthors analyzed the cases of 189,219 patients with primary thyroid cancer who underwent total thyroidectomy between 1990 and 2008. Data were culled from the National Cancer Database, which captures close to 85% of all thyroid cancers in the United States, according to the investigators.

They found that in 1990, 1,373 of 3,397 patients with the diagnosis received radioactive iodine (40%).

By 2008, that number had jumped to 11,539 of 20,620 cases (56%) – a significant increase (P less than .001).

The authors then conducted a subgroup analysis involving 85,948 patients diagnosed between 2004 and 2008, in order to "define the most contemporary practice patterns." They found that "younger age and absence of comorbidity were associated with a small but significantly greater likelihood of receiving radioactive iodine after total thyroidectomy."

Younger patients (aged 44 years and less) had an odds ratio of 2.15 for receiving the treatment compared with patients aged 60 years and older (95% confidence interval, 2.04-2.26).

Similarly, patients with a Charlson-Deyo comorbidity index score of 0 registered an OR of 1.19 for receiving radioactive iodine following thyroidectomy, compared to patients with scores of 2 or greater (95% CI, 1.07-1.35).

Factors significantly associated with a lower rate of radioactive iodine use were female sex (OR, 0.87; 95% CI, 0.84-0.91), African American race (OR, 0.83; 95% CI, 0.77-0.89), and the absence of private/government insurance (OR, 0.84; 95% CI, 0.81-0.88).

By comparison, disease severity appeared to play less of a role in treatment patterns. There was a significant difference between radioactive iodine use between American Joint Committee on Cancer designation stage I and stage IV (OR for stage I vs. stage IV, 0.34; 95% CI, 0.31-0.37). However, no difference in use existed between stage II and stage IV (OR for stage II vs. stage IV, 0.97; 95% CI, 0.88-1.07). Nor was there a significant difference in use between stage III and stage IV (OR, 1.06; 95% CI, 0.95-1.17).

The number of cases of post-thyroidectomy thyroid cancers seen at a particular institution per year also affected the use of radioactive treatment. Compared with high-volume institutions, defined as treating 35 or more cases per year, there was significantly less use of radioactive iodine at low-volume centers, treating 6 or fewer cases per year (OR, 0.44; 95% CI, 0.33-0.58) and medium-volume centers, treating 7-11 cases per year (OR, 0.62; 95% CI, 0.48-0.80).

According to Dr. Haymart and her colleagues, the conflicting use patterns are not easily explained, although some uncertainty may be due to a lack of clinical trials, as well as previous conflicting, single-institution studies. "Because of limited clinical evidence, clinical guidelines have left radioactive iodine use to physician discretion in many cases," they wrote.

"In the interest of curbing the increasing health care costs and preventing both overtreatment and undertreatment of disease, indications for radioactive iodine should be clearly defined and disease severity should become the primary driver of radioactive iodine use," they said.

The authors reported no potential conflicts of interest. The study was funded by a grant to Dr. Haymart from the National Institutes of Health.

Publications
Publications
Topics
Article Type
Display Headline
Cancer Stage Not a Significant Factor in Radioiodine Use
Display Headline
Cancer Stage Not a Significant Factor in Radioiodine Use
Legacy Keywords
oncology, endocrinology
Legacy Keywords
oncology, endocrinology
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major Finding: Patient demographics and a hospital’s annual volume of thyroidectomy procedures – and not disease severity – are some of the biggest predictors of radioactive iodine use following total thyroidectomy. Factors significantly associated with a lower rate of radioactive iodine use were female sex (OR, 0.87; 95% CI, 0.84-0.91), African American race (OR, 0.83; 95% CI, 0.77-0.89), and the absence of private/government insurance (OR, 0.84; 95% CI, 0.81-0.88).

Data Source: A study of 189,219 patients registered in the National Cancer Database with primary thyroid cancer who underwent total thyroidectomy in the United States between 1990 and 2008.

Disclosures: The authors reported no potential conflicts of interest. The study was funded by a grant to Dr. Haymart from the National Institutes of Health.

Rivaroxaban Contests Warfarin for Stroke Prevention in AF

'Alternatives Have Arrived'
Article Type
Changed
Fri, 01/18/2019 - 11:16
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Author and Disclosure Information

Author and Disclosure Information

Related Articles
Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Title
'Alternatives Have Arrived'
'Alternatives Have Arrived'

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

Publications
Publications
Topics
Article Type
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF
Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Article Source

FROM NEW ENGLAND JOURNAL OF MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major Finding: The rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

Data Source: ROCKET AF, a randomized, double-blind trial of 14,264 patients with nonvalvular AF who were at high risk for stroke.

Disclosures: The study was sponsored by Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of rivaroxaban, respectively. Dr. Patel and several other authors on this study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson and Bayer Healthcare.

Rivaroxaban Contests Warfarin for Stroke Prevention in AF

'Alternatives Have Arrived'
Article Type
Changed
Fri, 12/07/2018 - 14:13
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Author and Disclosure Information

Author and Disclosure Information

Related Articles
Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Body

"For the management of atrial fibrillation, oral alternatives to warfarin have arrived," wrote Dr. Gregory J. del Zoppo and Misha Eliasziw, Ph.D., in an editorial accompanying the study.

In addition to the current study of rivaroxaban, they also cited the Randomized Evaluation of Long-term Anticoagulant Therapy (RE-LY) study, which showed that "dabigatran, a direct thrombin inhibitor, was not inferior to warfarin."

"Both studies provide some points to ponder for a condition in which placebo controlled trials are no longer possible," they wrote (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMe11o7516]).

Not the least of these is the "interesting" finding of a reduced rate of intracranial hemorrhage among both new agents, compared with warfarin.

"The reasons for the potential reduction ... are not clear, but one possibility is the effect on a single target in the hemostatic system by the new antithrombolytic agents versus the multiple targets by warfarin," they wrote.

"More intriguing is the possibility that cerebral vascular beds have protective features that are more apparent at the doses of either of the new agents tested."

"Fundamental studies of cerebral vascular responses to these agents ... would be instructive."

Although the new agents’ "simplicity of use" remains attractive, they noted that is important to know that these clinical trials do not address the "absence of antidotes to rapidly reverse the anticoagulation effects of either rivaroxaban or dabigatran in the case of life-threatening hemorrhage or surgery," they wrote.

Dr. del Zoppo is with the division of hematology at the University of Washington, Seattle, and disclosed no conflicts of interest. Dr. Eliasziw is with the department of community health sciences at the University of Calgary (Alta.), and disclosed previous grant support from the National Institutes of Health.

Title
'Alternatives Have Arrived'
'Alternatives Have Arrived'

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

A once-daily, oral, fixed dose of the Factor Xa inhibitor rivaroxaban was noninferior to warfarin for the prevention of stroke in patients with atrial fibrillation in a randomized, double-blind trial.

Moreover, although rates of bleeding were similar between the two groups of patients, "bleeding that proved fatal or involved a critical anatomical site occurred less frequently in the rivaroxaban group," Dr. Manesh R. Patel of Duke University, Durham, N.C., and his coauthors wrote online Aug. 10 in the New England Journal of Medicine.

"In contrast, bleeding from gastrointestinal sites, including upper, lower, and rectal sites, occurred more frequently in the rivaroxaban group."

In July, the U.S. Food and Drug Administration approved rivaroxaban (Xarelto) for the prevention of deep vein thrombosis.

The investigators enrolled 14,264 patients with elevated stroke risk from 1,178 sites in 45 countries with nonvalvular atrial fibrillation (AF). An "elevated risk" was indicated by a previous stroke history, transient ischemic attack, or systemic embolism. Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of the drug, respectively, sponsored the study, called ROCKET AF (Rivaroxaban One Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation).

Patients also were included if they had at least two of several risk factors, including heart failure, an ejection fraction of less than 35%, hypertension, age older than 75 years, or diabetes mellitus.

The median age of participants was 73 years, of whom nearly 40% were women. Both groups were statistically similar or identical in terms of body mass index, median systolic and diastolic blood pressure, type of AF (persistent, paroxysmal, or new-onset), and other related factors.

Patients were randomly assigned to receive either a once-daily, 20-mg dose of rivaroxaban or a standard, adjusted dose of warfarin, with a target international normalized ratio (INR) of 2-3. Patients in the rivaroxaban group who had a creatinine clearance of 30-49 mL/min received a 15-mg dose. All patients also received a placebo tablet (N. Engl. J. Med. 2011 Aug. 10 [doi:10.1056/NEJMoa1009638]).

The patients underwent treatment for a median of 590 days and had a median follow-up period of 707 days. Among the patients who received at least one dose of a study drug and had no major protocol violation, the rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin. Overall, stroke or embolism occurred in 188 rivaroxaban patients and 241 warfarin patients, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

The rate of "major" bleeding was similar between the rivaroxaban and warfarin groups (3.6% vs. 3.4%, respectively). Major bleeding was defined as "clinically overt" bleeding associated with fatal outcome; involvement with a "critical" site including intracranial, spinal, ocular, pericardial, articular, retroperitoneal, or intramuscular areas; a drop in hemoglobin of 2 g/dL or greater; transfusion of multiple units of blood; or permanent disability.

Rates of "clinically relevant non-major bleeding" also were similar, occurring in 1,475 rivaroxaban patients and 1,449 warfarin patients (14.9% vs. 14.5%, respectively; rivaroxaban hazard ratio of 1.03; P = .44). Non-major events included overt bleeds not meeting major criteria, but requiring physician intervention.

However, rivaroxaban users were more likely to experience drops in hemoglobin of 2 g/dL or more, compared with warfarin users (2.8% vs. 2.3%, respectively; P = .02), as well as gastrointestinal bleeding (3.2% vs. 2.2%, respectively; P less than .001).

On the other hand, warfarin-treated patients more frequently experienced fatal bleeding (0.2% for rivaroxaban versus 0.5% for warfarin, P = .003) and intracranial bleeding (0.5% for rivaroxaban, versus 0.7% for warfarin, P =0.02). The target INR was maintained only 55% of the time in patients who received warfarin, the authors noted.

In addition to the study funding, Dr. Patel and several other authors of the study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson as well as Bayer Healthcare.

Publications
Publications
Topics
Article Type
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF
Display Headline
Rivaroxaban Contests Warfarin for Stroke Prevention in AF
Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Legacy Keywords
Factor Xa inhibitor, rivaroxaban, warfarin, stroke prevention, atrial fibrillation, AF, Dr. Manesh R. Patel, the New England Journal of Medicine, bleeding, gastrointestinal sites, Xarelto, ROCKET AF,

Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major Finding: The rate of stroke or embolism was 1.7% per year for rivaroxaban, compared with 2.2% per year for warfarin, for a hazard ratio of 0.79 among rivaroxaban recipients (95% confidence interval, 0.66-0.96, P < .001 for noninferiority).

Data Source: ROCKET AF, a randomized, double-blind trial of 14,264 patients with nonvalvular AF who were at high risk for stroke.

Disclosures: The study was sponsored by Johnson & Johnson and Bayer Healthcare, the U.S. and European marketers of rivaroxaban, respectively. Dr. Patel and several other authors on this study disclosed numerous financial relationships to pharmaceutical companies, including the makers of rivaroxaban; several authors were also employees of Johnson & Johnson and Bayer Healthcare.

Platinum-Based Chemotherapy Benefits Elderly Lung Cancer Patients

More Studies Needed in Elderly
Article Type
Changed
Fri, 01/04/2019 - 11:44
Display Headline
Platinum-Based Chemotherapy Benefits Elderly Lung Cancer Patients

Chemotherapy with a platinum-based doublet was associated with a highly significant 36% reduction in mortality, compared with monotherapy, among elderly patients with non–small cell lung cancer in a study published online Aug. 9 by the Lancet.

The combination of carboplatin and paclitaxel was associated with more toxicity than was single-drug vinorelbine or gemcitabine regimens in the phase III trial, but the investigators contended that this was outweighed by the survival benefit.

Median overall survival for patients receiving carboplatin plus paclitaxel was 10.3 months, compared with 6.2 months in those randomized to monotherapy (hazard ratio, 0.64; P less than .0001).

Although several guidelines currently recommend monotherapy for elderly patients, the current finding is "of such magnitude that we believe the treatment paradigm for elderly patients with advanced NSCLC should be reconsidered," wrote Dr. Elisabeth Quoix of the Hôpitaux Universitaires de Strasbourg (France) and colleagues (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)60780-0]).

The investigators from the IFCT (Intergroupe Francophone de Cancérologie Thoracique) looked at 451 patients aged 70-89 years (median age, 77 years) with unresectable stage IV NSCLC or stage III disease that was "unsuitable" for radical radiation therapy. Patients were followed for a median of 30.3 months. To be included in the study, patients’ scores on the World Health Organization’s performance status scale had to be 2 or lower (indicating severe disease), and patients had to have a life expectancy of at least 12 weeks.

The 225 patients who were randomized to the doublet chemotherapy group received intravenous carboplatin (AUC [area under the curve] = 6) on day 1, plus 90 mg/m² of paclitaxel on days 1, 8, and 15 of 28-day cycles.

The 226-patient monotherapy cohort received 25 mg/m² vinorelbine (62 patients) on days 1 and 8 – or 1,150 mg/m² gemcitabine (164 patients) on days 1 and 8 – of 21-day cycles, with the choice of either vinorelbine or gemcitabine being made by the institution conducting the therapy.

In the doublet group, cycles were repeated every 4 weeks, such that patients received 3 weeks of treatment plus 1 week without, for doublet chemotherapy. For monotherapy, the cycle lasted 3 weeks (2 weeks of treatment plus 1 week without), with a maximum of four cycles for doublet therapy and five cycles for monotherapy.

Survival Rates for Chemotherapy Study Patients

By 1 year, the survival rate was 25.4% in the monotherapy group vs. 44.5% in the doublet therapy group (HR, 0.64; P less than .0001). The trend persisted at 2 years, with the probability of survival being 11.7% in monotherapy recipients and 22.4% in those receiving doublet therapy, wrote the authors.

Median progression-free survival also was significantly longer with the doublet (6 months vs. 2.8 months; P less than .0001).

In 2009, an independent data-monitoring committee recommended stopping recruitment based on the second interim analysis.

"Grade 3-4 neutropenia, febrile neutropenia, thrombopenia, and anemia were significantly more frequent among patients in the doublet chemotherapy group than among those in the monotherapy group, as was grade 3-4 sensory neuropathy," wrote the authors. The protocol did not allow growth factor support in the first cycle, but it was authorized as secondary prophylaxis in patients who developed grade 3 or 4 neutropenia.

In all, 10 deaths in the doublet therapy group (4.4%) and 3 in the monotherapy group (1.3%) were related to treatment: Culprit diagnoses included sepsis, respiratory distress, and diarrhea related to renal insufficiency. But the percentage of deaths in the first 3 months after the start of therapy was "markedly lower" in patients who received carboplatin plus paclitaxel.

Although quality of life scores at week 18 were similar between groups, the authors pointed out that "role functioning and fatigue were worse in the doublet chemotherapy group than in the monotherapy group" (P = .026 and .039, respectively). Full quality of life data will be published separately, they said.

The 2009 American Society of Clinical Oncology guidelines (J. Clin. Oncol. 2009;27:6251-66) recommend that age "not be used as a criterion in the decision-making process about whether to treat a patient" and call for further research devoted to elderly patients, according to Dr. Quoix and colleagues. "Conversely," they noted, "the European Organisation for Research and Treatment of Cancer Elderly Task Force and Lung Cancer Group and International Society of Geriatric Oncology highlighted in 2010 that monotherapy should be given to elderly patients with advanced NSCLC."

"We believe that monthly carboplatin and weekly paclitaxel is a feasible option for first-line therapy of advanced NSCLC in patients older than 70 years with performance status scores of 0-2," the authors wrote.

 

 

The study was funded by the IFCT and the French National Cancer Institute, with support by grants from Bristol-Myers Squibb, Roche, and Pierre Fabre. Several authors, including Dr. Quoix, disclosed financial relationships with the makers of chemotherapy drugs, including carboplatin (Bristol-Myers Squibb, Roche, and Lilly). Dr. Reckamp disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech.

Body

In an editorial accompanying the study, Dr. Karen L. Reckamp wrote that although older patients dominate the lung cancer population, they nevertheless continue to be underrepresented in clinical trials. Accounting for 47% of the U.S. adults with cancer, patients aged 70 years or older constitute just 13% of patients who are enrolled in clinical trials, according to the Southern Italy Cooperative Oncology Group (SICOG).

Nevertheless, there are some studies that provide guidance, she wrote.

A study by the SICOG compared vinorelbine alone with vinorelbine plus gemcitabine in patients aged 70 years and older (J. Clin. Oncol. 2000;18:2529-36). "Combination chemotherapy resulted in a significantly lower risk of death," she wrote. "Adverse events were greater in the combination group, but patients had a delay" in quality of life deterioration.

A second investigation, MILES (Multicenter Italian Lung Cancer in the Elderly Study), compared vinorelbine or gemcitabine alone or in combination. "Combination therapy did not improve overall survival and was more toxic than was either single-agent regimen," wrote Dr. Reckamp.

A third study from Japan was stopped because of futility when the interim analysis showed inferior survival with increased toxic effects in patients who were treated with weekly cisplatin and docetaxel vs. docetaxel every 3 weeks (J. Clin. Oncol. 2011;29:abstract 7509).

"The appropriate assessment to predict efficacy and toxic effects of therapy has not yet been identified," said Dr. Reckamp. As previous trials assessed multiple regimens with fractionated doses and non–platinum-based doublets, "the optimum chemotherapy regimen remains unknown, she added, concluding that "additional studies are needed that enroll adequate numbers of older adults, and include a comprehensive geriatric assessment to provide the knowledge required to properly assess the risk-benefit ratio in treatment decisions, so that a personalized approach can be taken."

Dr. Reckamp is at the City of Hope Comprehensive Cancer Center in Duarte, Calif. She disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech. These remarks were adapted from an editorial that accompanied the study (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)61259-2]).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
platinum-based chemotherapy, non small cell lung cancer treatment, carboplatin and paclitaxel, radical radiation therapy, chemotherapy survival rate
Author and Disclosure Information

Author and Disclosure Information

Body

In an editorial accompanying the study, Dr. Karen L. Reckamp wrote that although older patients dominate the lung cancer population, they nevertheless continue to be underrepresented in clinical trials. Accounting for 47% of the U.S. adults with cancer, patients aged 70 years or older constitute just 13% of patients who are enrolled in clinical trials, according to the Southern Italy Cooperative Oncology Group (SICOG).

Nevertheless, there are some studies that provide guidance, she wrote.

A study by the SICOG compared vinorelbine alone with vinorelbine plus gemcitabine in patients aged 70 years and older (J. Clin. Oncol. 2000;18:2529-36). "Combination chemotherapy resulted in a significantly lower risk of death," she wrote. "Adverse events were greater in the combination group, but patients had a delay" in quality of life deterioration.

A second investigation, MILES (Multicenter Italian Lung Cancer in the Elderly Study), compared vinorelbine or gemcitabine alone or in combination. "Combination therapy did not improve overall survival and was more toxic than was either single-agent regimen," wrote Dr. Reckamp.

A third study from Japan was stopped because of futility when the interim analysis showed inferior survival with increased toxic effects in patients who were treated with weekly cisplatin and docetaxel vs. docetaxel every 3 weeks (J. Clin. Oncol. 2011;29:abstract 7509).

"The appropriate assessment to predict efficacy and toxic effects of therapy has not yet been identified," said Dr. Reckamp. As previous trials assessed multiple regimens with fractionated doses and non–platinum-based doublets, "the optimum chemotherapy regimen remains unknown, she added, concluding that "additional studies are needed that enroll adequate numbers of older adults, and include a comprehensive geriatric assessment to provide the knowledge required to properly assess the risk-benefit ratio in treatment decisions, so that a personalized approach can be taken."

Dr. Reckamp is at the City of Hope Comprehensive Cancer Center in Duarte, Calif. She disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech. These remarks were adapted from an editorial that accompanied the study (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)61259-2]).

Body

In an editorial accompanying the study, Dr. Karen L. Reckamp wrote that although older patients dominate the lung cancer population, they nevertheless continue to be underrepresented in clinical trials. Accounting for 47% of the U.S. adults with cancer, patients aged 70 years or older constitute just 13% of patients who are enrolled in clinical trials, according to the Southern Italy Cooperative Oncology Group (SICOG).

Nevertheless, there are some studies that provide guidance, she wrote.

A study by the SICOG compared vinorelbine alone with vinorelbine plus gemcitabine in patients aged 70 years and older (J. Clin. Oncol. 2000;18:2529-36). "Combination chemotherapy resulted in a significantly lower risk of death," she wrote. "Adverse events were greater in the combination group, but patients had a delay" in quality of life deterioration.

A second investigation, MILES (Multicenter Italian Lung Cancer in the Elderly Study), compared vinorelbine or gemcitabine alone or in combination. "Combination therapy did not improve overall survival and was more toxic than was either single-agent regimen," wrote Dr. Reckamp.

A third study from Japan was stopped because of futility when the interim analysis showed inferior survival with increased toxic effects in patients who were treated with weekly cisplatin and docetaxel vs. docetaxel every 3 weeks (J. Clin. Oncol. 2011;29:abstract 7509).

"The appropriate assessment to predict efficacy and toxic effects of therapy has not yet been identified," said Dr. Reckamp. As previous trials assessed multiple regimens with fractionated doses and non–platinum-based doublets, "the optimum chemotherapy regimen remains unknown, she added, concluding that "additional studies are needed that enroll adequate numbers of older adults, and include a comprehensive geriatric assessment to provide the knowledge required to properly assess the risk-benefit ratio in treatment decisions, so that a personalized approach can be taken."

Dr. Reckamp is at the City of Hope Comprehensive Cancer Center in Duarte, Calif. She disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech. These remarks were adapted from an editorial that accompanied the study (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)61259-2]).

Title
More Studies Needed in Elderly
More Studies Needed in Elderly

Chemotherapy with a platinum-based doublet was associated with a highly significant 36% reduction in mortality, compared with monotherapy, among elderly patients with non–small cell lung cancer in a study published online Aug. 9 by the Lancet.

The combination of carboplatin and paclitaxel was associated with more toxicity than was single-drug vinorelbine or gemcitabine regimens in the phase III trial, but the investigators contended that this was outweighed by the survival benefit.

Median overall survival for patients receiving carboplatin plus paclitaxel was 10.3 months, compared with 6.2 months in those randomized to monotherapy (hazard ratio, 0.64; P less than .0001).

Although several guidelines currently recommend monotherapy for elderly patients, the current finding is "of such magnitude that we believe the treatment paradigm for elderly patients with advanced NSCLC should be reconsidered," wrote Dr. Elisabeth Quoix of the Hôpitaux Universitaires de Strasbourg (France) and colleagues (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)60780-0]).

The investigators from the IFCT (Intergroupe Francophone de Cancérologie Thoracique) looked at 451 patients aged 70-89 years (median age, 77 years) with unresectable stage IV NSCLC or stage III disease that was "unsuitable" for radical radiation therapy. Patients were followed for a median of 30.3 months. To be included in the study, patients’ scores on the World Health Organization’s performance status scale had to be 2 or lower (indicating severe disease), and patients had to have a life expectancy of at least 12 weeks.

The 225 patients who were randomized to the doublet chemotherapy group received intravenous carboplatin (AUC [area under the curve] = 6) on day 1, plus 90 mg/m² of paclitaxel on days 1, 8, and 15 of 28-day cycles.

The 226-patient monotherapy cohort received 25 mg/m² vinorelbine (62 patients) on days 1 and 8 – or 1,150 mg/m² gemcitabine (164 patients) on days 1 and 8 – of 21-day cycles, with the choice of either vinorelbine or gemcitabine being made by the institution conducting the therapy.

In the doublet group, cycles were repeated every 4 weeks, such that patients received 3 weeks of treatment plus 1 week without, for doublet chemotherapy. For monotherapy, the cycle lasted 3 weeks (2 weeks of treatment plus 1 week without), with a maximum of four cycles for doublet therapy and five cycles for monotherapy.

Survival Rates for Chemotherapy Study Patients

By 1 year, the survival rate was 25.4% in the monotherapy group vs. 44.5% in the doublet therapy group (HR, 0.64; P less than .0001). The trend persisted at 2 years, with the probability of survival being 11.7% in monotherapy recipients and 22.4% in those receiving doublet therapy, wrote the authors.

Median progression-free survival also was significantly longer with the doublet (6 months vs. 2.8 months; P less than .0001).

In 2009, an independent data-monitoring committee recommended stopping recruitment based on the second interim analysis.

"Grade 3-4 neutropenia, febrile neutropenia, thrombopenia, and anemia were significantly more frequent among patients in the doublet chemotherapy group than among those in the monotherapy group, as was grade 3-4 sensory neuropathy," wrote the authors. The protocol did not allow growth factor support in the first cycle, but it was authorized as secondary prophylaxis in patients who developed grade 3 or 4 neutropenia.

In all, 10 deaths in the doublet therapy group (4.4%) and 3 in the monotherapy group (1.3%) were related to treatment: Culprit diagnoses included sepsis, respiratory distress, and diarrhea related to renal insufficiency. But the percentage of deaths in the first 3 months after the start of therapy was "markedly lower" in patients who received carboplatin plus paclitaxel.

Although quality of life scores at week 18 were similar between groups, the authors pointed out that "role functioning and fatigue were worse in the doublet chemotherapy group than in the monotherapy group" (P = .026 and .039, respectively). Full quality of life data will be published separately, they said.

The 2009 American Society of Clinical Oncology guidelines (J. Clin. Oncol. 2009;27:6251-66) recommend that age "not be used as a criterion in the decision-making process about whether to treat a patient" and call for further research devoted to elderly patients, according to Dr. Quoix and colleagues. "Conversely," they noted, "the European Organisation for Research and Treatment of Cancer Elderly Task Force and Lung Cancer Group and International Society of Geriatric Oncology highlighted in 2010 that monotherapy should be given to elderly patients with advanced NSCLC."

"We believe that monthly carboplatin and weekly paclitaxel is a feasible option for first-line therapy of advanced NSCLC in patients older than 70 years with performance status scores of 0-2," the authors wrote.

 

 

The study was funded by the IFCT and the French National Cancer Institute, with support by grants from Bristol-Myers Squibb, Roche, and Pierre Fabre. Several authors, including Dr. Quoix, disclosed financial relationships with the makers of chemotherapy drugs, including carboplatin (Bristol-Myers Squibb, Roche, and Lilly). Dr. Reckamp disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech.

Chemotherapy with a platinum-based doublet was associated with a highly significant 36% reduction in mortality, compared with monotherapy, among elderly patients with non–small cell lung cancer in a study published online Aug. 9 by the Lancet.

The combination of carboplatin and paclitaxel was associated with more toxicity than was single-drug vinorelbine or gemcitabine regimens in the phase III trial, but the investigators contended that this was outweighed by the survival benefit.

Median overall survival for patients receiving carboplatin plus paclitaxel was 10.3 months, compared with 6.2 months in those randomized to monotherapy (hazard ratio, 0.64; P less than .0001).

Although several guidelines currently recommend monotherapy for elderly patients, the current finding is "of such magnitude that we believe the treatment paradigm for elderly patients with advanced NSCLC should be reconsidered," wrote Dr. Elisabeth Quoix of the Hôpitaux Universitaires de Strasbourg (France) and colleagues (Lancet 2011 Aug. 9 [doi:10.1016/S0140-6736(11)60780-0]).

The investigators from the IFCT (Intergroupe Francophone de Cancérologie Thoracique) looked at 451 patients aged 70-89 years (median age, 77 years) with unresectable stage IV NSCLC or stage III disease that was "unsuitable" for radical radiation therapy. Patients were followed for a median of 30.3 months. To be included in the study, patients’ scores on the World Health Organization’s performance status scale had to be 2 or lower (indicating severe disease), and patients had to have a life expectancy of at least 12 weeks.

The 225 patients who were randomized to the doublet chemotherapy group received intravenous carboplatin (AUC [area under the curve] = 6) on day 1, plus 90 mg/m² of paclitaxel on days 1, 8, and 15 of 28-day cycles.

The 226-patient monotherapy cohort received 25 mg/m² vinorelbine (62 patients) on days 1 and 8 – or 1,150 mg/m² gemcitabine (164 patients) on days 1 and 8 – of 21-day cycles, with the choice of either vinorelbine or gemcitabine being made by the institution conducting the therapy.

In the doublet group, cycles were repeated every 4 weeks, such that patients received 3 weeks of treatment plus 1 week without, for doublet chemotherapy. For monotherapy, the cycle lasted 3 weeks (2 weeks of treatment plus 1 week without), with a maximum of four cycles for doublet therapy and five cycles for monotherapy.

Survival Rates for Chemotherapy Study Patients

By 1 year, the survival rate was 25.4% in the monotherapy group vs. 44.5% in the doublet therapy group (HR, 0.64; P less than .0001). The trend persisted at 2 years, with the probability of survival being 11.7% in monotherapy recipients and 22.4% in those receiving doublet therapy, wrote the authors.

Median progression-free survival also was significantly longer with the doublet (6 months vs. 2.8 months; P less than .0001).

In 2009, an independent data-monitoring committee recommended stopping recruitment based on the second interim analysis.

"Grade 3-4 neutropenia, febrile neutropenia, thrombopenia, and anemia were significantly more frequent among patients in the doublet chemotherapy group than among those in the monotherapy group, as was grade 3-4 sensory neuropathy," wrote the authors. The protocol did not allow growth factor support in the first cycle, but it was authorized as secondary prophylaxis in patients who developed grade 3 or 4 neutropenia.

In all, 10 deaths in the doublet therapy group (4.4%) and 3 in the monotherapy group (1.3%) were related to treatment: Culprit diagnoses included sepsis, respiratory distress, and diarrhea related to renal insufficiency. But the percentage of deaths in the first 3 months after the start of therapy was "markedly lower" in patients who received carboplatin plus paclitaxel.

Although quality of life scores at week 18 were similar between groups, the authors pointed out that "role functioning and fatigue were worse in the doublet chemotherapy group than in the monotherapy group" (P = .026 and .039, respectively). Full quality of life data will be published separately, they said.

The 2009 American Society of Clinical Oncology guidelines (J. Clin. Oncol. 2009;27:6251-66) recommend that age "not be used as a criterion in the decision-making process about whether to treat a patient" and call for further research devoted to elderly patients, according to Dr. Quoix and colleagues. "Conversely," they noted, "the European Organisation for Research and Treatment of Cancer Elderly Task Force and Lung Cancer Group and International Society of Geriatric Oncology highlighted in 2010 that monotherapy should be given to elderly patients with advanced NSCLC."

"We believe that monthly carboplatin and weekly paclitaxel is a feasible option for first-line therapy of advanced NSCLC in patients older than 70 years with performance status scores of 0-2," the authors wrote.

 

 

The study was funded by the IFCT and the French National Cancer Institute, with support by grants from Bristol-Myers Squibb, Roche, and Pierre Fabre. Several authors, including Dr. Quoix, disclosed financial relationships with the makers of chemotherapy drugs, including carboplatin (Bristol-Myers Squibb, Roche, and Lilly). Dr. Reckamp disclosed consulting for Amgen, Genentech, and Tragara Pharmaceuticals, as well as serving on speakers bureaus for Lilly Oncology and Genentech.

Publications
Publications
Topics
Article Type
Display Headline
Platinum-Based Chemotherapy Benefits Elderly Lung Cancer Patients
Display Headline
Platinum-Based Chemotherapy Benefits Elderly Lung Cancer Patients
Legacy Keywords
platinum-based chemotherapy, non small cell lung cancer treatment, carboplatin and paclitaxel, radical radiation therapy, chemotherapy survival rate
Legacy Keywords
platinum-based chemotherapy, non small cell lung cancer treatment, carboplatin and paclitaxel, radical radiation therapy, chemotherapy survival rate
Article Source

FROM THE LANCET

PURLs Copyright

Inside the Article

Vitals

Major Finding: By 1 year, median overall survival was 25.4% with monotherapy, vs. 44.5% with a carboplatin and paclitaxel doublet.

Data Source: A multicenter, open-label, phase III, randomized trial in NSCLC patients aged 70-89 years.

Disclosures: The study was funded by the IFCT and the French National Cancer Institute, with support by grants from Bristol-Myers Squibb, Roche, and Pierre Fabre. Several authors, including Dr. Quoix, disclosed financial relationships with the makers of chemotherapy drugs, including carboplatin (Bristol-Myers Squibb, Roche, and Lilly).

Pancreatectomy Raised Quality of Life in Pediatric Chronic Pancreatitis

Article Type
Changed
Fri, 01/18/2019 - 11:15
Display Headline
Pancreatectomy Raised Quality of Life in Pediatric Chronic Pancreatitis

Total pancreatectomy with islet autotransplant in pediatric chronic pancreatitis significantly improves quality of life and largely obviates the need for narcotics post procedure, according to a report by Dr. Melena D. Bellin and colleagues in the September issue of Clinical Gastroenterology and Hepatology.

"This procedure should be considered in children with [chronic pancreatitis] when medical and endoscopic modalities have failed," and may be a better alternative to the current surgical standard of care – partial resection and drainage, wrote the authors.

Dr. Bellin, of the endocrinology division in the department of pediatrics at the University of Minnesota, Minneapolis, studied 19 consecutive children aged 5-18 years who underwent total pancreatectomy with islet autotransplant into the portal vein during 2006-2009 at her institution (Clin. Gastroenterol. Hepatol. 2011 September [doi:10.1016/j.cgh.2011.04.024]).

According to the authors, only three centers around the world have completed more than 50 of these procedures, with the bulk of the experience occurring in the adult population.

All patients had a diagnosis of chronic pancreatitis (CP), and had previously failed medical treatment, endoscopic treatment or both.

With their parents’ help, patients completed the Medical Outcomes Study 36-item short form (SF-36) questionnaire at 1 week before and at 3, 6, and 12 months after surgery, and then annually. The scores range between 0 and 100 and are divided into eight subscales which in turn make up a Physical Component Summary (PCS) and a Mental Component Summary (MCS) score, with higher numbers signifying better health.

At baseline, all patients required narcotics, either on a daily basis (n = 13) or intermittently. All patients had also had multiple hospitalizations for pain management. Two were dependent on jejunal tube feedings and two on total parenteral nutrition. Seven patients also had undergone prior pancreatic surgery at outside institutions.

"Prior to surgery, all patients had below average HRQOL [health related quality of life] based on the SF-36, with a mean PCS score of 30 and a mean MCS score of 34," wrote Dr. Bellin. These scores were equivalent to 2 and 1.5 standard deviations, respectively, below the norm for the U.S. population.

By 1 year, wrote the authors, the PCS improved significantly, to a mean of 50 (P less than .001). Similarly, the MCS improved to a mean of 46, although the increase just missed statistical significance (P = .06). Both postsurgery scores were equivalent to normal HRQOL values in this population.

Looking at postprocedure narcotics use, the authors found that by 1 year, 14 patients had stopped using narcotics for pain management entirely.

"Of the remaining 5 patients, 2 reported rare narcotic use (a few times a year), 1 used tramadol, and 2 used daily narcotics at a reduced dose," they added.

After surgery, all of the patients received insulin initially, with a goal of weaning them off insulin if possible. At a mean of 18 months following the islet graft, seven patients were insulin independent, and four more were reporting minimal insulin use; all of them had hemoglobin A1c levels less than or equal to 6.5%.

However, the study showed that patients who had undergone prior drainage procedures were more likely to be insulin-dependent (P = .04) and to have variable HbA1c levels, perhaps necessitating "a paradigm shift in the current management of CP, with avoidance of partial resections without islet autotransplantation and of surgical drainage procedures," recommended Dr. Bellin.

"Although optimal timing of surgery needs to be elucidated, for those who will go on to [total pancreatectomy with islet autotransplantation], earlier surgery may avoid progressive damage to the endocrine pancreas and the hyperalgesia associated with chronic narcotic use," concluded Dr. Bellin.

The authors reported no individual conflicts of interest related to this study, which was supported in part by the National Pancreas Foundation.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Total pancreatectomy, islet autotransplant, pediatric chronic pancreatitis, quality of life, Dr. Melena D. Bellin, Clinical Gastroenterology and Hepatology, partial resection and drainage,



Author and Disclosure Information

Author and Disclosure Information

Total pancreatectomy with islet autotransplant in pediatric chronic pancreatitis significantly improves quality of life and largely obviates the need for narcotics post procedure, according to a report by Dr. Melena D. Bellin and colleagues in the September issue of Clinical Gastroenterology and Hepatology.

"This procedure should be considered in children with [chronic pancreatitis] when medical and endoscopic modalities have failed," and may be a better alternative to the current surgical standard of care – partial resection and drainage, wrote the authors.

Dr. Bellin, of the endocrinology division in the department of pediatrics at the University of Minnesota, Minneapolis, studied 19 consecutive children aged 5-18 years who underwent total pancreatectomy with islet autotransplant into the portal vein during 2006-2009 at her institution (Clin. Gastroenterol. Hepatol. 2011 September [doi:10.1016/j.cgh.2011.04.024]).

According to the authors, only three centers around the world have completed more than 50 of these procedures, with the bulk of the experience occurring in the adult population.

All patients had a diagnosis of chronic pancreatitis (CP), and had previously failed medical treatment, endoscopic treatment or both.

With their parents’ help, patients completed the Medical Outcomes Study 36-item short form (SF-36) questionnaire at 1 week before and at 3, 6, and 12 months after surgery, and then annually. The scores range between 0 and 100 and are divided into eight subscales which in turn make up a Physical Component Summary (PCS) and a Mental Component Summary (MCS) score, with higher numbers signifying better health.

At baseline, all patients required narcotics, either on a daily basis (n = 13) or intermittently. All patients had also had multiple hospitalizations for pain management. Two were dependent on jejunal tube feedings and two on total parenteral nutrition. Seven patients also had undergone prior pancreatic surgery at outside institutions.

"Prior to surgery, all patients had below average HRQOL [health related quality of life] based on the SF-36, with a mean PCS score of 30 and a mean MCS score of 34," wrote Dr. Bellin. These scores were equivalent to 2 and 1.5 standard deviations, respectively, below the norm for the U.S. population.

By 1 year, wrote the authors, the PCS improved significantly, to a mean of 50 (P less than .001). Similarly, the MCS improved to a mean of 46, although the increase just missed statistical significance (P = .06). Both postsurgery scores were equivalent to normal HRQOL values in this population.

Looking at postprocedure narcotics use, the authors found that by 1 year, 14 patients had stopped using narcotics for pain management entirely.

"Of the remaining 5 patients, 2 reported rare narcotic use (a few times a year), 1 used tramadol, and 2 used daily narcotics at a reduced dose," they added.

After surgery, all of the patients received insulin initially, with a goal of weaning them off insulin if possible. At a mean of 18 months following the islet graft, seven patients were insulin independent, and four more were reporting minimal insulin use; all of them had hemoglobin A1c levels less than or equal to 6.5%.

However, the study showed that patients who had undergone prior drainage procedures were more likely to be insulin-dependent (P = .04) and to have variable HbA1c levels, perhaps necessitating "a paradigm shift in the current management of CP, with avoidance of partial resections without islet autotransplantation and of surgical drainage procedures," recommended Dr. Bellin.

"Although optimal timing of surgery needs to be elucidated, for those who will go on to [total pancreatectomy with islet autotransplantation], earlier surgery may avoid progressive damage to the endocrine pancreas and the hyperalgesia associated with chronic narcotic use," concluded Dr. Bellin.

The authors reported no individual conflicts of interest related to this study, which was supported in part by the National Pancreas Foundation.

Total pancreatectomy with islet autotransplant in pediatric chronic pancreatitis significantly improves quality of life and largely obviates the need for narcotics post procedure, according to a report by Dr. Melena D. Bellin and colleagues in the September issue of Clinical Gastroenterology and Hepatology.

"This procedure should be considered in children with [chronic pancreatitis] when medical and endoscopic modalities have failed," and may be a better alternative to the current surgical standard of care – partial resection and drainage, wrote the authors.

Dr. Bellin, of the endocrinology division in the department of pediatrics at the University of Minnesota, Minneapolis, studied 19 consecutive children aged 5-18 years who underwent total pancreatectomy with islet autotransplant into the portal vein during 2006-2009 at her institution (Clin. Gastroenterol. Hepatol. 2011 September [doi:10.1016/j.cgh.2011.04.024]).

According to the authors, only three centers around the world have completed more than 50 of these procedures, with the bulk of the experience occurring in the adult population.

All patients had a diagnosis of chronic pancreatitis (CP), and had previously failed medical treatment, endoscopic treatment or both.

With their parents’ help, patients completed the Medical Outcomes Study 36-item short form (SF-36) questionnaire at 1 week before and at 3, 6, and 12 months after surgery, and then annually. The scores range between 0 and 100 and are divided into eight subscales which in turn make up a Physical Component Summary (PCS) and a Mental Component Summary (MCS) score, with higher numbers signifying better health.

At baseline, all patients required narcotics, either on a daily basis (n = 13) or intermittently. All patients had also had multiple hospitalizations for pain management. Two were dependent on jejunal tube feedings and two on total parenteral nutrition. Seven patients also had undergone prior pancreatic surgery at outside institutions.

"Prior to surgery, all patients had below average HRQOL [health related quality of life] based on the SF-36, with a mean PCS score of 30 and a mean MCS score of 34," wrote Dr. Bellin. These scores were equivalent to 2 and 1.5 standard deviations, respectively, below the norm for the U.S. population.

By 1 year, wrote the authors, the PCS improved significantly, to a mean of 50 (P less than .001). Similarly, the MCS improved to a mean of 46, although the increase just missed statistical significance (P = .06). Both postsurgery scores were equivalent to normal HRQOL values in this population.

Looking at postprocedure narcotics use, the authors found that by 1 year, 14 patients had stopped using narcotics for pain management entirely.

"Of the remaining 5 patients, 2 reported rare narcotic use (a few times a year), 1 used tramadol, and 2 used daily narcotics at a reduced dose," they added.

After surgery, all of the patients received insulin initially, with a goal of weaning them off insulin if possible. At a mean of 18 months following the islet graft, seven patients were insulin independent, and four more were reporting minimal insulin use; all of them had hemoglobin A1c levels less than or equal to 6.5%.

However, the study showed that patients who had undergone prior drainage procedures were more likely to be insulin-dependent (P = .04) and to have variable HbA1c levels, perhaps necessitating "a paradigm shift in the current management of CP, with avoidance of partial resections without islet autotransplantation and of surgical drainage procedures," recommended Dr. Bellin.

"Although optimal timing of surgery needs to be elucidated, for those who will go on to [total pancreatectomy with islet autotransplantation], earlier surgery may avoid progressive damage to the endocrine pancreas and the hyperalgesia associated with chronic narcotic use," concluded Dr. Bellin.

The authors reported no individual conflicts of interest related to this study, which was supported in part by the National Pancreas Foundation.

Publications
Publications
Topics
Article Type
Display Headline
Pancreatectomy Raised Quality of Life in Pediatric Chronic Pancreatitis
Display Headline
Pancreatectomy Raised Quality of Life in Pediatric Chronic Pancreatitis
Legacy Keywords
Total pancreatectomy, islet autotransplant, pediatric chronic pancreatitis, quality of life, Dr. Melena D. Bellin, Clinical Gastroenterology and Hepatology, partial resection and drainage,



Legacy Keywords
Total pancreatectomy, islet autotransplant, pediatric chronic pancreatitis, quality of life, Dr. Melena D. Bellin, Clinical Gastroenterology and Hepatology, partial resection and drainage,



Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: In children who underwent total pancreatectomy, health-related quality of life scores normalized, and 14 of the 19 patients stopped using narcotics entirely by 1 year.

Data Source: A prospective study of outcomes after total pancreatectomy with islet autotransplant in 19 consecutive pediatric patients.

Disclosures: The authors reported no individual disclosures. The study was supported in part by the National Pancreas Foundation.

Response to Locoregional Embolization Predicts Survival in Liver Cancer

Article Type
Changed
Wed, 05/26/2021 - 14:05
Display Headline
Response to Locoregional Embolization Predicts Survival in Liver Cancer

Radiographic response to locoregional embolization of hepatocellular carcinoma predicted survival, Dr. Khairuddin Memon and colleagues reported in the August issue of Gastroenterology.

The finding is the first to substantiate the prognostic value of tumor response to chemoembolization and radioembolization in hepatocellular carcinoma (HCC), they wrote (Gastroenterology 2011 August [doi: 10.1053/j.gastro.2011.04.054].

"Based on these findings, consideration should be made to develop treatments for HCC that not only prolong [time to progression], but also elicit radiographic tumor response," added Dr. Memon of the department of radiology at Northwestern Memorial Hospital in Chicago.

According to Dr. Memon, between 2000 and 2008, 463 patients with HCC were treated at the authors’ institution using transarterial locoregional therapies – either transarterial chemoembolization (TACE) or yttrium-90 radioembolization (Y). All of these patients had unresectable HCC and bilirubin levels less than 3.0 mg/dL.

For the present analysis, the authors subsequently excluded all patients who underwent transplant, exhibited portal venous thrombosis or extrahepatic metastases at baseline, or had a Child-Pugh score greater than B7.

Survival outcomes were analyzed for the remaining 159 patients with respect to their response to TACE or Y therapy. Most of the patients (74%) were male, and 40% were younger than 65 years of age.

Response status was assessed using both World Health Organization criteria and European Association for Study of the Liver (EASL) guidelines. Patients were assessed using computed tomography or magnetic resonance imaging at 1 month after treatment and then at scheduled 2- to 3-month intervals.

The authors found that patients who were responders at the 6-month posttreatment landmark according to WHO criteria had an overall median survival of 31.6 months, compared with 13.7 months for 6-month WHO nonresponders (P = .069).

Judging by EASL guidelines, however, the difference reached significance: 6-month landmark EASL responders had a median overall survival of 24.6 months, versus 13.2 months for nonresponders (P = .002).

Highly significant differences were found when the participants were divided into two groups based on response or nonresponse at 12 months. By WHO criteria, median overall survival for responders at 12 months was 36.4 months, versus 11.1 months for nonresponders (P = .004). And by EASL standards, median survival was also 36.4 months for responders, versus 9.7 months for nonresponders (P less than .0001).

The authors also analyzed risk of death in the 6 months following each landmark. They found that WHO responders at the 6-month landmark had a 6.6% rate of death in this window, versus 15.5% among nonresponders – a nonsignificant difference (P = .707).

However, according to EASL standards, the rate of death in the 6 months following the 6-month landmark was 4.7% among responders versus 20.6% among nonresponders (P = .046).

Moreover, the death rate in the 6 months following the 12-month landmark was 0% for WHO responders, versus 31.5% for nonresponders (P = .010), and it was 4% and 32.7%, respectively, by EASL guidelines (P = .013).

The authors added that baseline tumor size was not a significant factor affecting survival in a univariate analysis.

"Our data show that responders by WHO/EASL criteria live longer than nonresponders from the landmark onwards, with the exception of WHO 6-month landmark (near significance); the trend is clear," wrote the authors.

Randomized controlled trials "will be required to validate this concept and establish radiographic response as a surrogate of the true end point (survival)."

Dr. Memon and his fellow researchers declared no conflicts of interest associated with this study.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
hepatocellular carcinoma treatment, liver cancer survival, chemoembolization, radiographic response, liver cancer treatment
Author and Disclosure Information

Author and Disclosure Information

Radiographic response to locoregional embolization of hepatocellular carcinoma predicted survival, Dr. Khairuddin Memon and colleagues reported in the August issue of Gastroenterology.

The finding is the first to substantiate the prognostic value of tumor response to chemoembolization and radioembolization in hepatocellular carcinoma (HCC), they wrote (Gastroenterology 2011 August [doi: 10.1053/j.gastro.2011.04.054].

"Based on these findings, consideration should be made to develop treatments for HCC that not only prolong [time to progression], but also elicit radiographic tumor response," added Dr. Memon of the department of radiology at Northwestern Memorial Hospital in Chicago.

According to Dr. Memon, between 2000 and 2008, 463 patients with HCC were treated at the authors’ institution using transarterial locoregional therapies – either transarterial chemoembolization (TACE) or yttrium-90 radioembolization (Y). All of these patients had unresectable HCC and bilirubin levels less than 3.0 mg/dL.

For the present analysis, the authors subsequently excluded all patients who underwent transplant, exhibited portal venous thrombosis or extrahepatic metastases at baseline, or had a Child-Pugh score greater than B7.

Survival outcomes were analyzed for the remaining 159 patients with respect to their response to TACE or Y therapy. Most of the patients (74%) were male, and 40% were younger than 65 years of age.

Response status was assessed using both World Health Organization criteria and European Association for Study of the Liver (EASL) guidelines. Patients were assessed using computed tomography or magnetic resonance imaging at 1 month after treatment and then at scheduled 2- to 3-month intervals.

The authors found that patients who were responders at the 6-month posttreatment landmark according to WHO criteria had an overall median survival of 31.6 months, compared with 13.7 months for 6-month WHO nonresponders (P = .069).

Judging by EASL guidelines, however, the difference reached significance: 6-month landmark EASL responders had a median overall survival of 24.6 months, versus 13.2 months for nonresponders (P = .002).

Highly significant differences were found when the participants were divided into two groups based on response or nonresponse at 12 months. By WHO criteria, median overall survival for responders at 12 months was 36.4 months, versus 11.1 months for nonresponders (P = .004). And by EASL standards, median survival was also 36.4 months for responders, versus 9.7 months for nonresponders (P less than .0001).

The authors also analyzed risk of death in the 6 months following each landmark. They found that WHO responders at the 6-month landmark had a 6.6% rate of death in this window, versus 15.5% among nonresponders – a nonsignificant difference (P = .707).

However, according to EASL standards, the rate of death in the 6 months following the 6-month landmark was 4.7% among responders versus 20.6% among nonresponders (P = .046).

Moreover, the death rate in the 6 months following the 12-month landmark was 0% for WHO responders, versus 31.5% for nonresponders (P = .010), and it was 4% and 32.7%, respectively, by EASL guidelines (P = .013).

The authors added that baseline tumor size was not a significant factor affecting survival in a univariate analysis.

"Our data show that responders by WHO/EASL criteria live longer than nonresponders from the landmark onwards, with the exception of WHO 6-month landmark (near significance); the trend is clear," wrote the authors.

Randomized controlled trials "will be required to validate this concept and establish radiographic response as a surrogate of the true end point (survival)."

Dr. Memon and his fellow researchers declared no conflicts of interest associated with this study.

Radiographic response to locoregional embolization of hepatocellular carcinoma predicted survival, Dr. Khairuddin Memon and colleagues reported in the August issue of Gastroenterology.

The finding is the first to substantiate the prognostic value of tumor response to chemoembolization and radioembolization in hepatocellular carcinoma (HCC), they wrote (Gastroenterology 2011 August [doi: 10.1053/j.gastro.2011.04.054].

"Based on these findings, consideration should be made to develop treatments for HCC that not only prolong [time to progression], but also elicit radiographic tumor response," added Dr. Memon of the department of radiology at Northwestern Memorial Hospital in Chicago.

According to Dr. Memon, between 2000 and 2008, 463 patients with HCC were treated at the authors’ institution using transarterial locoregional therapies – either transarterial chemoembolization (TACE) or yttrium-90 radioembolization (Y). All of these patients had unresectable HCC and bilirubin levels less than 3.0 mg/dL.

For the present analysis, the authors subsequently excluded all patients who underwent transplant, exhibited portal venous thrombosis or extrahepatic metastases at baseline, or had a Child-Pugh score greater than B7.

Survival outcomes were analyzed for the remaining 159 patients with respect to their response to TACE or Y therapy. Most of the patients (74%) were male, and 40% were younger than 65 years of age.

Response status was assessed using both World Health Organization criteria and European Association for Study of the Liver (EASL) guidelines. Patients were assessed using computed tomography or magnetic resonance imaging at 1 month after treatment and then at scheduled 2- to 3-month intervals.

The authors found that patients who were responders at the 6-month posttreatment landmark according to WHO criteria had an overall median survival of 31.6 months, compared with 13.7 months for 6-month WHO nonresponders (P = .069).

Judging by EASL guidelines, however, the difference reached significance: 6-month landmark EASL responders had a median overall survival of 24.6 months, versus 13.2 months for nonresponders (P = .002).

Highly significant differences were found when the participants were divided into two groups based on response or nonresponse at 12 months. By WHO criteria, median overall survival for responders at 12 months was 36.4 months, versus 11.1 months for nonresponders (P = .004). And by EASL standards, median survival was also 36.4 months for responders, versus 9.7 months for nonresponders (P less than .0001).

The authors also analyzed risk of death in the 6 months following each landmark. They found that WHO responders at the 6-month landmark had a 6.6% rate of death in this window, versus 15.5% among nonresponders – a nonsignificant difference (P = .707).

However, according to EASL standards, the rate of death in the 6 months following the 6-month landmark was 4.7% among responders versus 20.6% among nonresponders (P = .046).

Moreover, the death rate in the 6 months following the 12-month landmark was 0% for WHO responders, versus 31.5% for nonresponders (P = .010), and it was 4% and 32.7%, respectively, by EASL guidelines (P = .013).

The authors added that baseline tumor size was not a significant factor affecting survival in a univariate analysis.

"Our data show that responders by WHO/EASL criteria live longer than nonresponders from the landmark onwards, with the exception of WHO 6-month landmark (near significance); the trend is clear," wrote the authors.

Randomized controlled trials "will be required to validate this concept and establish radiographic response as a surrogate of the true end point (survival)."

Dr. Memon and his fellow researchers declared no conflicts of interest associated with this study.

Publications
Publications
Topics
Article Type
Display Headline
Response to Locoregional Embolization Predicts Survival in Liver Cancer
Display Headline
Response to Locoregional Embolization Predicts Survival in Liver Cancer
Legacy Keywords
hepatocellular carcinoma treatment, liver cancer survival, chemoembolization, radiographic response, liver cancer treatment
Legacy Keywords
hepatocellular carcinoma treatment, liver cancer survival, chemoembolization, radiographic response, liver cancer treatment
Article Source

FROM GASTROENTEROLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Patients with hepatocellular carcinoma who showed radiographic response to locoregional tumor embolization at 6 months post treatment had a median overall survival of 24.6 months, versus 13.2 months for nonresponders, according to European Association for Study of the Liver guidelines (P = .002).

Data Source: A single-center, 9-year survival study of hepatocellular carcinoma patients who underwent locoregional embolization.

Disclosures: Dr. Memon and his colleagues declared no conflicts of interest associated with this study.

Comorbidities Often Preclude Transplants in Nonalcoholic Steatohepatitis

Article Type
Changed
Fri, 12/07/2018 - 14:11
Display Headline
Comorbidities Often Preclude Transplants in Nonalcoholic Steatohepatitis

Patients with nonalcoholic steatohepatitis who were deemed candidates for liver transplant were less likely to actually undergo the procedure than were candidates with hepatitis C, and more often died or were delisted for being too ill before transplant, wrote Dr. Jacqueline G. O’Leary and colleagues in the August issue of Clinical Gastroenterology and Hepatology.

Moreover, the study showed that the presence of comorbid conditions such as hypertension and obesity was the predominant reason for denial of liver transplant in nonalcoholic steatohepatitis (NASH).

The findings mean that "The primary focus of treatment in NASH/CC [cryptogenic cirrhosis] patients with a low MELD [model for end-stage liver disease] score needs to be aggressive treatment of their obesity, diabetes, lipid disorders, and hypertension so that they do not develop comorbid conditions that cause death or make them ineligible for transplant."

Dr. O’Leary, of the Annette C. and Harold C. Simmons Transplant Institute at the Baylor University Medical Center at Dallas, studied data from 415 patients with NASH and/or CC and 1,232 patients with hepatitis C virus (HCV)–associated cirrhosis who were evaluated for transplant at the Baylor facility (Clin. Gastroenterol Hepatol. 2011 [doi.10.1016/j.cgh.2011.04.007]).

All NASH/CC patients denied for transplant were compared to all HCV patients denied for transplant. "When patients are denied for OLT [orthotopic liver transplantation], they are recorded as excluded for medical comorbidities, psychosocial reasons, adequate hepatic reserve, exceeding tumor criteria, or death," wrote the investigators. "However, the specific comorbidities and psychosocial reasons are not recorded in our database."

Overall, Dr. O’Leary found that 197 NASH/CC patients were denied for listing (47%), as were 586 HCV patients (48%).

In general, the NASH/CC patients with denials were older (median age 60 years vs. 51 years; P less than .001), more likely to be female (57% vs. 35%; P less than .001), heavier (body mass index greater than 30 kg/m2: 59% vs. 40%; P less than .001), and had a lower glomerular filtration rate (74 mL/min vs. 88 mL/min; P = .004), compared with the HCV patients who were denied.

According to the analysis, among the NASH/CC patients, the existence of comorbid conditions was the most likely reason for denial (72% vs. 27% in the HCV group), whereas among the HCV patients, ongoing psychological issues – including recidivism and lack of social support – precluded the greatest percentage of candidates from transplant (39% vs. 8% in the NASH/CC cohort).

Among all denied patients, liver disease severity was similar for the two groups, with MELD scores of 12 among NASH/CC patients versus 11 for HCV patients.

The authors also compared the 217 NASH/CC patients who were listed for transplant with the 645 HCV patients who were listed.

NASH/CC patients were heavier (BMI greater than 30: 54% vs. 42%; P = .004), and were more likely to have diabetes (55% vs. 22%; P less than .001) and hypertension (46% vs. 28%; P less than .001) than the HCV patients. Liver disease severity, as measured by median MELD score (14 for both) and Child-Turcotte-Pugh point system (7 for both), was the same.

However, the authors found that among those listed, patients with NASH/CC were significantly less likely to ultimately be transplanted than patients with HCV (48% vs. 62%; P less than .001).

"While listed, 22% of NASH/CC patients and 16% of HCV patients either died on the list or were delisted for being too ill," the authors explained

"Although some have suggested that NASH cirrhosis would overtake HCV as the main indication for [transplant] by 2020, this does not seem likely, since coincident comorbid conditions often found in NASH/CC patients may often preclude [transplant]," wrote the authors.

"In fact, our data confirm this suspicion; NASH/CC patients were almost twice as likely as HCV cirrhosis patients to be denied for listing because of comorbid conditions."

The authors reported no grant support and no relevant conflicts of interest.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
nonalcoholic steatohepatitis, liver transplant, hepatitis C, Clinical Gastroenterology and Hepatology
Author and Disclosure Information

Author and Disclosure Information

Patients with nonalcoholic steatohepatitis who were deemed candidates for liver transplant were less likely to actually undergo the procedure than were candidates with hepatitis C, and more often died or were delisted for being too ill before transplant, wrote Dr. Jacqueline G. O’Leary and colleagues in the August issue of Clinical Gastroenterology and Hepatology.

Moreover, the study showed that the presence of comorbid conditions such as hypertension and obesity was the predominant reason for denial of liver transplant in nonalcoholic steatohepatitis (NASH).

The findings mean that "The primary focus of treatment in NASH/CC [cryptogenic cirrhosis] patients with a low MELD [model for end-stage liver disease] score needs to be aggressive treatment of their obesity, diabetes, lipid disorders, and hypertension so that they do not develop comorbid conditions that cause death or make them ineligible for transplant."

Dr. O’Leary, of the Annette C. and Harold C. Simmons Transplant Institute at the Baylor University Medical Center at Dallas, studied data from 415 patients with NASH and/or CC and 1,232 patients with hepatitis C virus (HCV)–associated cirrhosis who were evaluated for transplant at the Baylor facility (Clin. Gastroenterol Hepatol. 2011 [doi.10.1016/j.cgh.2011.04.007]).

All NASH/CC patients denied for transplant were compared to all HCV patients denied for transplant. "When patients are denied for OLT [orthotopic liver transplantation], they are recorded as excluded for medical comorbidities, psychosocial reasons, adequate hepatic reserve, exceeding tumor criteria, or death," wrote the investigators. "However, the specific comorbidities and psychosocial reasons are not recorded in our database."

Overall, Dr. O’Leary found that 197 NASH/CC patients were denied for listing (47%), as were 586 HCV patients (48%).

In general, the NASH/CC patients with denials were older (median age 60 years vs. 51 years; P less than .001), more likely to be female (57% vs. 35%; P less than .001), heavier (body mass index greater than 30 kg/m2: 59% vs. 40%; P less than .001), and had a lower glomerular filtration rate (74 mL/min vs. 88 mL/min; P = .004), compared with the HCV patients who were denied.

According to the analysis, among the NASH/CC patients, the existence of comorbid conditions was the most likely reason for denial (72% vs. 27% in the HCV group), whereas among the HCV patients, ongoing psychological issues – including recidivism and lack of social support – precluded the greatest percentage of candidates from transplant (39% vs. 8% in the NASH/CC cohort).

Among all denied patients, liver disease severity was similar for the two groups, with MELD scores of 12 among NASH/CC patients versus 11 for HCV patients.

The authors also compared the 217 NASH/CC patients who were listed for transplant with the 645 HCV patients who were listed.

NASH/CC patients were heavier (BMI greater than 30: 54% vs. 42%; P = .004), and were more likely to have diabetes (55% vs. 22%; P less than .001) and hypertension (46% vs. 28%; P less than .001) than the HCV patients. Liver disease severity, as measured by median MELD score (14 for both) and Child-Turcotte-Pugh point system (7 for both), was the same.

However, the authors found that among those listed, patients with NASH/CC were significantly less likely to ultimately be transplanted than patients with HCV (48% vs. 62%; P less than .001).

"While listed, 22% of NASH/CC patients and 16% of HCV patients either died on the list or were delisted for being too ill," the authors explained

"Although some have suggested that NASH cirrhosis would overtake HCV as the main indication for [transplant] by 2020, this does not seem likely, since coincident comorbid conditions often found in NASH/CC patients may often preclude [transplant]," wrote the authors.

"In fact, our data confirm this suspicion; NASH/CC patients were almost twice as likely as HCV cirrhosis patients to be denied for listing because of comorbid conditions."

The authors reported no grant support and no relevant conflicts of interest.

Patients with nonalcoholic steatohepatitis who were deemed candidates for liver transplant were less likely to actually undergo the procedure than were candidates with hepatitis C, and more often died or were delisted for being too ill before transplant, wrote Dr. Jacqueline G. O’Leary and colleagues in the August issue of Clinical Gastroenterology and Hepatology.

Moreover, the study showed that the presence of comorbid conditions such as hypertension and obesity was the predominant reason for denial of liver transplant in nonalcoholic steatohepatitis (NASH).

The findings mean that "The primary focus of treatment in NASH/CC [cryptogenic cirrhosis] patients with a low MELD [model for end-stage liver disease] score needs to be aggressive treatment of their obesity, diabetes, lipid disorders, and hypertension so that they do not develop comorbid conditions that cause death or make them ineligible for transplant."

Dr. O’Leary, of the Annette C. and Harold C. Simmons Transplant Institute at the Baylor University Medical Center at Dallas, studied data from 415 patients with NASH and/or CC and 1,232 patients with hepatitis C virus (HCV)–associated cirrhosis who were evaluated for transplant at the Baylor facility (Clin. Gastroenterol Hepatol. 2011 [doi.10.1016/j.cgh.2011.04.007]).

All NASH/CC patients denied for transplant were compared to all HCV patients denied for transplant. "When patients are denied for OLT [orthotopic liver transplantation], they are recorded as excluded for medical comorbidities, psychosocial reasons, adequate hepatic reserve, exceeding tumor criteria, or death," wrote the investigators. "However, the specific comorbidities and psychosocial reasons are not recorded in our database."

Overall, Dr. O’Leary found that 197 NASH/CC patients were denied for listing (47%), as were 586 HCV patients (48%).

In general, the NASH/CC patients with denials were older (median age 60 years vs. 51 years; P less than .001), more likely to be female (57% vs. 35%; P less than .001), heavier (body mass index greater than 30 kg/m2: 59% vs. 40%; P less than .001), and had a lower glomerular filtration rate (74 mL/min vs. 88 mL/min; P = .004), compared with the HCV patients who were denied.

According to the analysis, among the NASH/CC patients, the existence of comorbid conditions was the most likely reason for denial (72% vs. 27% in the HCV group), whereas among the HCV patients, ongoing psychological issues – including recidivism and lack of social support – precluded the greatest percentage of candidates from transplant (39% vs. 8% in the NASH/CC cohort).

Among all denied patients, liver disease severity was similar for the two groups, with MELD scores of 12 among NASH/CC patients versus 11 for HCV patients.

The authors also compared the 217 NASH/CC patients who were listed for transplant with the 645 HCV patients who were listed.

NASH/CC patients were heavier (BMI greater than 30: 54% vs. 42%; P = .004), and were more likely to have diabetes (55% vs. 22%; P less than .001) and hypertension (46% vs. 28%; P less than .001) than the HCV patients. Liver disease severity, as measured by median MELD score (14 for both) and Child-Turcotte-Pugh point system (7 for both), was the same.

However, the authors found that among those listed, patients with NASH/CC were significantly less likely to ultimately be transplanted than patients with HCV (48% vs. 62%; P less than .001).

"While listed, 22% of NASH/CC patients and 16% of HCV patients either died on the list or were delisted for being too ill," the authors explained

"Although some have suggested that NASH cirrhosis would overtake HCV as the main indication for [transplant] by 2020, this does not seem likely, since coincident comorbid conditions often found in NASH/CC patients may often preclude [transplant]," wrote the authors.

"In fact, our data confirm this suspicion; NASH/CC patients were almost twice as likely as HCV cirrhosis patients to be denied for listing because of comorbid conditions."

The authors reported no grant support and no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Display Headline
Comorbidities Often Preclude Transplants in Nonalcoholic Steatohepatitis
Display Headline
Comorbidities Often Preclude Transplants in Nonalcoholic Steatohepatitis
Legacy Keywords
nonalcoholic steatohepatitis, liver transplant, hepatitis C, Clinical Gastroenterology and Hepatology
Legacy Keywords
nonalcoholic steatohepatitis, liver transplant, hepatitis C, Clinical Gastroenterology and Hepatology
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Patients with nonalcoholic steatohepatitis and/or cryptogenic cirrhosis who were deemed candidates for liver transplant were less likely to undergo the procedure than patients with hepatitis C who were on the transplant list (48% vs. 62%; P less than .001).

Data Source: A retrospective study of all patients older than 18 years who were referred for consideration of primary liver transplant at the Baylor Regional Transplant Institute from March 2002 to May 2008.

Disclosures: The authors reported no grant support and no relevant conflicts of interest.

Radiofrequency Ablation Works Well for Dysplatic Barrett's Esophagus

Article Type
Changed
Wed, 05/26/2021 - 14:05
Display Headline
Radiofrequency Ablation Works Well for Dysplatic Barrett's Esophagus

At 3 years, radiofrequency ablation for dysplastic Barrett’s esophagus resulted in complete eradication of dysplasia in 98% of patients, Dr. Nicholas J. Shaheen and colleagues reported in the August issue of Gastroenterology.

The finding represents the latest results from the Ablation of Intestinal Metaplasia Containing Dysplasia Study (AIM Dysplasia Study). The study’s 12-month results were published by Dr. Shaheen and colleagues in 2009 (N. Engl. J. Med. 2009;360:2277-88).

    Dr. Nicholas J. Shaheen

According to Dr. Shaheen, director of the center for esophageal diseases and swallowing at the University of North Carolina, Chapel Hill, the original study recruited 127 patients aged 18-80 years with nonnodular Barrett’s esophagus less than or equal to 8 cm in length.

Patients were randomized in a 2:1 ratio to either radiofrequency ablation (n = 84) or endoscopic surveillance – simple endoscopy with no treatment, the sham arm (n = 43). In the current follow-up, Dr. Shaheen and his colleagues reported on the durability of the treatment at 2 and 3 years post ablation Gastroenterology 2011 August [doi:10.1053/j.gastro.2011.04.061]).

"After completion of the 12-month assessment, subjects in the sham arm were offered crossover to active (RFA) treatment," wrote the investigators. All 35 of the original 43 sham subjects who were eligible for crossover elected to receive treatment (4 had developed esophageal adenocarcinoma while in the sham arm prior to the 1 year end point, and 4 were lost to follow-up). All treated subjects were then followed for 2 years after receiving RFA therapy.

Overall, 119 subjects underwent ablation (84 at study outset plus 35 crossovers), with 106 completing at least the 2-year follow-up. At 2 years, 101 of 106 (95%) of subjects had complete eradication of dysplasia, and 99 of 106 (93%) had complete eradication of intestinal metaplasia, the authors reported. However, over the mean follow-up period of 3.05 years, 25 subjects required an interim focal treatment of recurrent Barrett’s. Moreover, "in the most stringent analysis, if we include any subject who ever received any RFA and left the trial before the 2 year end point as a failure (n = 13), the response rates were 101/119 (85%)" for complete eradication of dysplasia and 99 of 119 (83%) for complete eradication of intestinal metaplasia, the authors added.

Among the 56 subjects for whom 3-year follow-up data were available, 55 (98%) had complete eradication of dysplasia and 51 (91%) had complete eradication of intestinal metaplasia. The annual rate of any neoplastic progression was 1.37% per patient-year, or 1/73 patient-years.

Four serious adverse events have occurred to date in the 119 subjects (3.4%): one upper gastrointestinal bleed in a patient on dual aspirin and clopidogrel therapy for heart disease; one episode of chest pain occurring 8 days following ablation; and two overnight hospitalizations for nausea and chest pain immediately following the procedure.

"The main adverse side effect was stricture occurrence, which occurred in 7.6% of subjects and was correctable with a mean of 2.8 dilation sessions," the investigators wrote. There were no perforations or procedure-related deaths.

Radiofrequency ablation in this study was accomplished with a device made by BÂRRX Medical, which supported both the 2009 study and the current follow-up. Several investigators also reported financial relationships with BÂRRX Medical, among other pharmaceutical makers. AstraZeneca supplied esomeprazole for the patients through the company’s investigator-sponsored study program.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
radiofrequency ablation, dysplastic Barrett’s esophagus, dysplasia, Gastroenterology, Ablation of Intestinal Metaplasia Containing Dysplasia Study
Author and Disclosure Information

Author and Disclosure Information

At 3 years, radiofrequency ablation for dysplastic Barrett’s esophagus resulted in complete eradication of dysplasia in 98% of patients, Dr. Nicholas J. Shaheen and colleagues reported in the August issue of Gastroenterology.

The finding represents the latest results from the Ablation of Intestinal Metaplasia Containing Dysplasia Study (AIM Dysplasia Study). The study’s 12-month results were published by Dr. Shaheen and colleagues in 2009 (N. Engl. J. Med. 2009;360:2277-88).

    Dr. Nicholas J. Shaheen

According to Dr. Shaheen, director of the center for esophageal diseases and swallowing at the University of North Carolina, Chapel Hill, the original study recruited 127 patients aged 18-80 years with nonnodular Barrett’s esophagus less than or equal to 8 cm in length.

Patients were randomized in a 2:1 ratio to either radiofrequency ablation (n = 84) or endoscopic surveillance – simple endoscopy with no treatment, the sham arm (n = 43). In the current follow-up, Dr. Shaheen and his colleagues reported on the durability of the treatment at 2 and 3 years post ablation Gastroenterology 2011 August [doi:10.1053/j.gastro.2011.04.061]).

"After completion of the 12-month assessment, subjects in the sham arm were offered crossover to active (RFA) treatment," wrote the investigators. All 35 of the original 43 sham subjects who were eligible for crossover elected to receive treatment (4 had developed esophageal adenocarcinoma while in the sham arm prior to the 1 year end point, and 4 were lost to follow-up). All treated subjects were then followed for 2 years after receiving RFA therapy.

Overall, 119 subjects underwent ablation (84 at study outset plus 35 crossovers), with 106 completing at least the 2-year follow-up. At 2 years, 101 of 106 (95%) of subjects had complete eradication of dysplasia, and 99 of 106 (93%) had complete eradication of intestinal metaplasia, the authors reported. However, over the mean follow-up period of 3.05 years, 25 subjects required an interim focal treatment of recurrent Barrett’s. Moreover, "in the most stringent analysis, if we include any subject who ever received any RFA and left the trial before the 2 year end point as a failure (n = 13), the response rates were 101/119 (85%)" for complete eradication of dysplasia and 99 of 119 (83%) for complete eradication of intestinal metaplasia, the authors added.

Among the 56 subjects for whom 3-year follow-up data were available, 55 (98%) had complete eradication of dysplasia and 51 (91%) had complete eradication of intestinal metaplasia. The annual rate of any neoplastic progression was 1.37% per patient-year, or 1/73 patient-years.

Four serious adverse events have occurred to date in the 119 subjects (3.4%): one upper gastrointestinal bleed in a patient on dual aspirin and clopidogrel therapy for heart disease; one episode of chest pain occurring 8 days following ablation; and two overnight hospitalizations for nausea and chest pain immediately following the procedure.

"The main adverse side effect was stricture occurrence, which occurred in 7.6% of subjects and was correctable with a mean of 2.8 dilation sessions," the investigators wrote. There were no perforations or procedure-related deaths.

Radiofrequency ablation in this study was accomplished with a device made by BÂRRX Medical, which supported both the 2009 study and the current follow-up. Several investigators also reported financial relationships with BÂRRX Medical, among other pharmaceutical makers. AstraZeneca supplied esomeprazole for the patients through the company’s investigator-sponsored study program.

At 3 years, radiofrequency ablation for dysplastic Barrett’s esophagus resulted in complete eradication of dysplasia in 98% of patients, Dr. Nicholas J. Shaheen and colleagues reported in the August issue of Gastroenterology.

The finding represents the latest results from the Ablation of Intestinal Metaplasia Containing Dysplasia Study (AIM Dysplasia Study). The study’s 12-month results were published by Dr. Shaheen and colleagues in 2009 (N. Engl. J. Med. 2009;360:2277-88).

    Dr. Nicholas J. Shaheen

According to Dr. Shaheen, director of the center for esophageal diseases and swallowing at the University of North Carolina, Chapel Hill, the original study recruited 127 patients aged 18-80 years with nonnodular Barrett’s esophagus less than or equal to 8 cm in length.

Patients were randomized in a 2:1 ratio to either radiofrequency ablation (n = 84) or endoscopic surveillance – simple endoscopy with no treatment, the sham arm (n = 43). In the current follow-up, Dr. Shaheen and his colleagues reported on the durability of the treatment at 2 and 3 years post ablation Gastroenterology 2011 August [doi:10.1053/j.gastro.2011.04.061]).

"After completion of the 12-month assessment, subjects in the sham arm were offered crossover to active (RFA) treatment," wrote the investigators. All 35 of the original 43 sham subjects who were eligible for crossover elected to receive treatment (4 had developed esophageal adenocarcinoma while in the sham arm prior to the 1 year end point, and 4 were lost to follow-up). All treated subjects were then followed for 2 years after receiving RFA therapy.

Overall, 119 subjects underwent ablation (84 at study outset plus 35 crossovers), with 106 completing at least the 2-year follow-up. At 2 years, 101 of 106 (95%) of subjects had complete eradication of dysplasia, and 99 of 106 (93%) had complete eradication of intestinal metaplasia, the authors reported. However, over the mean follow-up period of 3.05 years, 25 subjects required an interim focal treatment of recurrent Barrett’s. Moreover, "in the most stringent analysis, if we include any subject who ever received any RFA and left the trial before the 2 year end point as a failure (n = 13), the response rates were 101/119 (85%)" for complete eradication of dysplasia and 99 of 119 (83%) for complete eradication of intestinal metaplasia, the authors added.

Among the 56 subjects for whom 3-year follow-up data were available, 55 (98%) had complete eradication of dysplasia and 51 (91%) had complete eradication of intestinal metaplasia. The annual rate of any neoplastic progression was 1.37% per patient-year, or 1/73 patient-years.

Four serious adverse events have occurred to date in the 119 subjects (3.4%): one upper gastrointestinal bleed in a patient on dual aspirin and clopidogrel therapy for heart disease; one episode of chest pain occurring 8 days following ablation; and two overnight hospitalizations for nausea and chest pain immediately following the procedure.

"The main adverse side effect was stricture occurrence, which occurred in 7.6% of subjects and was correctable with a mean of 2.8 dilation sessions," the investigators wrote. There were no perforations or procedure-related deaths.

Radiofrequency ablation in this study was accomplished with a device made by BÂRRX Medical, which supported both the 2009 study and the current follow-up. Several investigators also reported financial relationships with BÂRRX Medical, among other pharmaceutical makers. AstraZeneca supplied esomeprazole for the patients through the company’s investigator-sponsored study program.

Publications
Publications
Topics
Article Type
Display Headline
Radiofrequency Ablation Works Well for Dysplatic Barrett's Esophagus
Display Headline
Radiofrequency Ablation Works Well for Dysplatic Barrett's Esophagus
Legacy Keywords
radiofrequency ablation, dysplastic Barrett’s esophagus, dysplasia, Gastroenterology, Ablation of Intestinal Metaplasia Containing Dysplasia Study
Legacy Keywords
radiofrequency ablation, dysplastic Barrett’s esophagus, dysplasia, Gastroenterology, Ablation of Intestinal Metaplasia Containing Dysplasia Study
Article Source

FROM GASTROENTEROLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: At 2 years following radiofrequency ablation, 101 of 106 subjects (95%) with dysplastic Barrett’s esophagus had complete eradication of dysplasia, and 99 of 106 (93%) had complete eradication of intestinal metaplasia.

Data Source: A multicenter, randomized, sham-controlled trial of 127 patients with dysplastic Barrett’s esophagus.

Disclosures: Radiofrequency ablation in this study was accomplished with a device made by BÂRRX Medical, sponsor of both the 2009 study and the current follow-up. Several investigators reported financial relationships with BÂRRX Medical, among other pharmaceutical makers. AstraZeneca supplied daily esomeprazole doses for all patients in this study.