Slot System
Featured Buckets
Featured Buckets Admin

Skip that repeat DXA scan in these postmenopausal women

Article Type
Changed
Wed, 12/08/2021 - 17:51
Display Headline
Skip that repeat DXA scan in these postmenopausal women

ILLUSTRATIVE CASE

A 70-year-old White woman with a history of type 2 diabetes and a normal body mass index (BMI) presents to your office for a preventive care exam. She is otherwise doing well, without concerns. Her first dual-energy x-ray absorptiometry (DXA) scan, completed at age 67, demonstrated normal bone density. Should you recommend a repeat DXA scan today?

As many as 1 in 2 postmenopausal women are at risk for an ­osteoporosis-related fracture.2 Each year, about 2 million fragility fractures occur in the United States.2,3 The US Preventive Services Task Force (USPSTF) recommends bone mineral density (BMD) measurement in all women ages 65 years and older, as well as in younger postmenopausal women with certain clinical risk factors.4 The USPSTF does not make a recommendation regarding the interval for follow-up BMD testing.

Two prospective cohort studies determined that repeat BMD testing 4 to 8 years after baseline screening did not improve fracture risk prediction.5,6 Limitations of these studies included no analysis of high-risk subgroups, as well as failure to include many younger postmenopausal women in the studies.5,6 An additional longitudinal study that followed postmenopausal women for up to 15 years estimated that the interval for at least 10% of women to develop osteoporosis after initial screening was more than 15 years for women with normal BMD and about 5 years for those with moderate osteopenia.7

STUDY SUMMARY

No added predictive benefit found in 3-year repeat scan

The current study examined data from the Women’s Health Initiative Extension 1 Study, a large prospective cohort that included a broader range of postmenopausal women (N = 7419) than the previous studies. The purpose of this study was to determine if a second BMD measurement, about 3 years after the baseline BMD screening, would be useful in predicting risk for major osteoporotic fracture (MOF), compared with baseline BMD measurement alone. It analyzed data from prespecified subgroups defined by age, BMI, race/ethnicity, presence or absence of diabetes, and baseline BMD T score.1

Study participants averaged 66 years of age, with a mean BMI of 29, and 23% were non-White. In addition, 97% had either normal BMD or osteopenia (T score ≥ −2.4). Participants were excluded from the study if they had been treated with bone-active medications other than vitamin D and calcium, reported a history of MOF (fracture of the hip, spine, radius, ulna, wrist, upper arm, or shoulder) at baseline or between BMD tests, missed follow-up visits after the Year 3 BMD scan, or had missing covariate data. Participants self-reported fractures on annual patient questionnaires, and hip fractures were confirmed through medical records.

During the mean follow-up period of 9 years after the second BMD test, 139 women (1.9%) had 1 or more hip fractures, and 732 women (9.9%) had 1 or more MOFs.

Area under the receiver operating characteristic curve (AU-ROC) values for baseline BMD screening and baseline plus 3-year BMD measurement were similar in their ability to discriminate between women who had a hip fracture or MOF and women who did not. AU-ROC values communicate the usefulness of a diagnostic or screening test. An AU-ROC value of 1 would be considered perfect (100% sensitive and 100% specific), whereas an AU-ROC of 0.5 suggests a test with no ability to discriminate at all. Values between 0.7 and 0.8 would be considered acceptable, and those between 0.8 and 0.9, excellent.

Continue to: The AU-ROCs in this study...

 

 

The AU-ROCs in this study were 0.71 (95% CI, 0.67-0.75) for baseline total hip BMD, 0.61 (95% CI, 0.56-0.65) for change in total hip BMD between baseline and 3-year BMD scan, and 0.73 (95% CI, 0.69-0.77) for the combined baseline total hip BMD and change in total hip BMD. For femoral neck and lumbar spine BMD, AU-ROC values demonstrated comparable discrimination of hip fracture and MOF as those for total hip BMD. The AU-ROC values among age subgroups (< 65 years, 65-74 years, and ≥ 75 years) were also similar. Associations between change in bone density and fracture risk did not change when adjusted for factors such as BMI, race/ethnicity, diabetes, or baseline BMD.

WHAT’S NEW

Results can be applied to a wider range of patients

This study found that for postmenopausal women, a repeat BMD measurement obtained 3 years after the initial assessment did not improve risk discrimination for hip fracture or MOF beyond the baseline BMD value and should not be routinely performed. Additionally, evidence from this study allows this recommendation to apply to younger postmenopausal women and a variety of high-risk subgroups.

CAVEATS

Possible bias due to self-reporting of fractures

This study suggests that for women without a diagnosis of osteoporosis at initial screening, repeat testing is unlikely to affect future risk stratification. Repeat BMD testing should still be considered when the results are likely to influence clinical management.

However, an important consideration is that fractures were self-reported in this study, introducing a possible source of bias. Additionally, although this study supports foregoing repeat screening at a 3-year interval, there is still no agreed-upon determination of when (or if) to repeat BMD screening in women without osteoporosis.

A large subset of the study population was younger than 65 (44%), the age when family physicians typically recommend screening for osteoporosis. However, the age-adjusted AU-ROC values for fracture risk prediction were the same, and this should not invalidate the conclusions for the study population at large.

CHALLENGES TO IMPLEMENTATION

No challenges seen

We see no challenges in implementing this recommendation.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

Files
References

1. Crandall CJ, Larson J, Wright NC, et al. Serial bone density measurement and incident fracture risk discrimination in postmenopausal women. JAMA Intern Med. 2020;180:1232-1240. doi: 10.1001/jamainternmed.2020.2986

2. US Preventive Services Task Force. Screening for osteoporosis: US Preventive Services Task Force recommendation statement. Ann Intern Med. 2011;154:356-364. doi: 10.7326/0003-4819-154-5-201103010-00307

3. Burge R, Dawson-Hughes B, Solomon DH, et al. Incidence and economic burden of osteoporosis-related fractures in the United States, 2005-2025. J Bone Miner Res. 2007;22:465-475. doi: 10.1359/jbmr.061113

4. US Preventive Services Task Force; Curry SJ, Krist AH, Owens DK, et al. Screening for osteoporosis to prevent fractures: US Preventive Services Task Force recommendation statement. JAMA. 2018;319:2521-2531. doi: 10.1001/jama.2018.7498

5. Hillier TA, Stone KL, Bauer DC, et al. Evaluating the value of repeat bone mineral density measurement and prediction of fractures in older women: the study of osteoporotic fractures. Arch Intern Med. 2007;167:155-160. doi: 10.1001/archinte.167.2.155

6. Berry SD, Samelson EJ, Pencina MJ, et al. Repeat bone mineral density screening and prediction of hip and major osteoporotic fracture. JAMA. 2013;310:1256-1262. doi: 10.1001/jama.2013.277817

7. Gourlay ML, Fine JP, Preisser JS, et al; Study of Osteoporotic Fractures Research Group. Bone-density testing interval and transition to osteoporosis in older women. N Engl J Med. 2012;366:225-233. doi: 10.1056/NEJMoa1107142

Article PDF
Author and Disclosure Information

Department of Family and Community Medicine, University of Missouri, Columbia

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Issue
The Journal of Family Practice - 70(10)
Publications
Topics
Page Number
510-512
Sections
Files
Files
Author and Disclosure Information

Department of Family and Community Medicine, University of Missouri, Columbia

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Author and Disclosure Information

Department of Family and Community Medicine, University of Missouri, Columbia

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Article PDF
Article PDF

ILLUSTRATIVE CASE

A 70-year-old White woman with a history of type 2 diabetes and a normal body mass index (BMI) presents to your office for a preventive care exam. She is otherwise doing well, without concerns. Her first dual-energy x-ray absorptiometry (DXA) scan, completed at age 67, demonstrated normal bone density. Should you recommend a repeat DXA scan today?

As many as 1 in 2 postmenopausal women are at risk for an ­osteoporosis-related fracture.2 Each year, about 2 million fragility fractures occur in the United States.2,3 The US Preventive Services Task Force (USPSTF) recommends bone mineral density (BMD) measurement in all women ages 65 years and older, as well as in younger postmenopausal women with certain clinical risk factors.4 The USPSTF does not make a recommendation regarding the interval for follow-up BMD testing.

Two prospective cohort studies determined that repeat BMD testing 4 to 8 years after baseline screening did not improve fracture risk prediction.5,6 Limitations of these studies included no analysis of high-risk subgroups, as well as failure to include many younger postmenopausal women in the studies.5,6 An additional longitudinal study that followed postmenopausal women for up to 15 years estimated that the interval for at least 10% of women to develop osteoporosis after initial screening was more than 15 years for women with normal BMD and about 5 years for those with moderate osteopenia.7

STUDY SUMMARY

No added predictive benefit found in 3-year repeat scan

The current study examined data from the Women’s Health Initiative Extension 1 Study, a large prospective cohort that included a broader range of postmenopausal women (N = 7419) than the previous studies. The purpose of this study was to determine if a second BMD measurement, about 3 years after the baseline BMD screening, would be useful in predicting risk for major osteoporotic fracture (MOF), compared with baseline BMD measurement alone. It analyzed data from prespecified subgroups defined by age, BMI, race/ethnicity, presence or absence of diabetes, and baseline BMD T score.1

Study participants averaged 66 years of age, with a mean BMI of 29, and 23% were non-White. In addition, 97% had either normal BMD or osteopenia (T score ≥ −2.4). Participants were excluded from the study if they had been treated with bone-active medications other than vitamin D and calcium, reported a history of MOF (fracture of the hip, spine, radius, ulna, wrist, upper arm, or shoulder) at baseline or between BMD tests, missed follow-up visits after the Year 3 BMD scan, or had missing covariate data. Participants self-reported fractures on annual patient questionnaires, and hip fractures were confirmed through medical records.

During the mean follow-up period of 9 years after the second BMD test, 139 women (1.9%) had 1 or more hip fractures, and 732 women (9.9%) had 1 or more MOFs.

Area under the receiver operating characteristic curve (AU-ROC) values for baseline BMD screening and baseline plus 3-year BMD measurement were similar in their ability to discriminate between women who had a hip fracture or MOF and women who did not. AU-ROC values communicate the usefulness of a diagnostic or screening test. An AU-ROC value of 1 would be considered perfect (100% sensitive and 100% specific), whereas an AU-ROC of 0.5 suggests a test with no ability to discriminate at all. Values between 0.7 and 0.8 would be considered acceptable, and those between 0.8 and 0.9, excellent.

Continue to: The AU-ROCs in this study...

 

 

The AU-ROCs in this study were 0.71 (95% CI, 0.67-0.75) for baseline total hip BMD, 0.61 (95% CI, 0.56-0.65) for change in total hip BMD between baseline and 3-year BMD scan, and 0.73 (95% CI, 0.69-0.77) for the combined baseline total hip BMD and change in total hip BMD. For femoral neck and lumbar spine BMD, AU-ROC values demonstrated comparable discrimination of hip fracture and MOF as those for total hip BMD. The AU-ROC values among age subgroups (< 65 years, 65-74 years, and ≥ 75 years) were also similar. Associations between change in bone density and fracture risk did not change when adjusted for factors such as BMI, race/ethnicity, diabetes, or baseline BMD.

WHAT’S NEW

Results can be applied to a wider range of patients

This study found that for postmenopausal women, a repeat BMD measurement obtained 3 years after the initial assessment did not improve risk discrimination for hip fracture or MOF beyond the baseline BMD value and should not be routinely performed. Additionally, evidence from this study allows this recommendation to apply to younger postmenopausal women and a variety of high-risk subgroups.

CAVEATS

Possible bias due to self-reporting of fractures

This study suggests that for women without a diagnosis of osteoporosis at initial screening, repeat testing is unlikely to affect future risk stratification. Repeat BMD testing should still be considered when the results are likely to influence clinical management.

However, an important consideration is that fractures were self-reported in this study, introducing a possible source of bias. Additionally, although this study supports foregoing repeat screening at a 3-year interval, there is still no agreed-upon determination of when (or if) to repeat BMD screening in women without osteoporosis.

A large subset of the study population was younger than 65 (44%), the age when family physicians typically recommend screening for osteoporosis. However, the age-adjusted AU-ROC values for fracture risk prediction were the same, and this should not invalidate the conclusions for the study population at large.

CHALLENGES TO IMPLEMENTATION

No challenges seen

We see no challenges in implementing this recommendation.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

A 70-year-old White woman with a history of type 2 diabetes and a normal body mass index (BMI) presents to your office for a preventive care exam. She is otherwise doing well, without concerns. Her first dual-energy x-ray absorptiometry (DXA) scan, completed at age 67, demonstrated normal bone density. Should you recommend a repeat DXA scan today?

As many as 1 in 2 postmenopausal women are at risk for an ­osteoporosis-related fracture.2 Each year, about 2 million fragility fractures occur in the United States.2,3 The US Preventive Services Task Force (USPSTF) recommends bone mineral density (BMD) measurement in all women ages 65 years and older, as well as in younger postmenopausal women with certain clinical risk factors.4 The USPSTF does not make a recommendation regarding the interval for follow-up BMD testing.

Two prospective cohort studies determined that repeat BMD testing 4 to 8 years after baseline screening did not improve fracture risk prediction.5,6 Limitations of these studies included no analysis of high-risk subgroups, as well as failure to include many younger postmenopausal women in the studies.5,6 An additional longitudinal study that followed postmenopausal women for up to 15 years estimated that the interval for at least 10% of women to develop osteoporosis after initial screening was more than 15 years for women with normal BMD and about 5 years for those with moderate osteopenia.7

STUDY SUMMARY

No added predictive benefit found in 3-year repeat scan

The current study examined data from the Women’s Health Initiative Extension 1 Study, a large prospective cohort that included a broader range of postmenopausal women (N = 7419) than the previous studies. The purpose of this study was to determine if a second BMD measurement, about 3 years after the baseline BMD screening, would be useful in predicting risk for major osteoporotic fracture (MOF), compared with baseline BMD measurement alone. It analyzed data from prespecified subgroups defined by age, BMI, race/ethnicity, presence or absence of diabetes, and baseline BMD T score.1

Study participants averaged 66 years of age, with a mean BMI of 29, and 23% were non-White. In addition, 97% had either normal BMD or osteopenia (T score ≥ −2.4). Participants were excluded from the study if they had been treated with bone-active medications other than vitamin D and calcium, reported a history of MOF (fracture of the hip, spine, radius, ulna, wrist, upper arm, or shoulder) at baseline or between BMD tests, missed follow-up visits after the Year 3 BMD scan, or had missing covariate data. Participants self-reported fractures on annual patient questionnaires, and hip fractures were confirmed through medical records.

During the mean follow-up period of 9 years after the second BMD test, 139 women (1.9%) had 1 or more hip fractures, and 732 women (9.9%) had 1 or more MOFs.

Area under the receiver operating characteristic curve (AU-ROC) values for baseline BMD screening and baseline plus 3-year BMD measurement were similar in their ability to discriminate between women who had a hip fracture or MOF and women who did not. AU-ROC values communicate the usefulness of a diagnostic or screening test. An AU-ROC value of 1 would be considered perfect (100% sensitive and 100% specific), whereas an AU-ROC of 0.5 suggests a test with no ability to discriminate at all. Values between 0.7 and 0.8 would be considered acceptable, and those between 0.8 and 0.9, excellent.

Continue to: The AU-ROCs in this study...

 

 

The AU-ROCs in this study were 0.71 (95% CI, 0.67-0.75) for baseline total hip BMD, 0.61 (95% CI, 0.56-0.65) for change in total hip BMD between baseline and 3-year BMD scan, and 0.73 (95% CI, 0.69-0.77) for the combined baseline total hip BMD and change in total hip BMD. For femoral neck and lumbar spine BMD, AU-ROC values demonstrated comparable discrimination of hip fracture and MOF as those for total hip BMD. The AU-ROC values among age subgroups (< 65 years, 65-74 years, and ≥ 75 years) were also similar. Associations between change in bone density and fracture risk did not change when adjusted for factors such as BMI, race/ethnicity, diabetes, or baseline BMD.

WHAT’S NEW

Results can be applied to a wider range of patients

This study found that for postmenopausal women, a repeat BMD measurement obtained 3 years after the initial assessment did not improve risk discrimination for hip fracture or MOF beyond the baseline BMD value and should not be routinely performed. Additionally, evidence from this study allows this recommendation to apply to younger postmenopausal women and a variety of high-risk subgroups.

CAVEATS

Possible bias due to self-reporting of fractures

This study suggests that for women without a diagnosis of osteoporosis at initial screening, repeat testing is unlikely to affect future risk stratification. Repeat BMD testing should still be considered when the results are likely to influence clinical management.

However, an important consideration is that fractures were self-reported in this study, introducing a possible source of bias. Additionally, although this study supports foregoing repeat screening at a 3-year interval, there is still no agreed-upon determination of when (or if) to repeat BMD screening in women without osteoporosis.

A large subset of the study population was younger than 65 (44%), the age when family physicians typically recommend screening for osteoporosis. However, the age-adjusted AU-ROC values for fracture risk prediction were the same, and this should not invalidate the conclusions for the study population at large.

CHALLENGES TO IMPLEMENTATION

No challenges seen

We see no challenges in implementing this recommendation.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. Crandall CJ, Larson J, Wright NC, et al. Serial bone density measurement and incident fracture risk discrimination in postmenopausal women. JAMA Intern Med. 2020;180:1232-1240. doi: 10.1001/jamainternmed.2020.2986

2. US Preventive Services Task Force. Screening for osteoporosis: US Preventive Services Task Force recommendation statement. Ann Intern Med. 2011;154:356-364. doi: 10.7326/0003-4819-154-5-201103010-00307

3. Burge R, Dawson-Hughes B, Solomon DH, et al. Incidence and economic burden of osteoporosis-related fractures in the United States, 2005-2025. J Bone Miner Res. 2007;22:465-475. doi: 10.1359/jbmr.061113

4. US Preventive Services Task Force; Curry SJ, Krist AH, Owens DK, et al. Screening for osteoporosis to prevent fractures: US Preventive Services Task Force recommendation statement. JAMA. 2018;319:2521-2531. doi: 10.1001/jama.2018.7498

5. Hillier TA, Stone KL, Bauer DC, et al. Evaluating the value of repeat bone mineral density measurement and prediction of fractures in older women: the study of osteoporotic fractures. Arch Intern Med. 2007;167:155-160. doi: 10.1001/archinte.167.2.155

6. Berry SD, Samelson EJ, Pencina MJ, et al. Repeat bone mineral density screening and prediction of hip and major osteoporotic fracture. JAMA. 2013;310:1256-1262. doi: 10.1001/jama.2013.277817

7. Gourlay ML, Fine JP, Preisser JS, et al; Study of Osteoporotic Fractures Research Group. Bone-density testing interval and transition to osteoporosis in older women. N Engl J Med. 2012;366:225-233. doi: 10.1056/NEJMoa1107142

References

1. Crandall CJ, Larson J, Wright NC, et al. Serial bone density measurement and incident fracture risk discrimination in postmenopausal women. JAMA Intern Med. 2020;180:1232-1240. doi: 10.1001/jamainternmed.2020.2986

2. US Preventive Services Task Force. Screening for osteoporosis: US Preventive Services Task Force recommendation statement. Ann Intern Med. 2011;154:356-364. doi: 10.7326/0003-4819-154-5-201103010-00307

3. Burge R, Dawson-Hughes B, Solomon DH, et al. Incidence and economic burden of osteoporosis-related fractures in the United States, 2005-2025. J Bone Miner Res. 2007;22:465-475. doi: 10.1359/jbmr.061113

4. US Preventive Services Task Force; Curry SJ, Krist AH, Owens DK, et al. Screening for osteoporosis to prevent fractures: US Preventive Services Task Force recommendation statement. JAMA. 2018;319:2521-2531. doi: 10.1001/jama.2018.7498

5. Hillier TA, Stone KL, Bauer DC, et al. Evaluating the value of repeat bone mineral density measurement and prediction of fractures in older women: the study of osteoporotic fractures. Arch Intern Med. 2007;167:155-160. doi: 10.1001/archinte.167.2.155

6. Berry SD, Samelson EJ, Pencina MJ, et al. Repeat bone mineral density screening and prediction of hip and major osteoporotic fracture. JAMA. 2013;310:1256-1262. doi: 10.1001/jama.2013.277817

7. Gourlay ML, Fine JP, Preisser JS, et al; Study of Osteoporotic Fractures Research Group. Bone-density testing interval and transition to osteoporosis in older women. N Engl J Med. 2012;366:225-233. doi: 10.1056/NEJMoa1107142

Issue
The Journal of Family Practice - 70(10)
Issue
The Journal of Family Practice - 70(10)
Page Number
510-512
Page Number
510-512
Publications
Publications
Topics
Article Type
Display Headline
Skip that repeat DXA scan in these postmenopausal women
Display Headline
Skip that repeat DXA scan in these postmenopausal women
Sections
PURLs Copyright
Copyright © 2021. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Do not routinely repeat bone density testing 3 years after initial screening in postmenopausal patients who do not have osteoporosis.

STRENGTH OF RECOMMENDATION

A: Based on several large, good-quality prospective cohort studies1

Crandall CJ, Larson J, Wright NC, et al. Serial bone density measurement and incident fracture risk discrimination in postmenopausal women. JAMA Intern Med. 2020;180:1232-1240. doi: 10.1001/jamainternmed.2020.2986

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Media Files

Validated scoring system identifies low-risk syncope patients

Article Type
Changed
Thu, 11/11/2021 - 08:26
Display Headline
Validated scoring system identifies low-risk syncope patients

ILLUSTRATIVE CASE

A 30-year-old woman presented to the ED after she “passed out” while standing at a concert. She lost consciousness for 10 seconds. After she revived, her friends drove her to the ED. She is healthy, with no chronic medical conditions, no medication use, and no drug or alcohol use. Should she be admitted to the hospital for observation?

Syncope, a transient loss of consciousness followed by spontaneous complete recovery, accounts for 1% of ED visits.2 Approximately 10% of patients presenting to the ED will have a serious underlying condition identified and among 3% to 5% of these patients with syncope, the serious condition will be identified only after they leave the ED.1 Most patients have a benign course, but more than half of all patients presenting to the ED with syncope will be hospitalized, costing $2.4 billion annually.2

Because of the high hospitalization rate of patients with syncope, a practical and accurate tool to risk-stratify patients is vital. Other tools, such as the San Francisco Syncope Rule, Short-Term Prognosis of Syncope, and Risk Stratification of Syncope in the Emergency Department, lack validation or are excessively complex, with extensive lab work or testing.3

The CSRS was previously derived from a large, multisite consecutive cohort, and was internally validated and reported according to the Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis guideline statement.4 Patients are assigned points based on clinical findings, test results, and the diagnosis given in the ED (TABLE4). The scoring system is used to stratify patients as very low (−3, −2), low (−1, 0), medium (1, 2, 3), high (4, 5), or very high (≥6) risk.4

STUDY SUMMARY

Less than 1% of very low– and low-risk patients had serious 30-day outcomes

This multisite Canadian prospective validation cohort study enrolled patients age ≥ 16 years who presented to the ED within 24 hours of syncope. Both discharged and hospitalized patients were included.1

Patients were excluded if they had loss of consciousness for > 5 minutes, mental status changes at presentation, history of current or previous seizure, or head trauma resulting in loss of consciousness. Patients requiring hospitalization secondary to trauma or those from whom an accurate history could not be obtained (eg, intoxication) were excluded, as were patients with a serious underlying condition identified during the original ED evaluation.

ED physicians confirmed patient eligibility, obtained verbal consent, and completed the data collection form. In addition, research assistants sought to identify eligible patients who were not previously enrolled by reviewing all ED visits during the study period.

Continue to: To examine 30-day outcomes...

 

 

To examine 30-day outcomes, researchers reviewed all available patient medical records, including administrative health records at all hospitals within the province; performed a telephone follow-up immediately after 30 days; and if no other information was found, searched the coroner’s database. Two ED physicians (with a third resolving disagreements) determined if a serious outcome occurred, including any arrhythmia, intervention to treat arrythmia, death due to an unknown cause, myocardial infarction, structural heart disease, aortic dissection, pulmonary embolism, severe pulmonary hypertension, significant hemorrhage, or subarachnoid hemorrhage.1

A total of 4131 patients made up the validation cohort. A serious condition was identified during the initial ED visit in 160 patients (3.9%), who were excluded from the study, and 152 patients (3.7%) were lost to follow-up. Of the 3819 patients included in the final analysis, troponin was not measured in 1566 patients (41%), and an electrocardiogram was not obtained in 114 patients (3%). A serious outcome within 30 days was experienced by 139 patients (3.6%; 95% CI, 3.1%-4.3%). There was good correlation to the model-predicted serious outcome probability of 3.2% (95% CI, 2.7%-3.8%).1

Three of 1631 (0.2%) patients classified as very low risk and 9 of 1254 (0.7%) low-risk patients experienced a serious outcome, and no patients died. In the group classified as medium risk, 55 of 687 (8%) patients experienced a serious outcome, and there was 1 death. In the high-risk group, 32 of 167 (19.2%) patients experienced a serious outcome, and there were 5 deaths. In the group classified as very high risk, 40 of 78 (51.3%) patients experienced a serious outcome, and there were 7 deaths. The CSRS was able to identify very low– or low-risk patients (score of −1 or better) with a sensitivity of 97.8% (95% CI, 93.8%-99.6%) and a specificity of 44.3% (95% CI, 42.7%-45.9%).1

 

WHAT’S NEW

This scoring system offers a validated method to risk-stratify ED patients

Previous recommendations from the American College of Cardiology/American Heart Associationsuggested determining disposition of ED patients by using clinical judgment based on a list of risk factors such as age, chronic conditions, and medications. However, there was no scoring system.3 This new scoring system allows physicians to send home very low– and low-risk patients with reassurance that the likelihood of a serious outcome is less than 1%. High-risk and very high–risk patients should be admitted to the hospital for further evaluation. Most moderate-risk patients (8% risk of serious outcome but 0.1% risk of death) can also be discharged after providers have a risk/benefit discussion, including precautions for signs of arrhythmia or need for urgent return to the hospital.

CAVEATS

The study does not translate to all clinical settings

Because this study was done in EDs, the scoring system cannot necessarily be applied to urgent care or outpatient settings. However, 41% of the patients in the study did not have troponin testing performed. Therefore, physicians could consider using the scoring system in settings where this lab test is not immediately available.

Continue to: This scoring system was also only...

 

 

This scoring system was also only validated with adult patients presenting within 24 hours of their syncopal episode. It is unknown how it may predict the outcomes of patients who present > 24 hours after syncope.

CHALLENGES TO IMPLEMENTATION

Clinicians may not be awareof the CSRS scoring system

The main challenge to implementation is practitioner awareness of the CSRS scoring system and how to use it appropriately, as there are several different syncopal scoring systems that may already be in use. Additionally, depending on the electronic health record used, the CSRS scoring system may not be embedded. Using and documenting scores may also be a challenge.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

Files
References

1. Thiruganasambandamoorthy V, Sivilotti MLA, Le Sage N, et al. Multicenter emergency department validation of the Canadian Syncope Risk Score. JAMA Intern Med. 2020;180:737-744. doi:10.1001/jamainternmed.2020.0288

2. Probst MA, Kanzaria HK, Gbedemah M, et al. National trends in resource utilization associated with ED visits for syncope. Am J Emerg Med. 2015;33:998-1001. doi:10.1016/j.ajem.2015.04.030

3. Shen WK, Sheldon RS, Benditt DG, et al. 2017 ACC/AHA/HRS guideline for the evaluation and management of patients with syncope: executive summary: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Rhythm Society. J Am Coll Cardiol. 2017;70:620-663. doi:10.1016/j.jacc.2017.03.002

4. Thiruganasambandamoorthy V, Kwong K, Wells GA, et al. Development of the Canadian Syncope Risk Score to predict serious adverse events after emergency department assessment of syncope. CMAJ. 2016;188:E289-E298. doi:10.1503/cmaj.151469

Article PDF
Author and Disclosure Information

University of Missouri Department of Family & Community Medicine, Columbia

DEPUTY EDITOR
Katherine Hale, PharmD, BCPS, MFA

Department of Nursing, Heritage University, Toppenish, WA

Issue
The Journal of Family Practice - 70(9)
Publications
Topics
Page Number
454-456
Sections
Files
Files
Author and Disclosure Information

University of Missouri Department of Family & Community Medicine, Columbia

DEPUTY EDITOR
Katherine Hale, PharmD, BCPS, MFA

Department of Nursing, Heritage University, Toppenish, WA

Author and Disclosure Information

University of Missouri Department of Family & Community Medicine, Columbia

DEPUTY EDITOR
Katherine Hale, PharmD, BCPS, MFA

Department of Nursing, Heritage University, Toppenish, WA

Article PDF
Article PDF

ILLUSTRATIVE CASE

A 30-year-old woman presented to the ED after she “passed out” while standing at a concert. She lost consciousness for 10 seconds. After she revived, her friends drove her to the ED. She is healthy, with no chronic medical conditions, no medication use, and no drug or alcohol use. Should she be admitted to the hospital for observation?

Syncope, a transient loss of consciousness followed by spontaneous complete recovery, accounts for 1% of ED visits.2 Approximately 10% of patients presenting to the ED will have a serious underlying condition identified and among 3% to 5% of these patients with syncope, the serious condition will be identified only after they leave the ED.1 Most patients have a benign course, but more than half of all patients presenting to the ED with syncope will be hospitalized, costing $2.4 billion annually.2

Because of the high hospitalization rate of patients with syncope, a practical and accurate tool to risk-stratify patients is vital. Other tools, such as the San Francisco Syncope Rule, Short-Term Prognosis of Syncope, and Risk Stratification of Syncope in the Emergency Department, lack validation or are excessively complex, with extensive lab work or testing.3

The CSRS was previously derived from a large, multisite consecutive cohort, and was internally validated and reported according to the Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis guideline statement.4 Patients are assigned points based on clinical findings, test results, and the diagnosis given in the ED (TABLE4). The scoring system is used to stratify patients as very low (−3, −2), low (−1, 0), medium (1, 2, 3), high (4, 5), or very high (≥6) risk.4

STUDY SUMMARY

Less than 1% of very low– and low-risk patients had serious 30-day outcomes

This multisite Canadian prospective validation cohort study enrolled patients age ≥ 16 years who presented to the ED within 24 hours of syncope. Both discharged and hospitalized patients were included.1

Patients were excluded if they had loss of consciousness for > 5 minutes, mental status changes at presentation, history of current or previous seizure, or head trauma resulting in loss of consciousness. Patients requiring hospitalization secondary to trauma or those from whom an accurate history could not be obtained (eg, intoxication) were excluded, as were patients with a serious underlying condition identified during the original ED evaluation.

ED physicians confirmed patient eligibility, obtained verbal consent, and completed the data collection form. In addition, research assistants sought to identify eligible patients who were not previously enrolled by reviewing all ED visits during the study period.

Continue to: To examine 30-day outcomes...

 

 

To examine 30-day outcomes, researchers reviewed all available patient medical records, including administrative health records at all hospitals within the province; performed a telephone follow-up immediately after 30 days; and if no other information was found, searched the coroner’s database. Two ED physicians (with a third resolving disagreements) determined if a serious outcome occurred, including any arrhythmia, intervention to treat arrythmia, death due to an unknown cause, myocardial infarction, structural heart disease, aortic dissection, pulmonary embolism, severe pulmonary hypertension, significant hemorrhage, or subarachnoid hemorrhage.1

A total of 4131 patients made up the validation cohort. A serious condition was identified during the initial ED visit in 160 patients (3.9%), who were excluded from the study, and 152 patients (3.7%) were lost to follow-up. Of the 3819 patients included in the final analysis, troponin was not measured in 1566 patients (41%), and an electrocardiogram was not obtained in 114 patients (3%). A serious outcome within 30 days was experienced by 139 patients (3.6%; 95% CI, 3.1%-4.3%). There was good correlation to the model-predicted serious outcome probability of 3.2% (95% CI, 2.7%-3.8%).1

Three of 1631 (0.2%) patients classified as very low risk and 9 of 1254 (0.7%) low-risk patients experienced a serious outcome, and no patients died. In the group classified as medium risk, 55 of 687 (8%) patients experienced a serious outcome, and there was 1 death. In the high-risk group, 32 of 167 (19.2%) patients experienced a serious outcome, and there were 5 deaths. In the group classified as very high risk, 40 of 78 (51.3%) patients experienced a serious outcome, and there were 7 deaths. The CSRS was able to identify very low– or low-risk patients (score of −1 or better) with a sensitivity of 97.8% (95% CI, 93.8%-99.6%) and a specificity of 44.3% (95% CI, 42.7%-45.9%).1

 

WHAT’S NEW

This scoring system offers a validated method to risk-stratify ED patients

Previous recommendations from the American College of Cardiology/American Heart Associationsuggested determining disposition of ED patients by using clinical judgment based on a list of risk factors such as age, chronic conditions, and medications. However, there was no scoring system.3 This new scoring system allows physicians to send home very low– and low-risk patients with reassurance that the likelihood of a serious outcome is less than 1%. High-risk and very high–risk patients should be admitted to the hospital for further evaluation. Most moderate-risk patients (8% risk of serious outcome but 0.1% risk of death) can also be discharged after providers have a risk/benefit discussion, including precautions for signs of arrhythmia or need for urgent return to the hospital.

CAVEATS

The study does not translate to all clinical settings

Because this study was done in EDs, the scoring system cannot necessarily be applied to urgent care or outpatient settings. However, 41% of the patients in the study did not have troponin testing performed. Therefore, physicians could consider using the scoring system in settings where this lab test is not immediately available.

Continue to: This scoring system was also only...

 

 

This scoring system was also only validated with adult patients presenting within 24 hours of their syncopal episode. It is unknown how it may predict the outcomes of patients who present > 24 hours after syncope.

CHALLENGES TO IMPLEMENTATION

Clinicians may not be awareof the CSRS scoring system

The main challenge to implementation is practitioner awareness of the CSRS scoring system and how to use it appropriately, as there are several different syncopal scoring systems that may already be in use. Additionally, depending on the electronic health record used, the CSRS scoring system may not be embedded. Using and documenting scores may also be a challenge.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

A 30-year-old woman presented to the ED after she “passed out” while standing at a concert. She lost consciousness for 10 seconds. After she revived, her friends drove her to the ED. She is healthy, with no chronic medical conditions, no medication use, and no drug or alcohol use. Should she be admitted to the hospital for observation?

Syncope, a transient loss of consciousness followed by spontaneous complete recovery, accounts for 1% of ED visits.2 Approximately 10% of patients presenting to the ED will have a serious underlying condition identified and among 3% to 5% of these patients with syncope, the serious condition will be identified only after they leave the ED.1 Most patients have a benign course, but more than half of all patients presenting to the ED with syncope will be hospitalized, costing $2.4 billion annually.2

Because of the high hospitalization rate of patients with syncope, a practical and accurate tool to risk-stratify patients is vital. Other tools, such as the San Francisco Syncope Rule, Short-Term Prognosis of Syncope, and Risk Stratification of Syncope in the Emergency Department, lack validation or are excessively complex, with extensive lab work or testing.3

The CSRS was previously derived from a large, multisite consecutive cohort, and was internally validated and reported according to the Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis guideline statement.4 Patients are assigned points based on clinical findings, test results, and the diagnosis given in the ED (TABLE4). The scoring system is used to stratify patients as very low (−3, −2), low (−1, 0), medium (1, 2, 3), high (4, 5), or very high (≥6) risk.4

STUDY SUMMARY

Less than 1% of very low– and low-risk patients had serious 30-day outcomes

This multisite Canadian prospective validation cohort study enrolled patients age ≥ 16 years who presented to the ED within 24 hours of syncope. Both discharged and hospitalized patients were included.1

Patients were excluded if they had loss of consciousness for > 5 minutes, mental status changes at presentation, history of current or previous seizure, or head trauma resulting in loss of consciousness. Patients requiring hospitalization secondary to trauma or those from whom an accurate history could not be obtained (eg, intoxication) were excluded, as were patients with a serious underlying condition identified during the original ED evaluation.

ED physicians confirmed patient eligibility, obtained verbal consent, and completed the data collection form. In addition, research assistants sought to identify eligible patients who were not previously enrolled by reviewing all ED visits during the study period.

Continue to: To examine 30-day outcomes...

 

 

To examine 30-day outcomes, researchers reviewed all available patient medical records, including administrative health records at all hospitals within the province; performed a telephone follow-up immediately after 30 days; and if no other information was found, searched the coroner’s database. Two ED physicians (with a third resolving disagreements) determined if a serious outcome occurred, including any arrhythmia, intervention to treat arrythmia, death due to an unknown cause, myocardial infarction, structural heart disease, aortic dissection, pulmonary embolism, severe pulmonary hypertension, significant hemorrhage, or subarachnoid hemorrhage.1

A total of 4131 patients made up the validation cohort. A serious condition was identified during the initial ED visit in 160 patients (3.9%), who were excluded from the study, and 152 patients (3.7%) were lost to follow-up. Of the 3819 patients included in the final analysis, troponin was not measured in 1566 patients (41%), and an electrocardiogram was not obtained in 114 patients (3%). A serious outcome within 30 days was experienced by 139 patients (3.6%; 95% CI, 3.1%-4.3%). There was good correlation to the model-predicted serious outcome probability of 3.2% (95% CI, 2.7%-3.8%).1

Three of 1631 (0.2%) patients classified as very low risk and 9 of 1254 (0.7%) low-risk patients experienced a serious outcome, and no patients died. In the group classified as medium risk, 55 of 687 (8%) patients experienced a serious outcome, and there was 1 death. In the high-risk group, 32 of 167 (19.2%) patients experienced a serious outcome, and there were 5 deaths. In the group classified as very high risk, 40 of 78 (51.3%) patients experienced a serious outcome, and there were 7 deaths. The CSRS was able to identify very low– or low-risk patients (score of −1 or better) with a sensitivity of 97.8% (95% CI, 93.8%-99.6%) and a specificity of 44.3% (95% CI, 42.7%-45.9%).1

 

WHAT’S NEW

This scoring system offers a validated method to risk-stratify ED patients

Previous recommendations from the American College of Cardiology/American Heart Associationsuggested determining disposition of ED patients by using clinical judgment based on a list of risk factors such as age, chronic conditions, and medications. However, there was no scoring system.3 This new scoring system allows physicians to send home very low– and low-risk patients with reassurance that the likelihood of a serious outcome is less than 1%. High-risk and very high–risk patients should be admitted to the hospital for further evaluation. Most moderate-risk patients (8% risk of serious outcome but 0.1% risk of death) can also be discharged after providers have a risk/benefit discussion, including precautions for signs of arrhythmia or need for urgent return to the hospital.

CAVEATS

The study does not translate to all clinical settings

Because this study was done in EDs, the scoring system cannot necessarily be applied to urgent care or outpatient settings. However, 41% of the patients in the study did not have troponin testing performed. Therefore, physicians could consider using the scoring system in settings where this lab test is not immediately available.

Continue to: This scoring system was also only...

 

 

This scoring system was also only validated with adult patients presenting within 24 hours of their syncopal episode. It is unknown how it may predict the outcomes of patients who present > 24 hours after syncope.

CHALLENGES TO IMPLEMENTATION

Clinicians may not be awareof the CSRS scoring system

The main challenge to implementation is practitioner awareness of the CSRS scoring system and how to use it appropriately, as there are several different syncopal scoring systems that may already be in use. Additionally, depending on the electronic health record used, the CSRS scoring system may not be embedded. Using and documenting scores may also be a challenge.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. Thiruganasambandamoorthy V, Sivilotti MLA, Le Sage N, et al. Multicenter emergency department validation of the Canadian Syncope Risk Score. JAMA Intern Med. 2020;180:737-744. doi:10.1001/jamainternmed.2020.0288

2. Probst MA, Kanzaria HK, Gbedemah M, et al. National trends in resource utilization associated with ED visits for syncope. Am J Emerg Med. 2015;33:998-1001. doi:10.1016/j.ajem.2015.04.030

3. Shen WK, Sheldon RS, Benditt DG, et al. 2017 ACC/AHA/HRS guideline for the evaluation and management of patients with syncope: executive summary: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Rhythm Society. J Am Coll Cardiol. 2017;70:620-663. doi:10.1016/j.jacc.2017.03.002

4. Thiruganasambandamoorthy V, Kwong K, Wells GA, et al. Development of the Canadian Syncope Risk Score to predict serious adverse events after emergency department assessment of syncope. CMAJ. 2016;188:E289-E298. doi:10.1503/cmaj.151469

References

1. Thiruganasambandamoorthy V, Sivilotti MLA, Le Sage N, et al. Multicenter emergency department validation of the Canadian Syncope Risk Score. JAMA Intern Med. 2020;180:737-744. doi:10.1001/jamainternmed.2020.0288

2. Probst MA, Kanzaria HK, Gbedemah M, et al. National trends in resource utilization associated with ED visits for syncope. Am J Emerg Med. 2015;33:998-1001. doi:10.1016/j.ajem.2015.04.030

3. Shen WK, Sheldon RS, Benditt DG, et al. 2017 ACC/AHA/HRS guideline for the evaluation and management of patients with syncope: executive summary: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Rhythm Society. J Am Coll Cardiol. 2017;70:620-663. doi:10.1016/j.jacc.2017.03.002

4. Thiruganasambandamoorthy V, Kwong K, Wells GA, et al. Development of the Canadian Syncope Risk Score to predict serious adverse events after emergency department assessment of syncope. CMAJ. 2016;188:E289-E298. doi:10.1503/cmaj.151469

Issue
The Journal of Family Practice - 70(9)
Issue
The Journal of Family Practice - 70(9)
Page Number
454-456
Page Number
454-456
Publications
Publications
Topics
Article Type
Display Headline
Validated scoring system identifies low-risk syncope patients
Display Headline
Validated scoring system identifies low-risk syncope patients
Sections
PURLs Copyright
Copyright © 2021. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Physicians should use the Canadian Syncope Risk Score (CSRS) to identify and send home very low– and low-risk patients from the emergency department (ED) after a syncopal episode.

STRENGTH OF RECOMMENDATION

A: Validated clinical decision rule based on a prospective cohort study1

Thiruganasambandamoorthy V, Sivilotti MLA, Le Sage N, et al. Multicenter emergency department validation of the Canadian Syncope Risk Score. JAMA Intern Med. 2020;180:737-744. doi:10.1001/jamainternmed.2020.0288

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Media Files

Monotherapy for nonvalvular A-fib with stable CAD?

Article Type
Changed
Wed, 10/20/2021 - 14:20
Display Headline
Monotherapy for nonvalvular A-fib with stable CAD?

ILLUSTRATIVE CASE

A 67-year-old man with a history of coronary artery stenting 7 years prior and nonvalvular AF that is well controlled with a beta-blocker comes in for a routine health maintenance visit. You note that the patient takes warfarin, metoprolol, and aspirin. The patient has not had any thrombotic or bleeding events in his lifetime. Does this patient need to take both warfarin and aspirin? Do the antithrombotic benefits of dual therapy outweigh the risk of bleeding?

Antiplatelet agents have long been recommended for secondary prevention of cardiovascular (CV) events in patients with IHD. The goal is to reduce the risk of coronary artery thrombosis.2 Many patients with IHD also develop AF and are treated with OACs such as warfarin or direct oral anticoagulants (DOACs) to prevent thromboembolic events.

There has been a paucity of data to determine the risks and benefits of OAC monotherapy compared to OAC plus single antiplatelet therapy (SAPT). Given research that shows increased risks of bleeding and all-cause mortality when aspirin is used for primary prevention of CV disease,3,4 it is prudent to examine if the harms of aspirin outweigh its benefits for the secondary prevention of acute coronary events in patients already taking antithrombotic agents.

STUDY SUMMARY

Reduced bleeding risk, with no difference in major adverse cardiovascular events

This study by Lee and colleagues1 was a meta-analysis of 8855 patients with nonvalvular AF and stable coronary artery disease (CAD), from 6 trials comparing OAC monotherapy vs OAC plus SAPT. The meta-analysis involved 3 studies using patient registries, 2 cohort studies, and an open-label randomized trial that together spanned the period from 2002 to 2016. The longest study period was 9 years (1 study) and the shortest, 1 year (2 studies). Oral anticoagulation consisted of either vitamin K antagonist (VKA) therapy (the majority of the patients studied) or DOAC therapy (8.6% of the patients studied). SAPT was either aspirin or clopidogrel.

The primary outcome measure was major adverse CV events (MACE). Secondary outcome measures included major bleeding, stroke, all-cause mortality, and net adverse events. The definitions used by the studies for major bleeding were deemed “largely consistent” with the International Society on Thrombosis and Haemostasis major bleeding criteria, ie, fatal bleeding, symptomatic bleeding in a critical area or organ (intracranial, intraspinal, intraocular, retroperitoneal, intra-articular, pericardial, or intramuscular causing compartment syndrome), or a drop in hemoglobin (≥ 2 g/dL or requiring transfusion of ≥ 2 units of whole blood or red cells).5

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom single antiplatelet therapy should not be prescribed as a preventive medication.

There was no difference in MACE between the monotherapy and OAC plus SAPT groups (hazard ratio [HR] = 1.09; 95% CI, 0.92-1.29). Similarly, there were no differences in stroke and all-cause mortality between the groups. However, there was a significant association of higher risk of major bleeding (HR = 1.61; 95% CI, 1.38-1.87) and net adverse events (HR = 1.21; 95% CI, 1.02-1.43) in the OAC plus SAPT group compared with the OAC monotherapy group.

This study’s limitations included its low percentage of patients taking a DOAC. Also, due to variations in methods of reporting CHA2DS2-VASc and HAS-BLED scores among the studies (for risk of stroke in patients with nonrheumatic AF and for risk of bleeding in AF patients taking anticoagulants), this meta-analysis could not determine if different outcomes might be found in patients with different CHA2DS2-VASc and HAS-BLED scores.

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

OAC monotherapy benefit for patients with nonvalvular AF

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom SAPT should not be prescribed as a preventive medication: patients with nonvalvular AF who are receiving OAC therapy. This study concurs with the results of the 2019 AFIRE (Atrial Fibrillation and Ischemic Events with Rivaroxaban in Patients with Stable Coronary Artery Disease) trial in Japan, in which 2236 patients with stable IHD (coronary artery bypass grafting, stenting, or cardiac catheterization > 1 year earlier) were randomized to receive rivaroxaban either alone or with an antiplatelet agent. All-cause mortality and major bleeding were lower in the monotherapy group.6

This meta-analysis calls into question the baseline recommendation from the 2012 American College of Cardiology Foundation/American Heart Association (ACCF/AHA) guideline to prescribe aspirin indefinitely for patients with stable CAD unless there is a contraindication (oral anticoagulation is not listed as a contraindication).2 The 2020 ACC Expert Consensus Decision Pathway7 published in February 2021 stated that for patients requiring long-term anticoagulation therapy who have completed 12 months of SAPT after percutaneous coronary intervention, anticoagulation therapy alone “could be used long-term”; however, the 2019 study by Lee was not listed among their references. Inclusion of the Lee study might have contributed to a stronger recommendation.

Also, the new guidelines include clinical situations in which dual therapy could still be continued: “… if perceived thrombotic risk is high (eg, prior myocardial infarction, complex lesions, presence of select traditional cardiovascular risk factors, or extensive [atherosclerotic cardiovascular disease]), and the patient is at low bleeding risk.” The guidelines state that in this situation, “… it is reasonable to continue SAPT beyond 12 months (in line with prior ACC/AHA recommendations).”7 However, the cited study compared dual therapy (dabigatran plus APT) to warfarin triple therapy. Single OAC therapy was not studied.8

CAVEATS

DOAC patient populationwas not well represented

The study had a low percentage of patients taking a DOAC. Also, because there were variations in how the studies reported CHA2DS2-VASc and HAS-BLED scores, this meta-analysis was unable to determine if different scores might have produced different outcomes. However, the studies involving registries had the advantage of looking at the data for this population over long periods of time and included a wide variety of patients, making the recommendation likely valid.

CHALLENGES TO IMPLEMENTATION

Primary care approach may not sync with specialist practice

We see no challenges to implementation except for potential differences between primary care physicians and specialists regarding the use of antiplatelet agents in this patient population.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

Files
References

1. Lee SR, Rhee TM, Kang DY, et al. Meta-analysis of oral anticoagulant monotherapy as an antithrombotic strategy in patients with stable coronary artery disease and nonvalvular atrial fibrillation. Am J Cardiol. 2019;124:879-885. doi: 10.1016/j.amjcard.2019.05.072

2. Fihn SD, Gardin JM, Abrams J, et al; American College of Cardiology Foundation; American Heart Association Task Force on Practice Guidelines; American College of Physicians; American Association for Thoracic Surgery; Preventive Cardiovascular Nurses Association; Society for Cardiovascular Angiography and Interventions; Society of Thoracic Surgeons. 2012 ACCF/AHA/ACP/AATS/PCNA/SCAI/STS guideline for the diagnosis and management of patients with stable ischemic heart disease: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Practice Guidelines, and the American College of Physicians, American Association for Thoracic Surgery, Preventive Cardiovascular Nurses Association, Society for Cardiovascular Angiography and Interventions, and Society of Thoracic Surgeons. J Am Coll Cardiol. 2012;60:e44-e164.

3. Whitlock EP, Burda BU, Williams SB, et al. Bleeding risks with aspirin use for primary prevention in adults: a systematic review for the U.S. Preventive Services Task Force. Ann Intern Med. 2016;164:826-835. doi: 10.7326/M15-2112

4. McNeil JJ, Nelson MR, Woods RL, et al; ASPREE Investigator Group. Effect of aspirin on all-cause mortality in the healthy elderly. N Engl J Med. 2018;379:1519-1528. doi: 10.1056/NEJMoa1803955

5. Schulman S, Kearon C; Subcommittee on Control of Anticoagulation of the Scientific and Standardization Committee of the International Society on Thrombosis and Haemostasis. Definition of major bleeding in clinical investigations of antihemostatic medicinal products in non-surgical patients. J Thromb Haemost. 2005;3:692-694. doi: 10.1111/j.1538-7836.2005.01204.x

6. Yasuda S, Kaikita K, Akao M, et al; AFIRE Investigators. Antithrombotic therapy for atrial fibrillation with stable coronary disease. N Engl J Med. 2019;381:1103-1113. doi: 10.1056/NEJMoa1904143

7. Kumbhani DJ, Cannon CP, Beavers CJ, et al. 2020 ACC expert consensus decision pathway for anticoagulant and antiplatelet therapy in patients with atrial fibrillation or venous thromboembolism undergoing percutaneous coronary intervention or with atherosclerotic cardiovascular disease: a report of the American College of Cardiology Solution Set Oversight Committee. J Am Coll Cardiol. 2021;77:629-658. doi: 10.1016/j.jacc.2020.09.011

8. Berry NC, Mauri L, Steg PG, et al. Effect of lesion complexity and clinical risk factors on the efficacy and safety of dabigatran dual therapy versus warfarin triple therapy in atrial fibrillation after percutaneous coronary intervention: a subgroup analysis from the REDUAL PCI trial. Circ Cardiovasc Interv. 2020;13:e008349. doi: 10.1161/CIRCINTERVENTIONS.119.008349

Article PDF
Author and Disclosure Information

Department of Soldier and Family Medicine, Eisenhower Army Medical Center, Fort Gordon, GA

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Issue
The Journal of Family Practice - 70(8)
Publications
Topics
Page Number
403-404,407
Sections
Files
Files
Author and Disclosure Information

Department of Soldier and Family Medicine, Eisenhower Army Medical Center, Fort Gordon, GA

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Author and Disclosure Information

Department of Soldier and Family Medicine, Eisenhower Army Medical Center, Fort Gordon, GA

DEPUTY EDITOR
Shailendra Prasad, MBBS, MPH

University of Minnesota North Memorial Family Medicine Residency Program, Minneapolis

Article PDF
Article PDF

ILLUSTRATIVE CASE

A 67-year-old man with a history of coronary artery stenting 7 years prior and nonvalvular AF that is well controlled with a beta-blocker comes in for a routine health maintenance visit. You note that the patient takes warfarin, metoprolol, and aspirin. The patient has not had any thrombotic or bleeding events in his lifetime. Does this patient need to take both warfarin and aspirin? Do the antithrombotic benefits of dual therapy outweigh the risk of bleeding?

Antiplatelet agents have long been recommended for secondary prevention of cardiovascular (CV) events in patients with IHD. The goal is to reduce the risk of coronary artery thrombosis.2 Many patients with IHD also develop AF and are treated with OACs such as warfarin or direct oral anticoagulants (DOACs) to prevent thromboembolic events.

There has been a paucity of data to determine the risks and benefits of OAC monotherapy compared to OAC plus single antiplatelet therapy (SAPT). Given research that shows increased risks of bleeding and all-cause mortality when aspirin is used for primary prevention of CV disease,3,4 it is prudent to examine if the harms of aspirin outweigh its benefits for the secondary prevention of acute coronary events in patients already taking antithrombotic agents.

STUDY SUMMARY

Reduced bleeding risk, with no difference in major adverse cardiovascular events

This study by Lee and colleagues1 was a meta-analysis of 8855 patients with nonvalvular AF and stable coronary artery disease (CAD), from 6 trials comparing OAC monotherapy vs OAC plus SAPT. The meta-analysis involved 3 studies using patient registries, 2 cohort studies, and an open-label randomized trial that together spanned the period from 2002 to 2016. The longest study period was 9 years (1 study) and the shortest, 1 year (2 studies). Oral anticoagulation consisted of either vitamin K antagonist (VKA) therapy (the majority of the patients studied) or DOAC therapy (8.6% of the patients studied). SAPT was either aspirin or clopidogrel.

The primary outcome measure was major adverse CV events (MACE). Secondary outcome measures included major bleeding, stroke, all-cause mortality, and net adverse events. The definitions used by the studies for major bleeding were deemed “largely consistent” with the International Society on Thrombosis and Haemostasis major bleeding criteria, ie, fatal bleeding, symptomatic bleeding in a critical area or organ (intracranial, intraspinal, intraocular, retroperitoneal, intra-articular, pericardial, or intramuscular causing compartment syndrome), or a drop in hemoglobin (≥ 2 g/dL or requiring transfusion of ≥ 2 units of whole blood or red cells).5

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom single antiplatelet therapy should not be prescribed as a preventive medication.

There was no difference in MACE between the monotherapy and OAC plus SAPT groups (hazard ratio [HR] = 1.09; 95% CI, 0.92-1.29). Similarly, there were no differences in stroke and all-cause mortality between the groups. However, there was a significant association of higher risk of major bleeding (HR = 1.61; 95% CI, 1.38-1.87) and net adverse events (HR = 1.21; 95% CI, 1.02-1.43) in the OAC plus SAPT group compared with the OAC monotherapy group.

This study’s limitations included its low percentage of patients taking a DOAC. Also, due to variations in methods of reporting CHA2DS2-VASc and HAS-BLED scores among the studies (for risk of stroke in patients with nonrheumatic AF and for risk of bleeding in AF patients taking anticoagulants), this meta-analysis could not determine if different outcomes might be found in patients with different CHA2DS2-VASc and HAS-BLED scores.

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

OAC monotherapy benefit for patients with nonvalvular AF

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom SAPT should not be prescribed as a preventive medication: patients with nonvalvular AF who are receiving OAC therapy. This study concurs with the results of the 2019 AFIRE (Atrial Fibrillation and Ischemic Events with Rivaroxaban in Patients with Stable Coronary Artery Disease) trial in Japan, in which 2236 patients with stable IHD (coronary artery bypass grafting, stenting, or cardiac catheterization > 1 year earlier) were randomized to receive rivaroxaban either alone or with an antiplatelet agent. All-cause mortality and major bleeding were lower in the monotherapy group.6

This meta-analysis calls into question the baseline recommendation from the 2012 American College of Cardiology Foundation/American Heart Association (ACCF/AHA) guideline to prescribe aspirin indefinitely for patients with stable CAD unless there is a contraindication (oral anticoagulation is not listed as a contraindication).2 The 2020 ACC Expert Consensus Decision Pathway7 published in February 2021 stated that for patients requiring long-term anticoagulation therapy who have completed 12 months of SAPT after percutaneous coronary intervention, anticoagulation therapy alone “could be used long-term”; however, the 2019 study by Lee was not listed among their references. Inclusion of the Lee study might have contributed to a stronger recommendation.

Also, the new guidelines include clinical situations in which dual therapy could still be continued: “… if perceived thrombotic risk is high (eg, prior myocardial infarction, complex lesions, presence of select traditional cardiovascular risk factors, or extensive [atherosclerotic cardiovascular disease]), and the patient is at low bleeding risk.” The guidelines state that in this situation, “… it is reasonable to continue SAPT beyond 12 months (in line with prior ACC/AHA recommendations).”7 However, the cited study compared dual therapy (dabigatran plus APT) to warfarin triple therapy. Single OAC therapy was not studied.8

CAVEATS

DOAC patient populationwas not well represented

The study had a low percentage of patients taking a DOAC. Also, because there were variations in how the studies reported CHA2DS2-VASc and HAS-BLED scores, this meta-analysis was unable to determine if different scores might have produced different outcomes. However, the studies involving registries had the advantage of looking at the data for this population over long periods of time and included a wide variety of patients, making the recommendation likely valid.

CHALLENGES TO IMPLEMENTATION

Primary care approach may not sync with specialist practice

We see no challenges to implementation except for potential differences between primary care physicians and specialists regarding the use of antiplatelet agents in this patient population.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

A 67-year-old man with a history of coronary artery stenting 7 years prior and nonvalvular AF that is well controlled with a beta-blocker comes in for a routine health maintenance visit. You note that the patient takes warfarin, metoprolol, and aspirin. The patient has not had any thrombotic or bleeding events in his lifetime. Does this patient need to take both warfarin and aspirin? Do the antithrombotic benefits of dual therapy outweigh the risk of bleeding?

Antiplatelet agents have long been recommended for secondary prevention of cardiovascular (CV) events in patients with IHD. The goal is to reduce the risk of coronary artery thrombosis.2 Many patients with IHD also develop AF and are treated with OACs such as warfarin or direct oral anticoagulants (DOACs) to prevent thromboembolic events.

There has been a paucity of data to determine the risks and benefits of OAC monotherapy compared to OAC plus single antiplatelet therapy (SAPT). Given research that shows increased risks of bleeding and all-cause mortality when aspirin is used for primary prevention of CV disease,3,4 it is prudent to examine if the harms of aspirin outweigh its benefits for the secondary prevention of acute coronary events in patients already taking antithrombotic agents.

STUDY SUMMARY

Reduced bleeding risk, with no difference in major adverse cardiovascular events

This study by Lee and colleagues1 was a meta-analysis of 8855 patients with nonvalvular AF and stable coronary artery disease (CAD), from 6 trials comparing OAC monotherapy vs OAC plus SAPT. The meta-analysis involved 3 studies using patient registries, 2 cohort studies, and an open-label randomized trial that together spanned the period from 2002 to 2016. The longest study period was 9 years (1 study) and the shortest, 1 year (2 studies). Oral anticoagulation consisted of either vitamin K antagonist (VKA) therapy (the majority of the patients studied) or DOAC therapy (8.6% of the patients studied). SAPT was either aspirin or clopidogrel.

The primary outcome measure was major adverse CV events (MACE). Secondary outcome measures included major bleeding, stroke, all-cause mortality, and net adverse events. The definitions used by the studies for major bleeding were deemed “largely consistent” with the International Society on Thrombosis and Haemostasis major bleeding criteria, ie, fatal bleeding, symptomatic bleeding in a critical area or organ (intracranial, intraspinal, intraocular, retroperitoneal, intra-articular, pericardial, or intramuscular causing compartment syndrome), or a drop in hemoglobin (≥ 2 g/dL or requiring transfusion of ≥ 2 units of whole blood or red cells).5

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom single antiplatelet therapy should not be prescribed as a preventive medication.

There was no difference in MACE between the monotherapy and OAC plus SAPT groups (hazard ratio [HR] = 1.09; 95% CI, 0.92-1.29). Similarly, there were no differences in stroke and all-cause mortality between the groups. However, there was a significant association of higher risk of major bleeding (HR = 1.61; 95% CI, 1.38-1.87) and net adverse events (HR = 1.21; 95% CI, 1.02-1.43) in the OAC plus SAPT group compared with the OAC monotherapy group.

This study’s limitations included its low percentage of patients taking a DOAC. Also, due to variations in methods of reporting CHA2DS2-VASc and HAS-BLED scores among the studies (for risk of stroke in patients with nonrheumatic AF and for risk of bleeding in AF patients taking anticoagulants), this meta-analysis could not determine if different outcomes might be found in patients with different CHA2DS2-VASc and HAS-BLED scores.

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

OAC monotherapy benefit for patients with nonvalvular AF

This study strongly suggests that there is a large subgroup of patients with stable CAD for whom SAPT should not be prescribed as a preventive medication: patients with nonvalvular AF who are receiving OAC therapy. This study concurs with the results of the 2019 AFIRE (Atrial Fibrillation and Ischemic Events with Rivaroxaban in Patients with Stable Coronary Artery Disease) trial in Japan, in which 2236 patients with stable IHD (coronary artery bypass grafting, stenting, or cardiac catheterization > 1 year earlier) were randomized to receive rivaroxaban either alone or with an antiplatelet agent. All-cause mortality and major bleeding were lower in the monotherapy group.6

This meta-analysis calls into question the baseline recommendation from the 2012 American College of Cardiology Foundation/American Heart Association (ACCF/AHA) guideline to prescribe aspirin indefinitely for patients with stable CAD unless there is a contraindication (oral anticoagulation is not listed as a contraindication).2 The 2020 ACC Expert Consensus Decision Pathway7 published in February 2021 stated that for patients requiring long-term anticoagulation therapy who have completed 12 months of SAPT after percutaneous coronary intervention, anticoagulation therapy alone “could be used long-term”; however, the 2019 study by Lee was not listed among their references. Inclusion of the Lee study might have contributed to a stronger recommendation.

Also, the new guidelines include clinical situations in which dual therapy could still be continued: “… if perceived thrombotic risk is high (eg, prior myocardial infarction, complex lesions, presence of select traditional cardiovascular risk factors, or extensive [atherosclerotic cardiovascular disease]), and the patient is at low bleeding risk.” The guidelines state that in this situation, “… it is reasonable to continue SAPT beyond 12 months (in line with prior ACC/AHA recommendations).”7 However, the cited study compared dual therapy (dabigatran plus APT) to warfarin triple therapy. Single OAC therapy was not studied.8

CAVEATS

DOAC patient populationwas not well represented

The study had a low percentage of patients taking a DOAC. Also, because there were variations in how the studies reported CHA2DS2-VASc and HAS-BLED scores, this meta-analysis was unable to determine if different scores might have produced different outcomes. However, the studies involving registries had the advantage of looking at the data for this population over long periods of time and included a wide variety of patients, making the recommendation likely valid.

CHALLENGES TO IMPLEMENTATION

Primary care approach may not sync with specialist practice

We see no challenges to implementation except for potential differences between primary care physicians and specialists regarding the use of antiplatelet agents in this patient population.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. Lee SR, Rhee TM, Kang DY, et al. Meta-analysis of oral anticoagulant monotherapy as an antithrombotic strategy in patients with stable coronary artery disease and nonvalvular atrial fibrillation. Am J Cardiol. 2019;124:879-885. doi: 10.1016/j.amjcard.2019.05.072

2. Fihn SD, Gardin JM, Abrams J, et al; American College of Cardiology Foundation; American Heart Association Task Force on Practice Guidelines; American College of Physicians; American Association for Thoracic Surgery; Preventive Cardiovascular Nurses Association; Society for Cardiovascular Angiography and Interventions; Society of Thoracic Surgeons. 2012 ACCF/AHA/ACP/AATS/PCNA/SCAI/STS guideline for the diagnosis and management of patients with stable ischemic heart disease: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Practice Guidelines, and the American College of Physicians, American Association for Thoracic Surgery, Preventive Cardiovascular Nurses Association, Society for Cardiovascular Angiography and Interventions, and Society of Thoracic Surgeons. J Am Coll Cardiol. 2012;60:e44-e164.

3. Whitlock EP, Burda BU, Williams SB, et al. Bleeding risks with aspirin use for primary prevention in adults: a systematic review for the U.S. Preventive Services Task Force. Ann Intern Med. 2016;164:826-835. doi: 10.7326/M15-2112

4. McNeil JJ, Nelson MR, Woods RL, et al; ASPREE Investigator Group. Effect of aspirin on all-cause mortality in the healthy elderly. N Engl J Med. 2018;379:1519-1528. doi: 10.1056/NEJMoa1803955

5. Schulman S, Kearon C; Subcommittee on Control of Anticoagulation of the Scientific and Standardization Committee of the International Society on Thrombosis and Haemostasis. Definition of major bleeding in clinical investigations of antihemostatic medicinal products in non-surgical patients. J Thromb Haemost. 2005;3:692-694. doi: 10.1111/j.1538-7836.2005.01204.x

6. Yasuda S, Kaikita K, Akao M, et al; AFIRE Investigators. Antithrombotic therapy for atrial fibrillation with stable coronary disease. N Engl J Med. 2019;381:1103-1113. doi: 10.1056/NEJMoa1904143

7. Kumbhani DJ, Cannon CP, Beavers CJ, et al. 2020 ACC expert consensus decision pathway for anticoagulant and antiplatelet therapy in patients with atrial fibrillation or venous thromboembolism undergoing percutaneous coronary intervention or with atherosclerotic cardiovascular disease: a report of the American College of Cardiology Solution Set Oversight Committee. J Am Coll Cardiol. 2021;77:629-658. doi: 10.1016/j.jacc.2020.09.011

8. Berry NC, Mauri L, Steg PG, et al. Effect of lesion complexity and clinical risk factors on the efficacy and safety of dabigatran dual therapy versus warfarin triple therapy in atrial fibrillation after percutaneous coronary intervention: a subgroup analysis from the REDUAL PCI trial. Circ Cardiovasc Interv. 2020;13:e008349. doi: 10.1161/CIRCINTERVENTIONS.119.008349

References

1. Lee SR, Rhee TM, Kang DY, et al. Meta-analysis of oral anticoagulant monotherapy as an antithrombotic strategy in patients with stable coronary artery disease and nonvalvular atrial fibrillation. Am J Cardiol. 2019;124:879-885. doi: 10.1016/j.amjcard.2019.05.072

2. Fihn SD, Gardin JM, Abrams J, et al; American College of Cardiology Foundation; American Heart Association Task Force on Practice Guidelines; American College of Physicians; American Association for Thoracic Surgery; Preventive Cardiovascular Nurses Association; Society for Cardiovascular Angiography and Interventions; Society of Thoracic Surgeons. 2012 ACCF/AHA/ACP/AATS/PCNA/SCAI/STS guideline for the diagnosis and management of patients with stable ischemic heart disease: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Practice Guidelines, and the American College of Physicians, American Association for Thoracic Surgery, Preventive Cardiovascular Nurses Association, Society for Cardiovascular Angiography and Interventions, and Society of Thoracic Surgeons. J Am Coll Cardiol. 2012;60:e44-e164.

3. Whitlock EP, Burda BU, Williams SB, et al. Bleeding risks with aspirin use for primary prevention in adults: a systematic review for the U.S. Preventive Services Task Force. Ann Intern Med. 2016;164:826-835. doi: 10.7326/M15-2112

4. McNeil JJ, Nelson MR, Woods RL, et al; ASPREE Investigator Group. Effect of aspirin on all-cause mortality in the healthy elderly. N Engl J Med. 2018;379:1519-1528. doi: 10.1056/NEJMoa1803955

5. Schulman S, Kearon C; Subcommittee on Control of Anticoagulation of the Scientific and Standardization Committee of the International Society on Thrombosis and Haemostasis. Definition of major bleeding in clinical investigations of antihemostatic medicinal products in non-surgical patients. J Thromb Haemost. 2005;3:692-694. doi: 10.1111/j.1538-7836.2005.01204.x

6. Yasuda S, Kaikita K, Akao M, et al; AFIRE Investigators. Antithrombotic therapy for atrial fibrillation with stable coronary disease. N Engl J Med. 2019;381:1103-1113. doi: 10.1056/NEJMoa1904143

7. Kumbhani DJ, Cannon CP, Beavers CJ, et al. 2020 ACC expert consensus decision pathway for anticoagulant and antiplatelet therapy in patients with atrial fibrillation or venous thromboembolism undergoing percutaneous coronary intervention or with atherosclerotic cardiovascular disease: a report of the American College of Cardiology Solution Set Oversight Committee. J Am Coll Cardiol. 2021;77:629-658. doi: 10.1016/j.jacc.2020.09.011

8. Berry NC, Mauri L, Steg PG, et al. Effect of lesion complexity and clinical risk factors on the efficacy and safety of dabigatran dual therapy versus warfarin triple therapy in atrial fibrillation after percutaneous coronary intervention: a subgroup analysis from the REDUAL PCI trial. Circ Cardiovasc Interv. 2020;13:e008349. doi: 10.1161/CIRCINTERVENTIONS.119.008349

Issue
The Journal of Family Practice - 70(8)
Issue
The Journal of Family Practice - 70(8)
Page Number
403-404,407
Page Number
403-404,407
Publications
Publications
Topics
Article Type
Display Headline
Monotherapy for nonvalvular A-fib with stable CAD?
Display Headline
Monotherapy for nonvalvular A-fib with stable CAD?
Sections
PURLs Copyright
Copyright © 2021. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Recommend the use of a single oral anticoagulant (OAC) over combination therapy with an OAC and an antiplatelet agent for patients with nonvalvular atrial fibrillation (AF) and stable ischemic heart disease (IHD). Doing so may confer the same benefits with fewer risks.

STRENGTH OF RECOMMENDATION

A: Meta-analysis of 7 trials1

Lee SR, Rhee TM, Kang DY, et al. Meta-analysis of oral anticoagulant monotherapy as an antithrombotic strategy in patients with stable coronary artery disease and nonvalvular atrial fibrillation. Am J Cardiol. 2019;124:879-885. doi: 10.1016/j.amjcard.2019.05.072

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Media Files

Updated USPSTF screening guidelines may reduce lung cancer deaths

Article Type
Changed
Wed, 09/22/2021 - 11:37
Display Headline
Updated USPSTF screening guidelines may reduce lung cancer deaths

ILLUSTRATIVE CASE

A 50-year-old woman presents to your office for a well-woman exam. Her past medical history includes a 22-pack-year smoking history (she quit 5 years ago), well-controlled hypertension, and mild obesity. She has no family history of cancer, but she does have a family history of type 2 diabetes and heart disease. Besides age- and risk-appropriate laboratory tests, cervical cancer screening, breast cancer screening, and initial colon cancer screening, are there any other preventive services you would offer her?

Lung cancer is the second most common cancer in both men and women, and it is the leading cause of cancer death in the United States—regardless of gender. The American Cancer Society estimates that 235,760 people will be diagnosed with lung cancer and 131,880 people will die of the disease in 2021.2

In the 2015 National Cancer Institute report on the economic costs of cancer, direct and indirect costs of lung cancer totaled $21.1 billion annually. Lost productivity from lung cancer added another $36.1 billion in annual costs.3 The economic costs increased to $23.8 billion in 2020, with no data on lost productivity.4

Smoking tobacco is by far the primary risk factor for lung cancer, and it is estimated to account for 90% of all lung cancer cases. Compared with nonsmokers, the relative risk of lung cancer is approximately 20 times higher for smokers.5,6

Because the median age of lung cancer diagnosis is 70 years, increasing age is also considered a risk factor for lung cancer.2,7

Although lung cancer has a relatively poor prognosis—with an average 5-year survival rate of 20.5%—early-stage lung cancer is more amenable to treatment and has a better prognosis (as is true with many cancers).1

LDCT has a high sensitivity, as well as a reasonable specificity, for lung cancer detection. There is demonstrated benefit in screening patients who are at high risk for lung cancer.8-11 In 2013, the USPSTF recommended annual lung cancer screening (B recommendation) with LDCT in adults 55 to 80 years of age who have a 30-pack-year smoking history, and who currently smoke or quit within the past 15 years.1

Continue to: STUDY SUMMARY

 

 

STUDY SUMMARY

Broader eligibility for screening supports mortality benefit

This is an update to the 2013 clinical practice guideline on lung cancer screening. The USPSTF used 2 methods to provide the best possible evidence for the recommendations. The first method was a systematic review of the accuracy of screening for lung cancer with LDCT, evaluating both the benefits and harms of lung cancer screening. The systematic review examined various subgroups, the number and/or frequency of LDCT scans, and various approaches to reducing false-positive results. In addition to the systematic review, they used collaborative modeling studies to determine the optimal age for beginning and ending screening, the optimal screening interval, and the relative benefits and harms of various screening strategies. These modeling studies complemented the evidence review.

This updated guideline nearly doubles eligibility for lung cancer screening using low-dose CT scanning.

The review included 7 randomized controlled trials (RCTs), plus the modeling studies. Only the National Lung Screening Trial (NLST; N = 53,454) and the Nederlands-Leuvens Longkanker Screenings Onderzoek (NELSON) trial (N = 15,792) had adequate power to detect a mortality benefit from screening (NLST: relative risk reduction = 16%; 95% CI, 5%-25%; NELSON: incidence rate ratio = 0.75; 95% CI, 0.61-0.90) compared with no screening.

Screening intervals, from the NLST and NELSON trials as well as the modeling studies, revealed the greatest benefit from annual screening (statistics not shared). Evidence also showed that screening those with lighter smoking histories (< 30 pack-years) and at an earlier age (age 50) provided increased mortality benefit. No evidence was found for a benefit of screening past 80 years of age. The modeling studies concluded that the 2013 USPSTF screening program, using a starting age of 55 and a 30-pack-year smoking history, would reduce mortality by 9.8%, but by changing to a starting age of 50, a 20-pack-year smoking history, and annual screening, the mortality benefit was increased to 13%.1,11

Comparison with computer-based risk prediction models from the Cancer Intervention and Surveillance Modeling Network (CISNET) revealed insufficient evidence at this time to show that prediction model–based screening offered any benefit beyond that of the age and smoking history risk factor model.

The incidence of false-positive results was > 25% in the NLST at baseline and at 1 year. Use of a classification system such as the Lung Imaging Reporting and Data System (Lung-RADS) could reduce that from 26.6% to 12.8%.2 Another potential harm from LDCT screening is radiation exposure. Evidence from several RCTs and cohort studies showed the exposure from 1 LDCT scan to be 0.65 to 2.36 mSv, whereas the annual background radiation in the United States is 2.4 mSv. The modeling studies estimated that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

Expanded age range, reduced pack-year history

Annual lung cancer screening is now recommended to begin for patients at age 50 years with a 20-pack-year history instead of age 55 years with a 30-pack-year history. This would nearly double (87% overall) the number of people eligible for screening, and it would include more Black patients and women, who tend to smoke fewer cigarettes than their White male counterparts. The American College of Radiology estimates that the expanded screening criteria could save between 30,000 and 60,000 lives per year.12

CAVEATS

Screening criteria for upper age limit, years since smoking remain unchanged

For those patients who quit smoking, the guidelines apply only to those who have stopped smoking within the past 15 years. Furthermore, the benefit does not extend beyond age 80 or where other conditions reduce life expectancy. And, as noted earlier, modeling studies estimate that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

CHALLENGES TO IMPLEMENTATION

Concerns about false-positives, ­radiation exposure may limit acceptance

Challenges would be based mostly on the need for greater, more detailed dialogue between physicians and patients at higher risk for lung cancer in a time-constrained environment. Also, LDCT may not be available in some areas, and patients and physicians may have concerns regarding repeated CT exposure. In addition, false-positive results increase patient stress and may adversely affect both patient and physician acceptance.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. US Preventive Services Task Force. Lung cancer: screening. Final recommendation statement. March 9, 2021. Accessed August 19, 2021. https://uspreventiveservicestaskforce.org/uspstf/recommendation/lung-cancer-screening

2. American Cancer Society. Key statistics for lung cancer. Updated January 12, 2021. Accessed August 19, 2021. www.cancer.org/cancer/lung-cancer/about/key-statistics.html

3. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health; 2015.

4. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health. Updated July 2021. Accessed August 19, 2021. https://progressreport.cancer.gov/after/economic_burden

5. Alberg AJ, Brock MV, Ford JG, et al. Epidemiology of lung cancer: diagnosis and management of lung cancer, 3rd ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2013;143(5 suppl):e1S-e29S. doi: 10.1378/chest.12-2345

6. Samet JM. Health benefits of smoking cessation. Clin Chest Med. 1991;12:669-679.

7. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2015. CA Cancer J Clin. 2015;65:5-29. doi: 10.3322/caac.21254

8. National Lung Screening Trial Research Team; Aberle DR, Adams AM, Berg CD, et al. Reduced lung-cancer mortality with low-dose computed tomographic screening. N Engl J Med. 2011;365:395-409. doi: 10.1056/NEJMoa1102873

9. Pinsky PF, Church TR, Izmirlian G, et al. The National Lung Screening Trial: results stratified by demographics, smoking history, and lung cancer histology. Cancer. 2013;119:3976-3983. doi: 10.1002/cncr.28326

10. de Koning HJ, van der Aalst CM, de Jong PA, et al. Reduced lung-cancer mortality with volume CT screening in a randomized trial. N Engl J Med. 2020;382:503-513. doi: 10.1056/NEJMoa1911793

11. Meza R, Jeon J, Toumazis I, et al. Evaluation of the Benefits and Harms of Lung Cancer Screening With Low-Dose Computed Tomography: A Collaborative Modeling Study for the U.S. Preventive Services Task Force. Agency for Healthcare Research and Quality; 2021.

12. American College of Radiology. Updated USPSTF lung cancer screening guidelines would help save lives. July 7, 2020. Accessed August 19, 2021. www.acr.org/Media-Center/ACR-News-Releases/2020/Updated-USPSTF-Lung-Cancer-Screening-Guidelines-Would-Help-Save-Lives

Article PDF
Author and Disclosure Information

Madigan Family Medicine Residency, Joint Base Lewis-McChord, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado, Family Medicine Residency, Denver

Issue
The Journal of Family Practice - 70(7)
Publications
Topics
Page Number
347-349
Sections
Author and Disclosure Information

Madigan Family Medicine Residency, Joint Base Lewis-McChord, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado, Family Medicine Residency, Denver

Author and Disclosure Information

Madigan Family Medicine Residency, Joint Base Lewis-McChord, WA

DEPUTY EDITOR
Corey Lyon, DO

University of Colorado, Family Medicine Residency, Denver

Article PDF
Article PDF

ILLUSTRATIVE CASE

A 50-year-old woman presents to your office for a well-woman exam. Her past medical history includes a 22-pack-year smoking history (she quit 5 years ago), well-controlled hypertension, and mild obesity. She has no family history of cancer, but she does have a family history of type 2 diabetes and heart disease. Besides age- and risk-appropriate laboratory tests, cervical cancer screening, breast cancer screening, and initial colon cancer screening, are there any other preventive services you would offer her?

Lung cancer is the second most common cancer in both men and women, and it is the leading cause of cancer death in the United States—regardless of gender. The American Cancer Society estimates that 235,760 people will be diagnosed with lung cancer and 131,880 people will die of the disease in 2021.2

In the 2015 National Cancer Institute report on the economic costs of cancer, direct and indirect costs of lung cancer totaled $21.1 billion annually. Lost productivity from lung cancer added another $36.1 billion in annual costs.3 The economic costs increased to $23.8 billion in 2020, with no data on lost productivity.4

Smoking tobacco is by far the primary risk factor for lung cancer, and it is estimated to account for 90% of all lung cancer cases. Compared with nonsmokers, the relative risk of lung cancer is approximately 20 times higher for smokers.5,6

Because the median age of lung cancer diagnosis is 70 years, increasing age is also considered a risk factor for lung cancer.2,7

Although lung cancer has a relatively poor prognosis—with an average 5-year survival rate of 20.5%—early-stage lung cancer is more amenable to treatment and has a better prognosis (as is true with many cancers).1

LDCT has a high sensitivity, as well as a reasonable specificity, for lung cancer detection. There is demonstrated benefit in screening patients who are at high risk for lung cancer.8-11 In 2013, the USPSTF recommended annual lung cancer screening (B recommendation) with LDCT in adults 55 to 80 years of age who have a 30-pack-year smoking history, and who currently smoke or quit within the past 15 years.1

Continue to: STUDY SUMMARY

 

 

STUDY SUMMARY

Broader eligibility for screening supports mortality benefit

This is an update to the 2013 clinical practice guideline on lung cancer screening. The USPSTF used 2 methods to provide the best possible evidence for the recommendations. The first method was a systematic review of the accuracy of screening for lung cancer with LDCT, evaluating both the benefits and harms of lung cancer screening. The systematic review examined various subgroups, the number and/or frequency of LDCT scans, and various approaches to reducing false-positive results. In addition to the systematic review, they used collaborative modeling studies to determine the optimal age for beginning and ending screening, the optimal screening interval, and the relative benefits and harms of various screening strategies. These modeling studies complemented the evidence review.

This updated guideline nearly doubles eligibility for lung cancer screening using low-dose CT scanning.

The review included 7 randomized controlled trials (RCTs), plus the modeling studies. Only the National Lung Screening Trial (NLST; N = 53,454) and the Nederlands-Leuvens Longkanker Screenings Onderzoek (NELSON) trial (N = 15,792) had adequate power to detect a mortality benefit from screening (NLST: relative risk reduction = 16%; 95% CI, 5%-25%; NELSON: incidence rate ratio = 0.75; 95% CI, 0.61-0.90) compared with no screening.

Screening intervals, from the NLST and NELSON trials as well as the modeling studies, revealed the greatest benefit from annual screening (statistics not shared). Evidence also showed that screening those with lighter smoking histories (< 30 pack-years) and at an earlier age (age 50) provided increased mortality benefit. No evidence was found for a benefit of screening past 80 years of age. The modeling studies concluded that the 2013 USPSTF screening program, using a starting age of 55 and a 30-pack-year smoking history, would reduce mortality by 9.8%, but by changing to a starting age of 50, a 20-pack-year smoking history, and annual screening, the mortality benefit was increased to 13%.1,11

Comparison with computer-based risk prediction models from the Cancer Intervention and Surveillance Modeling Network (CISNET) revealed insufficient evidence at this time to show that prediction model–based screening offered any benefit beyond that of the age and smoking history risk factor model.

The incidence of false-positive results was > 25% in the NLST at baseline and at 1 year. Use of a classification system such as the Lung Imaging Reporting and Data System (Lung-RADS) could reduce that from 26.6% to 12.8%.2 Another potential harm from LDCT screening is radiation exposure. Evidence from several RCTs and cohort studies showed the exposure from 1 LDCT scan to be 0.65 to 2.36 mSv, whereas the annual background radiation in the United States is 2.4 mSv. The modeling studies estimated that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

Expanded age range, reduced pack-year history

Annual lung cancer screening is now recommended to begin for patients at age 50 years with a 20-pack-year history instead of age 55 years with a 30-pack-year history. This would nearly double (87% overall) the number of people eligible for screening, and it would include more Black patients and women, who tend to smoke fewer cigarettes than their White male counterparts. The American College of Radiology estimates that the expanded screening criteria could save between 30,000 and 60,000 lives per year.12

CAVEATS

Screening criteria for upper age limit, years since smoking remain unchanged

For those patients who quit smoking, the guidelines apply only to those who have stopped smoking within the past 15 years. Furthermore, the benefit does not extend beyond age 80 or where other conditions reduce life expectancy. And, as noted earlier, modeling studies estimate that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

CHALLENGES TO IMPLEMENTATION

Concerns about false-positives, ­radiation exposure may limit acceptance

Challenges would be based mostly on the need for greater, more detailed dialogue between physicians and patients at higher risk for lung cancer in a time-constrained environment. Also, LDCT may not be available in some areas, and patients and physicians may have concerns regarding repeated CT exposure. In addition, false-positive results increase patient stress and may adversely affect both patient and physician acceptance.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

A 50-year-old woman presents to your office for a well-woman exam. Her past medical history includes a 22-pack-year smoking history (she quit 5 years ago), well-controlled hypertension, and mild obesity. She has no family history of cancer, but she does have a family history of type 2 diabetes and heart disease. Besides age- and risk-appropriate laboratory tests, cervical cancer screening, breast cancer screening, and initial colon cancer screening, are there any other preventive services you would offer her?

Lung cancer is the second most common cancer in both men and women, and it is the leading cause of cancer death in the United States—regardless of gender. The American Cancer Society estimates that 235,760 people will be diagnosed with lung cancer and 131,880 people will die of the disease in 2021.2

In the 2015 National Cancer Institute report on the economic costs of cancer, direct and indirect costs of lung cancer totaled $21.1 billion annually. Lost productivity from lung cancer added another $36.1 billion in annual costs.3 The economic costs increased to $23.8 billion in 2020, with no data on lost productivity.4

Smoking tobacco is by far the primary risk factor for lung cancer, and it is estimated to account for 90% of all lung cancer cases. Compared with nonsmokers, the relative risk of lung cancer is approximately 20 times higher for smokers.5,6

Because the median age of lung cancer diagnosis is 70 years, increasing age is also considered a risk factor for lung cancer.2,7

Although lung cancer has a relatively poor prognosis—with an average 5-year survival rate of 20.5%—early-stage lung cancer is more amenable to treatment and has a better prognosis (as is true with many cancers).1

LDCT has a high sensitivity, as well as a reasonable specificity, for lung cancer detection. There is demonstrated benefit in screening patients who are at high risk for lung cancer.8-11 In 2013, the USPSTF recommended annual lung cancer screening (B recommendation) with LDCT in adults 55 to 80 years of age who have a 30-pack-year smoking history, and who currently smoke or quit within the past 15 years.1

Continue to: STUDY SUMMARY

 

 

STUDY SUMMARY

Broader eligibility for screening supports mortality benefit

This is an update to the 2013 clinical practice guideline on lung cancer screening. The USPSTF used 2 methods to provide the best possible evidence for the recommendations. The first method was a systematic review of the accuracy of screening for lung cancer with LDCT, evaluating both the benefits and harms of lung cancer screening. The systematic review examined various subgroups, the number and/or frequency of LDCT scans, and various approaches to reducing false-positive results. In addition to the systematic review, they used collaborative modeling studies to determine the optimal age for beginning and ending screening, the optimal screening interval, and the relative benefits and harms of various screening strategies. These modeling studies complemented the evidence review.

This updated guideline nearly doubles eligibility for lung cancer screening using low-dose CT scanning.

The review included 7 randomized controlled trials (RCTs), plus the modeling studies. Only the National Lung Screening Trial (NLST; N = 53,454) and the Nederlands-Leuvens Longkanker Screenings Onderzoek (NELSON) trial (N = 15,792) had adequate power to detect a mortality benefit from screening (NLST: relative risk reduction = 16%; 95% CI, 5%-25%; NELSON: incidence rate ratio = 0.75; 95% CI, 0.61-0.90) compared with no screening.

Screening intervals, from the NLST and NELSON trials as well as the modeling studies, revealed the greatest benefit from annual screening (statistics not shared). Evidence also showed that screening those with lighter smoking histories (< 30 pack-years) and at an earlier age (age 50) provided increased mortality benefit. No evidence was found for a benefit of screening past 80 years of age. The modeling studies concluded that the 2013 USPSTF screening program, using a starting age of 55 and a 30-pack-year smoking history, would reduce mortality by 9.8%, but by changing to a starting age of 50, a 20-pack-year smoking history, and annual screening, the mortality benefit was increased to 13%.1,11

Comparison with computer-based risk prediction models from the Cancer Intervention and Surveillance Modeling Network (CISNET) revealed insufficient evidence at this time to show that prediction model–based screening offered any benefit beyond that of the age and smoking history risk factor model.

The incidence of false-positive results was > 25% in the NLST at baseline and at 1 year. Use of a classification system such as the Lung Imaging Reporting and Data System (Lung-RADS) could reduce that from 26.6% to 12.8%.2 Another potential harm from LDCT screening is radiation exposure. Evidence from several RCTs and cohort studies showed the exposure from 1 LDCT scan to be 0.65 to 2.36 mSv, whereas the annual background radiation in the United States is 2.4 mSv. The modeling studies estimated that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

Continue to: WHAT'S NEW

 

 

WHAT’S NEW

Expanded age range, reduced pack-year history

Annual lung cancer screening is now recommended to begin for patients at age 50 years with a 20-pack-year history instead of age 55 years with a 30-pack-year history. This would nearly double (87% overall) the number of people eligible for screening, and it would include more Black patients and women, who tend to smoke fewer cigarettes than their White male counterparts. The American College of Radiology estimates that the expanded screening criteria could save between 30,000 and 60,000 lives per year.12

CAVEATS

Screening criteria for upper age limit, years since smoking remain unchanged

For those patients who quit smoking, the guidelines apply only to those who have stopped smoking within the past 15 years. Furthermore, the benefit does not extend beyond age 80 or where other conditions reduce life expectancy. And, as noted earlier, modeling studies estimate that there would be 1 death caused by LDCT for every 18.5 cancer deaths avoided.1,11

CHALLENGES TO IMPLEMENTATION

Concerns about false-positives, ­radiation exposure may limit acceptance

Challenges would be based mostly on the need for greater, more detailed dialogue between physicians and patients at higher risk for lung cancer in a time-constrained environment. Also, LDCT may not be available in some areas, and patients and physicians may have concerns regarding repeated CT exposure. In addition, false-positive results increase patient stress and may adversely affect both patient and physician acceptance.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. US Preventive Services Task Force. Lung cancer: screening. Final recommendation statement. March 9, 2021. Accessed August 19, 2021. https://uspreventiveservicestaskforce.org/uspstf/recommendation/lung-cancer-screening

2. American Cancer Society. Key statistics for lung cancer. Updated January 12, 2021. Accessed August 19, 2021. www.cancer.org/cancer/lung-cancer/about/key-statistics.html

3. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health; 2015.

4. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health. Updated July 2021. Accessed August 19, 2021. https://progressreport.cancer.gov/after/economic_burden

5. Alberg AJ, Brock MV, Ford JG, et al. Epidemiology of lung cancer: diagnosis and management of lung cancer, 3rd ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2013;143(5 suppl):e1S-e29S. doi: 10.1378/chest.12-2345

6. Samet JM. Health benefits of smoking cessation. Clin Chest Med. 1991;12:669-679.

7. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2015. CA Cancer J Clin. 2015;65:5-29. doi: 10.3322/caac.21254

8. National Lung Screening Trial Research Team; Aberle DR, Adams AM, Berg CD, et al. Reduced lung-cancer mortality with low-dose computed tomographic screening. N Engl J Med. 2011;365:395-409. doi: 10.1056/NEJMoa1102873

9. Pinsky PF, Church TR, Izmirlian G, et al. The National Lung Screening Trial: results stratified by demographics, smoking history, and lung cancer histology. Cancer. 2013;119:3976-3983. doi: 10.1002/cncr.28326

10. de Koning HJ, van der Aalst CM, de Jong PA, et al. Reduced lung-cancer mortality with volume CT screening in a randomized trial. N Engl J Med. 2020;382:503-513. doi: 10.1056/NEJMoa1911793

11. Meza R, Jeon J, Toumazis I, et al. Evaluation of the Benefits and Harms of Lung Cancer Screening With Low-Dose Computed Tomography: A Collaborative Modeling Study for the U.S. Preventive Services Task Force. Agency for Healthcare Research and Quality; 2021.

12. American College of Radiology. Updated USPSTF lung cancer screening guidelines would help save lives. July 7, 2020. Accessed August 19, 2021. www.acr.org/Media-Center/ACR-News-Releases/2020/Updated-USPSTF-Lung-Cancer-Screening-Guidelines-Would-Help-Save-Lives

References

1. US Preventive Services Task Force. Lung cancer: screening. Final recommendation statement. March 9, 2021. Accessed August 19, 2021. https://uspreventiveservicestaskforce.org/uspstf/recommendation/lung-cancer-screening

2. American Cancer Society. Key statistics for lung cancer. Updated January 12, 2021. Accessed August 19, 2021. www.cancer.org/cancer/lung-cancer/about/key-statistics.html

3. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health; 2015.

4. National Cancer Institute. Cancer Trends Progress Report—Financial Burden of Cancer Care. National Institutes of Health. Updated July 2021. Accessed August 19, 2021. https://progressreport.cancer.gov/after/economic_burden

5. Alberg AJ, Brock MV, Ford JG, et al. Epidemiology of lung cancer: diagnosis and management of lung cancer, 3rd ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2013;143(5 suppl):e1S-e29S. doi: 10.1378/chest.12-2345

6. Samet JM. Health benefits of smoking cessation. Clin Chest Med. 1991;12:669-679.

7. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2015. CA Cancer J Clin. 2015;65:5-29. doi: 10.3322/caac.21254

8. National Lung Screening Trial Research Team; Aberle DR, Adams AM, Berg CD, et al. Reduced lung-cancer mortality with low-dose computed tomographic screening. N Engl J Med. 2011;365:395-409. doi: 10.1056/NEJMoa1102873

9. Pinsky PF, Church TR, Izmirlian G, et al. The National Lung Screening Trial: results stratified by demographics, smoking history, and lung cancer histology. Cancer. 2013;119:3976-3983. doi: 10.1002/cncr.28326

10. de Koning HJ, van der Aalst CM, de Jong PA, et al. Reduced lung-cancer mortality with volume CT screening in a randomized trial. N Engl J Med. 2020;382:503-513. doi: 10.1056/NEJMoa1911793

11. Meza R, Jeon J, Toumazis I, et al. Evaluation of the Benefits and Harms of Lung Cancer Screening With Low-Dose Computed Tomography: A Collaborative Modeling Study for the U.S. Preventive Services Task Force. Agency for Healthcare Research and Quality; 2021.

12. American College of Radiology. Updated USPSTF lung cancer screening guidelines would help save lives. July 7, 2020. Accessed August 19, 2021. www.acr.org/Media-Center/ACR-News-Releases/2020/Updated-USPSTF-Lung-Cancer-Screening-Guidelines-Would-Help-Save-Lives

Issue
The Journal of Family Practice - 70(7)
Issue
The Journal of Family Practice - 70(7)
Page Number
347-349
Page Number
347-349
Publications
Publications
Topics
Article Type
Display Headline
Updated USPSTF screening guidelines may reduce lung cancer deaths
Display Headline
Updated USPSTF screening guidelines may reduce lung cancer deaths
Sections
PURLs Copyright
Copyright © 2021. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Start assessing risk and screening for lung cancer at age 50 in patients who have a 20-pack-year history of smoking, using low-dose computed tomography (LDCT) scanning. This practice, based on a 2020 US Preventive Services Task Force (USPSTF) guideline update, is expected to reduce annual mortality from lung cancer by an additional 3% or more (from 9.8% to 13%).

STRENGTH OF RECOMMENDATION

A: Evidence-based clinical practice guideline1

US Preventive Services Task Force. Lung cancer: screening. Final recommendation statement. March 9, 2021. Accessed August 19, 2021. https://uspreventiveservicestaskforce.org/uspstf/recommendation/lung-cancer-screening

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

This is not the time to modify a HTN regimen

Article Type
Changed
Tue, 05/03/2022 - 15:05
Display Headline
This is not the time to modify a HTN regimen

ILLUSTRATIVE CASE

A 67-year-old man with hypertension that is well controlled on hydrochlorothiazide 25 mg po daily was admitted to the family medicine inpatient service for community-­acquired pneumonia requiring antibiotic therapy and oxygen support. Despite improvement in his overall condition, his blood pressure was consistently > 160/90 mm Hg during his hospitalization. He was treated with lisinopril 10 mg po daily in addition to his home medications, which helped achieve recommended blood pressure goals.

Prior to discharge, his blood pressure was noted to be 108/62 mm Hg. He asks if it is necessary to continue this new blood pressure medicine, as his home blood pressure readings had been within the goal set by his primary care physician. Should you continue this new antihypertensive agent at discharge?

Outpatient antihypertensive medication regimens are commonly intensified at hospital discharge in response to transient short-term elevations in blood pressure during inpatient encounters for noncardiac conditions.1,2 This is typically a reflexive response during a hospitalization, despite the unknown long-term, patient-oriented clinical outcomes. These short-term, in-­hospital blood pressure elevations may be due to numerous temporary causes, such as stress/anxiety, a pain response, agitation, a medication adverse effect, or volume overload.3

The transition from inpatient to outpatient care is a high-risk period, especially for older adults, as functional status is generally worse at hospital discharge than prehospitalization baseline.4 To compound this problem, adverse drug reactions are a common cause of hospitalization for older adults. Changing blood pressure medications in response to acute physiologic changes during illness may contribute to patient harm. Although observational studies of adverse drug reactions related to blood pressure medications are numerous, researchers have only evaluated adverse drug reactions pertaining to hospital admissions.5-8 This study sought to evaluate the clinical outcomes associated with intensification of antihypertensive regimens at discharge among older adults.

STUDY SUMMARY

Increased risk of readmission, adverse events after intensification at discharge

This retrospective cohort study, which was conducted across multiple Veterans Health Administration (VHA) hospitals, evaluated the association between intensifying blood pressure medication regimens at hospital discharge and sustained clinical outcomes in the outpatient setting. Study participants were community-dwelling adults (98% male) ages 65 years or older who had a prehospitalization diagnosis of hypertension and were admitted for pneumonia, urinary tract infection, or venous thromboembolism over a 3-year period (n = 4056).

This first study to address outcomes related to intensification of BP regimens at discharge found increased risk of readmission and serious adverse events within 30 days.

Antihypertensive medication changes at discharge were evaluated using information pulled from VHA pharmacies, combined with clinical data merged from VHA and Medicare claims. Intensification was defined as either adding a new blood pressure medication or a dose increase of more than 20% on a previously prescribed antihypertensive medication. Patients were excluded if they were discharged with a secondary diagnosis that required modifications to a blood pressure medication (such as atrial fibrillation, acute coronary syndrome, or stroke), were hospitalized in the previous 30 days, were admitted from a skilled nursing facility, or received more than 20% of their care (including filling prescriptions) outside the VHA system.

Primary outcomes included hospital readmission or SAEs (falls, syncope, hypotension, serious electrolyte abnormalities, or acute kidney injury) within 30 days or having a cardiovascular event within 1 year of hospital discharge. Secondary outcomes included the change in systolic blood pressure (SBP) within 1 year after discharge. Propensity score matching was used as a balancing factor to create a matched-pairs cohort to compare those receiving blood pressure medication intensification at hospital discharge with those who did not.

Continue to: Intensification of the blood pressure...

 

 

Intensification of the blood pressure regimen at hospital discharge was associated with an increased risk in 30-day hospital readmission (hazard ratio [HR] = 1.23; 95% CI, 1.07–1.42; number needed to harm [NNH] = 27) and SAEs (HR = 1.41; 95% CI, 1.06–1.88; NNH = 63). There was no associated reduction in cardiovascular events (HR = 1.18; 95% CI, 0.99–1.40) or change in mean SBP within 1 year after hospital discharge in those who received intensification vs those who did not (mean BP, 134.7 vs 134.4 mm Hg; difference-in-differences estimate = 0.2 mm Hg; 95% CI, −2.0 to 2.4 mm Hg).

WHAT’S NEW

First study on outcomes related to HTN med changes at hospital discharge

This well-designed, retrospective cohort study provides important clinical data to help guide inpatient blood pressure management decisions for patients with noncardiac conditions. No clinical trials up to that time had assessed patient-oriented outcomes when antihypertensive medication regimens were intensified at hospital discharge.

CAVEATS

Study population: Primarily older men with noncardiac conditions

Selected populations benefit from intensive blood pressure control based on specific risk factors and medical conditions. In patients at high risk for cardiovascular disease, without a history of stroke or diabetes, intensive blood pressure control (SBP < 120 mm Hg) improves cardiovascular outcomes and overall survival compared with standard therapy (SBP < 140 mm Hg).9 This retrospective cohort study involved mainly elderly male patients with noncardiac conditions. The study also excluded patients with a secondary diagnosis requiring modifications to an antihypertensive regimen, such as atrial fibrillation, acute coronary syndrome, or cerebrovascular accident. Thus, the findings may not be applicable to these patient populations.

CHALLENGES TO IMPLEMENTATION

Clinicians will need to address individual needs

Physicians have to balance various antihypertensive management strategies, as competing medical specialty society guidelines recommend differing targets for optimal blood pressure control. Given the concern for medicolegal liability and potential harms of therapeutic inertia, inpatient physicians must consider whether hospitalization is the best time to alter medications for long-term outpatient blood pressure control. Finally, the decision to leave blood pressure management to outpatient physicians assumes the patient has a continuity relationship with a primary care medical home.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

Files
References

1. Anderson TS, Jing B, Auerbach A, et al. Clinical outcomes after intensifying antihypertensive medication regimens among older adults at hospital discharge. JAMA Intern Med. 2019;179:1528-1536.

2. Harris CM, Sridharan A, Landis R, et al. What happens to the medication regimens of older adults during and after an acute hospitalization? J Patient Saf. 2013;9:150-153.

3. Aung WM, Menon SV, Materson BJ. Management of hypertension in hospitalized patients. Hosp Pract (1995). 2015;43:101-106.

4. Covinsky KE, Palmer RM, Fortinsky RH, et al. Loss of independence in activities of daily living in older adults hospitalized with medical illnesses: increased vulnerability with age. J Am Geriatr Soc. 2003;51:451-458.

5. Omer HMRB, Hodson J, Pontefract SK, et al. Inpatient falls in older adults: a cohort study of antihypertensive prescribing pre- and post-fall. BMC Geriatr. 2018;18:58.

6. Alhawassi TM, Krass I, Pont LG. Antihypertensive-related adverse drug reactions among older hospitalized adults. Int J Clin Pharm. 2018;40:428-435.

7. Passarelli MCG, Jacob-Filho W, Figueras A. Adverse drug reactions in an elderly hospitalised population: inappropriate prescription is a leading cause. Drugs Aging. 2005;22:767-777.

8. Beckett NS, Peters R, Fletcher AE, et al; HYVET Study Group. Treatment of hypertension in patients 80 years of age or older. N Engl J Med. 2008;358:1887-1898.

9. SPRINT Research Group; Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116. Published correction appears in N Engl J Med. 2017;377:2506.

Article PDF
Author and Disclosure Information

University of Missouri, Columbia

DEPUTY EDITOR
Jennie B. Jarrett, PharmD, BCPS, MMedEd, FCCP

University of Illinois at Chicago

Issue
The Journal of Family Practice - 70(6)
Publications
Topics
Page Number
293-295
Sections
Files
Files
Author and Disclosure Information

University of Missouri, Columbia

DEPUTY EDITOR
Jennie B. Jarrett, PharmD, BCPS, MMedEd, FCCP

University of Illinois at Chicago

Author and Disclosure Information

University of Missouri, Columbia

DEPUTY EDITOR
Jennie B. Jarrett, PharmD, BCPS, MMedEd, FCCP

University of Illinois at Chicago

Article PDF
Article PDF

ILLUSTRATIVE CASE

A 67-year-old man with hypertension that is well controlled on hydrochlorothiazide 25 mg po daily was admitted to the family medicine inpatient service for community-­acquired pneumonia requiring antibiotic therapy and oxygen support. Despite improvement in his overall condition, his blood pressure was consistently > 160/90 mm Hg during his hospitalization. He was treated with lisinopril 10 mg po daily in addition to his home medications, which helped achieve recommended blood pressure goals.

Prior to discharge, his blood pressure was noted to be 108/62 mm Hg. He asks if it is necessary to continue this new blood pressure medicine, as his home blood pressure readings had been within the goal set by his primary care physician. Should you continue this new antihypertensive agent at discharge?

Outpatient antihypertensive medication regimens are commonly intensified at hospital discharge in response to transient short-term elevations in blood pressure during inpatient encounters for noncardiac conditions.1,2 This is typically a reflexive response during a hospitalization, despite the unknown long-term, patient-oriented clinical outcomes. These short-term, in-­hospital blood pressure elevations may be due to numerous temporary causes, such as stress/anxiety, a pain response, agitation, a medication adverse effect, or volume overload.3

The transition from inpatient to outpatient care is a high-risk period, especially for older adults, as functional status is generally worse at hospital discharge than prehospitalization baseline.4 To compound this problem, adverse drug reactions are a common cause of hospitalization for older adults. Changing blood pressure medications in response to acute physiologic changes during illness may contribute to patient harm. Although observational studies of adverse drug reactions related to blood pressure medications are numerous, researchers have only evaluated adverse drug reactions pertaining to hospital admissions.5-8 This study sought to evaluate the clinical outcomes associated with intensification of antihypertensive regimens at discharge among older adults.

STUDY SUMMARY

Increased risk of readmission, adverse events after intensification at discharge

This retrospective cohort study, which was conducted across multiple Veterans Health Administration (VHA) hospitals, evaluated the association between intensifying blood pressure medication regimens at hospital discharge and sustained clinical outcomes in the outpatient setting. Study participants were community-dwelling adults (98% male) ages 65 years or older who had a prehospitalization diagnosis of hypertension and were admitted for pneumonia, urinary tract infection, or venous thromboembolism over a 3-year period (n = 4056).

This first study to address outcomes related to intensification of BP regimens at discharge found increased risk of readmission and serious adverse events within 30 days.

Antihypertensive medication changes at discharge were evaluated using information pulled from VHA pharmacies, combined with clinical data merged from VHA and Medicare claims. Intensification was defined as either adding a new blood pressure medication or a dose increase of more than 20% on a previously prescribed antihypertensive medication. Patients were excluded if they were discharged with a secondary diagnosis that required modifications to a blood pressure medication (such as atrial fibrillation, acute coronary syndrome, or stroke), were hospitalized in the previous 30 days, were admitted from a skilled nursing facility, or received more than 20% of their care (including filling prescriptions) outside the VHA system.

Primary outcomes included hospital readmission or SAEs (falls, syncope, hypotension, serious electrolyte abnormalities, or acute kidney injury) within 30 days or having a cardiovascular event within 1 year of hospital discharge. Secondary outcomes included the change in systolic blood pressure (SBP) within 1 year after discharge. Propensity score matching was used as a balancing factor to create a matched-pairs cohort to compare those receiving blood pressure medication intensification at hospital discharge with those who did not.

Continue to: Intensification of the blood pressure...

 

 

Intensification of the blood pressure regimen at hospital discharge was associated with an increased risk in 30-day hospital readmission (hazard ratio [HR] = 1.23; 95% CI, 1.07–1.42; number needed to harm [NNH] = 27) and SAEs (HR = 1.41; 95% CI, 1.06–1.88; NNH = 63). There was no associated reduction in cardiovascular events (HR = 1.18; 95% CI, 0.99–1.40) or change in mean SBP within 1 year after hospital discharge in those who received intensification vs those who did not (mean BP, 134.7 vs 134.4 mm Hg; difference-in-differences estimate = 0.2 mm Hg; 95% CI, −2.0 to 2.4 mm Hg).

WHAT’S NEW

First study on outcomes related to HTN med changes at hospital discharge

This well-designed, retrospective cohort study provides important clinical data to help guide inpatient blood pressure management decisions for patients with noncardiac conditions. No clinical trials up to that time had assessed patient-oriented outcomes when antihypertensive medication regimens were intensified at hospital discharge.

CAVEATS

Study population: Primarily older men with noncardiac conditions

Selected populations benefit from intensive blood pressure control based on specific risk factors and medical conditions. In patients at high risk for cardiovascular disease, without a history of stroke or diabetes, intensive blood pressure control (SBP < 120 mm Hg) improves cardiovascular outcomes and overall survival compared with standard therapy (SBP < 140 mm Hg).9 This retrospective cohort study involved mainly elderly male patients with noncardiac conditions. The study also excluded patients with a secondary diagnosis requiring modifications to an antihypertensive regimen, such as atrial fibrillation, acute coronary syndrome, or cerebrovascular accident. Thus, the findings may not be applicable to these patient populations.

CHALLENGES TO IMPLEMENTATION

Clinicians will need to address individual needs

Physicians have to balance various antihypertensive management strategies, as competing medical specialty society guidelines recommend differing targets for optimal blood pressure control. Given the concern for medicolegal liability and potential harms of therapeutic inertia, inpatient physicians must consider whether hospitalization is the best time to alter medications for long-term outpatient blood pressure control. Finally, the decision to leave blood pressure management to outpatient physicians assumes the patient has a continuity relationship with a primary care medical home.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

ILLUSTRATIVE CASE

A 67-year-old man with hypertension that is well controlled on hydrochlorothiazide 25 mg po daily was admitted to the family medicine inpatient service for community-­acquired pneumonia requiring antibiotic therapy and oxygen support. Despite improvement in his overall condition, his blood pressure was consistently > 160/90 mm Hg during his hospitalization. He was treated with lisinopril 10 mg po daily in addition to his home medications, which helped achieve recommended blood pressure goals.

Prior to discharge, his blood pressure was noted to be 108/62 mm Hg. He asks if it is necessary to continue this new blood pressure medicine, as his home blood pressure readings had been within the goal set by his primary care physician. Should you continue this new antihypertensive agent at discharge?

Outpatient antihypertensive medication regimens are commonly intensified at hospital discharge in response to transient short-term elevations in blood pressure during inpatient encounters for noncardiac conditions.1,2 This is typically a reflexive response during a hospitalization, despite the unknown long-term, patient-oriented clinical outcomes. These short-term, in-­hospital blood pressure elevations may be due to numerous temporary causes, such as stress/anxiety, a pain response, agitation, a medication adverse effect, or volume overload.3

The transition from inpatient to outpatient care is a high-risk period, especially for older adults, as functional status is generally worse at hospital discharge than prehospitalization baseline.4 To compound this problem, adverse drug reactions are a common cause of hospitalization for older adults. Changing blood pressure medications in response to acute physiologic changes during illness may contribute to patient harm. Although observational studies of adverse drug reactions related to blood pressure medications are numerous, researchers have only evaluated adverse drug reactions pertaining to hospital admissions.5-8 This study sought to evaluate the clinical outcomes associated with intensification of antihypertensive regimens at discharge among older adults.

STUDY SUMMARY

Increased risk of readmission, adverse events after intensification at discharge

This retrospective cohort study, which was conducted across multiple Veterans Health Administration (VHA) hospitals, evaluated the association between intensifying blood pressure medication regimens at hospital discharge and sustained clinical outcomes in the outpatient setting. Study participants were community-dwelling adults (98% male) ages 65 years or older who had a prehospitalization diagnosis of hypertension and were admitted for pneumonia, urinary tract infection, or venous thromboembolism over a 3-year period (n = 4056).

This first study to address outcomes related to intensification of BP regimens at discharge found increased risk of readmission and serious adverse events within 30 days.

Antihypertensive medication changes at discharge were evaluated using information pulled from VHA pharmacies, combined with clinical data merged from VHA and Medicare claims. Intensification was defined as either adding a new blood pressure medication or a dose increase of more than 20% on a previously prescribed antihypertensive medication. Patients were excluded if they were discharged with a secondary diagnosis that required modifications to a blood pressure medication (such as atrial fibrillation, acute coronary syndrome, or stroke), were hospitalized in the previous 30 days, were admitted from a skilled nursing facility, or received more than 20% of their care (including filling prescriptions) outside the VHA system.

Primary outcomes included hospital readmission or SAEs (falls, syncope, hypotension, serious electrolyte abnormalities, or acute kidney injury) within 30 days or having a cardiovascular event within 1 year of hospital discharge. Secondary outcomes included the change in systolic blood pressure (SBP) within 1 year after discharge. Propensity score matching was used as a balancing factor to create a matched-pairs cohort to compare those receiving blood pressure medication intensification at hospital discharge with those who did not.

Continue to: Intensification of the blood pressure...

 

 

Intensification of the blood pressure regimen at hospital discharge was associated with an increased risk in 30-day hospital readmission (hazard ratio [HR] = 1.23; 95% CI, 1.07–1.42; number needed to harm [NNH] = 27) and SAEs (HR = 1.41; 95% CI, 1.06–1.88; NNH = 63). There was no associated reduction in cardiovascular events (HR = 1.18; 95% CI, 0.99–1.40) or change in mean SBP within 1 year after hospital discharge in those who received intensification vs those who did not (mean BP, 134.7 vs 134.4 mm Hg; difference-in-differences estimate = 0.2 mm Hg; 95% CI, −2.0 to 2.4 mm Hg).

WHAT’S NEW

First study on outcomes related to HTN med changes at hospital discharge

This well-designed, retrospective cohort study provides important clinical data to help guide inpatient blood pressure management decisions for patients with noncardiac conditions. No clinical trials up to that time had assessed patient-oriented outcomes when antihypertensive medication regimens were intensified at hospital discharge.

CAVEATS

Study population: Primarily older men with noncardiac conditions

Selected populations benefit from intensive blood pressure control based on specific risk factors and medical conditions. In patients at high risk for cardiovascular disease, without a history of stroke or diabetes, intensive blood pressure control (SBP < 120 mm Hg) improves cardiovascular outcomes and overall survival compared with standard therapy (SBP < 140 mm Hg).9 This retrospective cohort study involved mainly elderly male patients with noncardiac conditions. The study also excluded patients with a secondary diagnosis requiring modifications to an antihypertensive regimen, such as atrial fibrillation, acute coronary syndrome, or cerebrovascular accident. Thus, the findings may not be applicable to these patient populations.

CHALLENGES TO IMPLEMENTATION

Clinicians will need to address individual needs

Physicians have to balance various antihypertensive management strategies, as competing medical specialty society guidelines recommend differing targets for optimal blood pressure control. Given the concern for medicolegal liability and potential harms of therapeutic inertia, inpatient physicians must consider whether hospitalization is the best time to alter medications for long-term outpatient blood pressure control. Finally, the decision to leave blood pressure management to outpatient physicians assumes the patient has a continuity relationship with a primary care medical home.

ACKNOWLEDGEMENT

The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center for Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources or the National Institutes of Health.

References

1. Anderson TS, Jing B, Auerbach A, et al. Clinical outcomes after intensifying antihypertensive medication regimens among older adults at hospital discharge. JAMA Intern Med. 2019;179:1528-1536.

2. Harris CM, Sridharan A, Landis R, et al. What happens to the medication regimens of older adults during and after an acute hospitalization? J Patient Saf. 2013;9:150-153.

3. Aung WM, Menon SV, Materson BJ. Management of hypertension in hospitalized patients. Hosp Pract (1995). 2015;43:101-106.

4. Covinsky KE, Palmer RM, Fortinsky RH, et al. Loss of independence in activities of daily living in older adults hospitalized with medical illnesses: increased vulnerability with age. J Am Geriatr Soc. 2003;51:451-458.

5. Omer HMRB, Hodson J, Pontefract SK, et al. Inpatient falls in older adults: a cohort study of antihypertensive prescribing pre- and post-fall. BMC Geriatr. 2018;18:58.

6. Alhawassi TM, Krass I, Pont LG. Antihypertensive-related adverse drug reactions among older hospitalized adults. Int J Clin Pharm. 2018;40:428-435.

7. Passarelli MCG, Jacob-Filho W, Figueras A. Adverse drug reactions in an elderly hospitalised population: inappropriate prescription is a leading cause. Drugs Aging. 2005;22:767-777.

8. Beckett NS, Peters R, Fletcher AE, et al; HYVET Study Group. Treatment of hypertension in patients 80 years of age or older. N Engl J Med. 2008;358:1887-1898.

9. SPRINT Research Group; Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116. Published correction appears in N Engl J Med. 2017;377:2506.

References

1. Anderson TS, Jing B, Auerbach A, et al. Clinical outcomes after intensifying antihypertensive medication regimens among older adults at hospital discharge. JAMA Intern Med. 2019;179:1528-1536.

2. Harris CM, Sridharan A, Landis R, et al. What happens to the medication regimens of older adults during and after an acute hospitalization? J Patient Saf. 2013;9:150-153.

3. Aung WM, Menon SV, Materson BJ. Management of hypertension in hospitalized patients. Hosp Pract (1995). 2015;43:101-106.

4. Covinsky KE, Palmer RM, Fortinsky RH, et al. Loss of independence in activities of daily living in older adults hospitalized with medical illnesses: increased vulnerability with age. J Am Geriatr Soc. 2003;51:451-458.

5. Omer HMRB, Hodson J, Pontefract SK, et al. Inpatient falls in older adults: a cohort study of antihypertensive prescribing pre- and post-fall. BMC Geriatr. 2018;18:58.

6. Alhawassi TM, Krass I, Pont LG. Antihypertensive-related adverse drug reactions among older hospitalized adults. Int J Clin Pharm. 2018;40:428-435.

7. Passarelli MCG, Jacob-Filho W, Figueras A. Adverse drug reactions in an elderly hospitalised population: inappropriate prescription is a leading cause. Drugs Aging. 2005;22:767-777.

8. Beckett NS, Peters R, Fletcher AE, et al; HYVET Study Group. Treatment of hypertension in patients 80 years of age or older. N Engl J Med. 2008;358:1887-1898.

9. SPRINT Research Group; Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116. Published correction appears in N Engl J Med. 2017;377:2506.

Issue
The Journal of Family Practice - 70(6)
Issue
The Journal of Family Practice - 70(6)
Page Number
293-295
Page Number
293-295
Publications
Publications
Topics
Article Type
Display Headline
This is not the time to modify a HTN regimen
Display Headline
This is not the time to modify a HTN regimen
Sections
PURLs Copyright
Copyright © 2021. The Family Physicians Inquiries Network. All rights reserved.
Inside the Article

PRACTICE CHANGER

Avoid intensifying antihypertensive medication regimens at hospital discharge in older adults; making such changes increases the risk of serious adverse events (SAEs) and hospital readmission within 30 days without reducing the risk of serious cardiovascular events at 1 year post discharge.

STRENGTH OF RECOMMENDATION

B: Based on a large retrospective cohort study evaluating patient-oriented outcomes.1

Anderson TS, Jing B, Auerbach A, et al. Clinical outcomes after intensifying antihypertensive medication regimens among older adults at hospital discharge. JAMA Intern Med. 2019;179:1528-1536.

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Media Files