RCT confirms CT scan screening catches lung cancer early

Article Type
Changed
Tue, 02/04/2020 - 12:04

CT scan screening of older people with heavy smoking histories – using lesion volume, not diameter, as a trigger for further work-up – reduced lung cancer deaths by about 30% in a randomized trial from the Netherlands and Belgium with almost 16,000 current and former smokers, investigators reported in the New England Journal of Medicine.

The Dutch-Belgian lung-cancer screening trial (Nederlands-Leuvens Longkanker Screenings Onderzoek [NELSON]) is “arguably the only adequately powered trial other than the” National Lung Screening Trial (NLST) in the United States to assess the role of CT scan screening among smokers, wrote University of London cancer epidemiologist Stephen Duffy, MSc, and University of Liverpool molecular oncology professor John Field, PhD, in an accompanying editorial.

NLST, which used lesion diameter, found an approximately 20% lower lung cancer mortality than screening with chest x-rays among 53,454 heavy smokers after a median follow-up of 6.5 years. The trial ultimately led the U.S. Preventive Services Task Force to recommend annual screening for people aged 55-80 years with a history of at least 30 pack-years.

European countries have considered similar programs but have hesitated “partly due to doubts fostered by the early publication of inconclusive results of a number of smaller trials in Europe. These doubts should be laid to rest,” Mr. Duffy and Dr. Field wrote.

“With the NELSON results, the efficacy of low-dose CT screening for lung cancer is confirmed. Our job is no longer to assess whether low-dose CT screening for lung cancer works; it does. Our job is to identify the target population in which it will be acceptable and cost effective,” they added.

The 15,789 NELSON participants (84% men, with a median age of 58 years and 38 pack-year history) were randomized about 1:1 to either low-dose CT scan screening at baseline and 1, 3, and 5.5 years, or to no screening.

At 10 years follow-up, there were 5.58 lung cancer cases and 2.5 deaths per 1,000 person-years in the screened group versus 4.91 cases and 3.3 deaths per 1,000 person-years among controls. Lung-cancer mortality was 24% lower among screened subjects overall, and 33% lower among screened women. The team estimated that screening prevented about 60 lung cancer deaths.

Using volume instead of diameter “resulted in low[er] referral rates” – 2.1% with a positive predictive value of 43.5% versus 24% with a positive predictive value of 3.8% in NLST – for additional work-up, explained investigators led by H.J. de Koning, MD, PhD, of the department of public health at Erasmus University Medical Center in Rotterdam, the Netherlands.

The upper limit of overdiagnosis risk – a major concern with any screening program – was 18.5% with NLST versus 8.9% with NELSON, they wrote.

In short: “Volume CT screening enabled a significant reduction of harms (e.g., false positive tests and unnecessary work-up procedures) without jeopardizing favorable outcomes,” the investigators wrote. Indeed, an ad hoc analysis suggested “more-favorable effects on lung-cancer mortality than in the NLST, despite lower referral rates for suspicious lesions” and the fact that NLST used annual screening.

“Recently,” Mr. Duffy and Dr. Field explained in their editorial, “the NELSON investigators evaluated both diameter and volume measurement to estimate lung-nodule size as an imaging biomarker for nodule management; this provided evidence that using mean or maximum axial diameter to assess nodule volume led to a substantial overestimation of nodule volume.” Direct measurement of volume “resulted in a substantial number of early-stage cancers identified at the time of diagnosis and avoided false positives from the overestimation incurred by management based on diameter.”

“The lung-nodule management system used in the NELSON trial has been advocated in the European position statement on lung-cancer screening. This will improve the acceptability of the intervention, because the rate of further investigation has been a major concern in lung cancer screening,” they wrote.

Baseline characteristics did not differ significantly between the screened and unscreened in NELSON, except for a slightly longer duration of smoking in the screened group.

The work was funded by the Netherlands Organization of Health Research and Development, among others. Mr. Duffy and Dr. de Koning didn’t report any disclosures. Dr. Field is an advisor for AstraZeneca, Epigenomics, and Nucleix, and has a research grant to his university from Janssen.
 

SOURCE: de Honing HJ et al. N Engl J Med. 2020 Jan 29. doi: 10.1056/NEJMoa1911793.

Publications
Topics
Sections

CT scan screening of older people with heavy smoking histories – using lesion volume, not diameter, as a trigger for further work-up – reduced lung cancer deaths by about 30% in a randomized trial from the Netherlands and Belgium with almost 16,000 current and former smokers, investigators reported in the New England Journal of Medicine.

The Dutch-Belgian lung-cancer screening trial (Nederlands-Leuvens Longkanker Screenings Onderzoek [NELSON]) is “arguably the only adequately powered trial other than the” National Lung Screening Trial (NLST) in the United States to assess the role of CT scan screening among smokers, wrote University of London cancer epidemiologist Stephen Duffy, MSc, and University of Liverpool molecular oncology professor John Field, PhD, in an accompanying editorial.

NLST, which used lesion diameter, found an approximately 20% lower lung cancer mortality than screening with chest x-rays among 53,454 heavy smokers after a median follow-up of 6.5 years. The trial ultimately led the U.S. Preventive Services Task Force to recommend annual screening for people aged 55-80 years with a history of at least 30 pack-years.

European countries have considered similar programs but have hesitated “partly due to doubts fostered by the early publication of inconclusive results of a number of smaller trials in Europe. These doubts should be laid to rest,” Mr. Duffy and Dr. Field wrote.

“With the NELSON results, the efficacy of low-dose CT screening for lung cancer is confirmed. Our job is no longer to assess whether low-dose CT screening for lung cancer works; it does. Our job is to identify the target population in which it will be acceptable and cost effective,” they added.

The 15,789 NELSON participants (84% men, with a median age of 58 years and 38 pack-year history) were randomized about 1:1 to either low-dose CT scan screening at baseline and 1, 3, and 5.5 years, or to no screening.

At 10 years follow-up, there were 5.58 lung cancer cases and 2.5 deaths per 1,000 person-years in the screened group versus 4.91 cases and 3.3 deaths per 1,000 person-years among controls. Lung-cancer mortality was 24% lower among screened subjects overall, and 33% lower among screened women. The team estimated that screening prevented about 60 lung cancer deaths.

Using volume instead of diameter “resulted in low[er] referral rates” – 2.1% with a positive predictive value of 43.5% versus 24% with a positive predictive value of 3.8% in NLST – for additional work-up, explained investigators led by H.J. de Koning, MD, PhD, of the department of public health at Erasmus University Medical Center in Rotterdam, the Netherlands.

The upper limit of overdiagnosis risk – a major concern with any screening program – was 18.5% with NLST versus 8.9% with NELSON, they wrote.

In short: “Volume CT screening enabled a significant reduction of harms (e.g., false positive tests and unnecessary work-up procedures) without jeopardizing favorable outcomes,” the investigators wrote. Indeed, an ad hoc analysis suggested “more-favorable effects on lung-cancer mortality than in the NLST, despite lower referral rates for suspicious lesions” and the fact that NLST used annual screening.

“Recently,” Mr. Duffy and Dr. Field explained in their editorial, “the NELSON investigators evaluated both diameter and volume measurement to estimate lung-nodule size as an imaging biomarker for nodule management; this provided evidence that using mean or maximum axial diameter to assess nodule volume led to a substantial overestimation of nodule volume.” Direct measurement of volume “resulted in a substantial number of early-stage cancers identified at the time of diagnosis and avoided false positives from the overestimation incurred by management based on diameter.”

“The lung-nodule management system used in the NELSON trial has been advocated in the European position statement on lung-cancer screening. This will improve the acceptability of the intervention, because the rate of further investigation has been a major concern in lung cancer screening,” they wrote.

Baseline characteristics did not differ significantly between the screened and unscreened in NELSON, except for a slightly longer duration of smoking in the screened group.

The work was funded by the Netherlands Organization of Health Research and Development, among others. Mr. Duffy and Dr. de Koning didn’t report any disclosures. Dr. Field is an advisor for AstraZeneca, Epigenomics, and Nucleix, and has a research grant to his university from Janssen.
 

SOURCE: de Honing HJ et al. N Engl J Med. 2020 Jan 29. doi: 10.1056/NEJMoa1911793.

CT scan screening of older people with heavy smoking histories – using lesion volume, not diameter, as a trigger for further work-up – reduced lung cancer deaths by about 30% in a randomized trial from the Netherlands and Belgium with almost 16,000 current and former smokers, investigators reported in the New England Journal of Medicine.

The Dutch-Belgian lung-cancer screening trial (Nederlands-Leuvens Longkanker Screenings Onderzoek [NELSON]) is “arguably the only adequately powered trial other than the” National Lung Screening Trial (NLST) in the United States to assess the role of CT scan screening among smokers, wrote University of London cancer epidemiologist Stephen Duffy, MSc, and University of Liverpool molecular oncology professor John Field, PhD, in an accompanying editorial.

NLST, which used lesion diameter, found an approximately 20% lower lung cancer mortality than screening with chest x-rays among 53,454 heavy smokers after a median follow-up of 6.5 years. The trial ultimately led the U.S. Preventive Services Task Force to recommend annual screening for people aged 55-80 years with a history of at least 30 pack-years.

European countries have considered similar programs but have hesitated “partly due to doubts fostered by the early publication of inconclusive results of a number of smaller trials in Europe. These doubts should be laid to rest,” Mr. Duffy and Dr. Field wrote.

“With the NELSON results, the efficacy of low-dose CT screening for lung cancer is confirmed. Our job is no longer to assess whether low-dose CT screening for lung cancer works; it does. Our job is to identify the target population in which it will be acceptable and cost effective,” they added.

The 15,789 NELSON participants (84% men, with a median age of 58 years and 38 pack-year history) were randomized about 1:1 to either low-dose CT scan screening at baseline and 1, 3, and 5.5 years, or to no screening.

At 10 years follow-up, there were 5.58 lung cancer cases and 2.5 deaths per 1,000 person-years in the screened group versus 4.91 cases and 3.3 deaths per 1,000 person-years among controls. Lung-cancer mortality was 24% lower among screened subjects overall, and 33% lower among screened women. The team estimated that screening prevented about 60 lung cancer deaths.

Using volume instead of diameter “resulted in low[er] referral rates” – 2.1% with a positive predictive value of 43.5% versus 24% with a positive predictive value of 3.8% in NLST – for additional work-up, explained investigators led by H.J. de Koning, MD, PhD, of the department of public health at Erasmus University Medical Center in Rotterdam, the Netherlands.

The upper limit of overdiagnosis risk – a major concern with any screening program – was 18.5% with NLST versus 8.9% with NELSON, they wrote.

In short: “Volume CT screening enabled a significant reduction of harms (e.g., false positive tests and unnecessary work-up procedures) without jeopardizing favorable outcomes,” the investigators wrote. Indeed, an ad hoc analysis suggested “more-favorable effects on lung-cancer mortality than in the NLST, despite lower referral rates for suspicious lesions” and the fact that NLST used annual screening.

“Recently,” Mr. Duffy and Dr. Field explained in their editorial, “the NELSON investigators evaluated both diameter and volume measurement to estimate lung-nodule size as an imaging biomarker for nodule management; this provided evidence that using mean or maximum axial diameter to assess nodule volume led to a substantial overestimation of nodule volume.” Direct measurement of volume “resulted in a substantial number of early-stage cancers identified at the time of diagnosis and avoided false positives from the overestimation incurred by management based on diameter.”

“The lung-nodule management system used in the NELSON trial has been advocated in the European position statement on lung-cancer screening. This will improve the acceptability of the intervention, because the rate of further investigation has been a major concern in lung cancer screening,” they wrote.

Baseline characteristics did not differ significantly between the screened and unscreened in NELSON, except for a slightly longer duration of smoking in the screened group.

The work was funded by the Netherlands Organization of Health Research and Development, among others. Mr. Duffy and Dr. de Koning didn’t report any disclosures. Dr. Field is an advisor for AstraZeneca, Epigenomics, and Nucleix, and has a research grant to his university from Janssen.
 

SOURCE: de Honing HJ et al. N Engl J Med. 2020 Jan 29. doi: 10.1056/NEJMoa1911793.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

The scents-less life and the speaking mummy

Article Type
Changed
Mon, 02/24/2020 - 10:28

 

If I only had a nose

PeopleImages/E+

Deaf and blind people get all the attention. Special schools, Braille, sign language, even a pinball-focused rock opera. And it is easy to see why: Those senses are kind of important when it comes to navigating the world. But what if you have to live without one of the less-cool senses? What if the nose doesn’t know?

According to research published in Clinical Otolaryngology, up to 5% of the world’s population has some sort of smell disorder, preventing them from either smelling correctly or smelling anything at all. And the effects of this on everyday life are drastic.

In a survey of 71 people with smell disorders, the researchers found that patients experience a smorgasbord of negative effects – ranging from poor hazard perception and poor sense of personal hygiene, to an inability to enjoy food and an inability to link smell to happy memories. The whiff of gingerbread on Christmas morning, the smoke of a bonfire on a summer evening – the smell-deprived miss out on them all. The negative emotions those people experience read like a recipe for your very own homemade Sith lord: sadness, regret, isolation, anxiety, anger, frustration. A path to the dark side, losing your scent is.

Speaking of fictional bad guys, this nasal-based research really could have benefited one Lord Volde ... fine, You-Know-Who. Just look at that face. That’s a man who can’t smell. You can’t tell us he wouldn’t have turned out better if only Dorothy had picked him up on the yellow brick road instead of some dumb scarecrow.
 

The sound of hieroglyphics

stigalenas/iStock/Getty Images Plus

The Rosetta Stone revealed the meaning of Egyptian hieroglyphics and unlocked the ancient language of the Pharaohs for modern humans. But that mute stele said nothing about what those who uttered that ancient tongue sounded like.

Researchers at London’s Royal Holloway College may now know the answer. At least, a monosyllabic one.

The answer comes (indirectly) from Egyptian priest Nesyamun, a former resident of Thebes who worked at the temple of Karnak, but who now calls the Leeds City Museum home. Or, to be precise, Nesyamun’s 3,000-year-old mummified remains live on in Leeds. Nesyamun’s religious duties during his Karnak career likely required a smooth singing style and an accomplished speaking voice.

In a paper published in Scientific Reports, the British scientists say they’ve now heard the sound of the Egyptian priest’s long-silenced liturgical voice.

Working from CT scans of Nesyamun’s relatively well-preserved vocal-tract soft tissue, the scientists used a 3D-printed vocal tract and an electronic larynx to synthesize the actual sound of his voice.

And the result? Did the crooning priest of Karnak utter a Boris Karloffian curse upon those who had disturbed his millennia-long slumber? Did he deliver a rousing hymn of praise to Egypt’s ruler during the turbulent 1070s bce, Ramses XI?

In fact, what emerged from Nesyamun’s synthesized throat was ... “eh.” Maybe “a,” as in “bad.”

Given the state of the priest’s tongue (shrunken) and his soft palate (missing), the researchers say those monosyllabic sounds are the best Nesyamun can muster in his present state. Other experts say actual words from the ancients are likely impossible.

Perhaps one day, science will indeed be able to synthesize whole words or sentences from other well-preserved residents of the distant past. May we all live to hear an unyielding Ramses II himself chew the scenery like his Hollywood doppelganger, Yul Brynner: “So let it be written! So let it be done!”
 

 

 

To beard or not to beard

RyanJLane/E+

People are funny, and men, who happen to be people, are no exception.

Men, you see, have these things called beards, and there are definitely more men running around with facial hair these days. A lot of women go through a lot of trouble to get rid of a lot of their hair. But men, well, we grow extra hair. Why?

That’s what Honest Amish, a maker of beard-care products, wanted to know. They commissioned OnePoll to conduct a survey of 2,000 Americans, both men and women, to learn all kinds of things about beards.

So what did they find? Facial hair confidence, that’s what. Three-quarters of men said that a beard made them feel more confident than did a bare face, and 73% said that facial hair makes a man more attractive. That number was a bit lower among women, 63% of whom said that facial hair made a man more attractive.

That doesn’t seem very funny, does it? We’re getting there.

Male respondents also were asked what they would do to get the perfect beard: 40% would be willing to spend a night in jail or give up coffee for a year, and 38% would stand in line at the DMV for an entire day. Somewhat less popular responses included giving up sex for a year (22%) – seems like a waste of all that new-found confidence – and shaving their heads (18%).

And that, we don’t mind saying, is a hair-raising conclusion.




 

Publications
Topics
Sections

 

If I only had a nose

PeopleImages/E+

Deaf and blind people get all the attention. Special schools, Braille, sign language, even a pinball-focused rock opera. And it is easy to see why: Those senses are kind of important when it comes to navigating the world. But what if you have to live without one of the less-cool senses? What if the nose doesn’t know?

According to research published in Clinical Otolaryngology, up to 5% of the world’s population has some sort of smell disorder, preventing them from either smelling correctly or smelling anything at all. And the effects of this on everyday life are drastic.

In a survey of 71 people with smell disorders, the researchers found that patients experience a smorgasbord of negative effects – ranging from poor hazard perception and poor sense of personal hygiene, to an inability to enjoy food and an inability to link smell to happy memories. The whiff of gingerbread on Christmas morning, the smoke of a bonfire on a summer evening – the smell-deprived miss out on them all. The negative emotions those people experience read like a recipe for your very own homemade Sith lord: sadness, regret, isolation, anxiety, anger, frustration. A path to the dark side, losing your scent is.

Speaking of fictional bad guys, this nasal-based research really could have benefited one Lord Volde ... fine, You-Know-Who. Just look at that face. That’s a man who can’t smell. You can’t tell us he wouldn’t have turned out better if only Dorothy had picked him up on the yellow brick road instead of some dumb scarecrow.
 

The sound of hieroglyphics

stigalenas/iStock/Getty Images Plus

The Rosetta Stone revealed the meaning of Egyptian hieroglyphics and unlocked the ancient language of the Pharaohs for modern humans. But that mute stele said nothing about what those who uttered that ancient tongue sounded like.

Researchers at London’s Royal Holloway College may now know the answer. At least, a monosyllabic one.

The answer comes (indirectly) from Egyptian priest Nesyamun, a former resident of Thebes who worked at the temple of Karnak, but who now calls the Leeds City Museum home. Or, to be precise, Nesyamun’s 3,000-year-old mummified remains live on in Leeds. Nesyamun’s religious duties during his Karnak career likely required a smooth singing style and an accomplished speaking voice.

In a paper published in Scientific Reports, the British scientists say they’ve now heard the sound of the Egyptian priest’s long-silenced liturgical voice.

Working from CT scans of Nesyamun’s relatively well-preserved vocal-tract soft tissue, the scientists used a 3D-printed vocal tract and an electronic larynx to synthesize the actual sound of his voice.

And the result? Did the crooning priest of Karnak utter a Boris Karloffian curse upon those who had disturbed his millennia-long slumber? Did he deliver a rousing hymn of praise to Egypt’s ruler during the turbulent 1070s bce, Ramses XI?

In fact, what emerged from Nesyamun’s synthesized throat was ... “eh.” Maybe “a,” as in “bad.”

Given the state of the priest’s tongue (shrunken) and his soft palate (missing), the researchers say those monosyllabic sounds are the best Nesyamun can muster in his present state. Other experts say actual words from the ancients are likely impossible.

Perhaps one day, science will indeed be able to synthesize whole words or sentences from other well-preserved residents of the distant past. May we all live to hear an unyielding Ramses II himself chew the scenery like his Hollywood doppelganger, Yul Brynner: “So let it be written! So let it be done!”
 

 

 

To beard or not to beard

RyanJLane/E+

People are funny, and men, who happen to be people, are no exception.

Men, you see, have these things called beards, and there are definitely more men running around with facial hair these days. A lot of women go through a lot of trouble to get rid of a lot of their hair. But men, well, we grow extra hair. Why?

That’s what Honest Amish, a maker of beard-care products, wanted to know. They commissioned OnePoll to conduct a survey of 2,000 Americans, both men and women, to learn all kinds of things about beards.

So what did they find? Facial hair confidence, that’s what. Three-quarters of men said that a beard made them feel more confident than did a bare face, and 73% said that facial hair makes a man more attractive. That number was a bit lower among women, 63% of whom said that facial hair made a man more attractive.

That doesn’t seem very funny, does it? We’re getting there.

Male respondents also were asked what they would do to get the perfect beard: 40% would be willing to spend a night in jail or give up coffee for a year, and 38% would stand in line at the DMV for an entire day. Somewhat less popular responses included giving up sex for a year (22%) – seems like a waste of all that new-found confidence – and shaving their heads (18%).

And that, we don’t mind saying, is a hair-raising conclusion.




 

 

If I only had a nose

PeopleImages/E+

Deaf and blind people get all the attention. Special schools, Braille, sign language, even a pinball-focused rock opera. And it is easy to see why: Those senses are kind of important when it comes to navigating the world. But what if you have to live without one of the less-cool senses? What if the nose doesn’t know?

According to research published in Clinical Otolaryngology, up to 5% of the world’s population has some sort of smell disorder, preventing them from either smelling correctly or smelling anything at all. And the effects of this on everyday life are drastic.

In a survey of 71 people with smell disorders, the researchers found that patients experience a smorgasbord of negative effects – ranging from poor hazard perception and poor sense of personal hygiene, to an inability to enjoy food and an inability to link smell to happy memories. The whiff of gingerbread on Christmas morning, the smoke of a bonfire on a summer evening – the smell-deprived miss out on them all. The negative emotions those people experience read like a recipe for your very own homemade Sith lord: sadness, regret, isolation, anxiety, anger, frustration. A path to the dark side, losing your scent is.

Speaking of fictional bad guys, this nasal-based research really could have benefited one Lord Volde ... fine, You-Know-Who. Just look at that face. That’s a man who can’t smell. You can’t tell us he wouldn’t have turned out better if only Dorothy had picked him up on the yellow brick road instead of some dumb scarecrow.
 

The sound of hieroglyphics

stigalenas/iStock/Getty Images Plus

The Rosetta Stone revealed the meaning of Egyptian hieroglyphics and unlocked the ancient language of the Pharaohs for modern humans. But that mute stele said nothing about what those who uttered that ancient tongue sounded like.

Researchers at London’s Royal Holloway College may now know the answer. At least, a monosyllabic one.

The answer comes (indirectly) from Egyptian priest Nesyamun, a former resident of Thebes who worked at the temple of Karnak, but who now calls the Leeds City Museum home. Or, to be precise, Nesyamun’s 3,000-year-old mummified remains live on in Leeds. Nesyamun’s religious duties during his Karnak career likely required a smooth singing style and an accomplished speaking voice.

In a paper published in Scientific Reports, the British scientists say they’ve now heard the sound of the Egyptian priest’s long-silenced liturgical voice.

Working from CT scans of Nesyamun’s relatively well-preserved vocal-tract soft tissue, the scientists used a 3D-printed vocal tract and an electronic larynx to synthesize the actual sound of his voice.

And the result? Did the crooning priest of Karnak utter a Boris Karloffian curse upon those who had disturbed his millennia-long slumber? Did he deliver a rousing hymn of praise to Egypt’s ruler during the turbulent 1070s bce, Ramses XI?

In fact, what emerged from Nesyamun’s synthesized throat was ... “eh.” Maybe “a,” as in “bad.”

Given the state of the priest’s tongue (shrunken) and his soft palate (missing), the researchers say those monosyllabic sounds are the best Nesyamun can muster in his present state. Other experts say actual words from the ancients are likely impossible.

Perhaps one day, science will indeed be able to synthesize whole words or sentences from other well-preserved residents of the distant past. May we all live to hear an unyielding Ramses II himself chew the scenery like his Hollywood doppelganger, Yul Brynner: “So let it be written! So let it be done!”
 

 

 

To beard or not to beard

RyanJLane/E+

People are funny, and men, who happen to be people, are no exception.

Men, you see, have these things called beards, and there are definitely more men running around with facial hair these days. A lot of women go through a lot of trouble to get rid of a lot of their hair. But men, well, we grow extra hair. Why?

That’s what Honest Amish, a maker of beard-care products, wanted to know. They commissioned OnePoll to conduct a survey of 2,000 Americans, both men and women, to learn all kinds of things about beards.

So what did they find? Facial hair confidence, that’s what. Three-quarters of men said that a beard made them feel more confident than did a bare face, and 73% said that facial hair makes a man more attractive. That number was a bit lower among women, 63% of whom said that facial hair made a man more attractive.

That doesn’t seem very funny, does it? We’re getting there.

Male respondents also were asked what they would do to get the perfect beard: 40% would be willing to spend a night in jail or give up coffee for a year, and 38% would stand in line at the DMV for an entire day. Somewhat less popular responses included giving up sex for a year (22%) – seems like a waste of all that new-found confidence – and shaving their heads (18%).

And that, we don’t mind saying, is a hair-raising conclusion.




 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Psoriasis: A look back over the past 50 years, and forward to next steps

Article Type
Changed
Tue, 02/07/2023 - 16:50

 

Imagine a patient suffering with horrible psoriasis for decades having failed “every available treatment.” Imagine him living all that time with “flaking, cracking, painful, itchy skin,” only to develop cirrhosis after exposure to toxic therapies.

Joel M. Gelfand, MD, director of the Psoriasis and Phototherapy Treatment Center at the University of Pennsylvania, Philadelphia.
Dr. Joel Gelfand

Then imagine the experience for that patient when, 2 weeks after initiating treatment with a new interleukin-17 inhibitor, his skin clears completely.

“Two weeks later it’s all gone – it was a moment to behold,” said Joel M. Gelfand, MD, professor of dermatology and epidemiology at the University of Pennsylvania, Philadelphia, who had cared for the man for many years before a psoriasis treatment revolution of sorts took the field of dermatology by storm.

“The progress has been breathtaking – there’s no other way to describe it – and it feels like a miracle every time I see a new patient who has tough disease and I have all these things to offer them,” he continued. “For most patients, I can really help them and make a major difference in their life.”

Much of the progress in psoriasis treatment in the past 50 years unfolded over the past 2 decades, with biologics emerging for psoriasis, said Mark Lebwohl, MD, Waldman professor of dermatology and chair of the Kimberly and Eric J. Waldman department of dermatology at the Icahn School of Medicine at Mount Sinai, New York.

Dr. Mark Lebwohl with a patient.

Dr. Lebwohl recounted some of his own experiences with psoriasis patients before the advent of treatments – particularly biologics – that have transformed practice.

There was a time when psoriasis patients had little more to turn to than the effective – but “disgusting” – Goeckerman Regimen involving cycles of UVB light exposure and topical crude coal tar application. Initially, the regimen, which was introduced in the 1920s, was used around the clock on an inpatient basis until the skin cleared, Dr. Lebwohl said.

In the 1970s, the immunosuppressive chemotherapy drug methotrexate became the first oral systemic therapy approved for severe psoriasis. For those with disabling disease, it offered some hope for relief, but only about 40% of patients achieved at least a 75% reduction in the Psoriasis Area and Severity Index score (PASI 75), he said, adding that they did so at the expense of the liver and bone marrow. “But it was the only thing we had for severe psoriasis other than light treatments.”

In the 1980s and 1990s, oral retinoids emerged as a treatment for psoriasis, and the immunosuppressive drug cyclosporine used to prevent organ rejection in some transplant patients was found to clear psoriasis in affected transplant recipients. Although they brought relief to some patients with severe, disabling disease, these also came with a high price. “It’s not that effective, and it has lots of side effects ... and causes kidney damage in essentially 100% of patients,” Dr. Lebwohl said of cyclosporine.

“So we had treatments that worked, but because the side effects were sufficiently severe, a lot of patients were not treated,” he said.

 

 

Enter the biologics era

The early 2000s brought the first two approvals for psoriasis: alefacept (Amevive), a “modestly effective, but quite safe” immunosuppressive dimeric fusion protein approved in early 2003 for moderate to severe plaque psoriasis, and efalizumab (Raptiva), a recombinant humanized monoclonal antibody approved in October 2003; both were T-cell–targeted therapies. The former was withdrawn from the market voluntarily as newer agents became available, and the latter was withdrawn in 2009 because of a link with development of progressive multifocal leukoencephalopathy.

Tumor necrosis factor (TNF) blockers, which had been used effectively for RA and Crohn’s disease, emerged next, and were highly effective, much safer than the systemic treatments, and gained “very widespread use,” Dr. Lebwohl said.

Dr. Alice Gottelieb


His colleague Alice B. Gottlieb, MD, PhD, was among the pioneers in the development of TNF blockers for the treatment of psoriasis. Her seminal, investigator-initiated paper on the efficacy and safety of infliximab (Remicade) monotherapy for plaque-type psoriasis published in the Lancet in 2001 helped launch the current era in which many psoriasis patients achieve 100% PASI responses with limited side effects, he said, explaining that subsequent research elucidated the role of IL-12 and -23 – leading to effective treatments like ustekinumab (Stelara), and later IL-17, which is, “in fact, the molecule closest to the pathogenesis of psoriasis.”

“If you block IL-17, you get rid of psoriasis,” he said, noting that there are now several companies with approved antibodies to IL-17. “Taltz [ixekizumab] and Cosentyx [secukinumab] are the leading ones, and Siliq [brodalumab] blocks the receptor for IL-17, so it is very effective.”

Another novel biologic – bimekizumab – is on the horizon. It blocks both IL-17a and IL-17f, and appears highly effective in psoriasis and psoriatic arthritis (PsA). “Biologics were the real start of the [psoriasis treatment] revolution,” he said. “When I started out I would speak at patient meetings and the patients were angry at their physicians; they thought they weren’t aggressive enough, they were very frustrated.”

Dr. Lebwohl described patients he would see at annual National Psoriasis Foundation meetings: “There were patients in wheel chairs, because they couldn’t walk. They would be red and scaly all over ... you could have literally swept up scale like it was snow after one of those meetings.

“You go forward to around 2010 – nobody’s in wheelchairs anymore, everybody has clear skin, and it’s become a party; patients are no longer angry – they are thrilled with the results they are getting from much safer and much more effective drugs,” he said. “So it’s been a pleasure taking care of those patients and going from a very difficult time of treating them, to a time where we’ve done a great job treating them.”

Dr. Lebwohl noted that a “large number of dermatologists have been involved with the development of these drugs and making sure they succeed, and that has also been a pleasure to see.”

Dr. Gottlieb, who Dr. Lebwohl has described as “a superstar” in the fields of dermatology and rheumatology, is one such researcher. In an interview, she looked back on her work and the ways that her work “opened the field,” led to many of her trainees also doing “great work,” and changed the lives of patients.

“It’s nice to feel that I really did change, fundamentally, how psoriasis patients are treated,” said Dr. Gottlieb, who is a clinical professor in the department of dermatology at the Icahn School of Medicine at Mount Sinai. “That obviously feels great.”

She recalled a patient – “a 6-foot-5 biker with bad psoriasis” – who “literally, the minute the door closed, he was crying about how horrible his disease was.”

“And I cleared him ... and then you get big hugs – it just feels extremely good ... giving somebody their life back,” she said.

Dr. Gottlieb has been involved in much of the work in developing biologics for psoriasis, including the ongoing work with bimekizumab for PsA as mentioned by Dr. Lebwohl.

If the phase 2 data with bimekizumab are replicated in the ongoing phase 3 trials now underway at her center, “that can really raise the bar ... so if it’s reproducible, it’s very exciting.”

“It’s exciting to have an IL-23 blocker that, at least in clinical trials, showed inhibition of radiographic progression [in PsA],” she said. “That’s guselkumab those data are already out, and I was involved with that.”

The early work of Dr. Gottlieb and others has also “spread to other diseases,” like hidradenitis suppurativa and atopic dermatitis, she said, noting that numerous studies are underway.

Aside from curing all patients, her ultimate goal is getting to a point where psoriasis has no effect on patients’ quality of life.

“And I see it already,” she said. “It’s happening, and it’s nice to see that it’s happening in children now, too; several of the drugs are approved in kids.”

Alan Menter, MD, chairman of the Division of Dermatology at Baylor University Medical Center, Dallas
Dr. Alan Menter

Alan Menter, MD, chairman of the division of dermatology at Baylor University Medical Center, Dallas, also a prolific researcher – and chair of the guidelines committee that published two new sets of guidelines for psoriasis treatment in 2019 – said that the field of dermatology was “late to the biologic evolution,” as many of the early biologics were first approved for PsA.

“But over the last 10 years, things have changed dramatically,” he said. “After that we suddenly leapt ahead of everybody. ... We now have 11 biologic drugs approved for psoriasis, which is more than any other disease has available.”

It’s been “highly exciting” to see this “evolution and revolution,” he commented, adding that one of the next challenges is to address the comorbidities, such as cardiovascular disease, associated with psoriasis.

“The big question now ... is if you improve skin and you improve joints, can you potentially reduce the risk of coronary artery disease,” he said. “Everybody is looking at that, and to me it’s one of the most exciting things that we’re doing.”

Work is ongoing to look at whether the IL-17s and IL-23s have “other indications outside of the skin and joints,” both within and outside of dermatology.

Like Dr. Gottlieb, Dr. Menter also mentioned the potential for hidradenitis suppurativa, and also for a condition that is rarely discussed or studied: genital psoriasis. Ixekizumab has recently been shown to work in about 75% of patients with genital psoriasis, he noted.

Another important area of research is the identification of biomarkers for predicting response and relapse, he said. For now, biomarker research has disappointed, he added, predicting that it will take at least 3-5 years before biomarkers to help guide treatment are identified.

Indeed, Dr. Gelfand, who also is director of the Psoriasis and Phototherapy Treatment Center, vice chair of clinical research, and medical director of the dermatology clinical studies unit at the University of Pennsylvania, agreed there is a need for research to improve treatment selection.

Advances are being made in genetics – with more than 80 different genes now identified as being related to psoriasis – and in medical informatics – which allow thousands of patients to be followed for years, he said, noting that this could elucidate immunopathological features that can improve treatments, predict and prevent comorbidity, and further improve outcomes.

“We also need care that is more patient centered,” he said, describing the ongoing pragmatic LITE trial of home- or office-based phototherapy for which he is the lead investigator, and other studies that he hopes will expand access to care.

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee
Dr. Kenneth Brian Gordon

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee, whose career started in the basic science immunology arena, added the need for expanding benefit to patients with more-moderate disease. Like Dr. Menter, he identified psoriasis as the area in medicine that has had the greatest degree of advancement, except perhaps for hepatitis C.

He described the process not as a “bench-to-bedside” story, but as a bedside-to-bench, then “back-to-bedside” story.

It was really about taking those early T-cell–targeted biologics and anti-TNF agents from bedside to bench with the realization of the importance of the IL-23 and IL-17 pathways, and that understanding led back to the bedside with the development of the newest agents – and to a “huge difference in patient’s lives.”

“But we’ve gotten so good at treating patients with severe disease ... the question now is how to take care of those with more-moderate disease,” he said, noting that a focus on cost and better delivery systems will be needed for that population.

That research is underway, and the future looks bright – and clear.
 

 

 

“I think with psoriasis therapy and where we’ve come in the last 20 years ... we have a hard time remembering what it was like before we had biologic agents” he said. “Our perspective has changed a lot, and sometimes we forget that.”

In fact, “psoriasis has sort of dragged dermatology into the world of modern clinical trial science, and we can now apply that to all sorts of other diseases,” he said. “The psoriasis trials were the first really well-done large-scale trials in dermatology, and I think that has given dermatology a real leg up in how we do clinical research and how we do evidence-based medicine.”

All of the doctors interviewed for this story have received funds and/or honoraria from, consulted with, are employed with, or served on the advisory boards of manufacturers of biologics. Dr. Gelfand is a copatent holder of resiquimod for treatment of cutaneous T-cell lymphoma and is deputy editor of the Journal of Investigative Dermatology.

Publications
Topics
Sections

 

Imagine a patient suffering with horrible psoriasis for decades having failed “every available treatment.” Imagine him living all that time with “flaking, cracking, painful, itchy skin,” only to develop cirrhosis after exposure to toxic therapies.

Joel M. Gelfand, MD, director of the Psoriasis and Phototherapy Treatment Center at the University of Pennsylvania, Philadelphia.
Dr. Joel Gelfand

Then imagine the experience for that patient when, 2 weeks after initiating treatment with a new interleukin-17 inhibitor, his skin clears completely.

“Two weeks later it’s all gone – it was a moment to behold,” said Joel M. Gelfand, MD, professor of dermatology and epidemiology at the University of Pennsylvania, Philadelphia, who had cared for the man for many years before a psoriasis treatment revolution of sorts took the field of dermatology by storm.

“The progress has been breathtaking – there’s no other way to describe it – and it feels like a miracle every time I see a new patient who has tough disease and I have all these things to offer them,” he continued. “For most patients, I can really help them and make a major difference in their life.”

Much of the progress in psoriasis treatment in the past 50 years unfolded over the past 2 decades, with biologics emerging for psoriasis, said Mark Lebwohl, MD, Waldman professor of dermatology and chair of the Kimberly and Eric J. Waldman department of dermatology at the Icahn School of Medicine at Mount Sinai, New York.

Dr. Mark Lebwohl with a patient.

Dr. Lebwohl recounted some of his own experiences with psoriasis patients before the advent of treatments – particularly biologics – that have transformed practice.

There was a time when psoriasis patients had little more to turn to than the effective – but “disgusting” – Goeckerman Regimen involving cycles of UVB light exposure and topical crude coal tar application. Initially, the regimen, which was introduced in the 1920s, was used around the clock on an inpatient basis until the skin cleared, Dr. Lebwohl said.

In the 1970s, the immunosuppressive chemotherapy drug methotrexate became the first oral systemic therapy approved for severe psoriasis. For those with disabling disease, it offered some hope for relief, but only about 40% of patients achieved at least a 75% reduction in the Psoriasis Area and Severity Index score (PASI 75), he said, adding that they did so at the expense of the liver and bone marrow. “But it was the only thing we had for severe psoriasis other than light treatments.”

In the 1980s and 1990s, oral retinoids emerged as a treatment for psoriasis, and the immunosuppressive drug cyclosporine used to prevent organ rejection in some transplant patients was found to clear psoriasis in affected transplant recipients. Although they brought relief to some patients with severe, disabling disease, these also came with a high price. “It’s not that effective, and it has lots of side effects ... and causes kidney damage in essentially 100% of patients,” Dr. Lebwohl said of cyclosporine.

“So we had treatments that worked, but because the side effects were sufficiently severe, a lot of patients were not treated,” he said.

 

 

Enter the biologics era

The early 2000s brought the first two approvals for psoriasis: alefacept (Amevive), a “modestly effective, but quite safe” immunosuppressive dimeric fusion protein approved in early 2003 for moderate to severe plaque psoriasis, and efalizumab (Raptiva), a recombinant humanized monoclonal antibody approved in October 2003; both were T-cell–targeted therapies. The former was withdrawn from the market voluntarily as newer agents became available, and the latter was withdrawn in 2009 because of a link with development of progressive multifocal leukoencephalopathy.

Tumor necrosis factor (TNF) blockers, which had been used effectively for RA and Crohn’s disease, emerged next, and were highly effective, much safer than the systemic treatments, and gained “very widespread use,” Dr. Lebwohl said.

Dr. Alice Gottelieb


His colleague Alice B. Gottlieb, MD, PhD, was among the pioneers in the development of TNF blockers for the treatment of psoriasis. Her seminal, investigator-initiated paper on the efficacy and safety of infliximab (Remicade) monotherapy for plaque-type psoriasis published in the Lancet in 2001 helped launch the current era in which many psoriasis patients achieve 100% PASI responses with limited side effects, he said, explaining that subsequent research elucidated the role of IL-12 and -23 – leading to effective treatments like ustekinumab (Stelara), and later IL-17, which is, “in fact, the molecule closest to the pathogenesis of psoriasis.”

“If you block IL-17, you get rid of psoriasis,” he said, noting that there are now several companies with approved antibodies to IL-17. “Taltz [ixekizumab] and Cosentyx [secukinumab] are the leading ones, and Siliq [brodalumab] blocks the receptor for IL-17, so it is very effective.”

Another novel biologic – bimekizumab – is on the horizon. It blocks both IL-17a and IL-17f, and appears highly effective in psoriasis and psoriatic arthritis (PsA). “Biologics were the real start of the [psoriasis treatment] revolution,” he said. “When I started out I would speak at patient meetings and the patients were angry at their physicians; they thought they weren’t aggressive enough, they were very frustrated.”

Dr. Lebwohl described patients he would see at annual National Psoriasis Foundation meetings: “There were patients in wheel chairs, because they couldn’t walk. They would be red and scaly all over ... you could have literally swept up scale like it was snow after one of those meetings.

“You go forward to around 2010 – nobody’s in wheelchairs anymore, everybody has clear skin, and it’s become a party; patients are no longer angry – they are thrilled with the results they are getting from much safer and much more effective drugs,” he said. “So it’s been a pleasure taking care of those patients and going from a very difficult time of treating them, to a time where we’ve done a great job treating them.”

Dr. Lebwohl noted that a “large number of dermatologists have been involved with the development of these drugs and making sure they succeed, and that has also been a pleasure to see.”

Dr. Gottlieb, who Dr. Lebwohl has described as “a superstar” in the fields of dermatology and rheumatology, is one such researcher. In an interview, she looked back on her work and the ways that her work “opened the field,” led to many of her trainees also doing “great work,” and changed the lives of patients.

“It’s nice to feel that I really did change, fundamentally, how psoriasis patients are treated,” said Dr. Gottlieb, who is a clinical professor in the department of dermatology at the Icahn School of Medicine at Mount Sinai. “That obviously feels great.”

She recalled a patient – “a 6-foot-5 biker with bad psoriasis” – who “literally, the minute the door closed, he was crying about how horrible his disease was.”

“And I cleared him ... and then you get big hugs – it just feels extremely good ... giving somebody their life back,” she said.

Dr. Gottlieb has been involved in much of the work in developing biologics for psoriasis, including the ongoing work with bimekizumab for PsA as mentioned by Dr. Lebwohl.

If the phase 2 data with bimekizumab are replicated in the ongoing phase 3 trials now underway at her center, “that can really raise the bar ... so if it’s reproducible, it’s very exciting.”

“It’s exciting to have an IL-23 blocker that, at least in clinical trials, showed inhibition of radiographic progression [in PsA],” she said. “That’s guselkumab those data are already out, and I was involved with that.”

The early work of Dr. Gottlieb and others has also “spread to other diseases,” like hidradenitis suppurativa and atopic dermatitis, she said, noting that numerous studies are underway.

Aside from curing all patients, her ultimate goal is getting to a point where psoriasis has no effect on patients’ quality of life.

“And I see it already,” she said. “It’s happening, and it’s nice to see that it’s happening in children now, too; several of the drugs are approved in kids.”

Alan Menter, MD, chairman of the Division of Dermatology at Baylor University Medical Center, Dallas
Dr. Alan Menter

Alan Menter, MD, chairman of the division of dermatology at Baylor University Medical Center, Dallas, also a prolific researcher – and chair of the guidelines committee that published two new sets of guidelines for psoriasis treatment in 2019 – said that the field of dermatology was “late to the biologic evolution,” as many of the early biologics were first approved for PsA.

“But over the last 10 years, things have changed dramatically,” he said. “After that we suddenly leapt ahead of everybody. ... We now have 11 biologic drugs approved for psoriasis, which is more than any other disease has available.”

It’s been “highly exciting” to see this “evolution and revolution,” he commented, adding that one of the next challenges is to address the comorbidities, such as cardiovascular disease, associated with psoriasis.

“The big question now ... is if you improve skin and you improve joints, can you potentially reduce the risk of coronary artery disease,” he said. “Everybody is looking at that, and to me it’s one of the most exciting things that we’re doing.”

Work is ongoing to look at whether the IL-17s and IL-23s have “other indications outside of the skin and joints,” both within and outside of dermatology.

Like Dr. Gottlieb, Dr. Menter also mentioned the potential for hidradenitis suppurativa, and also for a condition that is rarely discussed or studied: genital psoriasis. Ixekizumab has recently been shown to work in about 75% of patients with genital psoriasis, he noted.

Another important area of research is the identification of biomarkers for predicting response and relapse, he said. For now, biomarker research has disappointed, he added, predicting that it will take at least 3-5 years before biomarkers to help guide treatment are identified.

Indeed, Dr. Gelfand, who also is director of the Psoriasis and Phototherapy Treatment Center, vice chair of clinical research, and medical director of the dermatology clinical studies unit at the University of Pennsylvania, agreed there is a need for research to improve treatment selection.

Advances are being made in genetics – with more than 80 different genes now identified as being related to psoriasis – and in medical informatics – which allow thousands of patients to be followed for years, he said, noting that this could elucidate immunopathological features that can improve treatments, predict and prevent comorbidity, and further improve outcomes.

“We also need care that is more patient centered,” he said, describing the ongoing pragmatic LITE trial of home- or office-based phototherapy for which he is the lead investigator, and other studies that he hopes will expand access to care.

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee
Dr. Kenneth Brian Gordon

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee, whose career started in the basic science immunology arena, added the need for expanding benefit to patients with more-moderate disease. Like Dr. Menter, he identified psoriasis as the area in medicine that has had the greatest degree of advancement, except perhaps for hepatitis C.

He described the process not as a “bench-to-bedside” story, but as a bedside-to-bench, then “back-to-bedside” story.

It was really about taking those early T-cell–targeted biologics and anti-TNF agents from bedside to bench with the realization of the importance of the IL-23 and IL-17 pathways, and that understanding led back to the bedside with the development of the newest agents – and to a “huge difference in patient’s lives.”

“But we’ve gotten so good at treating patients with severe disease ... the question now is how to take care of those with more-moderate disease,” he said, noting that a focus on cost and better delivery systems will be needed for that population.

That research is underway, and the future looks bright – and clear.
 

 

 

“I think with psoriasis therapy and where we’ve come in the last 20 years ... we have a hard time remembering what it was like before we had biologic agents” he said. “Our perspective has changed a lot, and sometimes we forget that.”

In fact, “psoriasis has sort of dragged dermatology into the world of modern clinical trial science, and we can now apply that to all sorts of other diseases,” he said. “The psoriasis trials were the first really well-done large-scale trials in dermatology, and I think that has given dermatology a real leg up in how we do clinical research and how we do evidence-based medicine.”

All of the doctors interviewed for this story have received funds and/or honoraria from, consulted with, are employed with, or served on the advisory boards of manufacturers of biologics. Dr. Gelfand is a copatent holder of resiquimod for treatment of cutaneous T-cell lymphoma and is deputy editor of the Journal of Investigative Dermatology.

 

Imagine a patient suffering with horrible psoriasis for decades having failed “every available treatment.” Imagine him living all that time with “flaking, cracking, painful, itchy skin,” only to develop cirrhosis after exposure to toxic therapies.

Joel M. Gelfand, MD, director of the Psoriasis and Phototherapy Treatment Center at the University of Pennsylvania, Philadelphia.
Dr. Joel Gelfand

Then imagine the experience for that patient when, 2 weeks after initiating treatment with a new interleukin-17 inhibitor, his skin clears completely.

“Two weeks later it’s all gone – it was a moment to behold,” said Joel M. Gelfand, MD, professor of dermatology and epidemiology at the University of Pennsylvania, Philadelphia, who had cared for the man for many years before a psoriasis treatment revolution of sorts took the field of dermatology by storm.

“The progress has been breathtaking – there’s no other way to describe it – and it feels like a miracle every time I see a new patient who has tough disease and I have all these things to offer them,” he continued. “For most patients, I can really help them and make a major difference in their life.”

Much of the progress in psoriasis treatment in the past 50 years unfolded over the past 2 decades, with biologics emerging for psoriasis, said Mark Lebwohl, MD, Waldman professor of dermatology and chair of the Kimberly and Eric J. Waldman department of dermatology at the Icahn School of Medicine at Mount Sinai, New York.

Dr. Mark Lebwohl with a patient.

Dr. Lebwohl recounted some of his own experiences with psoriasis patients before the advent of treatments – particularly biologics – that have transformed practice.

There was a time when psoriasis patients had little more to turn to than the effective – but “disgusting” – Goeckerman Regimen involving cycles of UVB light exposure and topical crude coal tar application. Initially, the regimen, which was introduced in the 1920s, was used around the clock on an inpatient basis until the skin cleared, Dr. Lebwohl said.

In the 1970s, the immunosuppressive chemotherapy drug methotrexate became the first oral systemic therapy approved for severe psoriasis. For those with disabling disease, it offered some hope for relief, but only about 40% of patients achieved at least a 75% reduction in the Psoriasis Area and Severity Index score (PASI 75), he said, adding that they did so at the expense of the liver and bone marrow. “But it was the only thing we had for severe psoriasis other than light treatments.”

In the 1980s and 1990s, oral retinoids emerged as a treatment for psoriasis, and the immunosuppressive drug cyclosporine used to prevent organ rejection in some transplant patients was found to clear psoriasis in affected transplant recipients. Although they brought relief to some patients with severe, disabling disease, these also came with a high price. “It’s not that effective, and it has lots of side effects ... and causes kidney damage in essentially 100% of patients,” Dr. Lebwohl said of cyclosporine.

“So we had treatments that worked, but because the side effects were sufficiently severe, a lot of patients were not treated,” he said.

 

 

Enter the biologics era

The early 2000s brought the first two approvals for psoriasis: alefacept (Amevive), a “modestly effective, but quite safe” immunosuppressive dimeric fusion protein approved in early 2003 for moderate to severe plaque psoriasis, and efalizumab (Raptiva), a recombinant humanized monoclonal antibody approved in October 2003; both were T-cell–targeted therapies. The former was withdrawn from the market voluntarily as newer agents became available, and the latter was withdrawn in 2009 because of a link with development of progressive multifocal leukoencephalopathy.

Tumor necrosis factor (TNF) blockers, which had been used effectively for RA and Crohn’s disease, emerged next, and were highly effective, much safer than the systemic treatments, and gained “very widespread use,” Dr. Lebwohl said.

Dr. Alice Gottelieb


His colleague Alice B. Gottlieb, MD, PhD, was among the pioneers in the development of TNF blockers for the treatment of psoriasis. Her seminal, investigator-initiated paper on the efficacy and safety of infliximab (Remicade) monotherapy for plaque-type psoriasis published in the Lancet in 2001 helped launch the current era in which many psoriasis patients achieve 100% PASI responses with limited side effects, he said, explaining that subsequent research elucidated the role of IL-12 and -23 – leading to effective treatments like ustekinumab (Stelara), and later IL-17, which is, “in fact, the molecule closest to the pathogenesis of psoriasis.”

“If you block IL-17, you get rid of psoriasis,” he said, noting that there are now several companies with approved antibodies to IL-17. “Taltz [ixekizumab] and Cosentyx [secukinumab] are the leading ones, and Siliq [brodalumab] blocks the receptor for IL-17, so it is very effective.”

Another novel biologic – bimekizumab – is on the horizon. It blocks both IL-17a and IL-17f, and appears highly effective in psoriasis and psoriatic arthritis (PsA). “Biologics were the real start of the [psoriasis treatment] revolution,” he said. “When I started out I would speak at patient meetings and the patients were angry at their physicians; they thought they weren’t aggressive enough, they were very frustrated.”

Dr. Lebwohl described patients he would see at annual National Psoriasis Foundation meetings: “There were patients in wheel chairs, because they couldn’t walk. They would be red and scaly all over ... you could have literally swept up scale like it was snow after one of those meetings.

“You go forward to around 2010 – nobody’s in wheelchairs anymore, everybody has clear skin, and it’s become a party; patients are no longer angry – they are thrilled with the results they are getting from much safer and much more effective drugs,” he said. “So it’s been a pleasure taking care of those patients and going from a very difficult time of treating them, to a time where we’ve done a great job treating them.”

Dr. Lebwohl noted that a “large number of dermatologists have been involved with the development of these drugs and making sure they succeed, and that has also been a pleasure to see.”

Dr. Gottlieb, who Dr. Lebwohl has described as “a superstar” in the fields of dermatology and rheumatology, is one such researcher. In an interview, she looked back on her work and the ways that her work “opened the field,” led to many of her trainees also doing “great work,” and changed the lives of patients.

“It’s nice to feel that I really did change, fundamentally, how psoriasis patients are treated,” said Dr. Gottlieb, who is a clinical professor in the department of dermatology at the Icahn School of Medicine at Mount Sinai. “That obviously feels great.”

She recalled a patient – “a 6-foot-5 biker with bad psoriasis” – who “literally, the minute the door closed, he was crying about how horrible his disease was.”

“And I cleared him ... and then you get big hugs – it just feels extremely good ... giving somebody their life back,” she said.

Dr. Gottlieb has been involved in much of the work in developing biologics for psoriasis, including the ongoing work with bimekizumab for PsA as mentioned by Dr. Lebwohl.

If the phase 2 data with bimekizumab are replicated in the ongoing phase 3 trials now underway at her center, “that can really raise the bar ... so if it’s reproducible, it’s very exciting.”

“It’s exciting to have an IL-23 blocker that, at least in clinical trials, showed inhibition of radiographic progression [in PsA],” she said. “That’s guselkumab those data are already out, and I was involved with that.”

The early work of Dr. Gottlieb and others has also “spread to other diseases,” like hidradenitis suppurativa and atopic dermatitis, she said, noting that numerous studies are underway.

Aside from curing all patients, her ultimate goal is getting to a point where psoriasis has no effect on patients’ quality of life.

“And I see it already,” she said. “It’s happening, and it’s nice to see that it’s happening in children now, too; several of the drugs are approved in kids.”

Alan Menter, MD, chairman of the Division of Dermatology at Baylor University Medical Center, Dallas
Dr. Alan Menter

Alan Menter, MD, chairman of the division of dermatology at Baylor University Medical Center, Dallas, also a prolific researcher – and chair of the guidelines committee that published two new sets of guidelines for psoriasis treatment in 2019 – said that the field of dermatology was “late to the biologic evolution,” as many of the early biologics were first approved for PsA.

“But over the last 10 years, things have changed dramatically,” he said. “After that we suddenly leapt ahead of everybody. ... We now have 11 biologic drugs approved for psoriasis, which is more than any other disease has available.”

It’s been “highly exciting” to see this “evolution and revolution,” he commented, adding that one of the next challenges is to address the comorbidities, such as cardiovascular disease, associated with psoriasis.

“The big question now ... is if you improve skin and you improve joints, can you potentially reduce the risk of coronary artery disease,” he said. “Everybody is looking at that, and to me it’s one of the most exciting things that we’re doing.”

Work is ongoing to look at whether the IL-17s and IL-23s have “other indications outside of the skin and joints,” both within and outside of dermatology.

Like Dr. Gottlieb, Dr. Menter also mentioned the potential for hidradenitis suppurativa, and also for a condition that is rarely discussed or studied: genital psoriasis. Ixekizumab has recently been shown to work in about 75% of patients with genital psoriasis, he noted.

Another important area of research is the identification of biomarkers for predicting response and relapse, he said. For now, biomarker research has disappointed, he added, predicting that it will take at least 3-5 years before biomarkers to help guide treatment are identified.

Indeed, Dr. Gelfand, who also is director of the Psoriasis and Phototherapy Treatment Center, vice chair of clinical research, and medical director of the dermatology clinical studies unit at the University of Pennsylvania, agreed there is a need for research to improve treatment selection.

Advances are being made in genetics – with more than 80 different genes now identified as being related to psoriasis – and in medical informatics – which allow thousands of patients to be followed for years, he said, noting that this could elucidate immunopathological features that can improve treatments, predict and prevent comorbidity, and further improve outcomes.

“We also need care that is more patient centered,” he said, describing the ongoing pragmatic LITE trial of home- or office-based phototherapy for which he is the lead investigator, and other studies that he hopes will expand access to care.

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee
Dr. Kenneth Brian Gordon

Kenneth Brian Gordon, MD, chair and professor of dermatology at the Medical College of Wisconsin, Milwaukee, whose career started in the basic science immunology arena, added the need for expanding benefit to patients with more-moderate disease. Like Dr. Menter, he identified psoriasis as the area in medicine that has had the greatest degree of advancement, except perhaps for hepatitis C.

He described the process not as a “bench-to-bedside” story, but as a bedside-to-bench, then “back-to-bedside” story.

It was really about taking those early T-cell–targeted biologics and anti-TNF agents from bedside to bench with the realization of the importance of the IL-23 and IL-17 pathways, and that understanding led back to the bedside with the development of the newest agents – and to a “huge difference in patient’s lives.”

“But we’ve gotten so good at treating patients with severe disease ... the question now is how to take care of those with more-moderate disease,” he said, noting that a focus on cost and better delivery systems will be needed for that population.

That research is underway, and the future looks bright – and clear.
 

 

 

“I think with psoriasis therapy and where we’ve come in the last 20 years ... we have a hard time remembering what it was like before we had biologic agents” he said. “Our perspective has changed a lot, and sometimes we forget that.”

In fact, “psoriasis has sort of dragged dermatology into the world of modern clinical trial science, and we can now apply that to all sorts of other diseases,” he said. “The psoriasis trials were the first really well-done large-scale trials in dermatology, and I think that has given dermatology a real leg up in how we do clinical research and how we do evidence-based medicine.”

All of the doctors interviewed for this story have received funds and/or honoraria from, consulted with, are employed with, or served on the advisory boards of manufacturers of biologics. Dr. Gelfand is a copatent holder of resiquimod for treatment of cutaneous T-cell lymphoma and is deputy editor of the Journal of Investigative Dermatology.

Publications
Publications
Topics
Article Type
Sections
Article Source

 

 
 
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Are unmatched residency graduates a solution for ‘shrinking shrinks’?

Article Type
Changed
Tue, 02/04/2020 - 15:01

‘Physician associates’ could be used to expand the reach of psychiatry

For many years now, we have been lamenting the shortage of psychiatrists practicing in the United States. At this point, we must identify possible solutions.1,2 Currently, the shortage of practicing psychiatrists in the United States could be as high as 45,000.3 The major problem is that the number of psychiatry residency positions will not increase in the foreseeable future, thus generating more psychiatrists is not an option.

Dr. Maju Mathew Koola, Stony Brook (N.Y.) University
Dr. Maju Mathew Koola

Medicare pays about $150,000 per residency slot per year. To solve the mental health access problem, $27 billion (45,000 x $150,000 x 4 years)* would be required from Medicare, which is not feasible.4 The national average starting salary for psychiatrists from 2018-2019 was about $273,000 (much lower in academic institutions), according to Merritt Hawkins, the physician recruiting firm. That salary is modest, compared with those offered in other medical specialties. For this reason, many graduates choose other lucrative specialties. And we know that increasing the salaries of psychiatrists alone would not lead more people to choose psychiatry. On paper, it may say they work a 40-hour week, but they end up working 60 hours a week.

To make matters worse, family medicine and internal medicine doctors generally would rather not deal with people with mental illness and do “cherry-picking and lemon-dropping.” While many patients present to primary care with mental health issues, lack of time and education in psychiatric disorders and treatment hinder these physicians. In short, the mental health field cannot count on primary care physicians.

Meanwhile, there are thousands of unmatched residency graduates. In light of those realities, perhaps psychiatry residency programs could provide these unmatched graduates with 6 months of training and use them to supplement the workforce. These medical doctors, or “physician associates,” could be paired with a few psychiatrists to do clinical and administrative work. With one in four individuals having mental health issues, and more and more people seeking help because of increasing awareness and the benefits that accompanied the Affordable Care Act (ACA), physician associates might ease the workload of psychiatrists so that they can deliver better care to more people. We must take advantage of these two trends: The surge in unmatched graduates and “shrinking shrinks,” or the decline in the psychiatric workforce pool. (The Royal College of Physicians has established a category of clinicians called physician associates,5 but they are comparable to physician assistants in the United States. As you will see, the construct I am proposing is different.)

Unmatched but not unwanted

 

The current landscape

Currently, psychiatrists are under a lot of pressure to see a certain number of patients. Patients consistently complain that psychiatrists spend a maximum of 15 minutes with them, that the visits are interrupted by phone calls, and that they are not being heard and helped. Burnout, a silent epidemic among physicians, is relatively prevalent in psychiatry.6 Hence, some psychiatrists are reducing their hours and retiring early. Psychiatry has the third-oldest workforce, with 59% of current psychiatrists aged 55 years or older.7 A better pay/work ratio and work/life balance would enable psychiatrists to enjoy more fulfilling careers.

Many psychiatrists are spending a lot of their time in research, administration, and the classroom. In addition to those issues, the United States currently has a broken mental health care system.8 Finally, the medical practice landscape has changed dramatically in recent years, and those changes undermine both the effectiveness and well-being of clinicians.


The historical landscape

Some people proudly refer to the deinstitutionalization of mental asylums and state mental hospitals in the United States. But where have these patients gone? According to a U.S. Justice Department report, 2,220,300 adults were incarcerated in U.S. federal and state prisons and county jails in 2013.9 In addition, 4,751,400 adults in 2013 were on probation or parole. The percentages of inmates in state and federal prisons and local jails with a psychiatric diagnosis were 56%, 45%, and 64%, respectively.

I work at the Maryland correctional institutions, part of the Maryland Department of Public Safety and Correctional Services. One thing that I consistently hear from several correctional officers is “had these inmates received timely help and care, they wouldn’t have ended up behind bars.” Because of the criminalization of mental illness, in 44 states, the number of people with mental illness is higher in a jail or prison than in the largest state psychiatric hospital, according to the Treatment Advocacy Center. We have to be responsible for many of the inmates currently in correctional facilities for committing crimes related to mental health problems. In Maryland, a small state, there are 30,000 inmates in jails, and state and federal prison. The average cost of a meal is $1.36, thus $1.36 x 3 meals x 30,000 inmates = $122,400.00 for food alone for 1 day – this average does not take other expenses into account. By using money and manpower wisely and taking care of individuals’ mental health problems before they commit crimes, better outcomes could be achieved.

I used to work for MedOptions Inc. doing psychiatry consults at nursing homes and assisted-living facilities. Because of the shortage of psychiatrists and nurse practitioners, especially in the suburbs and rural areas, those patients could not be seen in a timely manner even for their 3-month routine follow-ups. As my colleagues and I have written previously, many elderly individuals with major neurocognitive disorders are not on the Food and Drug Administration­–approved cognitive enhancers, such as donepezil, galantamine, and memantine.10 Instead, those patients are on benzodiazepines, which are associated with cognitive impairments, and increased risk of pneumonia and falls. Benzodiazepines also can cause and/or worsen disinhibited behavior. Also, in those settings, crisis situations often are addressed days to weeks later because of the doctor shortage. This situation is going to get worse, because this patient population is growing.
 

Child and geriatric psychiatry shortages

Child and geriatric psychiatrist shortages are even higher than those in general psychiatry.11 Many years of training and low salaries are a few of the reasons some choose not to do a fellowship. These residency graduates would rather join a practice at an attending salary than at a fellow’s salary, which requires an additional 1 to 2 years of training. Student loans of $100,000–$500,000 after residency also discourage some from pursuing fellowship opportunities. We need to consider models such as 2 years of residency with 2 years of a child psychiatry fellowship or 3 years of residency with 1 year of geriatric psychiatry fellowship. Working as an adult attending physician (50% of the time) and concurrently doing a fellowship (50% of the time) while receiving an attending salary might motivate more people to complete a fellowship.

In specialties such as radiology, international medical graduates (IMGs) who have completed residency training in radiology in other countries can complete a radiology fellowship in a particular area for several years and can practice in the United States as board-eligible certified MDs. Likewise, in line with the model proposed here, we could provide unmatched graduates who have no residency training with 3 to 4 years of child psychiatry and geriatric psychiatry training in addition to some adult psychiatry training.

Implementation of such a model might take care of the shortage of child and geriatric psychiatrists. In 2015, there were 56 geriatric psychiatry fellowship programs; 54 positions were filled, and 51 fellows completed training.12 “It appears that a reasonable percentage of IMGs who obtain a fellowship in geriatric psychiatry do not have an intent of pursuing a career in the field,” Marc H. Zisselman, MD, former geriatric psychiatry fellowship director and currently with the Einstein Medical Center in Philadelphia, told me in 2016. These numbers are not at all sufficient to take care of the nation’s unmet need. Hence, implementing alternate strategies is imperative.
 

Administrative tasks and care

What consumes a psychiatrist’s time and leads to burnout? The answer has to do with administrative tasks at work. Administrative tasks are not an effective use of time for an MD who has spent more than a decade in medical school, residency, and fellowship training. Although electronic medical record (EMR) systems are considered a major advancement, engaging in the process throughout the day is associated with exhaustion.

Many physicians feel that EMRs have slowed them down, and some are not well-equipped to use them in quick and efficient ways. EMRs also have led to physicians making minimal eye contact in interviews with patients. Patients often complain: “I am talking, and the doctor is looking at the computer and typing.” Patients consider this behavior to be unprofessional and rude. In a survey of 57 U.S. physicians in family medicine, internal medicine, cardiology, and orthopedics, results showed that during the work day, 27% of their time was spent on direct clinical face time with patients and 49.2% was spent on EMR and desk work. While in the examination room with patients, physicians spent 52.9% of their time on direct clinical face time and 37.0% on EMR and desk work. Outside office hours, physicians spend up to 2 hours of personal time each night doing additional computer and other clerical work.13

Several EMR software systems, such as CareLogic, Cerner, Epic,NextGen, PointClickCare, and Sunrise, are used in the United States. The U.S. Veterans Affairs Medical Centers (VAMCs) use the computerized patient record system (CPRS) across the country. VA clinicians find CPRS extremely useful when they move from one VAMC to another. Likewise, hospitals and universities may use one software system such as the CPRS and thus, when clinicians change jobs, they find it hard to adapt to the new system.

Because psychiatrists are wasting a lot of time doing administrative tasks, they might be unable to do a good job with regard to making the right diagnoses and prescribing the best treatments.When I ask patients what are they diagnosed with, they tell me: “It depends on who you ask,” or “I’ve been diagnosed with everything.” This shows that we are not doing a good job or something is not right.

Currently, psychiatrists do not have the time and/or interest to make the right diagnoses and provide adequate psychoeducation for their patients. This also could be attributable to a variety of factors, including, but not limited to, time constraints, cynicism, and apathy. Time constraints also lead to the gross underutilization14 of relapse prevention strategies such as long-acting injectables and medications that can prevent suicide, such as lithium and clozapine.15

Other factors that undermine good care include not participating in continuing medical education (CME) and not staying up to date with the literature. For example, haloperidol continues to be one of the most frequently prescribed (probably, the most common) antipsychotic, although it is clearly neurotoxic16,17 and other safer options are available.18 Board certification and maintenance of certification (MOC) are not synonymous with good clinical practice. Many physicians are finding it hard to complete daily documentation, let alone time for MOC. For a variety of reasons, many are not maintaining certification, and this number is likely to increase. Think about how much time is devoted to the one-to-one interview with the patient and direct patient care during the 15-minute medical check appointment and the hour-long new evaluation. In some clinics, psychiatrists are asked to see more than 25 patients in 4 hours. Some U.S.-based psychiatrists see 65 inpatients and initiate 10 new evaluations in a single day. Under those kinds of time constraints, how can we provide quality care?
 

 

 

A model that would address the shortage

Overall, 7,826 PGY-1 applicants were unmatched in 2019, according to data from the 2019 Main Residency Match.19 Psychiatry residency programs could give these unmatched graduates 6 months of training (arbitrary duration) in psychiatry, which is not at all difficult with the program modules that are available.20 We could use them as physician associates as a major contributor to our workforce to complete administrative and other clinical tasks.

Administrative tasks are not necessarily negative, as all psychiatrists have done administrative tasks as medical students, residents, and fellows. However, at this point, administrative tasks are not an effective use of a psychiatrist’s time. Those physician associates could be paired with two to three psychiatrists to do administrative tasks (for making daytime and overnight phone calls; handling prescriptions, prior authorizations, and medication orders, especially over-the-counter and comfort medications in the inpatient units; doing chart reviews; ordering and checking laboratory tests; collecting collateral information from previous clinicians and records; printing medication education pamphlets; faxing; corresponding with insurance companies/utilization review; performing documentation; billing; and taking care of other clinical and administrative paperwork).

In addition, physician associates could collect information using rating scales such as the 9-item Patient Health Questionnaire for measurement-based care21 and Geriatric Depression Scale, both of which are currently not used in psychiatric practice because of time constraints and lack of manpower. Keep in mind that these individuals are medical doctors and could do a good job with these kinds of tasks. Most of them already have clinical experience in the United States and know the health care system. These MDs could conduct an initial interview (what medical students, residents, and fellows do) and present to the attending psychiatrist. Psychiatrists could then focus on the follow-up interview; diagnoses and treatment; major medical decision making, including shared decision making (patients feel that they are not part of the treatment plan); and seeing more patients, which is a more effective use of their time. This training would give these physician associates a chance to work as doctors and make a living. These MDs have completed medical school training after passing Medical College Admission Test – equivalent exams in their countries. They have passed all steps of the U.S. Medical Licensing Examination and have received Educational Commission for Foreign Medical Graduates certification. Some have even completed residency programs in their home countries.

Some U.S. states already have implemented these kinds of programs. In Arkansas, Kansas, and Missouri,22,23 legislators have passed laws allowing unmatched graduates who have not completed a residency program to work in medically underserved areas with a collaborating physician. These physicians must directly supervise the new doctors for at least a month before they can see patients on their own. Another proposal that has been suggested to address the psychiatrist shortage is employing physician assistants to provide care.24-26

The model proposed here is comparable to postdoctoral fellow-principal investigator and resident-attending collaborative work. At hospitals, a certified nurse assistant helps patients with health care needs under the supervision of a nurse. Similarly, a physician associate could help a psychiatrist under his or her supervision. In the Sheppard Pratt Health System in Baltimore, where I worked previously, for example, nurses dictate and prepare discharge summaries for the attending physician with whom they work. These are the kinds of tasks that physician associates could do as well.

The wait time to get a new evaluation with a psychiatrist is enormous. The policy is that a new patient admitted to an inpatient unit must be seen within 24 hours. With this model, the physician associates could see patients within a few hours, take care of their most immediate needs, take a history and conduct a physical, and write an admission note for the attending psychiatrist to sign. Currently, the outpatient practice is so busy that psychiatrists do not have the time to read the discharge summaries of patients referred to them after a recent hospitalization, which often leads to poor aftercare. The physician associates could read the discharge summaries and provide pertinent information to the attending psychiatrists.

In the inpatient units and emergency departments, nurses and social workers see patients before the attending physician, present patient information to the attending psychiatrist, and document their findings. It is redundant for the physician to write the same narrative again. Rather, the physician could add an addendum to the nurse’s or social worker’s notes and sign off. This would save a lot of time.

Numerous well-designed studies support the adoption of collaborative care models as one means of providing quality psychiatric care to larger populations.27,28 The American Psychiatric Association (APA) is currently training 3,500 psychiatrists in collaborative care through the Centers for Medicare and Medicaid Services’ Transforming Clinical Practice Initiative.29,30 Despite this training and the services provided by the nurse practitioners and physician assistants, the shortage of psychiatrists has not been adequately addressed. Hence, we need to think outside the box to find other potential pragmatic solutions.

Simply increasing the hours of work or the number of nurse practitioners or physician assistants already in practice is not going to solve the problem completely. The model proposed here and previously31 is likely to improve the quality of care that we now provide. This model should not be seen as exploiting these unmatched medical graduates and setting up a two-tiered health care system. The salary for these physicians would be a small percentage (5%-10%; these are arbitrary percentages) from the reimbursement of the attending psychiatrist. This model would not affect the salary of the attending psychiatrists; with this model, they would be able to see 25%-50% more patients (again, arbitrary percentages) with the help and support from these physician associates.
 

 

 

Potential barriers to implementation

There could be inherent barriers and complications to implementation of this model that are difficult to foresee at this point. Nurse practitioners (222,000 plus) and physician assistants (83,000 plus) have a fixed and structured curriculum, have national examining boards and national organizations with recertification requirements, and are licensed as independent practitioners, at least as far as CME is concerned.

Physician associates would need a standardized curriculum and examinations to validate what they have studied and learned. This process might be an important part of the credentialing of these individuals, as well as evaluation of cultural competency. If this model is to successfully lead to formation of a specific clinical group, it might need its own specific identity, national organization, national standards of competency, national certification and recertification processes, and national conference and CME or at least a subsection in a national behavioral and medical health organization, such as the APA or the American Academy of Child and Adolescent Psychiatry.

It would be desirable to “field test” the physician associate concept to clarify implementation difficulties, including the ones described above, that could arise. The cost of implementation of this program should not be of much concern; the 6-month training could be on a volunteer basis, or a small stipend might be paid by graduate medical education funding. This model could prove to be rewarding long term, save trillions of health care dollars, and allow us to provide exceptional and timely care.
 

Conclusion

The 2020 Mental Health America annual State of Mental Health in America report found that more than 70% of youth with severe major depressive disorder were in need of treatment in 2017. The percentage of adults with any mental illness who did not receive treatment stood at about 57.2%.32 Meanwhile, from 1999 through 2014, the age-adjusted suicide rate in the United States increased 24%.33 More individuals are seeking help because of increased awareness.34,35 In light of the access to services afforded by the ACA, physician associates might ease the workload of psychiatrists and enable them to deliver better care to more people. We would not necessarily have to use the term “physician associate” and could generate better terminologies later. In short, let’s tap into the pools of unmatched graduates and shrinking shrinks! If this model is successful, it could be used in other specialties and countries. The stakes for our patients have never been higher.

References

1. Bishop TF et al. Health Aff. 2016;35(7):1271-7.

2. National Council Medical Director Institute. The psychiatric shortage: Causes and solutions. 2017. Washington: National Council for Behavioral Health.

3. Satiani A et al. Psychiatric Serv. 2018;69:710-3.

4. Carlat D. Psychiatric Times. 2010 Aug 3;27(8).

5. McCartney M. BMJ. 2017;359:j5022.

6. Maslach C and Leiter MP. World Psychiatry. 2016 Jun 5;15:103-11.

7. Merritt Hawkins. “The silent shortage: A white paper examining supply, demand and recruitment trends in psychiatry.” 2018.

8. Sederer LI and Sharfstein SS. JAMA. 2014 Sep 24;312:1195-6.

9. James DJ and Glaze LE. Mental health problems of prison and jail inmates. 2006 Sep. U.S. Justice Department, Bureau of Justice Statistics Special Report.

10. Koola MM et al. J Geriatr Care Res. 2018;5(2):57-67.

11. Buckley PF and Nasrallah HA. Curr Psychiatr. 2016;15:23-4.

12. American Medical Association Database. Open Residency and Fellowship Positions.

13. Sinsky C et al. Ann Intern Med. 2016;165:753-60.

14. Koola MM. Curr Psychiatr. 2017 Mar. 16(3):19-20,47,e1.

15. Koola MM and Sebastian J. HSOA J Psychiatry Depress Anxiety. 2016;(2):1-11.

16. Nasrallah HA and Chen AT. Ann Clin Psychiatry. 2017 Aug;29(3):195-202.

17. Nasrallah HA. Curr Psychiatr. 2013 Jul;7-8.

18. Chen AT and Nasrallah HA. Schizophr Res. 2019 Jun;208:1-7.

19. National Resident Matching Program, Results and Data: 2019 Main Residency Match. National Resident Matching Program, Washington, 2019.

20. Masters KJ. J Physician Assist Educ. 2015 Sep;26(3):136-43.

21. Koola MM et al. J Nerv Ment Dis. 2011;199(12):989-90.

22. “New Missouri licensing offers ‘Band-Aid’ for physician shortages.” Kansas City Business Journal. Updated 2017 May 16.

23. “After earning an MD, she’s headed back to school – to become a nurse.” STAT. 2016 Nov 8.

24. Keizer TB and Trangle MA. Acad Psychiatry. 2015 Dec;39(6):691-4.

25. Miller JG and Peterson DJ. Acad Psychiatry. 2015 Dec;39(6):685-6.

26. Smith MS. Curr Psychiatr. 2019 Sep;18(9):17-24.

27. Osofsky HJ et al. Acad Psychiatry. 2016 Oct;40(5):747-54.

28. Dreier-Wolfgramm A et al. Z Gerontol Geriatr. 2017 May;50(Suppl 2):68-77.

29. Huang H and Barkil-Oteo A. Psychosomatics. 2015 Nov-Dec;56(6):658-61.

30. Raney L et al. Fam Syst Health. 2014 Jun;32(2):147-8.

31. Koola MM. Curr Psychiatr. 2016 Dec. 15(12):33-4.

32. Mental Health America. State of Mental Health in America 2020.

33. Curtin SC et al. NCHS Data Brief. 2016 Apr;(241):1-8.

34. Kelly DL et al. Ann Intern Med. 2020;172(2):167-8.

35. Miller JP and Nasrallah HA. Curr Psychiatr. 2015;14(12):45-6.

Dr. Koola is an associate professor in the department of psychiatry and behavioral health at Stony Brook (N.Y.) University. His main area of interest is novel therapeutic discovery in the treatment of schizophrenia. He has a particular interest in improving the health care delivery system for people with psychiatric illness. Dr. Koola declared no conflicts of interest. He can be reached at maju.koola@stonybrook.edu.

*This commentary was updated 2/2/2020.

Publications
Topics
Sections

‘Physician associates’ could be used to expand the reach of psychiatry

‘Physician associates’ could be used to expand the reach of psychiatry

For many years now, we have been lamenting the shortage of psychiatrists practicing in the United States. At this point, we must identify possible solutions.1,2 Currently, the shortage of practicing psychiatrists in the United States could be as high as 45,000.3 The major problem is that the number of psychiatry residency positions will not increase in the foreseeable future, thus generating more psychiatrists is not an option.

Dr. Maju Mathew Koola, Stony Brook (N.Y.) University
Dr. Maju Mathew Koola

Medicare pays about $150,000 per residency slot per year. To solve the mental health access problem, $27 billion (45,000 x $150,000 x 4 years)* would be required from Medicare, which is not feasible.4 The national average starting salary for psychiatrists from 2018-2019 was about $273,000 (much lower in academic institutions), according to Merritt Hawkins, the physician recruiting firm. That salary is modest, compared with those offered in other medical specialties. For this reason, many graduates choose other lucrative specialties. And we know that increasing the salaries of psychiatrists alone would not lead more people to choose psychiatry. On paper, it may say they work a 40-hour week, but they end up working 60 hours a week.

To make matters worse, family medicine and internal medicine doctors generally would rather not deal with people with mental illness and do “cherry-picking and lemon-dropping.” While many patients present to primary care with mental health issues, lack of time and education in psychiatric disorders and treatment hinder these physicians. In short, the mental health field cannot count on primary care physicians.

Meanwhile, there are thousands of unmatched residency graduates. In light of those realities, perhaps psychiatry residency programs could provide these unmatched graduates with 6 months of training and use them to supplement the workforce. These medical doctors, or “physician associates,” could be paired with a few psychiatrists to do clinical and administrative work. With one in four individuals having mental health issues, and more and more people seeking help because of increasing awareness and the benefits that accompanied the Affordable Care Act (ACA), physician associates might ease the workload of psychiatrists so that they can deliver better care to more people. We must take advantage of these two trends: The surge in unmatched graduates and “shrinking shrinks,” or the decline in the psychiatric workforce pool. (The Royal College of Physicians has established a category of clinicians called physician associates,5 but they are comparable to physician assistants in the United States. As you will see, the construct I am proposing is different.)

Unmatched but not unwanted

 

The current landscape

Currently, psychiatrists are under a lot of pressure to see a certain number of patients. Patients consistently complain that psychiatrists spend a maximum of 15 minutes with them, that the visits are interrupted by phone calls, and that they are not being heard and helped. Burnout, a silent epidemic among physicians, is relatively prevalent in psychiatry.6 Hence, some psychiatrists are reducing their hours and retiring early. Psychiatry has the third-oldest workforce, with 59% of current psychiatrists aged 55 years or older.7 A better pay/work ratio and work/life balance would enable psychiatrists to enjoy more fulfilling careers.

Many psychiatrists are spending a lot of their time in research, administration, and the classroom. In addition to those issues, the United States currently has a broken mental health care system.8 Finally, the medical practice landscape has changed dramatically in recent years, and those changes undermine both the effectiveness and well-being of clinicians.


The historical landscape

Some people proudly refer to the deinstitutionalization of mental asylums and state mental hospitals in the United States. But where have these patients gone? According to a U.S. Justice Department report, 2,220,300 adults were incarcerated in U.S. federal and state prisons and county jails in 2013.9 In addition, 4,751,400 adults in 2013 were on probation or parole. The percentages of inmates in state and federal prisons and local jails with a psychiatric diagnosis were 56%, 45%, and 64%, respectively.

I work at the Maryland correctional institutions, part of the Maryland Department of Public Safety and Correctional Services. One thing that I consistently hear from several correctional officers is “had these inmates received timely help and care, they wouldn’t have ended up behind bars.” Because of the criminalization of mental illness, in 44 states, the number of people with mental illness is higher in a jail or prison than in the largest state psychiatric hospital, according to the Treatment Advocacy Center. We have to be responsible for many of the inmates currently in correctional facilities for committing crimes related to mental health problems. In Maryland, a small state, there are 30,000 inmates in jails, and state and federal prison. The average cost of a meal is $1.36, thus $1.36 x 3 meals x 30,000 inmates = $122,400.00 for food alone for 1 day – this average does not take other expenses into account. By using money and manpower wisely and taking care of individuals’ mental health problems before they commit crimes, better outcomes could be achieved.

I used to work for MedOptions Inc. doing psychiatry consults at nursing homes and assisted-living facilities. Because of the shortage of psychiatrists and nurse practitioners, especially in the suburbs and rural areas, those patients could not be seen in a timely manner even for their 3-month routine follow-ups. As my colleagues and I have written previously, many elderly individuals with major neurocognitive disorders are not on the Food and Drug Administration­–approved cognitive enhancers, such as donepezil, galantamine, and memantine.10 Instead, those patients are on benzodiazepines, which are associated with cognitive impairments, and increased risk of pneumonia and falls. Benzodiazepines also can cause and/or worsen disinhibited behavior. Also, in those settings, crisis situations often are addressed days to weeks later because of the doctor shortage. This situation is going to get worse, because this patient population is growing.
 

Child and geriatric psychiatry shortages

Child and geriatric psychiatrist shortages are even higher than those in general psychiatry.11 Many years of training and low salaries are a few of the reasons some choose not to do a fellowship. These residency graduates would rather join a practice at an attending salary than at a fellow’s salary, which requires an additional 1 to 2 years of training. Student loans of $100,000–$500,000 after residency also discourage some from pursuing fellowship opportunities. We need to consider models such as 2 years of residency with 2 years of a child psychiatry fellowship or 3 years of residency with 1 year of geriatric psychiatry fellowship. Working as an adult attending physician (50% of the time) and concurrently doing a fellowship (50% of the time) while receiving an attending salary might motivate more people to complete a fellowship.

In specialties such as radiology, international medical graduates (IMGs) who have completed residency training in radiology in other countries can complete a radiology fellowship in a particular area for several years and can practice in the United States as board-eligible certified MDs. Likewise, in line with the model proposed here, we could provide unmatched graduates who have no residency training with 3 to 4 years of child psychiatry and geriatric psychiatry training in addition to some adult psychiatry training.

Implementation of such a model might take care of the shortage of child and geriatric psychiatrists. In 2015, there were 56 geriatric psychiatry fellowship programs; 54 positions were filled, and 51 fellows completed training.12 “It appears that a reasonable percentage of IMGs who obtain a fellowship in geriatric psychiatry do not have an intent of pursuing a career in the field,” Marc H. Zisselman, MD, former geriatric psychiatry fellowship director and currently with the Einstein Medical Center in Philadelphia, told me in 2016. These numbers are not at all sufficient to take care of the nation’s unmet need. Hence, implementing alternate strategies is imperative.
 

Administrative tasks and care

What consumes a psychiatrist’s time and leads to burnout? The answer has to do with administrative tasks at work. Administrative tasks are not an effective use of time for an MD who has spent more than a decade in medical school, residency, and fellowship training. Although electronic medical record (EMR) systems are considered a major advancement, engaging in the process throughout the day is associated with exhaustion.

Many physicians feel that EMRs have slowed them down, and some are not well-equipped to use them in quick and efficient ways. EMRs also have led to physicians making minimal eye contact in interviews with patients. Patients often complain: “I am talking, and the doctor is looking at the computer and typing.” Patients consider this behavior to be unprofessional and rude. In a survey of 57 U.S. physicians in family medicine, internal medicine, cardiology, and orthopedics, results showed that during the work day, 27% of their time was spent on direct clinical face time with patients and 49.2% was spent on EMR and desk work. While in the examination room with patients, physicians spent 52.9% of their time on direct clinical face time and 37.0% on EMR and desk work. Outside office hours, physicians spend up to 2 hours of personal time each night doing additional computer and other clerical work.13

Several EMR software systems, such as CareLogic, Cerner, Epic,NextGen, PointClickCare, and Sunrise, are used in the United States. The U.S. Veterans Affairs Medical Centers (VAMCs) use the computerized patient record system (CPRS) across the country. VA clinicians find CPRS extremely useful when they move from one VAMC to another. Likewise, hospitals and universities may use one software system such as the CPRS and thus, when clinicians change jobs, they find it hard to adapt to the new system.

Because psychiatrists are wasting a lot of time doing administrative tasks, they might be unable to do a good job with regard to making the right diagnoses and prescribing the best treatments.When I ask patients what are they diagnosed with, they tell me: “It depends on who you ask,” or “I’ve been diagnosed with everything.” This shows that we are not doing a good job or something is not right.

Currently, psychiatrists do not have the time and/or interest to make the right diagnoses and provide adequate psychoeducation for their patients. This also could be attributable to a variety of factors, including, but not limited to, time constraints, cynicism, and apathy. Time constraints also lead to the gross underutilization14 of relapse prevention strategies such as long-acting injectables and medications that can prevent suicide, such as lithium and clozapine.15

Other factors that undermine good care include not participating in continuing medical education (CME) and not staying up to date with the literature. For example, haloperidol continues to be one of the most frequently prescribed (probably, the most common) antipsychotic, although it is clearly neurotoxic16,17 and other safer options are available.18 Board certification and maintenance of certification (MOC) are not synonymous with good clinical practice. Many physicians are finding it hard to complete daily documentation, let alone time for MOC. For a variety of reasons, many are not maintaining certification, and this number is likely to increase. Think about how much time is devoted to the one-to-one interview with the patient and direct patient care during the 15-minute medical check appointment and the hour-long new evaluation. In some clinics, psychiatrists are asked to see more than 25 patients in 4 hours. Some U.S.-based psychiatrists see 65 inpatients and initiate 10 new evaluations in a single day. Under those kinds of time constraints, how can we provide quality care?
 

 

 

A model that would address the shortage

Overall, 7,826 PGY-1 applicants were unmatched in 2019, according to data from the 2019 Main Residency Match.19 Psychiatry residency programs could give these unmatched graduates 6 months of training (arbitrary duration) in psychiatry, which is not at all difficult with the program modules that are available.20 We could use them as physician associates as a major contributor to our workforce to complete administrative and other clinical tasks.

Administrative tasks are not necessarily negative, as all psychiatrists have done administrative tasks as medical students, residents, and fellows. However, at this point, administrative tasks are not an effective use of a psychiatrist’s time. Those physician associates could be paired with two to three psychiatrists to do administrative tasks (for making daytime and overnight phone calls; handling prescriptions, prior authorizations, and medication orders, especially over-the-counter and comfort medications in the inpatient units; doing chart reviews; ordering and checking laboratory tests; collecting collateral information from previous clinicians and records; printing medication education pamphlets; faxing; corresponding with insurance companies/utilization review; performing documentation; billing; and taking care of other clinical and administrative paperwork).

In addition, physician associates could collect information using rating scales such as the 9-item Patient Health Questionnaire for measurement-based care21 and Geriatric Depression Scale, both of which are currently not used in psychiatric practice because of time constraints and lack of manpower. Keep in mind that these individuals are medical doctors and could do a good job with these kinds of tasks. Most of them already have clinical experience in the United States and know the health care system. These MDs could conduct an initial interview (what medical students, residents, and fellows do) and present to the attending psychiatrist. Psychiatrists could then focus on the follow-up interview; diagnoses and treatment; major medical decision making, including shared decision making (patients feel that they are not part of the treatment plan); and seeing more patients, which is a more effective use of their time. This training would give these physician associates a chance to work as doctors and make a living. These MDs have completed medical school training after passing Medical College Admission Test – equivalent exams in their countries. They have passed all steps of the U.S. Medical Licensing Examination and have received Educational Commission for Foreign Medical Graduates certification. Some have even completed residency programs in their home countries.

Some U.S. states already have implemented these kinds of programs. In Arkansas, Kansas, and Missouri,22,23 legislators have passed laws allowing unmatched graduates who have not completed a residency program to work in medically underserved areas with a collaborating physician. These physicians must directly supervise the new doctors for at least a month before they can see patients on their own. Another proposal that has been suggested to address the psychiatrist shortage is employing physician assistants to provide care.24-26

The model proposed here is comparable to postdoctoral fellow-principal investigator and resident-attending collaborative work. At hospitals, a certified nurse assistant helps patients with health care needs under the supervision of a nurse. Similarly, a physician associate could help a psychiatrist under his or her supervision. In the Sheppard Pratt Health System in Baltimore, where I worked previously, for example, nurses dictate and prepare discharge summaries for the attending physician with whom they work. These are the kinds of tasks that physician associates could do as well.

The wait time to get a new evaluation with a psychiatrist is enormous. The policy is that a new patient admitted to an inpatient unit must be seen within 24 hours. With this model, the physician associates could see patients within a few hours, take care of their most immediate needs, take a history and conduct a physical, and write an admission note for the attending psychiatrist to sign. Currently, the outpatient practice is so busy that psychiatrists do not have the time to read the discharge summaries of patients referred to them after a recent hospitalization, which often leads to poor aftercare. The physician associates could read the discharge summaries and provide pertinent information to the attending psychiatrists.

In the inpatient units and emergency departments, nurses and social workers see patients before the attending physician, present patient information to the attending psychiatrist, and document their findings. It is redundant for the physician to write the same narrative again. Rather, the physician could add an addendum to the nurse’s or social worker’s notes and sign off. This would save a lot of time.

Numerous well-designed studies support the adoption of collaborative care models as one means of providing quality psychiatric care to larger populations.27,28 The American Psychiatric Association (APA) is currently training 3,500 psychiatrists in collaborative care through the Centers for Medicare and Medicaid Services’ Transforming Clinical Practice Initiative.29,30 Despite this training and the services provided by the nurse practitioners and physician assistants, the shortage of psychiatrists has not been adequately addressed. Hence, we need to think outside the box to find other potential pragmatic solutions.

Simply increasing the hours of work or the number of nurse practitioners or physician assistants already in practice is not going to solve the problem completely. The model proposed here and previously31 is likely to improve the quality of care that we now provide. This model should not be seen as exploiting these unmatched medical graduates and setting up a two-tiered health care system. The salary for these physicians would be a small percentage (5%-10%; these are arbitrary percentages) from the reimbursement of the attending psychiatrist. This model would not affect the salary of the attending psychiatrists; with this model, they would be able to see 25%-50% more patients (again, arbitrary percentages) with the help and support from these physician associates.
 

 

 

Potential barriers to implementation

There could be inherent barriers and complications to implementation of this model that are difficult to foresee at this point. Nurse practitioners (222,000 plus) and physician assistants (83,000 plus) have a fixed and structured curriculum, have national examining boards and national organizations with recertification requirements, and are licensed as independent practitioners, at least as far as CME is concerned.

Physician associates would need a standardized curriculum and examinations to validate what they have studied and learned. This process might be an important part of the credentialing of these individuals, as well as evaluation of cultural competency. If this model is to successfully lead to formation of a specific clinical group, it might need its own specific identity, national organization, national standards of competency, national certification and recertification processes, and national conference and CME or at least a subsection in a national behavioral and medical health organization, such as the APA or the American Academy of Child and Adolescent Psychiatry.

It would be desirable to “field test” the physician associate concept to clarify implementation difficulties, including the ones described above, that could arise. The cost of implementation of this program should not be of much concern; the 6-month training could be on a volunteer basis, or a small stipend might be paid by graduate medical education funding. This model could prove to be rewarding long term, save trillions of health care dollars, and allow us to provide exceptional and timely care.
 

Conclusion

The 2020 Mental Health America annual State of Mental Health in America report found that more than 70% of youth with severe major depressive disorder were in need of treatment in 2017. The percentage of adults with any mental illness who did not receive treatment stood at about 57.2%.32 Meanwhile, from 1999 through 2014, the age-adjusted suicide rate in the United States increased 24%.33 More individuals are seeking help because of increased awareness.34,35 In light of the access to services afforded by the ACA, physician associates might ease the workload of psychiatrists and enable them to deliver better care to more people. We would not necessarily have to use the term “physician associate” and could generate better terminologies later. In short, let’s tap into the pools of unmatched graduates and shrinking shrinks! If this model is successful, it could be used in other specialties and countries. The stakes for our patients have never been higher.

References

1. Bishop TF et al. Health Aff. 2016;35(7):1271-7.

2. National Council Medical Director Institute. The psychiatric shortage: Causes and solutions. 2017. Washington: National Council for Behavioral Health.

3. Satiani A et al. Psychiatric Serv. 2018;69:710-3.

4. Carlat D. Psychiatric Times. 2010 Aug 3;27(8).

5. McCartney M. BMJ. 2017;359:j5022.

6. Maslach C and Leiter MP. World Psychiatry. 2016 Jun 5;15:103-11.

7. Merritt Hawkins. “The silent shortage: A white paper examining supply, demand and recruitment trends in psychiatry.” 2018.

8. Sederer LI and Sharfstein SS. JAMA. 2014 Sep 24;312:1195-6.

9. James DJ and Glaze LE. Mental health problems of prison and jail inmates. 2006 Sep. U.S. Justice Department, Bureau of Justice Statistics Special Report.

10. Koola MM et al. J Geriatr Care Res. 2018;5(2):57-67.

11. Buckley PF and Nasrallah HA. Curr Psychiatr. 2016;15:23-4.

12. American Medical Association Database. Open Residency and Fellowship Positions.

13. Sinsky C et al. Ann Intern Med. 2016;165:753-60.

14. Koola MM. Curr Psychiatr. 2017 Mar. 16(3):19-20,47,e1.

15. Koola MM and Sebastian J. HSOA J Psychiatry Depress Anxiety. 2016;(2):1-11.

16. Nasrallah HA and Chen AT. Ann Clin Psychiatry. 2017 Aug;29(3):195-202.

17. Nasrallah HA. Curr Psychiatr. 2013 Jul;7-8.

18. Chen AT and Nasrallah HA. Schizophr Res. 2019 Jun;208:1-7.

19. National Resident Matching Program, Results and Data: 2019 Main Residency Match. National Resident Matching Program, Washington, 2019.

20. Masters KJ. J Physician Assist Educ. 2015 Sep;26(3):136-43.

21. Koola MM et al. J Nerv Ment Dis. 2011;199(12):989-90.

22. “New Missouri licensing offers ‘Band-Aid’ for physician shortages.” Kansas City Business Journal. Updated 2017 May 16.

23. “After earning an MD, she’s headed back to school – to become a nurse.” STAT. 2016 Nov 8.

24. Keizer TB and Trangle MA. Acad Psychiatry. 2015 Dec;39(6):691-4.

25. Miller JG and Peterson DJ. Acad Psychiatry. 2015 Dec;39(6):685-6.

26. Smith MS. Curr Psychiatr. 2019 Sep;18(9):17-24.

27. Osofsky HJ et al. Acad Psychiatry. 2016 Oct;40(5):747-54.

28. Dreier-Wolfgramm A et al. Z Gerontol Geriatr. 2017 May;50(Suppl 2):68-77.

29. Huang H and Barkil-Oteo A. Psychosomatics. 2015 Nov-Dec;56(6):658-61.

30. Raney L et al. Fam Syst Health. 2014 Jun;32(2):147-8.

31. Koola MM. Curr Psychiatr. 2016 Dec. 15(12):33-4.

32. Mental Health America. State of Mental Health in America 2020.

33. Curtin SC et al. NCHS Data Brief. 2016 Apr;(241):1-8.

34. Kelly DL et al. Ann Intern Med. 2020;172(2):167-8.

35. Miller JP and Nasrallah HA. Curr Psychiatr. 2015;14(12):45-6.

Dr. Koola is an associate professor in the department of psychiatry and behavioral health at Stony Brook (N.Y.) University. His main area of interest is novel therapeutic discovery in the treatment of schizophrenia. He has a particular interest in improving the health care delivery system for people with psychiatric illness. Dr. Koola declared no conflicts of interest. He can be reached at maju.koola@stonybrook.edu.

*This commentary was updated 2/2/2020.

For many years now, we have been lamenting the shortage of psychiatrists practicing in the United States. At this point, we must identify possible solutions.1,2 Currently, the shortage of practicing psychiatrists in the United States could be as high as 45,000.3 The major problem is that the number of psychiatry residency positions will not increase in the foreseeable future, thus generating more psychiatrists is not an option.

Dr. Maju Mathew Koola, Stony Brook (N.Y.) University
Dr. Maju Mathew Koola

Medicare pays about $150,000 per residency slot per year. To solve the mental health access problem, $27 billion (45,000 x $150,000 x 4 years)* would be required from Medicare, which is not feasible.4 The national average starting salary for psychiatrists from 2018-2019 was about $273,000 (much lower in academic institutions), according to Merritt Hawkins, the physician recruiting firm. That salary is modest, compared with those offered in other medical specialties. For this reason, many graduates choose other lucrative specialties. And we know that increasing the salaries of psychiatrists alone would not lead more people to choose psychiatry. On paper, it may say they work a 40-hour week, but they end up working 60 hours a week.

To make matters worse, family medicine and internal medicine doctors generally would rather not deal with people with mental illness and do “cherry-picking and lemon-dropping.” While many patients present to primary care with mental health issues, lack of time and education in psychiatric disorders and treatment hinder these physicians. In short, the mental health field cannot count on primary care physicians.

Meanwhile, there are thousands of unmatched residency graduates. In light of those realities, perhaps psychiatry residency programs could provide these unmatched graduates with 6 months of training and use them to supplement the workforce. These medical doctors, or “physician associates,” could be paired with a few psychiatrists to do clinical and administrative work. With one in four individuals having mental health issues, and more and more people seeking help because of increasing awareness and the benefits that accompanied the Affordable Care Act (ACA), physician associates might ease the workload of psychiatrists so that they can deliver better care to more people. We must take advantage of these two trends: The surge in unmatched graduates and “shrinking shrinks,” or the decline in the psychiatric workforce pool. (The Royal College of Physicians has established a category of clinicians called physician associates,5 but they are comparable to physician assistants in the United States. As you will see, the construct I am proposing is different.)

Unmatched but not unwanted

 

The current landscape

Currently, psychiatrists are under a lot of pressure to see a certain number of patients. Patients consistently complain that psychiatrists spend a maximum of 15 minutes with them, that the visits are interrupted by phone calls, and that they are not being heard and helped. Burnout, a silent epidemic among physicians, is relatively prevalent in psychiatry.6 Hence, some psychiatrists are reducing their hours and retiring early. Psychiatry has the third-oldest workforce, with 59% of current psychiatrists aged 55 years or older.7 A better pay/work ratio and work/life balance would enable psychiatrists to enjoy more fulfilling careers.

Many psychiatrists are spending a lot of their time in research, administration, and the classroom. In addition to those issues, the United States currently has a broken mental health care system.8 Finally, the medical practice landscape has changed dramatically in recent years, and those changes undermine both the effectiveness and well-being of clinicians.


The historical landscape

Some people proudly refer to the deinstitutionalization of mental asylums and state mental hospitals in the United States. But where have these patients gone? According to a U.S. Justice Department report, 2,220,300 adults were incarcerated in U.S. federal and state prisons and county jails in 2013.9 In addition, 4,751,400 adults in 2013 were on probation or parole. The percentages of inmates in state and federal prisons and local jails with a psychiatric diagnosis were 56%, 45%, and 64%, respectively.

I work at the Maryland correctional institutions, part of the Maryland Department of Public Safety and Correctional Services. One thing that I consistently hear from several correctional officers is “had these inmates received timely help and care, they wouldn’t have ended up behind bars.” Because of the criminalization of mental illness, in 44 states, the number of people with mental illness is higher in a jail or prison than in the largest state psychiatric hospital, according to the Treatment Advocacy Center. We have to be responsible for many of the inmates currently in correctional facilities for committing crimes related to mental health problems. In Maryland, a small state, there are 30,000 inmates in jails, and state and federal prison. The average cost of a meal is $1.36, thus $1.36 x 3 meals x 30,000 inmates = $122,400.00 for food alone for 1 day – this average does not take other expenses into account. By using money and manpower wisely and taking care of individuals’ mental health problems before they commit crimes, better outcomes could be achieved.

I used to work for MedOptions Inc. doing psychiatry consults at nursing homes and assisted-living facilities. Because of the shortage of psychiatrists and nurse practitioners, especially in the suburbs and rural areas, those patients could not be seen in a timely manner even for their 3-month routine follow-ups. As my colleagues and I have written previously, many elderly individuals with major neurocognitive disorders are not on the Food and Drug Administration­–approved cognitive enhancers, such as donepezil, galantamine, and memantine.10 Instead, those patients are on benzodiazepines, which are associated with cognitive impairments, and increased risk of pneumonia and falls. Benzodiazepines also can cause and/or worsen disinhibited behavior. Also, in those settings, crisis situations often are addressed days to weeks later because of the doctor shortage. This situation is going to get worse, because this patient population is growing.
 

Child and geriatric psychiatry shortages

Child and geriatric psychiatrist shortages are even higher than those in general psychiatry.11 Many years of training and low salaries are a few of the reasons some choose not to do a fellowship. These residency graduates would rather join a practice at an attending salary than at a fellow’s salary, which requires an additional 1 to 2 years of training. Student loans of $100,000–$500,000 after residency also discourage some from pursuing fellowship opportunities. We need to consider models such as 2 years of residency with 2 years of a child psychiatry fellowship or 3 years of residency with 1 year of geriatric psychiatry fellowship. Working as an adult attending physician (50% of the time) and concurrently doing a fellowship (50% of the time) while receiving an attending salary might motivate more people to complete a fellowship.

In specialties such as radiology, international medical graduates (IMGs) who have completed residency training in radiology in other countries can complete a radiology fellowship in a particular area for several years and can practice in the United States as board-eligible certified MDs. Likewise, in line with the model proposed here, we could provide unmatched graduates who have no residency training with 3 to 4 years of child psychiatry and geriatric psychiatry training in addition to some adult psychiatry training.

Implementation of such a model might take care of the shortage of child and geriatric psychiatrists. In 2015, there were 56 geriatric psychiatry fellowship programs; 54 positions were filled, and 51 fellows completed training.12 “It appears that a reasonable percentage of IMGs who obtain a fellowship in geriatric psychiatry do not have an intent of pursuing a career in the field,” Marc H. Zisselman, MD, former geriatric psychiatry fellowship director and currently with the Einstein Medical Center in Philadelphia, told me in 2016. These numbers are not at all sufficient to take care of the nation’s unmet need. Hence, implementing alternate strategies is imperative.
 

Administrative tasks and care

What consumes a psychiatrist’s time and leads to burnout? The answer has to do with administrative tasks at work. Administrative tasks are not an effective use of time for an MD who has spent more than a decade in medical school, residency, and fellowship training. Although electronic medical record (EMR) systems are considered a major advancement, engaging in the process throughout the day is associated with exhaustion.

Many physicians feel that EMRs have slowed them down, and some are not well-equipped to use them in quick and efficient ways. EMRs also have led to physicians making minimal eye contact in interviews with patients. Patients often complain: “I am talking, and the doctor is looking at the computer and typing.” Patients consider this behavior to be unprofessional and rude. In a survey of 57 U.S. physicians in family medicine, internal medicine, cardiology, and orthopedics, results showed that during the work day, 27% of their time was spent on direct clinical face time with patients and 49.2% was spent on EMR and desk work. While in the examination room with patients, physicians spent 52.9% of their time on direct clinical face time and 37.0% on EMR and desk work. Outside office hours, physicians spend up to 2 hours of personal time each night doing additional computer and other clerical work.13

Several EMR software systems, such as CareLogic, Cerner, Epic,NextGen, PointClickCare, and Sunrise, are used in the United States. The U.S. Veterans Affairs Medical Centers (VAMCs) use the computerized patient record system (CPRS) across the country. VA clinicians find CPRS extremely useful when they move from one VAMC to another. Likewise, hospitals and universities may use one software system such as the CPRS and thus, when clinicians change jobs, they find it hard to adapt to the new system.

Because psychiatrists are wasting a lot of time doing administrative tasks, they might be unable to do a good job with regard to making the right diagnoses and prescribing the best treatments.When I ask patients what are they diagnosed with, they tell me: “It depends on who you ask,” or “I’ve been diagnosed with everything.” This shows that we are not doing a good job or something is not right.

Currently, psychiatrists do not have the time and/or interest to make the right diagnoses and provide adequate psychoeducation for their patients. This also could be attributable to a variety of factors, including, but not limited to, time constraints, cynicism, and apathy. Time constraints also lead to the gross underutilization14 of relapse prevention strategies such as long-acting injectables and medications that can prevent suicide, such as lithium and clozapine.15

Other factors that undermine good care include not participating in continuing medical education (CME) and not staying up to date with the literature. For example, haloperidol continues to be one of the most frequently prescribed (probably, the most common) antipsychotic, although it is clearly neurotoxic16,17 and other safer options are available.18 Board certification and maintenance of certification (MOC) are not synonymous with good clinical practice. Many physicians are finding it hard to complete daily documentation, let alone time for MOC. For a variety of reasons, many are not maintaining certification, and this number is likely to increase. Think about how much time is devoted to the one-to-one interview with the patient and direct patient care during the 15-minute medical check appointment and the hour-long new evaluation. In some clinics, psychiatrists are asked to see more than 25 patients in 4 hours. Some U.S.-based psychiatrists see 65 inpatients and initiate 10 new evaluations in a single day. Under those kinds of time constraints, how can we provide quality care?
 

 

 

A model that would address the shortage

Overall, 7,826 PGY-1 applicants were unmatched in 2019, according to data from the 2019 Main Residency Match.19 Psychiatry residency programs could give these unmatched graduates 6 months of training (arbitrary duration) in psychiatry, which is not at all difficult with the program modules that are available.20 We could use them as physician associates as a major contributor to our workforce to complete administrative and other clinical tasks.

Administrative tasks are not necessarily negative, as all psychiatrists have done administrative tasks as medical students, residents, and fellows. However, at this point, administrative tasks are not an effective use of a psychiatrist’s time. Those physician associates could be paired with two to three psychiatrists to do administrative tasks (for making daytime and overnight phone calls; handling prescriptions, prior authorizations, and medication orders, especially over-the-counter and comfort medications in the inpatient units; doing chart reviews; ordering and checking laboratory tests; collecting collateral information from previous clinicians and records; printing medication education pamphlets; faxing; corresponding with insurance companies/utilization review; performing documentation; billing; and taking care of other clinical and administrative paperwork).

In addition, physician associates could collect information using rating scales such as the 9-item Patient Health Questionnaire for measurement-based care21 and Geriatric Depression Scale, both of which are currently not used in psychiatric practice because of time constraints and lack of manpower. Keep in mind that these individuals are medical doctors and could do a good job with these kinds of tasks. Most of them already have clinical experience in the United States and know the health care system. These MDs could conduct an initial interview (what medical students, residents, and fellows do) and present to the attending psychiatrist. Psychiatrists could then focus on the follow-up interview; diagnoses and treatment; major medical decision making, including shared decision making (patients feel that they are not part of the treatment plan); and seeing more patients, which is a more effective use of their time. This training would give these physician associates a chance to work as doctors and make a living. These MDs have completed medical school training after passing Medical College Admission Test – equivalent exams in their countries. They have passed all steps of the U.S. Medical Licensing Examination and have received Educational Commission for Foreign Medical Graduates certification. Some have even completed residency programs in their home countries.

Some U.S. states already have implemented these kinds of programs. In Arkansas, Kansas, and Missouri,22,23 legislators have passed laws allowing unmatched graduates who have not completed a residency program to work in medically underserved areas with a collaborating physician. These physicians must directly supervise the new doctors for at least a month before they can see patients on their own. Another proposal that has been suggested to address the psychiatrist shortage is employing physician assistants to provide care.24-26

The model proposed here is comparable to postdoctoral fellow-principal investigator and resident-attending collaborative work. At hospitals, a certified nurse assistant helps patients with health care needs under the supervision of a nurse. Similarly, a physician associate could help a psychiatrist under his or her supervision. In the Sheppard Pratt Health System in Baltimore, where I worked previously, for example, nurses dictate and prepare discharge summaries for the attending physician with whom they work. These are the kinds of tasks that physician associates could do as well.

The wait time to get a new evaluation with a psychiatrist is enormous. The policy is that a new patient admitted to an inpatient unit must be seen within 24 hours. With this model, the physician associates could see patients within a few hours, take care of their most immediate needs, take a history and conduct a physical, and write an admission note for the attending psychiatrist to sign. Currently, the outpatient practice is so busy that psychiatrists do not have the time to read the discharge summaries of patients referred to them after a recent hospitalization, which often leads to poor aftercare. The physician associates could read the discharge summaries and provide pertinent information to the attending psychiatrists.

In the inpatient units and emergency departments, nurses and social workers see patients before the attending physician, present patient information to the attending psychiatrist, and document their findings. It is redundant for the physician to write the same narrative again. Rather, the physician could add an addendum to the nurse’s or social worker’s notes and sign off. This would save a lot of time.

Numerous well-designed studies support the adoption of collaborative care models as one means of providing quality psychiatric care to larger populations.27,28 The American Psychiatric Association (APA) is currently training 3,500 psychiatrists in collaborative care through the Centers for Medicare and Medicaid Services’ Transforming Clinical Practice Initiative.29,30 Despite this training and the services provided by the nurse practitioners and physician assistants, the shortage of psychiatrists has not been adequately addressed. Hence, we need to think outside the box to find other potential pragmatic solutions.

Simply increasing the hours of work or the number of nurse practitioners or physician assistants already in practice is not going to solve the problem completely. The model proposed here and previously31 is likely to improve the quality of care that we now provide. This model should not be seen as exploiting these unmatched medical graduates and setting up a two-tiered health care system. The salary for these physicians would be a small percentage (5%-10%; these are arbitrary percentages) from the reimbursement of the attending psychiatrist. This model would not affect the salary of the attending psychiatrists; with this model, they would be able to see 25%-50% more patients (again, arbitrary percentages) with the help and support from these physician associates.
 

 

 

Potential barriers to implementation

There could be inherent barriers and complications to implementation of this model that are difficult to foresee at this point. Nurse practitioners (222,000 plus) and physician assistants (83,000 plus) have a fixed and structured curriculum, have national examining boards and national organizations with recertification requirements, and are licensed as independent practitioners, at least as far as CME is concerned.

Physician associates would need a standardized curriculum and examinations to validate what they have studied and learned. This process might be an important part of the credentialing of these individuals, as well as evaluation of cultural competency. If this model is to successfully lead to formation of a specific clinical group, it might need its own specific identity, national organization, national standards of competency, national certification and recertification processes, and national conference and CME or at least a subsection in a national behavioral and medical health organization, such as the APA or the American Academy of Child and Adolescent Psychiatry.

It would be desirable to “field test” the physician associate concept to clarify implementation difficulties, including the ones described above, that could arise. The cost of implementation of this program should not be of much concern; the 6-month training could be on a volunteer basis, or a small stipend might be paid by graduate medical education funding. This model could prove to be rewarding long term, save trillions of health care dollars, and allow us to provide exceptional and timely care.
 

Conclusion

The 2020 Mental Health America annual State of Mental Health in America report found that more than 70% of youth with severe major depressive disorder were in need of treatment in 2017. The percentage of adults with any mental illness who did not receive treatment stood at about 57.2%.32 Meanwhile, from 1999 through 2014, the age-adjusted suicide rate in the United States increased 24%.33 More individuals are seeking help because of increased awareness.34,35 In light of the access to services afforded by the ACA, physician associates might ease the workload of psychiatrists and enable them to deliver better care to more people. We would not necessarily have to use the term “physician associate” and could generate better terminologies later. In short, let’s tap into the pools of unmatched graduates and shrinking shrinks! If this model is successful, it could be used in other specialties and countries. The stakes for our patients have never been higher.

References

1. Bishop TF et al. Health Aff. 2016;35(7):1271-7.

2. National Council Medical Director Institute. The psychiatric shortage: Causes and solutions. 2017. Washington: National Council for Behavioral Health.

3. Satiani A et al. Psychiatric Serv. 2018;69:710-3.

4. Carlat D. Psychiatric Times. 2010 Aug 3;27(8).

5. McCartney M. BMJ. 2017;359:j5022.

6. Maslach C and Leiter MP. World Psychiatry. 2016 Jun 5;15:103-11.

7. Merritt Hawkins. “The silent shortage: A white paper examining supply, demand and recruitment trends in psychiatry.” 2018.

8. Sederer LI and Sharfstein SS. JAMA. 2014 Sep 24;312:1195-6.

9. James DJ and Glaze LE. Mental health problems of prison and jail inmates. 2006 Sep. U.S. Justice Department, Bureau of Justice Statistics Special Report.

10. Koola MM et al. J Geriatr Care Res. 2018;5(2):57-67.

11. Buckley PF and Nasrallah HA. Curr Psychiatr. 2016;15:23-4.

12. American Medical Association Database. Open Residency and Fellowship Positions.

13. Sinsky C et al. Ann Intern Med. 2016;165:753-60.

14. Koola MM. Curr Psychiatr. 2017 Mar. 16(3):19-20,47,e1.

15. Koola MM and Sebastian J. HSOA J Psychiatry Depress Anxiety. 2016;(2):1-11.

16. Nasrallah HA and Chen AT. Ann Clin Psychiatry. 2017 Aug;29(3):195-202.

17. Nasrallah HA. Curr Psychiatr. 2013 Jul;7-8.

18. Chen AT and Nasrallah HA. Schizophr Res. 2019 Jun;208:1-7.

19. National Resident Matching Program, Results and Data: 2019 Main Residency Match. National Resident Matching Program, Washington, 2019.

20. Masters KJ. J Physician Assist Educ. 2015 Sep;26(3):136-43.

21. Koola MM et al. J Nerv Ment Dis. 2011;199(12):989-90.

22. “New Missouri licensing offers ‘Band-Aid’ for physician shortages.” Kansas City Business Journal. Updated 2017 May 16.

23. “After earning an MD, she’s headed back to school – to become a nurse.” STAT. 2016 Nov 8.

24. Keizer TB and Trangle MA. Acad Psychiatry. 2015 Dec;39(6):691-4.

25. Miller JG and Peterson DJ. Acad Psychiatry. 2015 Dec;39(6):685-6.

26. Smith MS. Curr Psychiatr. 2019 Sep;18(9):17-24.

27. Osofsky HJ et al. Acad Psychiatry. 2016 Oct;40(5):747-54.

28. Dreier-Wolfgramm A et al. Z Gerontol Geriatr. 2017 May;50(Suppl 2):68-77.

29. Huang H and Barkil-Oteo A. Psychosomatics. 2015 Nov-Dec;56(6):658-61.

30. Raney L et al. Fam Syst Health. 2014 Jun;32(2):147-8.

31. Koola MM. Curr Psychiatr. 2016 Dec. 15(12):33-4.

32. Mental Health America. State of Mental Health in America 2020.

33. Curtin SC et al. NCHS Data Brief. 2016 Apr;(241):1-8.

34. Kelly DL et al. Ann Intern Med. 2020;172(2):167-8.

35. Miller JP and Nasrallah HA. Curr Psychiatr. 2015;14(12):45-6.

Dr. Koola is an associate professor in the department of psychiatry and behavioral health at Stony Brook (N.Y.) University. His main area of interest is novel therapeutic discovery in the treatment of schizophrenia. He has a particular interest in improving the health care delivery system for people with psychiatric illness. Dr. Koola declared no conflicts of interest. He can be reached at maju.koola@stonybrook.edu.

*This commentary was updated 2/2/2020.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Celecoxib oral solution treats migraine effectively in randomized trial

Article Type
Changed
Thu, 12/15/2022 - 15:45

An oral solution of celecoxib is more effective than placebo for the acute treatment of migraine, according to trial results published in the January issue of Headache.

Richard B. Lipton, MD, of Albert Einstein College of Medicine in Bronx, New York
Dr. Richard B. Lipton

Two hours after treatment, a significantly greater proportion of patients who received the liquid solution, known as DFN-15, had freedom from pain and freedom from their most bothersome accompanying symptom – nausea, photophobia, or phonophobia – compared with patients who received placebo. The pain freedom rates were 35.6% with celecoxib oral solution and 21.7% with placebo. The rates of freedom from the most bothersome symptom were 57.8% with celecoxib oral solution and 44.8% with placebo.

About 9% of patients who received celecoxib oral solution had treatment-emergent adverse events related to the study drug, the most common of which were dysgeusia (4.2%) and nausea (3.2%). In comparison, about 6% of patients who received placebo had treatment-emergent adverse events. There were no serious treatment-emergent adverse events.

“DFN‐15 has the potential to become a reliable and convenient acute therapeutic option for patients with migraine,” said lead author Richard B. Lipton, MD, and colleagues. Dr. Lipton is affiliated with the Albert Einstein College of Medicine in New York.
 

Assessing celecoxib in migraineurs

Evidence-based guidelines recommend nonsteroidal anti-inflammatory drugs (NSAIDs), including aspirin, diclofenac, ibuprofen, and naproxen, as effective acute migraine treatments, but these medications may increase the risk of adverse gastrointestinal events, the authors said. Celecoxib, a selective cyclooxygenase (COX)-2 inhibitor, is indicated for the treatment of acute pain in patients with ankylosing spondylitis, osteoarthritis, primary dysmenorrhea, and rheumatoid arthritis. Although it produces analgesia similar to other NSAIDs, among patients with osteoarthritis and rheumatoid arthritis, celecoxib is associated with significantly lower risk of gastrointestinal events, compared with naproxen and ibuprofen, and significantly lower risk of renal events, compared with ibuprofen.

Researchers have studied an oral capsule form of celecoxib (Celebrex, Pfizer) as an acute treatment for migraine in an open-label study that compared celecoxib with naproxen sodium. “While preliminary results suggest comparable efficacy but better tolerability than widely used and guideline-recommended NSAIDs, celecoxib is not currently approved for migraine,” the authors said.

Compared with the oral capsule formulation, the oral liquid solution DFN-15 has a faster median time to peak concentration under fasting conditions (within 1 hour vs. 2.5 hours), which “could translate into more rapid onset of pain relief,” the authors said. In addition, DFN-15 may have greater bioavailability, which could lower dose requirements and improve safety and tolerability. To compare the efficacy, tolerability, and safety of 120-mg DFN-15 with placebo for the acute treatment of migraine, researchers conducted a randomized, double-blind, placebo-controlled study.
 

Participants used single-dose bottles

Researchers randomized 622 patients 1:1 to DFN-15 or placebo, and 567 treated a migraine during the trial. Patients had a mean age of 40 years, and 87% were female. Patients had episodic migraine with or without aura, no signs of medication overuse, and two-eight migraine attacks per month. For the trial, patients treated a single migraine attack of moderate to severe intensity within 1 hour of onset. “Each subject was given a single‐dose bottle of DFN‐15 120 mg or matching placebo containing 4.8 mL liquid,” Dr. Lipton and colleagues said. “They were instructed to drink the entire contents of the bottle to ensure complete consumption of study medication.”

Freedom from pain and freedom from the most bothersome symptom at 2 hours were the coprimary endpoints. “DFN‐15 was also significantly superior to placebo on multiple secondary 2‐hour endpoints, including freedom from photophobia, pain relief, change in functional disability from baseline, overall and 24‐hour satisfaction with treatment, and use of rescue medication,” they reported.

“A new COX‐2 inhibitor that is effective and rapidly absorbed could provide an important new option for a wide range of patients,” the authors said. “Though cross‐study comparisons are problematic, the current results for DFN‐15 indicate that its efficacy is similar to that of NSAIDs and small‐molecule calcitonin gene‐related peptide receptor antagonists (gepants), based on placebo‐subtracted rates pain freedom in acute treatment trials (14%‐21%). DFN‐15 may also be useful among triptan users, who are at elevated risk of medication‐overuse headache and for whom TEAEs within 24 hours postdose are common. ... The form and delivery system of DFN‐15 – a ready‐to‐use solution in a 4.8‐mL single‐use bottle – may support patient adherence.”

The trial had robust placebo response rates, which may have been influenced by “the novelty of a ready‐made oral solution, which has not been previously tested for the acute treatment of migraine,” the authors noted. In addition, the trial does not address the treatment of mild pain or treatment across multiple attacks.

The trial was supported by Dr. Reddy’s Laboratories, manufacturer of DFN-15. Two authors are employed by and own stock in Dr. Reddy’s. Dr. Lipton and a coauthor disclosed research support from and consulting for Dr. Reddy’s.
 

SOURCE: Lipton RB et al. Headache. 2020;60(1):58-70. doi: 10.1111/head.13663.

Issue
Neurology Reviews- 28(4)
Publications
Topics
Sections

An oral solution of celecoxib is more effective than placebo for the acute treatment of migraine, according to trial results published in the January issue of Headache.

Richard B. Lipton, MD, of Albert Einstein College of Medicine in Bronx, New York
Dr. Richard B. Lipton

Two hours after treatment, a significantly greater proportion of patients who received the liquid solution, known as DFN-15, had freedom from pain and freedom from their most bothersome accompanying symptom – nausea, photophobia, or phonophobia – compared with patients who received placebo. The pain freedom rates were 35.6% with celecoxib oral solution and 21.7% with placebo. The rates of freedom from the most bothersome symptom were 57.8% with celecoxib oral solution and 44.8% with placebo.

About 9% of patients who received celecoxib oral solution had treatment-emergent adverse events related to the study drug, the most common of which were dysgeusia (4.2%) and nausea (3.2%). In comparison, about 6% of patients who received placebo had treatment-emergent adverse events. There were no serious treatment-emergent adverse events.

“DFN‐15 has the potential to become a reliable and convenient acute therapeutic option for patients with migraine,” said lead author Richard B. Lipton, MD, and colleagues. Dr. Lipton is affiliated with the Albert Einstein College of Medicine in New York.
 

Assessing celecoxib in migraineurs

Evidence-based guidelines recommend nonsteroidal anti-inflammatory drugs (NSAIDs), including aspirin, diclofenac, ibuprofen, and naproxen, as effective acute migraine treatments, but these medications may increase the risk of adverse gastrointestinal events, the authors said. Celecoxib, a selective cyclooxygenase (COX)-2 inhibitor, is indicated for the treatment of acute pain in patients with ankylosing spondylitis, osteoarthritis, primary dysmenorrhea, and rheumatoid arthritis. Although it produces analgesia similar to other NSAIDs, among patients with osteoarthritis and rheumatoid arthritis, celecoxib is associated with significantly lower risk of gastrointestinal events, compared with naproxen and ibuprofen, and significantly lower risk of renal events, compared with ibuprofen.

Researchers have studied an oral capsule form of celecoxib (Celebrex, Pfizer) as an acute treatment for migraine in an open-label study that compared celecoxib with naproxen sodium. “While preliminary results suggest comparable efficacy but better tolerability than widely used and guideline-recommended NSAIDs, celecoxib is not currently approved for migraine,” the authors said.

Compared with the oral capsule formulation, the oral liquid solution DFN-15 has a faster median time to peak concentration under fasting conditions (within 1 hour vs. 2.5 hours), which “could translate into more rapid onset of pain relief,” the authors said. In addition, DFN-15 may have greater bioavailability, which could lower dose requirements and improve safety and tolerability. To compare the efficacy, tolerability, and safety of 120-mg DFN-15 with placebo for the acute treatment of migraine, researchers conducted a randomized, double-blind, placebo-controlled study.
 

Participants used single-dose bottles

Researchers randomized 622 patients 1:1 to DFN-15 or placebo, and 567 treated a migraine during the trial. Patients had a mean age of 40 years, and 87% were female. Patients had episodic migraine with or without aura, no signs of medication overuse, and two-eight migraine attacks per month. For the trial, patients treated a single migraine attack of moderate to severe intensity within 1 hour of onset. “Each subject was given a single‐dose bottle of DFN‐15 120 mg or matching placebo containing 4.8 mL liquid,” Dr. Lipton and colleagues said. “They were instructed to drink the entire contents of the bottle to ensure complete consumption of study medication.”

Freedom from pain and freedom from the most bothersome symptom at 2 hours were the coprimary endpoints. “DFN‐15 was also significantly superior to placebo on multiple secondary 2‐hour endpoints, including freedom from photophobia, pain relief, change in functional disability from baseline, overall and 24‐hour satisfaction with treatment, and use of rescue medication,” they reported.

“A new COX‐2 inhibitor that is effective and rapidly absorbed could provide an important new option for a wide range of patients,” the authors said. “Though cross‐study comparisons are problematic, the current results for DFN‐15 indicate that its efficacy is similar to that of NSAIDs and small‐molecule calcitonin gene‐related peptide receptor antagonists (gepants), based on placebo‐subtracted rates pain freedom in acute treatment trials (14%‐21%). DFN‐15 may also be useful among triptan users, who are at elevated risk of medication‐overuse headache and for whom TEAEs within 24 hours postdose are common. ... The form and delivery system of DFN‐15 – a ready‐to‐use solution in a 4.8‐mL single‐use bottle – may support patient adherence.”

The trial had robust placebo response rates, which may have been influenced by “the novelty of a ready‐made oral solution, which has not been previously tested for the acute treatment of migraine,” the authors noted. In addition, the trial does not address the treatment of mild pain or treatment across multiple attacks.

The trial was supported by Dr. Reddy’s Laboratories, manufacturer of DFN-15. Two authors are employed by and own stock in Dr. Reddy’s. Dr. Lipton and a coauthor disclosed research support from and consulting for Dr. Reddy’s.
 

SOURCE: Lipton RB et al. Headache. 2020;60(1):58-70. doi: 10.1111/head.13663.

An oral solution of celecoxib is more effective than placebo for the acute treatment of migraine, according to trial results published in the January issue of Headache.

Richard B. Lipton, MD, of Albert Einstein College of Medicine in Bronx, New York
Dr. Richard B. Lipton

Two hours after treatment, a significantly greater proportion of patients who received the liquid solution, known as DFN-15, had freedom from pain and freedom from their most bothersome accompanying symptom – nausea, photophobia, or phonophobia – compared with patients who received placebo. The pain freedom rates were 35.6% with celecoxib oral solution and 21.7% with placebo. The rates of freedom from the most bothersome symptom were 57.8% with celecoxib oral solution and 44.8% with placebo.

About 9% of patients who received celecoxib oral solution had treatment-emergent adverse events related to the study drug, the most common of which were dysgeusia (4.2%) and nausea (3.2%). In comparison, about 6% of patients who received placebo had treatment-emergent adverse events. There were no serious treatment-emergent adverse events.

“DFN‐15 has the potential to become a reliable and convenient acute therapeutic option for patients with migraine,” said lead author Richard B. Lipton, MD, and colleagues. Dr. Lipton is affiliated with the Albert Einstein College of Medicine in New York.
 

Assessing celecoxib in migraineurs

Evidence-based guidelines recommend nonsteroidal anti-inflammatory drugs (NSAIDs), including aspirin, diclofenac, ibuprofen, and naproxen, as effective acute migraine treatments, but these medications may increase the risk of adverse gastrointestinal events, the authors said. Celecoxib, a selective cyclooxygenase (COX)-2 inhibitor, is indicated for the treatment of acute pain in patients with ankylosing spondylitis, osteoarthritis, primary dysmenorrhea, and rheumatoid arthritis. Although it produces analgesia similar to other NSAIDs, among patients with osteoarthritis and rheumatoid arthritis, celecoxib is associated with significantly lower risk of gastrointestinal events, compared with naproxen and ibuprofen, and significantly lower risk of renal events, compared with ibuprofen.

Researchers have studied an oral capsule form of celecoxib (Celebrex, Pfizer) as an acute treatment for migraine in an open-label study that compared celecoxib with naproxen sodium. “While preliminary results suggest comparable efficacy but better tolerability than widely used and guideline-recommended NSAIDs, celecoxib is not currently approved for migraine,” the authors said.

Compared with the oral capsule formulation, the oral liquid solution DFN-15 has a faster median time to peak concentration under fasting conditions (within 1 hour vs. 2.5 hours), which “could translate into more rapid onset of pain relief,” the authors said. In addition, DFN-15 may have greater bioavailability, which could lower dose requirements and improve safety and tolerability. To compare the efficacy, tolerability, and safety of 120-mg DFN-15 with placebo for the acute treatment of migraine, researchers conducted a randomized, double-blind, placebo-controlled study.
 

Participants used single-dose bottles

Researchers randomized 622 patients 1:1 to DFN-15 or placebo, and 567 treated a migraine during the trial. Patients had a mean age of 40 years, and 87% were female. Patients had episodic migraine with or without aura, no signs of medication overuse, and two-eight migraine attacks per month. For the trial, patients treated a single migraine attack of moderate to severe intensity within 1 hour of onset. “Each subject was given a single‐dose bottle of DFN‐15 120 mg or matching placebo containing 4.8 mL liquid,” Dr. Lipton and colleagues said. “They were instructed to drink the entire contents of the bottle to ensure complete consumption of study medication.”

Freedom from pain and freedom from the most bothersome symptom at 2 hours were the coprimary endpoints. “DFN‐15 was also significantly superior to placebo on multiple secondary 2‐hour endpoints, including freedom from photophobia, pain relief, change in functional disability from baseline, overall and 24‐hour satisfaction with treatment, and use of rescue medication,” they reported.

“A new COX‐2 inhibitor that is effective and rapidly absorbed could provide an important new option for a wide range of patients,” the authors said. “Though cross‐study comparisons are problematic, the current results for DFN‐15 indicate that its efficacy is similar to that of NSAIDs and small‐molecule calcitonin gene‐related peptide receptor antagonists (gepants), based on placebo‐subtracted rates pain freedom in acute treatment trials (14%‐21%). DFN‐15 may also be useful among triptan users, who are at elevated risk of medication‐overuse headache and for whom TEAEs within 24 hours postdose are common. ... The form and delivery system of DFN‐15 – a ready‐to‐use solution in a 4.8‐mL single‐use bottle – may support patient adherence.”

The trial had robust placebo response rates, which may have been influenced by “the novelty of a ready‐made oral solution, which has not been previously tested for the acute treatment of migraine,” the authors noted. In addition, the trial does not address the treatment of mild pain or treatment across multiple attacks.

The trial was supported by Dr. Reddy’s Laboratories, manufacturer of DFN-15. Two authors are employed by and own stock in Dr. Reddy’s. Dr. Lipton and a coauthor disclosed research support from and consulting for Dr. Reddy’s.
 

SOURCE: Lipton RB et al. Headache. 2020;60(1):58-70. doi: 10.1111/head.13663.

Issue
Neurology Reviews- 28(4)
Issue
Neurology Reviews- 28(4)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM HEADACHE

Citation Override
Publish date: January 29, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Esophageal length ratio predicts hiatal hernia recurrence

Article Type
Changed
Thu, 01/30/2020 - 11:37

A new ratio based on manometric esophageal length in relation to patient height could offer an objective means of preoperatively identifying shortened esophagus, which could improve surgical planning and outcomes with hiatal hernia repair, according to investigators.

In a retrospective analysis, patients with a lower manometric esophageal length-to-height (MELH) ratio had a higher rate of hiatal hernia recurrence, reported lead author Pooja Lal, MD, of the Cleveland Clinic, and colleagues.

A short esophagus increases tension at the gastroesophageal junction, which may necessitate a lengthening procedure in addition to hiatal hernia repair, the investigators wrote in the Journal of Clinical Gastroenterology. As lengthening may require additional expertise, preoperative knowledge of a short esophagus is beneficial; however, until this point, short esophagus could only be identified intraoperatively. Since previous attempts to define short esophagus were confounded by patient height, the investigators devised the MELH ratio to account for this variable.

The investigators evaluated data from 245 patients who underwent hiatal hernia repair by Nissen fundoplication, of whom 157 also underwent esophageal lengthening with a Collis gastroplasty. The decision to perform a Collis gastroplasty was made intraoperatively if a patient did not have at least 2-3 cm of intra-abdominal esophageal length with minimal tension.

For all patients, the MELH ratio was determined by dividing manometric esophageal length by patient height (both in centimeters).

On average, patients who needed a Collis gastroplasty had a shorter esophagus (20.2 vs. 22.4 cm; P less than .001) and a lower MELH ratio (0.12 vs. 0.13; P less than .001).

Multivariable hazard regression showed that regardless of surgical approach, for every 0.01 U-increment increase in MELH ratio, risk of hernia recurrence decreased by 33% (hazard ratio, 0.67; P less than .001). In contrast, regardless of MELH ratio, repair without Collis was associated with a 500% increased risk of recurrence (HR, 6.1; P less than .001). Over 5 years, the benefit of Collis gastroplasty translated to a significantly lower rate of both hernia recurrence (18% vs. 55%; P less than .001) and reoperations for recurrence (0% vs. 10%; P less than .001).

“We suggest that surgeons and gastroenterologists calculate the MELH ratio before repair of a hiatal hernia, and be cognizant of patients with a shortened esophagus,” the investigators concluded. “An esophageal lengthening procedure such as a Collis gastroplasty may reduce the risk of hernia recurrence and reoperation for recurrence, especially for patients with a MELH ratio less than 0.12.”The investigators reported no conflicts of interest.

SOURCE: Lal P et al. J Clin Gastroenterol. 2020 Jan 20. doi: 10.1097/MCG.0000000000001316.

Publications
Topics
Sections

A new ratio based on manometric esophageal length in relation to patient height could offer an objective means of preoperatively identifying shortened esophagus, which could improve surgical planning and outcomes with hiatal hernia repair, according to investigators.

In a retrospective analysis, patients with a lower manometric esophageal length-to-height (MELH) ratio had a higher rate of hiatal hernia recurrence, reported lead author Pooja Lal, MD, of the Cleveland Clinic, and colleagues.

A short esophagus increases tension at the gastroesophageal junction, which may necessitate a lengthening procedure in addition to hiatal hernia repair, the investigators wrote in the Journal of Clinical Gastroenterology. As lengthening may require additional expertise, preoperative knowledge of a short esophagus is beneficial; however, until this point, short esophagus could only be identified intraoperatively. Since previous attempts to define short esophagus were confounded by patient height, the investigators devised the MELH ratio to account for this variable.

The investigators evaluated data from 245 patients who underwent hiatal hernia repair by Nissen fundoplication, of whom 157 also underwent esophageal lengthening with a Collis gastroplasty. The decision to perform a Collis gastroplasty was made intraoperatively if a patient did not have at least 2-3 cm of intra-abdominal esophageal length with minimal tension.

For all patients, the MELH ratio was determined by dividing manometric esophageal length by patient height (both in centimeters).

On average, patients who needed a Collis gastroplasty had a shorter esophagus (20.2 vs. 22.4 cm; P less than .001) and a lower MELH ratio (0.12 vs. 0.13; P less than .001).

Multivariable hazard regression showed that regardless of surgical approach, for every 0.01 U-increment increase in MELH ratio, risk of hernia recurrence decreased by 33% (hazard ratio, 0.67; P less than .001). In contrast, regardless of MELH ratio, repair without Collis was associated with a 500% increased risk of recurrence (HR, 6.1; P less than .001). Over 5 years, the benefit of Collis gastroplasty translated to a significantly lower rate of both hernia recurrence (18% vs. 55%; P less than .001) and reoperations for recurrence (0% vs. 10%; P less than .001).

“We suggest that surgeons and gastroenterologists calculate the MELH ratio before repair of a hiatal hernia, and be cognizant of patients with a shortened esophagus,” the investigators concluded. “An esophageal lengthening procedure such as a Collis gastroplasty may reduce the risk of hernia recurrence and reoperation for recurrence, especially for patients with a MELH ratio less than 0.12.”The investigators reported no conflicts of interest.

SOURCE: Lal P et al. J Clin Gastroenterol. 2020 Jan 20. doi: 10.1097/MCG.0000000000001316.

A new ratio based on manometric esophageal length in relation to patient height could offer an objective means of preoperatively identifying shortened esophagus, which could improve surgical planning and outcomes with hiatal hernia repair, according to investigators.

In a retrospective analysis, patients with a lower manometric esophageal length-to-height (MELH) ratio had a higher rate of hiatal hernia recurrence, reported lead author Pooja Lal, MD, of the Cleveland Clinic, and colleagues.

A short esophagus increases tension at the gastroesophageal junction, which may necessitate a lengthening procedure in addition to hiatal hernia repair, the investigators wrote in the Journal of Clinical Gastroenterology. As lengthening may require additional expertise, preoperative knowledge of a short esophagus is beneficial; however, until this point, short esophagus could only be identified intraoperatively. Since previous attempts to define short esophagus were confounded by patient height, the investigators devised the MELH ratio to account for this variable.

The investigators evaluated data from 245 patients who underwent hiatal hernia repair by Nissen fundoplication, of whom 157 also underwent esophageal lengthening with a Collis gastroplasty. The decision to perform a Collis gastroplasty was made intraoperatively if a patient did not have at least 2-3 cm of intra-abdominal esophageal length with minimal tension.

For all patients, the MELH ratio was determined by dividing manometric esophageal length by patient height (both in centimeters).

On average, patients who needed a Collis gastroplasty had a shorter esophagus (20.2 vs. 22.4 cm; P less than .001) and a lower MELH ratio (0.12 vs. 0.13; P less than .001).

Multivariable hazard regression showed that regardless of surgical approach, for every 0.01 U-increment increase in MELH ratio, risk of hernia recurrence decreased by 33% (hazard ratio, 0.67; P less than .001). In contrast, regardless of MELH ratio, repair without Collis was associated with a 500% increased risk of recurrence (HR, 6.1; P less than .001). Over 5 years, the benefit of Collis gastroplasty translated to a significantly lower rate of both hernia recurrence (18% vs. 55%; P less than .001) and reoperations for recurrence (0% vs. 10%; P less than .001).

“We suggest that surgeons and gastroenterologists calculate the MELH ratio before repair of a hiatal hernia, and be cognizant of patients with a shortened esophagus,” the investigators concluded. “An esophageal lengthening procedure such as a Collis gastroplasty may reduce the risk of hernia recurrence and reoperation for recurrence, especially for patients with a MELH ratio less than 0.12.”The investigators reported no conflicts of interest.

SOURCE: Lal P et al. J Clin Gastroenterol. 2020 Jan 20. doi: 10.1097/MCG.0000000000001316.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF CLINICAL GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Genetic factor linked to impaired memory after heading many soccer balls

Article Type
Changed
Thu, 12/15/2022 - 15:45

Adult soccer players who frequently head the ball may have a heightened risk of memory impairment if they are carriers of the APOE e4 allele, according to authors of a recent longitudinal study. Worse verbal memory was linked to high levels of ball heading among those players who were APOE e4–positive, compared with those who were APOE e4–negative, according to the authors, led by Liane E. Hunter, PhD, of the Gruss Magnetic Resonance Imaging Center at Albert Einstein College of Medicine, New York.

These findings, while preliminary, do raise the possibility that “safe levels for soccer heading” could be proposed to protect players from harm or that APOE e4-positive players might be advised to limit their exposure to head impacts, Dr. Hunter and coauthors wrote in a report in JAMA Neurology.

However, the findings should “in no way” be used to justify APOE testing to make clinical decisions regarding the safety of playing soccer, said Sarah J. Banks, PhD, of the University of California, San Diego, and Jesse Mez, MD, of Boston University in a related editorial (doi: 10.1001/jamaneurol.2019.4451). “Like most good science, the study provides an important, but incremental, step to understanding gene-environment interactions in sports,” Dr. Banks and Dr. Mez wrote in their editorial.

While there are some studies tying APOE e4 to poorer neuropsychiatric performance in boxers and U.S. football players, there are no such studies looking at the role of APOE e4 in soccer players exposed to repetitive “subconcussive” ball heading, according to Dr. Hunter and coresearchers. Accordingly, they sought to analyze APOE e4 and neuropsychological performance in relation to ball heading in 352 adult amateur soccer players enrolled in the Einstein Soccer Study between November 2013 and January 2018. About three-quarters of the players were male, and the median age at enrollment was 23 years.

The players completed a computer-based questionnaire designed to estimate their exposure to soccer heading at enrollment and at follow-up visits every 3-6 months. To test verbal memory at each visit, players were asked to memorize a 12-item grocery list, and then asked to recall the items 20 minutes later.

High levels of heading were linked to poorer performance on the verbal memory task, similar to one previously reported study, investigators said.

There was no association overall of APOE e4 and heading with performance on the shopping list task, according to investigators. By contrast, there was a 4.1-fold increased deficit in verbal memory for APOE e4–positive players with high heading exposure, compared with those with low exposure, investigators reported. Likewise, there was an 8.5-fold increased deficit in verbal memory for APOE e4–positive players with high versus moderate heading exposure.

That said, the absolute difference in performance was “subtle” and difficult to interpret in the context of a cross-sectional study, Dr. Banks and Dr. Mez said in their editorial.

In absolute terms, the mean decrease in scores on the 13-point shopping list task between the high and low heading exposure was 1.13 points greater for the APOE e4–positive group, compared with the APOE e4–negative group, and the decrease between the high and moderate heading exposure groups was 0.98 points greater, according to the report.

“The effect size of our interaction is relatively small,” Dr. Hunter and colleagues acknowledged in their report. “However, similar to the widely cited model of disease evolution in Alzheimer disease, our findings may be evidence of early subclinical effects, which could accumulate in APOE e4–positive players over a protracted time frame and ultimately be associated with overt clinical dysfunction.”

Several study authors said they had received grants from the National Institutes of Health and affiliated institutes, the Migraine Research Foundation, and the National Headache Foundation. They reported disclosures related to Amgen, Avanir, Biohaven Holdings, Biovision, Boston Scientific, Eli Lilly, eNeura Therapeutics, GlaxoSmithKline, Merck, and Pfizer, among others.

SOURCE: Hunter LE et al. JAMA Neurol. 2020 Jan 27. doi: 10.1001/jamaneurol.2019.4828.

Publications
Topics
Sections

Adult soccer players who frequently head the ball may have a heightened risk of memory impairment if they are carriers of the APOE e4 allele, according to authors of a recent longitudinal study. Worse verbal memory was linked to high levels of ball heading among those players who were APOE e4–positive, compared with those who were APOE e4–negative, according to the authors, led by Liane E. Hunter, PhD, of the Gruss Magnetic Resonance Imaging Center at Albert Einstein College of Medicine, New York.

These findings, while preliminary, do raise the possibility that “safe levels for soccer heading” could be proposed to protect players from harm or that APOE e4-positive players might be advised to limit their exposure to head impacts, Dr. Hunter and coauthors wrote in a report in JAMA Neurology.

However, the findings should “in no way” be used to justify APOE testing to make clinical decisions regarding the safety of playing soccer, said Sarah J. Banks, PhD, of the University of California, San Diego, and Jesse Mez, MD, of Boston University in a related editorial (doi: 10.1001/jamaneurol.2019.4451). “Like most good science, the study provides an important, but incremental, step to understanding gene-environment interactions in sports,” Dr. Banks and Dr. Mez wrote in their editorial.

While there are some studies tying APOE e4 to poorer neuropsychiatric performance in boxers and U.S. football players, there are no such studies looking at the role of APOE e4 in soccer players exposed to repetitive “subconcussive” ball heading, according to Dr. Hunter and coresearchers. Accordingly, they sought to analyze APOE e4 and neuropsychological performance in relation to ball heading in 352 adult amateur soccer players enrolled in the Einstein Soccer Study between November 2013 and January 2018. About three-quarters of the players were male, and the median age at enrollment was 23 years.

The players completed a computer-based questionnaire designed to estimate their exposure to soccer heading at enrollment and at follow-up visits every 3-6 months. To test verbal memory at each visit, players were asked to memorize a 12-item grocery list, and then asked to recall the items 20 minutes later.

High levels of heading were linked to poorer performance on the verbal memory task, similar to one previously reported study, investigators said.

There was no association overall of APOE e4 and heading with performance on the shopping list task, according to investigators. By contrast, there was a 4.1-fold increased deficit in verbal memory for APOE e4–positive players with high heading exposure, compared with those with low exposure, investigators reported. Likewise, there was an 8.5-fold increased deficit in verbal memory for APOE e4–positive players with high versus moderate heading exposure.

That said, the absolute difference in performance was “subtle” and difficult to interpret in the context of a cross-sectional study, Dr. Banks and Dr. Mez said in their editorial.

In absolute terms, the mean decrease in scores on the 13-point shopping list task between the high and low heading exposure was 1.13 points greater for the APOE e4–positive group, compared with the APOE e4–negative group, and the decrease between the high and moderate heading exposure groups was 0.98 points greater, according to the report.

“The effect size of our interaction is relatively small,” Dr. Hunter and colleagues acknowledged in their report. “However, similar to the widely cited model of disease evolution in Alzheimer disease, our findings may be evidence of early subclinical effects, which could accumulate in APOE e4–positive players over a protracted time frame and ultimately be associated with overt clinical dysfunction.”

Several study authors said they had received grants from the National Institutes of Health and affiliated institutes, the Migraine Research Foundation, and the National Headache Foundation. They reported disclosures related to Amgen, Avanir, Biohaven Holdings, Biovision, Boston Scientific, Eli Lilly, eNeura Therapeutics, GlaxoSmithKline, Merck, and Pfizer, among others.

SOURCE: Hunter LE et al. JAMA Neurol. 2020 Jan 27. doi: 10.1001/jamaneurol.2019.4828.

Adult soccer players who frequently head the ball may have a heightened risk of memory impairment if they are carriers of the APOE e4 allele, according to authors of a recent longitudinal study. Worse verbal memory was linked to high levels of ball heading among those players who were APOE e4–positive, compared with those who were APOE e4–negative, according to the authors, led by Liane E. Hunter, PhD, of the Gruss Magnetic Resonance Imaging Center at Albert Einstein College of Medicine, New York.

These findings, while preliminary, do raise the possibility that “safe levels for soccer heading” could be proposed to protect players from harm or that APOE e4-positive players might be advised to limit their exposure to head impacts, Dr. Hunter and coauthors wrote in a report in JAMA Neurology.

However, the findings should “in no way” be used to justify APOE testing to make clinical decisions regarding the safety of playing soccer, said Sarah J. Banks, PhD, of the University of California, San Diego, and Jesse Mez, MD, of Boston University in a related editorial (doi: 10.1001/jamaneurol.2019.4451). “Like most good science, the study provides an important, but incremental, step to understanding gene-environment interactions in sports,” Dr. Banks and Dr. Mez wrote in their editorial.

While there are some studies tying APOE e4 to poorer neuropsychiatric performance in boxers and U.S. football players, there are no such studies looking at the role of APOE e4 in soccer players exposed to repetitive “subconcussive” ball heading, according to Dr. Hunter and coresearchers. Accordingly, they sought to analyze APOE e4 and neuropsychological performance in relation to ball heading in 352 adult amateur soccer players enrolled in the Einstein Soccer Study between November 2013 and January 2018. About three-quarters of the players were male, and the median age at enrollment was 23 years.

The players completed a computer-based questionnaire designed to estimate their exposure to soccer heading at enrollment and at follow-up visits every 3-6 months. To test verbal memory at each visit, players were asked to memorize a 12-item grocery list, and then asked to recall the items 20 minutes later.

High levels of heading were linked to poorer performance on the verbal memory task, similar to one previously reported study, investigators said.

There was no association overall of APOE e4 and heading with performance on the shopping list task, according to investigators. By contrast, there was a 4.1-fold increased deficit in verbal memory for APOE e4–positive players with high heading exposure, compared with those with low exposure, investigators reported. Likewise, there was an 8.5-fold increased deficit in verbal memory for APOE e4–positive players with high versus moderate heading exposure.

That said, the absolute difference in performance was “subtle” and difficult to interpret in the context of a cross-sectional study, Dr. Banks and Dr. Mez said in their editorial.

In absolute terms, the mean decrease in scores on the 13-point shopping list task between the high and low heading exposure was 1.13 points greater for the APOE e4–positive group, compared with the APOE e4–negative group, and the decrease between the high and moderate heading exposure groups was 0.98 points greater, according to the report.

“The effect size of our interaction is relatively small,” Dr. Hunter and colleagues acknowledged in their report. “However, similar to the widely cited model of disease evolution in Alzheimer disease, our findings may be evidence of early subclinical effects, which could accumulate in APOE e4–positive players over a protracted time frame and ultimately be associated with overt clinical dysfunction.”

Several study authors said they had received grants from the National Institutes of Health and affiliated institutes, the Migraine Research Foundation, and the National Headache Foundation. They reported disclosures related to Amgen, Avanir, Biohaven Holdings, Biovision, Boston Scientific, Eli Lilly, eNeura Therapeutics, GlaxoSmithKline, Merck, and Pfizer, among others.

SOURCE: Hunter LE et al. JAMA Neurol. 2020 Jan 27. doi: 10.1001/jamaneurol.2019.4828.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA Neurology

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

ID Blog: Wuhan coronavirus – just a stop on the zoonotic highway

Article Type
Changed
Tue, 03/17/2020 - 10:09

Emerging viruses that spread to humans from an animal host are commonplace and represent some of the deadliest diseases known. Given the details of the Wuhan coronavirus (2019-nCoV) outbreak, including the genetic profile of the disease agent, the hypothesis of a snake origin was the first raised in the peer-reviewed literature.

Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.
SISTEMA 12/Wikimedia Commons/CC BY-SA 4.0
Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.

It is a highly controversial origin story, however, given that mammals have been the sources of all other such zoonotic coronaviruses, as well as a host of other zoonotic diseases.

An animal source for emerging infections such as the 2019-nCoV is the default hypothesis, because “around 60% of all infectious diseases in humans are zoonotic, as are 75% of all emerging infectious diseases,” according to a United Nations report. The report goes on to say that, “on average, one new infectious disease emerges in humans every 4 months.”

To appreciate the emergence and nature of 2019-nCoV, it is important to examine the history of zoonotic outbreaks of other such diseases, especially with regard to the “mixing-vessel” phenomenon, which has been noted in closely related coronaviruses, including SARS and MERS, as well as the widely disparate HIV, Ebola, and influenza viruses.
 

Mutants in the mixing vessel

The mixing-vessel phenomenon is conceptually easy but molecularly complex. A single animal is coinfected with two related viruses; the virus genomes recombine together (virus “sex”) in that animal to form a new variant of virus. Such new mutant viruses can be more or less infective, more or less deadly, and more or less able to jump the species or even genus barrier. An emerging viral zoonosis can occur when a human being is exposed to one of these new viruses (either from the origin species or another species intermediate) that is capable of also infecting a human cell. Such exposure can occur from close proximity to animal waste or body fluids, as in the farm environment, or from wildlife pets or the capturing and slaughtering of wildlife for food, as is proposed in the case of the Wuhan seafood market scenario. In fact, the scientists who postulated a snake intermediary as the potential mixing vessel also stated that 2019‐nCoV appears to be a recombinant virus between a bat coronavirus and an origin‐unknown coronavirus.

Coronaviruses in particular have a history of moving from animal to human hosts (and even back again), and their detailed genetic pattern and taxonomy can reveal the animal origin of these diseases.
 

Going batty

Bats, in particular, have been shown to be a reservoir species for both alphacoronaviruses and betacoronaviruses. Given their ecology and behavior, they have been found to play a key role in transmitting coronaviruses between species. A highly pertinent example of this is the SARS coronavirus, which was shown to have likely originated in Chinese horseshoe bats. The SARS virus, which is genetically closely related to the new Wuhan coronavirus, first infected humans in the Guangdong province of southern China in 2002.

 

 

Scientists speculate that the virus was then either transmitted directly to humans from bats, or passed through an intermediate host species, with SARS-like viruses isolated from Himalayan palm civets found in a live-animal market in Guangdong. The virus infection was also detected in other animals (including a raccoon dog, Nyctereutes procyonoides) and in humans working at the market.

The MERS coronavirus is a betacoronavirus that was first reported in Saudi Arabia in 2012. It turned out to be far more deadly than either SARS or the Wuhan virus (at least as far as current estimates of the new coronavirus’s behavior). The MERS genotype was found to be closely related to MERS-like viruses in bats in Saudi Arabia, Africa, Europe, and Asia. Studies done on the cell receptor for MERS showed an apparently conserved viral receptor in both bats and humans. And an identical strain of MERS was found in bats in a nearby cave and near the workplace of the first known human patient.

Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.
Wikimedia Commons/Mickey Samuni-Blank
Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.

However, in many of the other locations of the outbreak in the Middle East, there appeared to be limited contact between bats and humans, so scientists looked for another vector species, perhaps one that was acting as an intermediate. A high seroprevalence of MERS-CoV or a closely related virus was found in camels across the Arabian Peninsula and parts of eastern and northern Africa, while tests for MERS antibodies were negative in the most-likely other species of livestock or pet animals, including chickens, cows, goats, horses, and sheep.

In addition, the MERS-related CoV carried by camels was genetically highly similar to that detected in humans, as demonstrated in one particular outbreak on a farm in Qatar where the genetic sequences of MERS-CoV in the nasal swabs from 3 of 14 seropositive camels were similar to those of 2 human cases on the same farm. Similar genomic results were found in MERS-CoV from nasal swabs from camels in Saudi Arabia.
 

Other mixing-vessel zoonoses

HIV, the viral cause of AIDS, provides an almost-textbook origin story of the rise of a zoonotic supervillain. The virus was genetically traced to have a chimpanzee-to-human origin, but it was found to be more complicated than that. The virus first emerged in the 1920s in Africa in what is now the Democratic Republic of the Congo, well before its rise to a global pandemic in the 1980s.

Researchers believe the chimpanzee virus is a hybrid of the simian immunodeficiency viruses (SIVs) naturally infecting two different monkey species: the red-capped mangabey (Cercocebus torquatus) and the greater spot-nosed monkey (Cercopithecus nictitans). Chimpanzees kill and eat monkeys, which is likely how they acquired the monkey viruses. The viruses hybridized in a chimpanzee; the hybrid virus then spread through the chimpanzee population and was later transmitted to humans who captured and slaughtered chimps for meat (becoming exposed to their blood). This was the most likely origin of HIV-1.

HIV-1 also shows one of the major risks of zoonotic infections. They can continue to mutate in its human host, increasing the risk of greater virulence, but also interfering with the production of a universally effective vaccine. Since its transmission to humans, for example, many subtypes of the HIV-1 strain have developed, with genetic differences even in the same subtypes found to be up to 20%.

Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.
CDC/Frederick A. Murphy
Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.

Ebolavirus, first detected in 1976, is another case of bats being the potential culprit. Genetic analysis has shown that African fruit bats are likely involved in the spread of the virus and may be its reservoir host. Further evidence of this was found in the most recent human-infecting Bombali variant of the virus, which was identified in samples from bats collected from Sierra Leone.

It was also found that pigs can also become infected with Zaire ebolavirus, leading to the fear that pigs could serve as a mixing vessel for it and other filoviruses. Pigs have their own forms of Ebola-like disease viruses, which are not currently transmissible to humans, but could provide a potential mixing-vessel reservoir.
 

 

 

Emergent influenzas

The Western world has been most affected by these highly mutable, multispecies zoonotic viruses. The 1957 and 1968 flu pandemics contained a mixture of gene segments from human and avian influenza viruses. “What is clear from genetic analysis of the viruses that caused these past pandemics is that reassortment (gene swapping) occurred to produce novel influenza viruses that caused the pandemics. In both of these cases, the new viruses that emerged showed major differences from the parent viruses,” according to the Centers for Disease Control and Prevention.

Influenza is, however, a good example that all zoonoses are not the result of a mixing-vessel phenomenon, with evidence showing that the origin of the catastrophic 1918 virus pandemic likely resulted from a bird influenza virus directly infecting humans and pigs at about the same time without reassortment, according to the CDC.
 

Building a protective infrastructure

The first 2 decades of the 21st century saw a huge increase in efforts to develop an infrastructure to monitor and potentially prevent the spread of new zoonoses. As part of a global effort led by the United Nations, the U.S. Agency for International AID developed the PREDICT program in 2009 “to strengthen global capacity for detection and discovery of zoonotic viruses with pandemic potential. Those include coronaviruses, the family to which SARS and MERS belong; paramyxoviruses, like Nipah virus; influenza viruses; and filoviruses, like the ebolavirus.”

PREDICT funding to the EcoHealth Alliance led to discovery of the likely bat origins of the Zaire ebolavirus during the 2013-2016 outbreak. And throughout the existence of PREDICT, more than 145,000 animals and people were surveyed in areas of likely zoonotic outbreaks, leading to the detection of more than “1,100 unique viruses, including zoonotic diseases of public health concern such as Bombali ebolavirus, Zaire ebolavirus, Marburg virus, and MERS- and SARS-like coronaviruses,” according to PREDICT partner, the University of California, Davis.

PREDICT-2 was launched in 2014 with the continuing goals of “identifying and better characterizing pathogens of known epidemic and unknown pandemic potential; recognizing animal reservoirs and amplification hosts of human-infectious viruses; and efficiently targeting intervention action at human behaviors which amplify disease transmission at critical animal-animal and animal-human interfaces in hotspots of viral evolution, spillover, amplification, and spread.”

However, in October 2019, the Trump administration cut all funding to the PREDICT program, leading to its shutdown. In a New York Times interview, Peter Daszak, president of the EcoHealth Alliance, stated: “PREDICT was an approach to heading off pandemics, instead of sitting there waiting for them to emerge and then mobilizing.”

Ultimately, in addition to its human cost, the current Wuhan coronavirus outbreak can be looked at an object lesson – a test of the pandemic surveillance and control systems currently in place, and a practice run for the next and potentially deadlier zoonotic outbreaks to come. Perhaps it is also a reminder that cutting resources to detect zoonoses at their source in their animal hosts – before they enter the human chain– is perhaps not the most prudent of ideas.

mlesney@mdedge.com

Mark Lesney is the managing editor of MDedge.com/IDPractioner. He has a PhD in plant virology and a PhD in the history of science, with a focus on the history of biotechnology and medicine. He has served as an adjunct assistant professor of the department of biochemistry and molecular & celluar biology at Georgetown University, Washington.

Publications
Topics
Sections

Emerging viruses that spread to humans from an animal host are commonplace and represent some of the deadliest diseases known. Given the details of the Wuhan coronavirus (2019-nCoV) outbreak, including the genetic profile of the disease agent, the hypothesis of a snake origin was the first raised in the peer-reviewed literature.

Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.
SISTEMA 12/Wikimedia Commons/CC BY-SA 4.0
Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.

It is a highly controversial origin story, however, given that mammals have been the sources of all other such zoonotic coronaviruses, as well as a host of other zoonotic diseases.

An animal source for emerging infections such as the 2019-nCoV is the default hypothesis, because “around 60% of all infectious diseases in humans are zoonotic, as are 75% of all emerging infectious diseases,” according to a United Nations report. The report goes on to say that, “on average, one new infectious disease emerges in humans every 4 months.”

To appreciate the emergence and nature of 2019-nCoV, it is important to examine the history of zoonotic outbreaks of other such diseases, especially with regard to the “mixing-vessel” phenomenon, which has been noted in closely related coronaviruses, including SARS and MERS, as well as the widely disparate HIV, Ebola, and influenza viruses.
 

Mutants in the mixing vessel

The mixing-vessel phenomenon is conceptually easy but molecularly complex. A single animal is coinfected with two related viruses; the virus genomes recombine together (virus “sex”) in that animal to form a new variant of virus. Such new mutant viruses can be more or less infective, more or less deadly, and more or less able to jump the species or even genus barrier. An emerging viral zoonosis can occur when a human being is exposed to one of these new viruses (either from the origin species or another species intermediate) that is capable of also infecting a human cell. Such exposure can occur from close proximity to animal waste or body fluids, as in the farm environment, or from wildlife pets or the capturing and slaughtering of wildlife for food, as is proposed in the case of the Wuhan seafood market scenario. In fact, the scientists who postulated a snake intermediary as the potential mixing vessel also stated that 2019‐nCoV appears to be a recombinant virus between a bat coronavirus and an origin‐unknown coronavirus.

Coronaviruses in particular have a history of moving from animal to human hosts (and even back again), and their detailed genetic pattern and taxonomy can reveal the animal origin of these diseases.
 

Going batty

Bats, in particular, have been shown to be a reservoir species for both alphacoronaviruses and betacoronaviruses. Given their ecology and behavior, they have been found to play a key role in transmitting coronaviruses between species. A highly pertinent example of this is the SARS coronavirus, which was shown to have likely originated in Chinese horseshoe bats. The SARS virus, which is genetically closely related to the new Wuhan coronavirus, first infected humans in the Guangdong province of southern China in 2002.

 

 

Scientists speculate that the virus was then either transmitted directly to humans from bats, or passed through an intermediate host species, with SARS-like viruses isolated from Himalayan palm civets found in a live-animal market in Guangdong. The virus infection was also detected in other animals (including a raccoon dog, Nyctereutes procyonoides) and in humans working at the market.

The MERS coronavirus is a betacoronavirus that was first reported in Saudi Arabia in 2012. It turned out to be far more deadly than either SARS or the Wuhan virus (at least as far as current estimates of the new coronavirus’s behavior). The MERS genotype was found to be closely related to MERS-like viruses in bats in Saudi Arabia, Africa, Europe, and Asia. Studies done on the cell receptor for MERS showed an apparently conserved viral receptor in both bats and humans. And an identical strain of MERS was found in bats in a nearby cave and near the workplace of the first known human patient.

Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.
Wikimedia Commons/Mickey Samuni-Blank
Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.

However, in many of the other locations of the outbreak in the Middle East, there appeared to be limited contact between bats and humans, so scientists looked for another vector species, perhaps one that was acting as an intermediate. A high seroprevalence of MERS-CoV or a closely related virus was found in camels across the Arabian Peninsula and parts of eastern and northern Africa, while tests for MERS antibodies were negative in the most-likely other species of livestock or pet animals, including chickens, cows, goats, horses, and sheep.

In addition, the MERS-related CoV carried by camels was genetically highly similar to that detected in humans, as demonstrated in one particular outbreak on a farm in Qatar where the genetic sequences of MERS-CoV in the nasal swabs from 3 of 14 seropositive camels were similar to those of 2 human cases on the same farm. Similar genomic results were found in MERS-CoV from nasal swabs from camels in Saudi Arabia.
 

Other mixing-vessel zoonoses

HIV, the viral cause of AIDS, provides an almost-textbook origin story of the rise of a zoonotic supervillain. The virus was genetically traced to have a chimpanzee-to-human origin, but it was found to be more complicated than that. The virus first emerged in the 1920s in Africa in what is now the Democratic Republic of the Congo, well before its rise to a global pandemic in the 1980s.

Researchers believe the chimpanzee virus is a hybrid of the simian immunodeficiency viruses (SIVs) naturally infecting two different monkey species: the red-capped mangabey (Cercocebus torquatus) and the greater spot-nosed monkey (Cercopithecus nictitans). Chimpanzees kill and eat monkeys, which is likely how they acquired the monkey viruses. The viruses hybridized in a chimpanzee; the hybrid virus then spread through the chimpanzee population and was later transmitted to humans who captured and slaughtered chimps for meat (becoming exposed to their blood). This was the most likely origin of HIV-1.

HIV-1 also shows one of the major risks of zoonotic infections. They can continue to mutate in its human host, increasing the risk of greater virulence, but also interfering with the production of a universally effective vaccine. Since its transmission to humans, for example, many subtypes of the HIV-1 strain have developed, with genetic differences even in the same subtypes found to be up to 20%.

Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.
CDC/Frederick A. Murphy
Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.

Ebolavirus, first detected in 1976, is another case of bats being the potential culprit. Genetic analysis has shown that African fruit bats are likely involved in the spread of the virus and may be its reservoir host. Further evidence of this was found in the most recent human-infecting Bombali variant of the virus, which was identified in samples from bats collected from Sierra Leone.

It was also found that pigs can also become infected with Zaire ebolavirus, leading to the fear that pigs could serve as a mixing vessel for it and other filoviruses. Pigs have their own forms of Ebola-like disease viruses, which are not currently transmissible to humans, but could provide a potential mixing-vessel reservoir.
 

 

 

Emergent influenzas

The Western world has been most affected by these highly mutable, multispecies zoonotic viruses. The 1957 and 1968 flu pandemics contained a mixture of gene segments from human and avian influenza viruses. “What is clear from genetic analysis of the viruses that caused these past pandemics is that reassortment (gene swapping) occurred to produce novel influenza viruses that caused the pandemics. In both of these cases, the new viruses that emerged showed major differences from the parent viruses,” according to the Centers for Disease Control and Prevention.

Influenza is, however, a good example that all zoonoses are not the result of a mixing-vessel phenomenon, with evidence showing that the origin of the catastrophic 1918 virus pandemic likely resulted from a bird influenza virus directly infecting humans and pigs at about the same time without reassortment, according to the CDC.
 

Building a protective infrastructure

The first 2 decades of the 21st century saw a huge increase in efforts to develop an infrastructure to monitor and potentially prevent the spread of new zoonoses. As part of a global effort led by the United Nations, the U.S. Agency for International AID developed the PREDICT program in 2009 “to strengthen global capacity for detection and discovery of zoonotic viruses with pandemic potential. Those include coronaviruses, the family to which SARS and MERS belong; paramyxoviruses, like Nipah virus; influenza viruses; and filoviruses, like the ebolavirus.”

PREDICT funding to the EcoHealth Alliance led to discovery of the likely bat origins of the Zaire ebolavirus during the 2013-2016 outbreak. And throughout the existence of PREDICT, more than 145,000 animals and people were surveyed in areas of likely zoonotic outbreaks, leading to the detection of more than “1,100 unique viruses, including zoonotic diseases of public health concern such as Bombali ebolavirus, Zaire ebolavirus, Marburg virus, and MERS- and SARS-like coronaviruses,” according to PREDICT partner, the University of California, Davis.

PREDICT-2 was launched in 2014 with the continuing goals of “identifying and better characterizing pathogens of known epidemic and unknown pandemic potential; recognizing animal reservoirs and amplification hosts of human-infectious viruses; and efficiently targeting intervention action at human behaviors which amplify disease transmission at critical animal-animal and animal-human interfaces in hotspots of viral evolution, spillover, amplification, and spread.”

However, in October 2019, the Trump administration cut all funding to the PREDICT program, leading to its shutdown. In a New York Times interview, Peter Daszak, president of the EcoHealth Alliance, stated: “PREDICT was an approach to heading off pandemics, instead of sitting there waiting for them to emerge and then mobilizing.”

Ultimately, in addition to its human cost, the current Wuhan coronavirus outbreak can be looked at an object lesson – a test of the pandemic surveillance and control systems currently in place, and a practice run for the next and potentially deadlier zoonotic outbreaks to come. Perhaps it is also a reminder that cutting resources to detect zoonoses at their source in their animal hosts – before they enter the human chain– is perhaps not the most prudent of ideas.

mlesney@mdedge.com

Mark Lesney is the managing editor of MDedge.com/IDPractioner. He has a PhD in plant virology and a PhD in the history of science, with a focus on the history of biotechnology and medicine. He has served as an adjunct assistant professor of the department of biochemistry and molecular & celluar biology at Georgetown University, Washington.

Emerging viruses that spread to humans from an animal host are commonplace and represent some of the deadliest diseases known. Given the details of the Wuhan coronavirus (2019-nCoV) outbreak, including the genetic profile of the disease agent, the hypothesis of a snake origin was the first raised in the peer-reviewed literature.

Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.
SISTEMA 12/Wikimedia Commons/CC BY-SA 4.0
Wuhan seafood market closed after the new coronavirus was detected there for the first time in 2020.

It is a highly controversial origin story, however, given that mammals have been the sources of all other such zoonotic coronaviruses, as well as a host of other zoonotic diseases.

An animal source for emerging infections such as the 2019-nCoV is the default hypothesis, because “around 60% of all infectious diseases in humans are zoonotic, as are 75% of all emerging infectious diseases,” according to a United Nations report. The report goes on to say that, “on average, one new infectious disease emerges in humans every 4 months.”

To appreciate the emergence and nature of 2019-nCoV, it is important to examine the history of zoonotic outbreaks of other such diseases, especially with regard to the “mixing-vessel” phenomenon, which has been noted in closely related coronaviruses, including SARS and MERS, as well as the widely disparate HIV, Ebola, and influenza viruses.
 

Mutants in the mixing vessel

The mixing-vessel phenomenon is conceptually easy but molecularly complex. A single animal is coinfected with two related viruses; the virus genomes recombine together (virus “sex”) in that animal to form a new variant of virus. Such new mutant viruses can be more or less infective, more or less deadly, and more or less able to jump the species or even genus barrier. An emerging viral zoonosis can occur when a human being is exposed to one of these new viruses (either from the origin species or another species intermediate) that is capable of also infecting a human cell. Such exposure can occur from close proximity to animal waste or body fluids, as in the farm environment, or from wildlife pets or the capturing and slaughtering of wildlife for food, as is proposed in the case of the Wuhan seafood market scenario. In fact, the scientists who postulated a snake intermediary as the potential mixing vessel also stated that 2019‐nCoV appears to be a recombinant virus between a bat coronavirus and an origin‐unknown coronavirus.

Coronaviruses in particular have a history of moving from animal to human hosts (and even back again), and their detailed genetic pattern and taxonomy can reveal the animal origin of these diseases.
 

Going batty

Bats, in particular, have been shown to be a reservoir species for both alphacoronaviruses and betacoronaviruses. Given their ecology and behavior, they have been found to play a key role in transmitting coronaviruses between species. A highly pertinent example of this is the SARS coronavirus, which was shown to have likely originated in Chinese horseshoe bats. The SARS virus, which is genetically closely related to the new Wuhan coronavirus, first infected humans in the Guangdong province of southern China in 2002.

 

 

Scientists speculate that the virus was then either transmitted directly to humans from bats, or passed through an intermediate host species, with SARS-like viruses isolated from Himalayan palm civets found in a live-animal market in Guangdong. The virus infection was also detected in other animals (including a raccoon dog, Nyctereutes procyonoides) and in humans working at the market.

The MERS coronavirus is a betacoronavirus that was first reported in Saudi Arabia in 2012. It turned out to be far more deadly than either SARS or the Wuhan virus (at least as far as current estimates of the new coronavirus’s behavior). The MERS genotype was found to be closely related to MERS-like viruses in bats in Saudi Arabia, Africa, Europe, and Asia. Studies done on the cell receptor for MERS showed an apparently conserved viral receptor in both bats and humans. And an identical strain of MERS was found in bats in a nearby cave and near the workplace of the first known human patient.

Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.
Wikimedia Commons/Mickey Samuni-Blank
Baby Egyptian fruit bat (Rousettus aegyptiacus), known carrier species of deadly Marburg virus.

However, in many of the other locations of the outbreak in the Middle East, there appeared to be limited contact between bats and humans, so scientists looked for another vector species, perhaps one that was acting as an intermediate. A high seroprevalence of MERS-CoV or a closely related virus was found in camels across the Arabian Peninsula and parts of eastern and northern Africa, while tests for MERS antibodies were negative in the most-likely other species of livestock or pet animals, including chickens, cows, goats, horses, and sheep.

In addition, the MERS-related CoV carried by camels was genetically highly similar to that detected in humans, as demonstrated in one particular outbreak on a farm in Qatar where the genetic sequences of MERS-CoV in the nasal swabs from 3 of 14 seropositive camels were similar to those of 2 human cases on the same farm. Similar genomic results were found in MERS-CoV from nasal swabs from camels in Saudi Arabia.
 

Other mixing-vessel zoonoses

HIV, the viral cause of AIDS, provides an almost-textbook origin story of the rise of a zoonotic supervillain. The virus was genetically traced to have a chimpanzee-to-human origin, but it was found to be more complicated than that. The virus first emerged in the 1920s in Africa in what is now the Democratic Republic of the Congo, well before its rise to a global pandemic in the 1980s.

Researchers believe the chimpanzee virus is a hybrid of the simian immunodeficiency viruses (SIVs) naturally infecting two different monkey species: the red-capped mangabey (Cercocebus torquatus) and the greater spot-nosed monkey (Cercopithecus nictitans). Chimpanzees kill and eat monkeys, which is likely how they acquired the monkey viruses. The viruses hybridized in a chimpanzee; the hybrid virus then spread through the chimpanzee population and was later transmitted to humans who captured and slaughtered chimps for meat (becoming exposed to their blood). This was the most likely origin of HIV-1.

HIV-1 also shows one of the major risks of zoonotic infections. They can continue to mutate in its human host, increasing the risk of greater virulence, but also interfering with the production of a universally effective vaccine. Since its transmission to humans, for example, many subtypes of the HIV-1 strain have developed, with genetic differences even in the same subtypes found to be up to 20%.

Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.
CDC/Frederick A. Murphy
Colorized transmission electron micrograph (TEM) revealing some of the ultrastructural morphology displayed by an Ebola virus virion.

Ebolavirus, first detected in 1976, is another case of bats being the potential culprit. Genetic analysis has shown that African fruit bats are likely involved in the spread of the virus and may be its reservoir host. Further evidence of this was found in the most recent human-infecting Bombali variant of the virus, which was identified in samples from bats collected from Sierra Leone.

It was also found that pigs can also become infected with Zaire ebolavirus, leading to the fear that pigs could serve as a mixing vessel for it and other filoviruses. Pigs have their own forms of Ebola-like disease viruses, which are not currently transmissible to humans, but could provide a potential mixing-vessel reservoir.
 

 

 

Emergent influenzas

The Western world has been most affected by these highly mutable, multispecies zoonotic viruses. The 1957 and 1968 flu pandemics contained a mixture of gene segments from human and avian influenza viruses. “What is clear from genetic analysis of the viruses that caused these past pandemics is that reassortment (gene swapping) occurred to produce novel influenza viruses that caused the pandemics. In both of these cases, the new viruses that emerged showed major differences from the parent viruses,” according to the Centers for Disease Control and Prevention.

Influenza is, however, a good example that all zoonoses are not the result of a mixing-vessel phenomenon, with evidence showing that the origin of the catastrophic 1918 virus pandemic likely resulted from a bird influenza virus directly infecting humans and pigs at about the same time without reassortment, according to the CDC.
 

Building a protective infrastructure

The first 2 decades of the 21st century saw a huge increase in efforts to develop an infrastructure to monitor and potentially prevent the spread of new zoonoses. As part of a global effort led by the United Nations, the U.S. Agency for International AID developed the PREDICT program in 2009 “to strengthen global capacity for detection and discovery of zoonotic viruses with pandemic potential. Those include coronaviruses, the family to which SARS and MERS belong; paramyxoviruses, like Nipah virus; influenza viruses; and filoviruses, like the ebolavirus.”

PREDICT funding to the EcoHealth Alliance led to discovery of the likely bat origins of the Zaire ebolavirus during the 2013-2016 outbreak. And throughout the existence of PREDICT, more than 145,000 animals and people were surveyed in areas of likely zoonotic outbreaks, leading to the detection of more than “1,100 unique viruses, including zoonotic diseases of public health concern such as Bombali ebolavirus, Zaire ebolavirus, Marburg virus, and MERS- and SARS-like coronaviruses,” according to PREDICT partner, the University of California, Davis.

PREDICT-2 was launched in 2014 with the continuing goals of “identifying and better characterizing pathogens of known epidemic and unknown pandemic potential; recognizing animal reservoirs and amplification hosts of human-infectious viruses; and efficiently targeting intervention action at human behaviors which amplify disease transmission at critical animal-animal and animal-human interfaces in hotspots of viral evolution, spillover, amplification, and spread.”

However, in October 2019, the Trump administration cut all funding to the PREDICT program, leading to its shutdown. In a New York Times interview, Peter Daszak, president of the EcoHealth Alliance, stated: “PREDICT was an approach to heading off pandemics, instead of sitting there waiting for them to emerge and then mobilizing.”

Ultimately, in addition to its human cost, the current Wuhan coronavirus outbreak can be looked at an object lesson – a test of the pandemic surveillance and control systems currently in place, and a practice run for the next and potentially deadlier zoonotic outbreaks to come. Perhaps it is also a reminder that cutting resources to detect zoonoses at their source in their animal hosts – before they enter the human chain– is perhaps not the most prudent of ideas.

mlesney@mdedge.com

Mark Lesney is the managing editor of MDedge.com/IDPractioner. He has a PhD in plant virology and a PhD in the history of science, with a focus on the history of biotechnology and medicine. He has served as an adjunct assistant professor of the department of biochemistry and molecular & celluar biology at Georgetown University, Washington.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Serum keratin 18 promising as AAH biomarker

Article Type
Changed
Thu, 01/30/2020 - 11:36

Outperforms MELD, ABIC

Serum keratin 18, an epithelial protein released from dying hepatocytes, identifies patients with severe acute alcoholic hepatitis (AAH) at high risk for death, according to an investigation of 173 subjects.

Standard biomarker scores – Model for End-stage Liver Disease (MELD), age, serum bilirubin, International Normalized Ratio, and serum creatinine (ABIC), as well as others – predict prognosis and severity of alcoholic liver disease, but they don’t reflect “the magnitude of cell death nor the form of cell death (apoptosis/necrosis), which may be important in distinguishing various forms of liver injury” and guiding therapy, explained investigators led by Vatsalya Vatsalya, MD, of the division of gastroenterology, hepatology, and nutrition at the University of Louisville (Ky.).

It’s important, for instance, to identify people with alcoholic cirrhosis but not active hepatitis, as they “would likely not benefit from anti-inflammatory agents such as steroids or [interleukin]-1 receptor antagonists, but would incur their side effects.” For those and other reasons, “new biomarkers are needed for diagnosing AAH, assessing the degree of hepatocyte death, and predicting mortality,” they said (Clin Gastroenterol Hepatol. 2019 Dec 4. doi: 10.1016/j.cgh.2019.11.050).

Keratin 18 – both the cleaved form (K18M30) and the uncleaved protein (K18M65) – have been suggested before as a marker for AAH, so the investigators took a closer look.

They analyzed serum from 57 people with severe AAH (MELD score above 20), 27 people with moderate AAH (MELD score 12-19), 34 with nonalcoholic steatohepatitis, 17 healthy controls, and 38 people with alcohol use disorder and either mild or no liver injury.

Overall, 51.9% of moderate AAH cases and 38.9% of severe cases had K18M65 levels between 641 and 2,000 IU/L; 25.9% of moderate and 61.1% of severe cases had K18M65 levels greater than 2,000 IU/L. All severe cases had levels above 641 IU/L. Serum levels of K18 also identified patients who died within 90 days with greater accuracy than did MELD, ABIC, and other scores, the investigators said.

The K18M65:ALT [alanine aminotransferase] ratio distinguished AAH from nonalcoholic steatohepatitis with a sensitivity of 0.971 and specificity of 0.829. Findings were similar for the K18M30:ALT ratio.

Levels of K18M65 and K18M30 increased significantly as liver disease worsened, as did the degree of necrosis as indicated by the K18M65:K18M30 ratio. Meanwhile, although k18 levels correlated with MELD scores, levels of ALT, aspartate aminotransferase (AST), and the ratio of AST:ALT did not.

“There is a stronger association between serum level of keratin 18 and amount of hepatocyte death and liver disease severity than for other biomarkers,” the team concluded.

Patients were in their mid 40s, on average, and there were more men than women.

The National Institutes of Health supported the work, and the investigators had no disclosures.

Publications
Topics
Sections

Outperforms MELD, ABIC

Outperforms MELD, ABIC

Serum keratin 18, an epithelial protein released from dying hepatocytes, identifies patients with severe acute alcoholic hepatitis (AAH) at high risk for death, according to an investigation of 173 subjects.

Standard biomarker scores – Model for End-stage Liver Disease (MELD), age, serum bilirubin, International Normalized Ratio, and serum creatinine (ABIC), as well as others – predict prognosis and severity of alcoholic liver disease, but they don’t reflect “the magnitude of cell death nor the form of cell death (apoptosis/necrosis), which may be important in distinguishing various forms of liver injury” and guiding therapy, explained investigators led by Vatsalya Vatsalya, MD, of the division of gastroenterology, hepatology, and nutrition at the University of Louisville (Ky.).

It’s important, for instance, to identify people with alcoholic cirrhosis but not active hepatitis, as they “would likely not benefit from anti-inflammatory agents such as steroids or [interleukin]-1 receptor antagonists, but would incur their side effects.” For those and other reasons, “new biomarkers are needed for diagnosing AAH, assessing the degree of hepatocyte death, and predicting mortality,” they said (Clin Gastroenterol Hepatol. 2019 Dec 4. doi: 10.1016/j.cgh.2019.11.050).

Keratin 18 – both the cleaved form (K18M30) and the uncleaved protein (K18M65) – have been suggested before as a marker for AAH, so the investigators took a closer look.

They analyzed serum from 57 people with severe AAH (MELD score above 20), 27 people with moderate AAH (MELD score 12-19), 34 with nonalcoholic steatohepatitis, 17 healthy controls, and 38 people with alcohol use disorder and either mild or no liver injury.

Overall, 51.9% of moderate AAH cases and 38.9% of severe cases had K18M65 levels between 641 and 2,000 IU/L; 25.9% of moderate and 61.1% of severe cases had K18M65 levels greater than 2,000 IU/L. All severe cases had levels above 641 IU/L. Serum levels of K18 also identified patients who died within 90 days with greater accuracy than did MELD, ABIC, and other scores, the investigators said.

The K18M65:ALT [alanine aminotransferase] ratio distinguished AAH from nonalcoholic steatohepatitis with a sensitivity of 0.971 and specificity of 0.829. Findings were similar for the K18M30:ALT ratio.

Levels of K18M65 and K18M30 increased significantly as liver disease worsened, as did the degree of necrosis as indicated by the K18M65:K18M30 ratio. Meanwhile, although k18 levels correlated with MELD scores, levels of ALT, aspartate aminotransferase (AST), and the ratio of AST:ALT did not.

“There is a stronger association between serum level of keratin 18 and amount of hepatocyte death and liver disease severity than for other biomarkers,” the team concluded.

Patients were in their mid 40s, on average, and there were more men than women.

The National Institutes of Health supported the work, and the investigators had no disclosures.

Serum keratin 18, an epithelial protein released from dying hepatocytes, identifies patients with severe acute alcoholic hepatitis (AAH) at high risk for death, according to an investigation of 173 subjects.

Standard biomarker scores – Model for End-stage Liver Disease (MELD), age, serum bilirubin, International Normalized Ratio, and serum creatinine (ABIC), as well as others – predict prognosis and severity of alcoholic liver disease, but they don’t reflect “the magnitude of cell death nor the form of cell death (apoptosis/necrosis), which may be important in distinguishing various forms of liver injury” and guiding therapy, explained investigators led by Vatsalya Vatsalya, MD, of the division of gastroenterology, hepatology, and nutrition at the University of Louisville (Ky.).

It’s important, for instance, to identify people with alcoholic cirrhosis but not active hepatitis, as they “would likely not benefit from anti-inflammatory agents such as steroids or [interleukin]-1 receptor antagonists, but would incur their side effects.” For those and other reasons, “new biomarkers are needed for diagnosing AAH, assessing the degree of hepatocyte death, and predicting mortality,” they said (Clin Gastroenterol Hepatol. 2019 Dec 4. doi: 10.1016/j.cgh.2019.11.050).

Keratin 18 – both the cleaved form (K18M30) and the uncleaved protein (K18M65) – have been suggested before as a marker for AAH, so the investigators took a closer look.

They analyzed serum from 57 people with severe AAH (MELD score above 20), 27 people with moderate AAH (MELD score 12-19), 34 with nonalcoholic steatohepatitis, 17 healthy controls, and 38 people with alcohol use disorder and either mild or no liver injury.

Overall, 51.9% of moderate AAH cases and 38.9% of severe cases had K18M65 levels between 641 and 2,000 IU/L; 25.9% of moderate and 61.1% of severe cases had K18M65 levels greater than 2,000 IU/L. All severe cases had levels above 641 IU/L. Serum levels of K18 also identified patients who died within 90 days with greater accuracy than did MELD, ABIC, and other scores, the investigators said.

The K18M65:ALT [alanine aminotransferase] ratio distinguished AAH from nonalcoholic steatohepatitis with a sensitivity of 0.971 and specificity of 0.829. Findings were similar for the K18M30:ALT ratio.

Levels of K18M65 and K18M30 increased significantly as liver disease worsened, as did the degree of necrosis as indicated by the K18M65:K18M30 ratio. Meanwhile, although k18 levels correlated with MELD scores, levels of ALT, aspartate aminotransferase (AST), and the ratio of AST:ALT did not.

“There is a stronger association between serum level of keratin 18 and amount of hepatocyte death and liver disease severity than for other biomarkers,” the team concluded.

Patients were in their mid 40s, on average, and there were more men than women.

The National Institutes of Health supported the work, and the investigators had no disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Performing gender-reaffirming surgery: Guidelines for the general ob.gyn.

Article Type
Changed
Wed, 01/29/2020 - 16:11

According to the DSM-V, gender dysphoria in adolescents and adults “involves a difference between one’s experienced/expressed gender and assigned gender, and significant distress or problems functioning. It lasts at least 6 months,” and several other criteria must be met.1 Many patients with gender dysphoria also identify as transgender. A “transition” or “transitioning” is a process by which individuals come to inhabit their gender identity.2 A gender transition may take many forms, and only some people will choose to include medical assistance in their transition process. Although the scope of this article will not address these concerns, it should be noted that many people in the transgender and gender nonconforming community would object to the concepts of gender dysphoria and gender transition because they rely on a binary model of gender that may exclude individuals that see themselves as something other than “man or woman.”

sturti/Getty Images

There are both medical and surgical options for medical assistance in a gender transition. This article will focus on the surgical care of patients assigned female at birth who are seeking masculinizing surgical therapy. Many writers will discuss “gender-affirming” surgery, but we will use the term “gender-reaffirming” surgery because transgender patients have already affirmed their own genders and do not require surgery to inhabit this affirmation. Surgical options might include bilateral mastectomy, hysterectomy, bilateral salpingo-oophorectomy (BSO), metoidioplasty (surgical formation of a neophallus with existing genital tissue), or phalloplasty. There currently is no single surgical subspecialty that encompasses training in all forms of gender-reaffirming surgical therapies. In some areas of the country, centers of excellence have given rise to multidisciplinary teams that combine the skill sets of surgical subspecialists to provide a streamlined approach to gender-reaffirming surgery. Because of the scarcity of these integrated centers, most patients seeking gender-reaffirming surgeries will need to find individual subspecialists whose surgical training focuses on one area of the body. For example, patients seeking all possible surgical options may need a breast surgeon to perform their mastectomy, an ob.gyn. to perform their hysterectomy and BSO, a urologist to perform their metoidioplasty, and a plastic surgeon to perform their phalloplasty. In these scenarios, a general ob.gyn. may be consulted to perform a gender-reaffirming hysterectomy with BSO.

There are many reasons why transgender men might desire hysterectomy/BSO as part of their transition. Removal of the uterus and cervix eliminates concerns surrounding menstruation, pregnancy, and cervical cancer screening, all of which may add to their experience of gender dysphoria. Furthermore, removal of the ovaries may simplify long-term hormonal therapy with testosterone by eliminating the need for estrogen suppression. Lastly, a hysterectomy/BSO is a lower-risk and more cost-effective masculinizing surgery, compared with metoidioplasty or phalloplasty.

While the technical aspect of performing a hysterectomy/BSO certainly is within the scope of training for a general ob.gyn., there are several nuances of which providers should be aware when planning gender-reaffirming surgery for a transgender man. During the preoperative planning phase, it is of utmost importance to provide an environment of safety so that the focus of the preop visit is not clouded by communication mishaps between office staff and the patient. These barriers can be avoided by implementing office intake forms that give patients the opportunity to inform the health care team of their chosen name and personal pronouns upon registration for the visit.

Dr. Andrea B. Joyner
Dr. Andrea B. Joyner

A pelvic exam is commonly performed by ob.gyns. to determine surgical approach for a hysterectomy/BSO. When approaching transgender male patients for preoperative pelvic exams, it is important to be mindful of the fact that this type of exam may trigger gender dysphoria. While pelvic exams should be handled in sensitive fashion regardless of a patient’s gender identity, a patient who is a transgender man may benefit from some added steps in discussing the pelvic exam. One approach is to acknowledge that these exams/discussions may be especially triggering of gender dysphoria, and ask if the patient would prefer certain words to be used or not used in reference to their anatomy. As with any patient, the provider should explain the purpose of the examination and offer opportunities for the patient to have some control in the exam such as by assisting with insertion of the speculum or designating a “safe word” that would signal the provider to stop or pause the exam. In some cases, patients may not be able to tolerate the pelvic exam while awake because of the degree of gender dysphoria that the exam would induce. Providers might consider noninvasive imaging studies to help with surgical planning if they find they need more information before scheduling the operation, or they may offer a staged procedure with exam under anesthesia prior to the definitive surgery.

In conclusion, performing a gender-reaffirming hysterectomy/BSO requires thoughtful preparation to ensure a safe surgical environment for this vulnerable population. Care should be taken to plan the operation with a culturally sensitive approach.

Dr. Joey Bahng
Dr. Joey Bahng

Dr. Joyner is an assistant professor at Emory University, and is the director of gynecologic services in the Gender Center at Grady Memorial Hospital, both in Atlanta. Dr. Joyner identifies as a cisgender female and uses she/hers/her as her personal pronouns. Dr. Joey Bahng is a PGY-1 resident physician in Emory University’s gynecology & obstetrics residency program. Dr. Bahng identifies as nonbinary and uses they/them/their as their personal pronouns. Dr. Joyner and Dr. Bahng reported no relevant financial disclosures.

References

1. American Psychiatric Association. What is Gender Dysphoria? https://www.psychiatry.org/patients-families/gender-dysphoria/what-is-gender-dysphoria

2. UCSF Transgender Care. Transition Roadmap. https://transcare.ucsf.edu/transition-roadmap

Publications
Topics
Sections

According to the DSM-V, gender dysphoria in adolescents and adults “involves a difference between one’s experienced/expressed gender and assigned gender, and significant distress or problems functioning. It lasts at least 6 months,” and several other criteria must be met.1 Many patients with gender dysphoria also identify as transgender. A “transition” or “transitioning” is a process by which individuals come to inhabit their gender identity.2 A gender transition may take many forms, and only some people will choose to include medical assistance in their transition process. Although the scope of this article will not address these concerns, it should be noted that many people in the transgender and gender nonconforming community would object to the concepts of gender dysphoria and gender transition because they rely on a binary model of gender that may exclude individuals that see themselves as something other than “man or woman.”

sturti/Getty Images

There are both medical and surgical options for medical assistance in a gender transition. This article will focus on the surgical care of patients assigned female at birth who are seeking masculinizing surgical therapy. Many writers will discuss “gender-affirming” surgery, but we will use the term “gender-reaffirming” surgery because transgender patients have already affirmed their own genders and do not require surgery to inhabit this affirmation. Surgical options might include bilateral mastectomy, hysterectomy, bilateral salpingo-oophorectomy (BSO), metoidioplasty (surgical formation of a neophallus with existing genital tissue), or phalloplasty. There currently is no single surgical subspecialty that encompasses training in all forms of gender-reaffirming surgical therapies. In some areas of the country, centers of excellence have given rise to multidisciplinary teams that combine the skill sets of surgical subspecialists to provide a streamlined approach to gender-reaffirming surgery. Because of the scarcity of these integrated centers, most patients seeking gender-reaffirming surgeries will need to find individual subspecialists whose surgical training focuses on one area of the body. For example, patients seeking all possible surgical options may need a breast surgeon to perform their mastectomy, an ob.gyn. to perform their hysterectomy and BSO, a urologist to perform their metoidioplasty, and a plastic surgeon to perform their phalloplasty. In these scenarios, a general ob.gyn. may be consulted to perform a gender-reaffirming hysterectomy with BSO.

There are many reasons why transgender men might desire hysterectomy/BSO as part of their transition. Removal of the uterus and cervix eliminates concerns surrounding menstruation, pregnancy, and cervical cancer screening, all of which may add to their experience of gender dysphoria. Furthermore, removal of the ovaries may simplify long-term hormonal therapy with testosterone by eliminating the need for estrogen suppression. Lastly, a hysterectomy/BSO is a lower-risk and more cost-effective masculinizing surgery, compared with metoidioplasty or phalloplasty.

While the technical aspect of performing a hysterectomy/BSO certainly is within the scope of training for a general ob.gyn., there are several nuances of which providers should be aware when planning gender-reaffirming surgery for a transgender man. During the preoperative planning phase, it is of utmost importance to provide an environment of safety so that the focus of the preop visit is not clouded by communication mishaps between office staff and the patient. These barriers can be avoided by implementing office intake forms that give patients the opportunity to inform the health care team of their chosen name and personal pronouns upon registration for the visit.

Dr. Andrea B. Joyner
Dr. Andrea B. Joyner

A pelvic exam is commonly performed by ob.gyns. to determine surgical approach for a hysterectomy/BSO. When approaching transgender male patients for preoperative pelvic exams, it is important to be mindful of the fact that this type of exam may trigger gender dysphoria. While pelvic exams should be handled in sensitive fashion regardless of a patient’s gender identity, a patient who is a transgender man may benefit from some added steps in discussing the pelvic exam. One approach is to acknowledge that these exams/discussions may be especially triggering of gender dysphoria, and ask if the patient would prefer certain words to be used or not used in reference to their anatomy. As with any patient, the provider should explain the purpose of the examination and offer opportunities for the patient to have some control in the exam such as by assisting with insertion of the speculum or designating a “safe word” that would signal the provider to stop or pause the exam. In some cases, patients may not be able to tolerate the pelvic exam while awake because of the degree of gender dysphoria that the exam would induce. Providers might consider noninvasive imaging studies to help with surgical planning if they find they need more information before scheduling the operation, or they may offer a staged procedure with exam under anesthesia prior to the definitive surgery.

In conclusion, performing a gender-reaffirming hysterectomy/BSO requires thoughtful preparation to ensure a safe surgical environment for this vulnerable population. Care should be taken to plan the operation with a culturally sensitive approach.

Dr. Joey Bahng
Dr. Joey Bahng

Dr. Joyner is an assistant professor at Emory University, and is the director of gynecologic services in the Gender Center at Grady Memorial Hospital, both in Atlanta. Dr. Joyner identifies as a cisgender female and uses she/hers/her as her personal pronouns. Dr. Joey Bahng is a PGY-1 resident physician in Emory University’s gynecology & obstetrics residency program. Dr. Bahng identifies as nonbinary and uses they/them/their as their personal pronouns. Dr. Joyner and Dr. Bahng reported no relevant financial disclosures.

References

1. American Psychiatric Association. What is Gender Dysphoria? https://www.psychiatry.org/patients-families/gender-dysphoria/what-is-gender-dysphoria

2. UCSF Transgender Care. Transition Roadmap. https://transcare.ucsf.edu/transition-roadmap

According to the DSM-V, gender dysphoria in adolescents and adults “involves a difference between one’s experienced/expressed gender and assigned gender, and significant distress or problems functioning. It lasts at least 6 months,” and several other criteria must be met.1 Many patients with gender dysphoria also identify as transgender. A “transition” or “transitioning” is a process by which individuals come to inhabit their gender identity.2 A gender transition may take many forms, and only some people will choose to include medical assistance in their transition process. Although the scope of this article will not address these concerns, it should be noted that many people in the transgender and gender nonconforming community would object to the concepts of gender dysphoria and gender transition because they rely on a binary model of gender that may exclude individuals that see themselves as something other than “man or woman.”

sturti/Getty Images

There are both medical and surgical options for medical assistance in a gender transition. This article will focus on the surgical care of patients assigned female at birth who are seeking masculinizing surgical therapy. Many writers will discuss “gender-affirming” surgery, but we will use the term “gender-reaffirming” surgery because transgender patients have already affirmed their own genders and do not require surgery to inhabit this affirmation. Surgical options might include bilateral mastectomy, hysterectomy, bilateral salpingo-oophorectomy (BSO), metoidioplasty (surgical formation of a neophallus with existing genital tissue), or phalloplasty. There currently is no single surgical subspecialty that encompasses training in all forms of gender-reaffirming surgical therapies. In some areas of the country, centers of excellence have given rise to multidisciplinary teams that combine the skill sets of surgical subspecialists to provide a streamlined approach to gender-reaffirming surgery. Because of the scarcity of these integrated centers, most patients seeking gender-reaffirming surgeries will need to find individual subspecialists whose surgical training focuses on one area of the body. For example, patients seeking all possible surgical options may need a breast surgeon to perform their mastectomy, an ob.gyn. to perform their hysterectomy and BSO, a urologist to perform their metoidioplasty, and a plastic surgeon to perform their phalloplasty. In these scenarios, a general ob.gyn. may be consulted to perform a gender-reaffirming hysterectomy with BSO.

There are many reasons why transgender men might desire hysterectomy/BSO as part of their transition. Removal of the uterus and cervix eliminates concerns surrounding menstruation, pregnancy, and cervical cancer screening, all of which may add to their experience of gender dysphoria. Furthermore, removal of the ovaries may simplify long-term hormonal therapy with testosterone by eliminating the need for estrogen suppression. Lastly, a hysterectomy/BSO is a lower-risk and more cost-effective masculinizing surgery, compared with metoidioplasty or phalloplasty.

While the technical aspect of performing a hysterectomy/BSO certainly is within the scope of training for a general ob.gyn., there are several nuances of which providers should be aware when planning gender-reaffirming surgery for a transgender man. During the preoperative planning phase, it is of utmost importance to provide an environment of safety so that the focus of the preop visit is not clouded by communication mishaps between office staff and the patient. These barriers can be avoided by implementing office intake forms that give patients the opportunity to inform the health care team of their chosen name and personal pronouns upon registration for the visit.

Dr. Andrea B. Joyner
Dr. Andrea B. Joyner

A pelvic exam is commonly performed by ob.gyns. to determine surgical approach for a hysterectomy/BSO. When approaching transgender male patients for preoperative pelvic exams, it is important to be mindful of the fact that this type of exam may trigger gender dysphoria. While pelvic exams should be handled in sensitive fashion regardless of a patient’s gender identity, a patient who is a transgender man may benefit from some added steps in discussing the pelvic exam. One approach is to acknowledge that these exams/discussions may be especially triggering of gender dysphoria, and ask if the patient would prefer certain words to be used or not used in reference to their anatomy. As with any patient, the provider should explain the purpose of the examination and offer opportunities for the patient to have some control in the exam such as by assisting with insertion of the speculum or designating a “safe word” that would signal the provider to stop or pause the exam. In some cases, patients may not be able to tolerate the pelvic exam while awake because of the degree of gender dysphoria that the exam would induce. Providers might consider noninvasive imaging studies to help with surgical planning if they find they need more information before scheduling the operation, or they may offer a staged procedure with exam under anesthesia prior to the definitive surgery.

In conclusion, performing a gender-reaffirming hysterectomy/BSO requires thoughtful preparation to ensure a safe surgical environment for this vulnerable population. Care should be taken to plan the operation with a culturally sensitive approach.

Dr. Joey Bahng
Dr. Joey Bahng

Dr. Joyner is an assistant professor at Emory University, and is the director of gynecologic services in the Gender Center at Grady Memorial Hospital, both in Atlanta. Dr. Joyner identifies as a cisgender female and uses she/hers/her as her personal pronouns. Dr. Joey Bahng is a PGY-1 resident physician in Emory University’s gynecology & obstetrics residency program. Dr. Bahng identifies as nonbinary and uses they/them/their as their personal pronouns. Dr. Joyner and Dr. Bahng reported no relevant financial disclosures.

References

1. American Psychiatric Association. What is Gender Dysphoria? https://www.psychiatry.org/patients-families/gender-dysphoria/what-is-gender-dysphoria

2. UCSF Transgender Care. Transition Roadmap. https://transcare.ucsf.edu/transition-roadmap

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.