User login
Cannabis Use Linked to Brain Thinning in Adolescents
, research in mice and humans suggested.
The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.
The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.
“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.
That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”
Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”
The study was published online on October 9 in the Journal of Neuroscience.
Of Mice, Men, and Cannabis
Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.
To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.
Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.
Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.
Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.
Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.
By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
‘Significant Implications’
Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.
“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”
Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.
Additional research could include women and assess potential sex differences, she added.
Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.
“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.
“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.
Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”
No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships.
A version of this article appeared on Medscape.com.
, research in mice and humans suggested.
The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.
The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.
“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.
That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”
Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”
The study was published online on October 9 in the Journal of Neuroscience.
Of Mice, Men, and Cannabis
Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.
To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.
Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.
Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.
Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.
Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.
By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
‘Significant Implications’
Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.
“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”
Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.
Additional research could include women and assess potential sex differences, she added.
Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.
“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.
“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.
Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”
No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships.
A version of this article appeared on Medscape.com.
, research in mice and humans suggested.
The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.
The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.
“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.
That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”
Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”
The study was published online on October 9 in the Journal of Neuroscience.
Of Mice, Men, and Cannabis
Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.
To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.
Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.
Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.
Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.
Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.
By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
‘Significant Implications’
Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.
“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”
Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.
Additional research could include women and assess potential sex differences, she added.
Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.
“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.
“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.
Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”
No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships.
A version of this article appeared on Medscape.com.
FROM THE JOURNAL OF NEUROSCIENCE
Cosmetic Dermatology Product Recalls Still Common, Analysis Finds
TOPLINE:
Between 2011 and 2023, the US Food and Drug Administration (FDA) reported recalls of 334 cosmetic dermatology products in the United States, affecting over 77 million units, predominantly due to bacterial contamination.
METHODOLOGY:
- Researchers conducted a cross-sectional analysis of the FDA Enforcement Report database for cosmetic dermatology products from 2011 to 2023.
- Cosmetic products are any article “intended for body cleaning or beauty enhancement,” as defined by the FDA.
- Recalls were categorized by product type, reason for the recall, microbial contaminant, inorganic contaminant, distribution, and risk classification.
TAKEAWAY:
- During the study period, 334 voluntary and manufacturer-initiated recalls of cosmetic products were reported, affecting 77,135,700 units.
- A total of 297 recalls (88.9%) were categorized as Class II, indicating that they caused “medically reversible health consequences.” The median recall duration was 307 days.
- Hygiene and cleaning products accounted for most of the recalls (51.5%). Makeup gels, soaps, shampoos, tattoo ink, wipes, and lotions were the most recalled product categories. Nearly 51% of the products were distributed internationally.
- Microbial and inorganic contamination accounted for 76.8% and 10.2% of the recalls (the two most common reasons for the recall), respectively, with bacteria (80%) the most common contaminating pathogen (primarily Pseudomonas and Burkholderia species).
IN PRACTICE:
With 77 million units recalled by the FDA over 12 years, cosmetic recalls have remained common, the authors concluded, adding that “dermatologists should be key voices in pharmacovigilance given scientific expertise and frontline experience managing products and associated concerns.” Dermatologists, they added, “should also be aware of FDA enforcement reports for recall updates given that average recall termination took approximately 1 year.”
SOURCE:
The study was led by Kaushik P. Venkatesh, MBA, MPH, Harvard Medical School, Boston, and was published online on October 29 in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study’s limitations include the potential underreporting of Class III recalls (products that are unlikely to cause any adverse health reaction but violate FDA labeling or manufacturing laws) and lack of complete information on contaminants.
DISCLOSURES:
No information on funding was provided in the study. No conflicts of interest were reported.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Between 2011 and 2023, the US Food and Drug Administration (FDA) reported recalls of 334 cosmetic dermatology products in the United States, affecting over 77 million units, predominantly due to bacterial contamination.
METHODOLOGY:
- Researchers conducted a cross-sectional analysis of the FDA Enforcement Report database for cosmetic dermatology products from 2011 to 2023.
- Cosmetic products are any article “intended for body cleaning or beauty enhancement,” as defined by the FDA.
- Recalls were categorized by product type, reason for the recall, microbial contaminant, inorganic contaminant, distribution, and risk classification.
TAKEAWAY:
- During the study period, 334 voluntary and manufacturer-initiated recalls of cosmetic products were reported, affecting 77,135,700 units.
- A total of 297 recalls (88.9%) were categorized as Class II, indicating that they caused “medically reversible health consequences.” The median recall duration was 307 days.
- Hygiene and cleaning products accounted for most of the recalls (51.5%). Makeup gels, soaps, shampoos, tattoo ink, wipes, and lotions were the most recalled product categories. Nearly 51% of the products were distributed internationally.
- Microbial and inorganic contamination accounted for 76.8% and 10.2% of the recalls (the two most common reasons for the recall), respectively, with bacteria (80%) the most common contaminating pathogen (primarily Pseudomonas and Burkholderia species).
IN PRACTICE:
With 77 million units recalled by the FDA over 12 years, cosmetic recalls have remained common, the authors concluded, adding that “dermatologists should be key voices in pharmacovigilance given scientific expertise and frontline experience managing products and associated concerns.” Dermatologists, they added, “should also be aware of FDA enforcement reports for recall updates given that average recall termination took approximately 1 year.”
SOURCE:
The study was led by Kaushik P. Venkatesh, MBA, MPH, Harvard Medical School, Boston, and was published online on October 29 in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study’s limitations include the potential underreporting of Class III recalls (products that are unlikely to cause any adverse health reaction but violate FDA labeling or manufacturing laws) and lack of complete information on contaminants.
DISCLOSURES:
No information on funding was provided in the study. No conflicts of interest were reported.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Between 2011 and 2023, the US Food and Drug Administration (FDA) reported recalls of 334 cosmetic dermatology products in the United States, affecting over 77 million units, predominantly due to bacterial contamination.
METHODOLOGY:
- Researchers conducted a cross-sectional analysis of the FDA Enforcement Report database for cosmetic dermatology products from 2011 to 2023.
- Cosmetic products are any article “intended for body cleaning or beauty enhancement,” as defined by the FDA.
- Recalls were categorized by product type, reason for the recall, microbial contaminant, inorganic contaminant, distribution, and risk classification.
TAKEAWAY:
- During the study period, 334 voluntary and manufacturer-initiated recalls of cosmetic products were reported, affecting 77,135,700 units.
- A total of 297 recalls (88.9%) were categorized as Class II, indicating that they caused “medically reversible health consequences.” The median recall duration was 307 days.
- Hygiene and cleaning products accounted for most of the recalls (51.5%). Makeup gels, soaps, shampoos, tattoo ink, wipes, and lotions were the most recalled product categories. Nearly 51% of the products were distributed internationally.
- Microbial and inorganic contamination accounted for 76.8% and 10.2% of the recalls (the two most common reasons for the recall), respectively, with bacteria (80%) the most common contaminating pathogen (primarily Pseudomonas and Burkholderia species).
IN PRACTICE:
With 77 million units recalled by the FDA over 12 years, cosmetic recalls have remained common, the authors concluded, adding that “dermatologists should be key voices in pharmacovigilance given scientific expertise and frontline experience managing products and associated concerns.” Dermatologists, they added, “should also be aware of FDA enforcement reports for recall updates given that average recall termination took approximately 1 year.”
SOURCE:
The study was led by Kaushik P. Venkatesh, MBA, MPH, Harvard Medical School, Boston, and was published online on October 29 in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study’s limitations include the potential underreporting of Class III recalls (products that are unlikely to cause any adverse health reaction but violate FDA labeling or manufacturing laws) and lack of complete information on contaminants.
DISCLOSURES:
No information on funding was provided in the study. No conflicts of interest were reported.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Novel Treatment Promising for Cutaneous Lupus in Phase 2 Trial
TOPLINE:
particularly in subacute and chronic cases.
METHODOLOGY:
- Researchers conducted a randomized phase 2 trial to evaluate the efficacy and safety of iberdomide in 288 patients with CLE (mean age, 45 years; 97% women). Iberdomide is a cereblon modulator, which results in degradation of two transcription factors of immune cell development and homeostasis — Ikaros and Aiolos — that have been implicated in the genetic predisposition of systemic lupus.
- CLE Disease Area and Severity Index Activity (CLASI-A) endpoints included mean percent change from baseline and ≥ 50% reduction from baseline (CLASI-50), which were evaluated in all patients with baseline CLASI-A scores ≥ 8 and by CLE subtypes (acute, subacute, and chronic).
- At baseline, 56% of patients had acute CLE, 29% had chronic CLE, and 16% had subacute CLE; 28% of patients had a baseline CLASI-A score ≥ 8.
- Patients were randomly assigned to receive oral iberdomide (0.45 mg, 0.30 mg, 0.15 mg, or placebo daily) for 24 weeks while continuing standard lupus medications. At week 24, patients on placebo were rerandomized to iberdomide 0.45 mg or 0.30 mg once a day, while those on iberdomide continued their assigned dose through week 52.
TAKEAWAY:
- Among patients with baseline CLASI-A ≥ 8, the mean change in CLASI-A score from baseline at week 24 was −66.7% for those on iberdomide 0.45 mg vs −54.2% for placebo (P = .295).
- At week 24, patients with subacute CLE showed a significantly greater mean percent change from baseline in CLASI-A with iberdomide 0.45 mg vs placebo (−90.5% vs −51.2%; P = .007), while no significant differences were observed with the 0.45-mg dose vs placebo in patients with chronic or acute CLE.
- Overall, CLASI-50 responses were not significantly different among those on 0.45 mg vs placebo (55.6% vs 44.6%). The proportions of patients achieving CLASI-50 at week 24 were significantly greater for iberdomide 0.45 mg vs placebo for those with subacute CLE (91.7% vs 52.9%; P = .035) and chronic CLE (62.1% vs 27.8%; P = .029), but not for those with baseline CLASI-A ≥ 8 (66.7% vs 50%).
- More than 80% of patients had treatment-emergent adverse events (TEAEs), which were mostly mild to moderate. Over 2 years, the most common were urinary tract infections, upper respiratory tract infections, neutropenia, and nasopharyngitis. TEAEs leading to iberdomide discontinuation in one or more patients were neutropenia (n = 7), rash (n = 7), increased hepatic enzymes (n = 4), and deep vein thrombosis (n = 3).
IN PRACTICE:
“Data from this phase 2 trial of iberdomide in patients with SLE suggest that a greater proportion of patients with subacute or chronic CLE who received the higher dose of 0.45 mg iberdomide achieved CLASI-50 vs placebo. For the overall population, CLASI-50 response was not significantly different between treatment groups at week 24, partly due to a high placebo response that may have been driven by patients with acute CLE,” the authors wrote.
SOURCE:
The study was led by Victoria P. Werth, MD, of the University of Pennsylvania and the Veteran’s Administration Medical Center, both in Philadelphia, and was published online in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study included small patient subgroups for different CLE subtypes, which may affect the generalizability of the findings. CLE subtype was determined by the investigator without additional photographic adjudication. Additionally, the use of background lupus medications could have influenced the placebo group’s response, limiting the ability to observe the treatment effect of iberdomide monotherapy.
DISCLOSURES:
The study was funded by Bristol-Myers Squibb. Six authors reported being employed by Bristol-Myers Squibb, and several others reported consultancy and research support from various sources including Bristol-Myers Squibb.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
particularly in subacute and chronic cases.
METHODOLOGY:
- Researchers conducted a randomized phase 2 trial to evaluate the efficacy and safety of iberdomide in 288 patients with CLE (mean age, 45 years; 97% women). Iberdomide is a cereblon modulator, which results in degradation of two transcription factors of immune cell development and homeostasis — Ikaros and Aiolos — that have been implicated in the genetic predisposition of systemic lupus.
- CLE Disease Area and Severity Index Activity (CLASI-A) endpoints included mean percent change from baseline and ≥ 50% reduction from baseline (CLASI-50), which were evaluated in all patients with baseline CLASI-A scores ≥ 8 and by CLE subtypes (acute, subacute, and chronic).
- At baseline, 56% of patients had acute CLE, 29% had chronic CLE, and 16% had subacute CLE; 28% of patients had a baseline CLASI-A score ≥ 8.
- Patients were randomly assigned to receive oral iberdomide (0.45 mg, 0.30 mg, 0.15 mg, or placebo daily) for 24 weeks while continuing standard lupus medications. At week 24, patients on placebo were rerandomized to iberdomide 0.45 mg or 0.30 mg once a day, while those on iberdomide continued their assigned dose through week 52.
TAKEAWAY:
- Among patients with baseline CLASI-A ≥ 8, the mean change in CLASI-A score from baseline at week 24 was −66.7% for those on iberdomide 0.45 mg vs −54.2% for placebo (P = .295).
- At week 24, patients with subacute CLE showed a significantly greater mean percent change from baseline in CLASI-A with iberdomide 0.45 mg vs placebo (−90.5% vs −51.2%; P = .007), while no significant differences were observed with the 0.45-mg dose vs placebo in patients with chronic or acute CLE.
- Overall, CLASI-50 responses were not significantly different among those on 0.45 mg vs placebo (55.6% vs 44.6%). The proportions of patients achieving CLASI-50 at week 24 were significantly greater for iberdomide 0.45 mg vs placebo for those with subacute CLE (91.7% vs 52.9%; P = .035) and chronic CLE (62.1% vs 27.8%; P = .029), but not for those with baseline CLASI-A ≥ 8 (66.7% vs 50%).
- More than 80% of patients had treatment-emergent adverse events (TEAEs), which were mostly mild to moderate. Over 2 years, the most common were urinary tract infections, upper respiratory tract infections, neutropenia, and nasopharyngitis. TEAEs leading to iberdomide discontinuation in one or more patients were neutropenia (n = 7), rash (n = 7), increased hepatic enzymes (n = 4), and deep vein thrombosis (n = 3).
IN PRACTICE:
“Data from this phase 2 trial of iberdomide in patients with SLE suggest that a greater proportion of patients with subacute or chronic CLE who received the higher dose of 0.45 mg iberdomide achieved CLASI-50 vs placebo. For the overall population, CLASI-50 response was not significantly different between treatment groups at week 24, partly due to a high placebo response that may have been driven by patients with acute CLE,” the authors wrote.
SOURCE:
The study was led by Victoria P. Werth, MD, of the University of Pennsylvania and the Veteran’s Administration Medical Center, both in Philadelphia, and was published online in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study included small patient subgroups for different CLE subtypes, which may affect the generalizability of the findings. CLE subtype was determined by the investigator without additional photographic adjudication. Additionally, the use of background lupus medications could have influenced the placebo group’s response, limiting the ability to observe the treatment effect of iberdomide monotherapy.
DISCLOSURES:
The study was funded by Bristol-Myers Squibb. Six authors reported being employed by Bristol-Myers Squibb, and several others reported consultancy and research support from various sources including Bristol-Myers Squibb.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
particularly in subacute and chronic cases.
METHODOLOGY:
- Researchers conducted a randomized phase 2 trial to evaluate the efficacy and safety of iberdomide in 288 patients with CLE (mean age, 45 years; 97% women). Iberdomide is a cereblon modulator, which results in degradation of two transcription factors of immune cell development and homeostasis — Ikaros and Aiolos — that have been implicated in the genetic predisposition of systemic lupus.
- CLE Disease Area and Severity Index Activity (CLASI-A) endpoints included mean percent change from baseline and ≥ 50% reduction from baseline (CLASI-50), which were evaluated in all patients with baseline CLASI-A scores ≥ 8 and by CLE subtypes (acute, subacute, and chronic).
- At baseline, 56% of patients had acute CLE, 29% had chronic CLE, and 16% had subacute CLE; 28% of patients had a baseline CLASI-A score ≥ 8.
- Patients were randomly assigned to receive oral iberdomide (0.45 mg, 0.30 mg, 0.15 mg, or placebo daily) for 24 weeks while continuing standard lupus medications. At week 24, patients on placebo were rerandomized to iberdomide 0.45 mg or 0.30 mg once a day, while those on iberdomide continued their assigned dose through week 52.
TAKEAWAY:
- Among patients with baseline CLASI-A ≥ 8, the mean change in CLASI-A score from baseline at week 24 was −66.7% for those on iberdomide 0.45 mg vs −54.2% for placebo (P = .295).
- At week 24, patients with subacute CLE showed a significantly greater mean percent change from baseline in CLASI-A with iberdomide 0.45 mg vs placebo (−90.5% vs −51.2%; P = .007), while no significant differences were observed with the 0.45-mg dose vs placebo in patients with chronic or acute CLE.
- Overall, CLASI-50 responses were not significantly different among those on 0.45 mg vs placebo (55.6% vs 44.6%). The proportions of patients achieving CLASI-50 at week 24 were significantly greater for iberdomide 0.45 mg vs placebo for those with subacute CLE (91.7% vs 52.9%; P = .035) and chronic CLE (62.1% vs 27.8%; P = .029), but not for those with baseline CLASI-A ≥ 8 (66.7% vs 50%).
- More than 80% of patients had treatment-emergent adverse events (TEAEs), which were mostly mild to moderate. Over 2 years, the most common were urinary tract infections, upper respiratory tract infections, neutropenia, and nasopharyngitis. TEAEs leading to iberdomide discontinuation in one or more patients were neutropenia (n = 7), rash (n = 7), increased hepatic enzymes (n = 4), and deep vein thrombosis (n = 3).
IN PRACTICE:
“Data from this phase 2 trial of iberdomide in patients with SLE suggest that a greater proportion of patients with subacute or chronic CLE who received the higher dose of 0.45 mg iberdomide achieved CLASI-50 vs placebo. For the overall population, CLASI-50 response was not significantly different between treatment groups at week 24, partly due to a high placebo response that may have been driven by patients with acute CLE,” the authors wrote.
SOURCE:
The study was led by Victoria P. Werth, MD, of the University of Pennsylvania and the Veteran’s Administration Medical Center, both in Philadelphia, and was published online in the Journal of the American Academy of Dermatology.
LIMITATIONS:
The study included small patient subgroups for different CLE subtypes, which may affect the generalizability of the findings. CLE subtype was determined by the investigator without additional photographic adjudication. Additionally, the use of background lupus medications could have influenced the placebo group’s response, limiting the ability to observe the treatment effect of iberdomide monotherapy.
DISCLOSURES:
The study was funded by Bristol-Myers Squibb. Six authors reported being employed by Bristol-Myers Squibb, and several others reported consultancy and research support from various sources including Bristol-Myers Squibb.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Virtual Yoga Classes Improve Chronic Low Back Pain
TOPLINE:
Virtual yoga classes significantly reduced chronic low back pain intensity and improved back-related function in health system employees. Improvements were sustained at 24 weeks, with reduced pain medication use and better sleep quality.
METHODOLOGY:
- A single-blinded, 24-week, 2-arm, randomized clinical trial was conducted from May 3, 2022, through May 23, 2023, comparing live-streamed yoga classes with a wait-list control among adults with chronic low back pain.
- A total of 140 participants aged 18-64 years with chronic low back pain were recruited from the Cleveland Clinic Employee Health Plan.
- Inclusion criteria included a mean low back pain intensity score of at least 4 on an 11-point numerical rating scale and daily back pain interference about half or more of the days.
- The intervention consisted of 12 consecutive weekly, 60-minute, virtual, live-streamed hatha yoga group classes.
Coprimary outcomes were mean pain intensity in the previous week on the 11-point numerical rating scale and back-related function as assessed using the 23-point modified Roland Morris Disability Questionnaire at 12 weeks.
TAKEAWAY:
- Participants in the virtual yoga group showed greater reductions in mean pain intensity at 12 weeks (mean change, –1.5 points; P < .001) and 24 weeks (mean change, –2.3 points; P < .001) compared to the wait-list control group.
- Back-related function improved significantly in the virtual yoga group at 12 weeks (mean change, –2.8 points; P < .001) and 24 weeks (mean change, –4.6 points; P < .001), compared with the control group.
- Virtual yoga participants reported 21.2 percentage points less use of any analgesic medication during the past week at 24 weeks, compared with the control group.
- Sleep quality improved more in the virtual yoga group at 12 weeks (mean change, 0.4 points; P = .008) and 24 weeks (mean change, 0.4 points; P = .005), compared with the control group.
IN PRACTICE:
“Given the demonstrated noninferiority of yoga to physical therapy, structured virtual yoga programs and physical therapy are reasonable choices for patients with [chronic low back pain] depending on accessibility, cost, and patient preference. These findings support the call by the National Academy of Medicine for increased evidenced-based pain treatments that can be disseminated via technology-based platforms,” wrote the authors of the study.
SOURCE:
The study was led by Hallie Tankha, PhD, Cleveland Clinic in Ohio. It was published online on November 1, 2024, in JAMA Network Open.
LIMITATIONS:
The study had a low adherence rate, with only 36.6% of participants attending at least 50% of the yoga classes. There was also a higher rate of missing data in the yoga group compared to the control group. The study did not include a longer-term follow-up assessment beyond 24 weeks.
DISCLOSURES:
This study was supported by grants from Cleveland Clinic Healthcare Delivery and Implementation Science Center. One coauthor disclosed receiving personal fees from the Blue Cross Blue Shield Association. Eric Roseen, DC, PhD, reported receiving grants from the National Institutes of Health National Center for Complementary and Integrative Health. One coauthor disclosed receiving personal fees from UpToDate and grants from NCCIH related to yoga and tai chi for treatment of pain. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Virtual yoga classes significantly reduced chronic low back pain intensity and improved back-related function in health system employees. Improvements were sustained at 24 weeks, with reduced pain medication use and better sleep quality.
METHODOLOGY:
- A single-blinded, 24-week, 2-arm, randomized clinical trial was conducted from May 3, 2022, through May 23, 2023, comparing live-streamed yoga classes with a wait-list control among adults with chronic low back pain.
- A total of 140 participants aged 18-64 years with chronic low back pain were recruited from the Cleveland Clinic Employee Health Plan.
- Inclusion criteria included a mean low back pain intensity score of at least 4 on an 11-point numerical rating scale and daily back pain interference about half or more of the days.
- The intervention consisted of 12 consecutive weekly, 60-minute, virtual, live-streamed hatha yoga group classes.
Coprimary outcomes were mean pain intensity in the previous week on the 11-point numerical rating scale and back-related function as assessed using the 23-point modified Roland Morris Disability Questionnaire at 12 weeks.
TAKEAWAY:
- Participants in the virtual yoga group showed greater reductions in mean pain intensity at 12 weeks (mean change, –1.5 points; P < .001) and 24 weeks (mean change, –2.3 points; P < .001) compared to the wait-list control group.
- Back-related function improved significantly in the virtual yoga group at 12 weeks (mean change, –2.8 points; P < .001) and 24 weeks (mean change, –4.6 points; P < .001), compared with the control group.
- Virtual yoga participants reported 21.2 percentage points less use of any analgesic medication during the past week at 24 weeks, compared with the control group.
- Sleep quality improved more in the virtual yoga group at 12 weeks (mean change, 0.4 points; P = .008) and 24 weeks (mean change, 0.4 points; P = .005), compared with the control group.
IN PRACTICE:
“Given the demonstrated noninferiority of yoga to physical therapy, structured virtual yoga programs and physical therapy are reasonable choices for patients with [chronic low back pain] depending on accessibility, cost, and patient preference. These findings support the call by the National Academy of Medicine for increased evidenced-based pain treatments that can be disseminated via technology-based platforms,” wrote the authors of the study.
SOURCE:
The study was led by Hallie Tankha, PhD, Cleveland Clinic in Ohio. It was published online on November 1, 2024, in JAMA Network Open.
LIMITATIONS:
The study had a low adherence rate, with only 36.6% of participants attending at least 50% of the yoga classes. There was also a higher rate of missing data in the yoga group compared to the control group. The study did not include a longer-term follow-up assessment beyond 24 weeks.
DISCLOSURES:
This study was supported by grants from Cleveland Clinic Healthcare Delivery and Implementation Science Center. One coauthor disclosed receiving personal fees from the Blue Cross Blue Shield Association. Eric Roseen, DC, PhD, reported receiving grants from the National Institutes of Health National Center for Complementary and Integrative Health. One coauthor disclosed receiving personal fees from UpToDate and grants from NCCIH related to yoga and tai chi for treatment of pain. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Virtual yoga classes significantly reduced chronic low back pain intensity and improved back-related function in health system employees. Improvements were sustained at 24 weeks, with reduced pain medication use and better sleep quality.
METHODOLOGY:
- A single-blinded, 24-week, 2-arm, randomized clinical trial was conducted from May 3, 2022, through May 23, 2023, comparing live-streamed yoga classes with a wait-list control among adults with chronic low back pain.
- A total of 140 participants aged 18-64 years with chronic low back pain were recruited from the Cleveland Clinic Employee Health Plan.
- Inclusion criteria included a mean low back pain intensity score of at least 4 on an 11-point numerical rating scale and daily back pain interference about half or more of the days.
- The intervention consisted of 12 consecutive weekly, 60-minute, virtual, live-streamed hatha yoga group classes.
Coprimary outcomes were mean pain intensity in the previous week on the 11-point numerical rating scale and back-related function as assessed using the 23-point modified Roland Morris Disability Questionnaire at 12 weeks.
TAKEAWAY:
- Participants in the virtual yoga group showed greater reductions in mean pain intensity at 12 weeks (mean change, –1.5 points; P < .001) and 24 weeks (mean change, –2.3 points; P < .001) compared to the wait-list control group.
- Back-related function improved significantly in the virtual yoga group at 12 weeks (mean change, –2.8 points; P < .001) and 24 weeks (mean change, –4.6 points; P < .001), compared with the control group.
- Virtual yoga participants reported 21.2 percentage points less use of any analgesic medication during the past week at 24 weeks, compared with the control group.
- Sleep quality improved more in the virtual yoga group at 12 weeks (mean change, 0.4 points; P = .008) and 24 weeks (mean change, 0.4 points; P = .005), compared with the control group.
IN PRACTICE:
“Given the demonstrated noninferiority of yoga to physical therapy, structured virtual yoga programs and physical therapy are reasonable choices for patients with [chronic low back pain] depending on accessibility, cost, and patient preference. These findings support the call by the National Academy of Medicine for increased evidenced-based pain treatments that can be disseminated via technology-based platforms,” wrote the authors of the study.
SOURCE:
The study was led by Hallie Tankha, PhD, Cleveland Clinic in Ohio. It was published online on November 1, 2024, in JAMA Network Open.
LIMITATIONS:
The study had a low adherence rate, with only 36.6% of participants attending at least 50% of the yoga classes. There was also a higher rate of missing data in the yoga group compared to the control group. The study did not include a longer-term follow-up assessment beyond 24 weeks.
DISCLOSURES:
This study was supported by grants from Cleveland Clinic Healthcare Delivery and Implementation Science Center. One coauthor disclosed receiving personal fees from the Blue Cross Blue Shield Association. Eric Roseen, DC, PhD, reported receiving grants from the National Institutes of Health National Center for Complementary and Integrative Health. One coauthor disclosed receiving personal fees from UpToDate and grants from NCCIH related to yoga and tai chi for treatment of pain. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
JIA Treatment Has Increasingly Involved New DMARDs Since 2001
TOPLINE:
The use of newer biologic or targeted synthetic disease-modifying antirheumatic drugs (b/tsDMARDs) for treating juvenile idiopathic arthritis (JIA) rose sharply from 2001 to 2022, while the use of conventional synthetic DMARDs (csDMARDs) plummeted, with adalimumab becoming the most commonly used b/tsDMARD.
METHODOLOGY:
- Researchers performed a serial cross-sectional study using Merative MarketScan Commercial Claims and Encounters data from 2000 to 2022 to describe recent trends in DMARD use for children with JIA in the United States.
- They identified 20,258 new episodes of DMARD use among 13,696 children with JIA (median age, 14 years; 67.5% girls) who newly initiated at least one DMARD.
- Participants were required to have ≥ 365 days of continuous healthcare and pharmacy eligibility prior to the index date, defined as the date of DMARD initiation.
TAKEAWAY:
- The use of csDMARDs declined from 89.5% to 43.2% between 2001 and 2022 (P < .001 for trend), whereas the use of bDMARDs increased from 10.5% to 50.0% over the same period (P < .001).
- Methotrexate was the most commonly used DMARD throughout the study period ; however, as with other csDMARDs, its use declined from 42.1% in 2001 to 21.5% in 2022 (P < .001 ).
- Use of the tumor necrosis factor inhibitor adalimumab doubled from 7% in 2007 to 14% in 2008 and increased further up to 20.5% by 2022; adalimumab also became the most predominantly used b/tsDMARD after csDMARD monotherapy, accounting for 77.8% of prescriptions following csDMARDs in 2022.
- Even though the use of individual TNF inhibitors increased, their overall popularity fell in recent years as the use of newer b/tsDMARDs, such as ustekinumab and secukinumab, increased.
IN PRACTICE:
“These real-world treatment patterns give us insight into how selection of therapies for JIA has evolved with increasing availability of effective agents and help prepare for future studies on comparative DMARD safety and effectiveness,” the authors wrote.
SOURCE:
The study was led by Priyanka Yalamanchili, PharmD, MS, Center for Pharmacoepidemiology and Treatment Science, Institute for Health, Rutgers University, New Brunswick, New Jersey, and was published online October 22, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
The dependence on commercial claims data may have limited the generalizability of the findings to other populations, such as those with public insurance or without insurance. The study did not have access to demographic data of the participants to investigate the presence of disparities in the use of DMARDs. Moreover, the lack of clinical details about the patients with JIA, including disease severity and specialty of prescribers, may have affected the interpretation of the results.
DISCLOSURES:
The study was supported by funding from the National Institute of Arthritis and Musculoskeletal and Skin Diseases and several other institutes of the National Institutes of Health, as well as the Rheumatology Research Foundation and the Juvenile Diabetes Research Foundation. No conflicts of interest were reported by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The use of newer biologic or targeted synthetic disease-modifying antirheumatic drugs (b/tsDMARDs) for treating juvenile idiopathic arthritis (JIA) rose sharply from 2001 to 2022, while the use of conventional synthetic DMARDs (csDMARDs) plummeted, with adalimumab becoming the most commonly used b/tsDMARD.
METHODOLOGY:
- Researchers performed a serial cross-sectional study using Merative MarketScan Commercial Claims and Encounters data from 2000 to 2022 to describe recent trends in DMARD use for children with JIA in the United States.
- They identified 20,258 new episodes of DMARD use among 13,696 children with JIA (median age, 14 years; 67.5% girls) who newly initiated at least one DMARD.
- Participants were required to have ≥ 365 days of continuous healthcare and pharmacy eligibility prior to the index date, defined as the date of DMARD initiation.
TAKEAWAY:
- The use of csDMARDs declined from 89.5% to 43.2% between 2001 and 2022 (P < .001 for trend), whereas the use of bDMARDs increased from 10.5% to 50.0% over the same period (P < .001).
- Methotrexate was the most commonly used DMARD throughout the study period ; however, as with other csDMARDs, its use declined from 42.1% in 2001 to 21.5% in 2022 (P < .001 ).
- Use of the tumor necrosis factor inhibitor adalimumab doubled from 7% in 2007 to 14% in 2008 and increased further up to 20.5% by 2022; adalimumab also became the most predominantly used b/tsDMARD after csDMARD monotherapy, accounting for 77.8% of prescriptions following csDMARDs in 2022.
- Even though the use of individual TNF inhibitors increased, their overall popularity fell in recent years as the use of newer b/tsDMARDs, such as ustekinumab and secukinumab, increased.
IN PRACTICE:
“These real-world treatment patterns give us insight into how selection of therapies for JIA has evolved with increasing availability of effective agents and help prepare for future studies on comparative DMARD safety and effectiveness,” the authors wrote.
SOURCE:
The study was led by Priyanka Yalamanchili, PharmD, MS, Center for Pharmacoepidemiology and Treatment Science, Institute for Health, Rutgers University, New Brunswick, New Jersey, and was published online October 22, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
The dependence on commercial claims data may have limited the generalizability of the findings to other populations, such as those with public insurance or without insurance. The study did not have access to demographic data of the participants to investigate the presence of disparities in the use of DMARDs. Moreover, the lack of clinical details about the patients with JIA, including disease severity and specialty of prescribers, may have affected the interpretation of the results.
DISCLOSURES:
The study was supported by funding from the National Institute of Arthritis and Musculoskeletal and Skin Diseases and several other institutes of the National Institutes of Health, as well as the Rheumatology Research Foundation and the Juvenile Diabetes Research Foundation. No conflicts of interest were reported by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The use of newer biologic or targeted synthetic disease-modifying antirheumatic drugs (b/tsDMARDs) for treating juvenile idiopathic arthritis (JIA) rose sharply from 2001 to 2022, while the use of conventional synthetic DMARDs (csDMARDs) plummeted, with adalimumab becoming the most commonly used b/tsDMARD.
METHODOLOGY:
- Researchers performed a serial cross-sectional study using Merative MarketScan Commercial Claims and Encounters data from 2000 to 2022 to describe recent trends in DMARD use for children with JIA in the United States.
- They identified 20,258 new episodes of DMARD use among 13,696 children with JIA (median age, 14 years; 67.5% girls) who newly initiated at least one DMARD.
- Participants were required to have ≥ 365 days of continuous healthcare and pharmacy eligibility prior to the index date, defined as the date of DMARD initiation.
TAKEAWAY:
- The use of csDMARDs declined from 89.5% to 43.2% between 2001 and 2022 (P < .001 for trend), whereas the use of bDMARDs increased from 10.5% to 50.0% over the same period (P < .001).
- Methotrexate was the most commonly used DMARD throughout the study period ; however, as with other csDMARDs, its use declined from 42.1% in 2001 to 21.5% in 2022 (P < .001 ).
- Use of the tumor necrosis factor inhibitor adalimumab doubled from 7% in 2007 to 14% in 2008 and increased further up to 20.5% by 2022; adalimumab also became the most predominantly used b/tsDMARD after csDMARD monotherapy, accounting for 77.8% of prescriptions following csDMARDs in 2022.
- Even though the use of individual TNF inhibitors increased, their overall popularity fell in recent years as the use of newer b/tsDMARDs, such as ustekinumab and secukinumab, increased.
IN PRACTICE:
“These real-world treatment patterns give us insight into how selection of therapies for JIA has evolved with increasing availability of effective agents and help prepare for future studies on comparative DMARD safety and effectiveness,” the authors wrote.
SOURCE:
The study was led by Priyanka Yalamanchili, PharmD, MS, Center for Pharmacoepidemiology and Treatment Science, Institute for Health, Rutgers University, New Brunswick, New Jersey, and was published online October 22, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
The dependence on commercial claims data may have limited the generalizability of the findings to other populations, such as those with public insurance or without insurance. The study did not have access to demographic data of the participants to investigate the presence of disparities in the use of DMARDs. Moreover, the lack of clinical details about the patients with JIA, including disease severity and specialty of prescribers, may have affected the interpretation of the results.
DISCLOSURES:
The study was supported by funding from the National Institute of Arthritis and Musculoskeletal and Skin Diseases and several other institutes of the National Institutes of Health, as well as the Rheumatology Research Foundation and the Juvenile Diabetes Research Foundation. No conflicts of interest were reported by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
No Benefit to High-Dose IV Vs Oral Steroids in Giant Cell Arteritis
TOPLINE:
In patients with giant cell arteritis (GCA), intravenous methylprednisolone compared with oral glucocorticoids alone does not improve visual acuity and increases the risk for diabetes within the first year. Survival rates do not differ with these two treatments.
METHODOLOGY:
- Researchers conducted a population-based retrospective study at three centers in Sweden to assess the clinical characteristics, treatment-related toxicity, and mortality in patients with GCA who were receiving high-dose intravenous methylprednisolone.
- A total of 419 patients with biopsy-confirmed GCA (mean age at diagnosis, 75 years; 69% women) diagnosed from 2004 to 2019 were included.
- Patients were treated with either intravenous methylprednisolone (n = 111) at a dose of 500-1000 mg per day for 3 consecutive days or oral glucocorticoids alone (n = 308).
- Ischemic visual complications considered to indicate visual involvement were confirmed by an ophthalmologist, and data on visual acuity were collected from ophthalmologic clinic records at initial consultations and follow-up at 3-18 months.
TAKEAWAY:
- Despite a tendency toward improvement, no significant difference in visual acuity was observed with intravenous methylprednisolone compared with oral glucocorticoids.
- Patients treated with intravenous methylprednisolone had a higher risk for newly diagnosed diabetes within a year of GCA diagnosis (odds ratio [OR], 2.59; P = .01).
- The risk for diabetes remained elevated even after adjustment for the cumulative oral glucocorticoid dose at 3 months (adjusted OR, 3.30; P = .01).
- Survival rates did not significantly differ between the treatment groups over a mean follow-up of 6.6 years.
IN PRACTICE:
“In this study on the use of intravenous methylprednisolone treatment in GCA, we found no evidence of a beneficial effect in improving visual acuity or enabling more rapid tapering of the oral glucocorticoid dose,” the authors wrote. “The use of IVMP [intravenous methylprednisolone] was associated with an increased risk of diabetes during the first year compared with oral GC [glucocorticoid], raising questions about the value of IVMP in GCA treatment.”
SOURCE:
The study, led by Hampus Henningson, Department of Clinical Sciences, Rheumatology, Lund University, Lund, Sweden, was published online in Rheumatology.
LIMITATIONS:
The retrospective nature of the study may have resulted in missing data and difficulty in accurately quantifying the cumulative glucocorticoid doses. The study did not validate the diagnoses of comorbidities but relied solely on diagnostic codes.
DISCLOSURES:
This study was supported by the Swedish Research Council, Swedish Rheumatism Association, Swedish Medical Society, Alfred Österlund’s Foundation, and King Gustaf V’s 80-year foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
In patients with giant cell arteritis (GCA), intravenous methylprednisolone compared with oral glucocorticoids alone does not improve visual acuity and increases the risk for diabetes within the first year. Survival rates do not differ with these two treatments.
METHODOLOGY:
- Researchers conducted a population-based retrospective study at three centers in Sweden to assess the clinical characteristics, treatment-related toxicity, and mortality in patients with GCA who were receiving high-dose intravenous methylprednisolone.
- A total of 419 patients with biopsy-confirmed GCA (mean age at diagnosis, 75 years; 69% women) diagnosed from 2004 to 2019 were included.
- Patients were treated with either intravenous methylprednisolone (n = 111) at a dose of 500-1000 mg per day for 3 consecutive days or oral glucocorticoids alone (n = 308).
- Ischemic visual complications considered to indicate visual involvement were confirmed by an ophthalmologist, and data on visual acuity were collected from ophthalmologic clinic records at initial consultations and follow-up at 3-18 months.
TAKEAWAY:
- Despite a tendency toward improvement, no significant difference in visual acuity was observed with intravenous methylprednisolone compared with oral glucocorticoids.
- Patients treated with intravenous methylprednisolone had a higher risk for newly diagnosed diabetes within a year of GCA diagnosis (odds ratio [OR], 2.59; P = .01).
- The risk for diabetes remained elevated even after adjustment for the cumulative oral glucocorticoid dose at 3 months (adjusted OR, 3.30; P = .01).
- Survival rates did not significantly differ between the treatment groups over a mean follow-up of 6.6 years.
IN PRACTICE:
“In this study on the use of intravenous methylprednisolone treatment in GCA, we found no evidence of a beneficial effect in improving visual acuity or enabling more rapid tapering of the oral glucocorticoid dose,” the authors wrote. “The use of IVMP [intravenous methylprednisolone] was associated with an increased risk of diabetes during the first year compared with oral GC [glucocorticoid], raising questions about the value of IVMP in GCA treatment.”
SOURCE:
The study, led by Hampus Henningson, Department of Clinical Sciences, Rheumatology, Lund University, Lund, Sweden, was published online in Rheumatology.
LIMITATIONS:
The retrospective nature of the study may have resulted in missing data and difficulty in accurately quantifying the cumulative glucocorticoid doses. The study did not validate the diagnoses of comorbidities but relied solely on diagnostic codes.
DISCLOSURES:
This study was supported by the Swedish Research Council, Swedish Rheumatism Association, Swedish Medical Society, Alfred Österlund’s Foundation, and King Gustaf V’s 80-year foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
In patients with giant cell arteritis (GCA), intravenous methylprednisolone compared with oral glucocorticoids alone does not improve visual acuity and increases the risk for diabetes within the first year. Survival rates do not differ with these two treatments.
METHODOLOGY:
- Researchers conducted a population-based retrospective study at three centers in Sweden to assess the clinical characteristics, treatment-related toxicity, and mortality in patients with GCA who were receiving high-dose intravenous methylprednisolone.
- A total of 419 patients with biopsy-confirmed GCA (mean age at diagnosis, 75 years; 69% women) diagnosed from 2004 to 2019 were included.
- Patients were treated with either intravenous methylprednisolone (n = 111) at a dose of 500-1000 mg per day for 3 consecutive days or oral glucocorticoids alone (n = 308).
- Ischemic visual complications considered to indicate visual involvement were confirmed by an ophthalmologist, and data on visual acuity were collected from ophthalmologic clinic records at initial consultations and follow-up at 3-18 months.
TAKEAWAY:
- Despite a tendency toward improvement, no significant difference in visual acuity was observed with intravenous methylprednisolone compared with oral glucocorticoids.
- Patients treated with intravenous methylprednisolone had a higher risk for newly diagnosed diabetes within a year of GCA diagnosis (odds ratio [OR], 2.59; P = .01).
- The risk for diabetes remained elevated even after adjustment for the cumulative oral glucocorticoid dose at 3 months (adjusted OR, 3.30; P = .01).
- Survival rates did not significantly differ between the treatment groups over a mean follow-up of 6.6 years.
IN PRACTICE:
“In this study on the use of intravenous methylprednisolone treatment in GCA, we found no evidence of a beneficial effect in improving visual acuity or enabling more rapid tapering of the oral glucocorticoid dose,” the authors wrote. “The use of IVMP [intravenous methylprednisolone] was associated with an increased risk of diabetes during the first year compared with oral GC [glucocorticoid], raising questions about the value of IVMP in GCA treatment.”
SOURCE:
The study, led by Hampus Henningson, Department of Clinical Sciences, Rheumatology, Lund University, Lund, Sweden, was published online in Rheumatology.
LIMITATIONS:
The retrospective nature of the study may have resulted in missing data and difficulty in accurately quantifying the cumulative glucocorticoid doses. The study did not validate the diagnoses of comorbidities but relied solely on diagnostic codes.
DISCLOSURES:
This study was supported by the Swedish Research Council, Swedish Rheumatism Association, Swedish Medical Society, Alfred Österlund’s Foundation, and King Gustaf V’s 80-year foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Novel Intervention Slows Cognitive Decline in At-Risk Adults
new research suggests.
The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function.
Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease.
“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto.
The findings were published online in JAMA Psychiatry.
High-Risk Group
Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.
A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.
The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.
Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.
tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.
The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted.
A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months).
Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.
To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.
Study participants and assessors were blinded to treatment assignment.
Slower Cognitive Decline
Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory.
“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.
The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant.
The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.
“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant.
These results suggest the pathways to dementia among people with MCI and rMDD are different, he added.
Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two.
The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education.
Promising, Important Findings
Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.
The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.
“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry.
The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.
A version of this article appeared on Medscape.com.
new research suggests.
The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function.
Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease.
“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto.
The findings were published online in JAMA Psychiatry.
High-Risk Group
Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.
A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.
The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.
Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.
tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.
The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted.
A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months).
Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.
To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.
Study participants and assessors were blinded to treatment assignment.
Slower Cognitive Decline
Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory.
“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.
The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant.
The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.
“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant.
These results suggest the pathways to dementia among people with MCI and rMDD are different, he added.
Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two.
The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education.
Promising, Important Findings
Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.
The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.
“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry.
The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.
A version of this article appeared on Medscape.com.
new research suggests.
The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function.
Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease.
“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto.
The findings were published online in JAMA Psychiatry.
High-Risk Group
Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.
A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.
The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.
Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.
tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.
The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted.
A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months).
Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.
To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.
Study participants and assessors were blinded to treatment assignment.
Slower Cognitive Decline
Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory.
“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.
The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant.
The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.
“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant.
These results suggest the pathways to dementia among people with MCI and rMDD are different, he added.
Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two.
The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education.
Promising, Important Findings
Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.
The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.
“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry.
The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.
A version of this article appeared on Medscape.com.
FROM JAMA PSYCHIATRY
Cannabis Often Used as a Substitute for Traditional Medications
Nearly two thirds of patients with rheumatic conditions switched to medical cannabis from medications such as nonsteroidal anti-inflammatory drugs (NSAIDs) and opioids, with the substitution being associated with greater self-reported improvement in symptoms than nonsubstitution.
METHODOLOGY:
- Researchers conducted a secondary analysis of a cross-sectional survey to investigate the prevalence of switching to medical cannabis from traditional medications in patients with rheumatic conditions from the United States and Canada.
- The survey included questions on current and past medical cannabis use, sociodemographic characteristics, medication taken and substituted, substance use, and patient-reported outcomes.
- Of the 1727 patients who completed the survey, 763 patients (mean age, 59 years; 84.1% women) reported current use of cannabis and were included in this analysis.
- Participants were asked if they had substituted any medications with medical cannabis and were sub-grouped accordingly.
- They also reported any changes in symptoms after initiating cannabis, the current and anticipated duration of medical cannabis use, methods of ingestion, cannabinoid content, and frequency of use.
TAKEAWAY:
- Overall, 62.5% reported substituting medical cannabis for certain medications, including NSAIDs (54.7%), opioids (48.6%), sleep aids (29.6%), muscle relaxants (25.2%), benzodiazepines (15.5%), and gabapentinoids (10.5%).
- The most common reasons given for substituting medical cannabis were fewer side effects (39%), better symptom control (27%), and fewer adverse effects (12%).
- Participants who substituted medical cannabis reported significant improvements in symptoms such as pain, sleep, joint stiffness, muscle spasms, and inflammation, and in overall health, compared with those who did not substitute it for medications.
- The substitution group was more likely to use inhalation methods (smoking and vaporizing) than the nonsubstitution group; they also used medical cannabis more frequently and preferred products containing delta-9-tetrahydrocannabinol.
IN PRACTICE:
“The changing legal status of cannabis has allowed a greater openness with more people willing to try cannabis for symptom relief. These encouraging results of medication reduction and favorable effect of [medical cannabis] require confirmation with more rigorous methods. At this time, survey information may be seen as a signal for effect, rather than sound evidence that could be applicable to those with musculoskeletal complaints in general,” the authors wrote.
SOURCE:
The study was led by Kevin F. Boehnke, PhD, University of Michigan Medical School, Ann Arbor, and was published online in ACR Open Rheumatology.
LIMITATIONS:
The cross-sectional nature of the study limited the determination of causality between medical cannabis use and symptom improvement. Moreover, the anonymous and self-reported nature of the survey at a single timepoint may have introduced recall bias. The sample predominantly consisted of older, White females, which may have limited the generalizability of the findings to other demographic groups.
DISCLOSURES:
Some authors received grant support from the National Institute on Drug Abuse and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some others received payments, honoraria, grant funding, consulting fees, and travel support, and reported other ties with pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Nearly two thirds of patients with rheumatic conditions switched to medical cannabis from medications such as nonsteroidal anti-inflammatory drugs (NSAIDs) and opioids, with the substitution being associated with greater self-reported improvement in symptoms than nonsubstitution.
METHODOLOGY:
- Researchers conducted a secondary analysis of a cross-sectional survey to investigate the prevalence of switching to medical cannabis from traditional medications in patients with rheumatic conditions from the United States and Canada.
- The survey included questions on current and past medical cannabis use, sociodemographic characteristics, medication taken and substituted, substance use, and patient-reported outcomes.
- Of the 1727 patients who completed the survey, 763 patients (mean age, 59 years; 84.1% women) reported current use of cannabis and were included in this analysis.
- Participants were asked if they had substituted any medications with medical cannabis and were sub-grouped accordingly.
- They also reported any changes in symptoms after initiating cannabis, the current and anticipated duration of medical cannabis use, methods of ingestion, cannabinoid content, and frequency of use.
TAKEAWAY:
- Overall, 62.5% reported substituting medical cannabis for certain medications, including NSAIDs (54.7%), opioids (48.6%), sleep aids (29.6%), muscle relaxants (25.2%), benzodiazepines (15.5%), and gabapentinoids (10.5%).
- The most common reasons given for substituting medical cannabis were fewer side effects (39%), better symptom control (27%), and fewer adverse effects (12%).
- Participants who substituted medical cannabis reported significant improvements in symptoms such as pain, sleep, joint stiffness, muscle spasms, and inflammation, and in overall health, compared with those who did not substitute it for medications.
- The substitution group was more likely to use inhalation methods (smoking and vaporizing) than the nonsubstitution group; they also used medical cannabis more frequently and preferred products containing delta-9-tetrahydrocannabinol.
IN PRACTICE:
“The changing legal status of cannabis has allowed a greater openness with more people willing to try cannabis for symptom relief. These encouraging results of medication reduction and favorable effect of [medical cannabis] require confirmation with more rigorous methods. At this time, survey information may be seen as a signal for effect, rather than sound evidence that could be applicable to those with musculoskeletal complaints in general,” the authors wrote.
SOURCE:
The study was led by Kevin F. Boehnke, PhD, University of Michigan Medical School, Ann Arbor, and was published online in ACR Open Rheumatology.
LIMITATIONS:
The cross-sectional nature of the study limited the determination of causality between medical cannabis use and symptom improvement. Moreover, the anonymous and self-reported nature of the survey at a single timepoint may have introduced recall bias. The sample predominantly consisted of older, White females, which may have limited the generalizability of the findings to other demographic groups.
DISCLOSURES:
Some authors received grant support from the National Institute on Drug Abuse and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some others received payments, honoraria, grant funding, consulting fees, and travel support, and reported other ties with pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Nearly two thirds of patients with rheumatic conditions switched to medical cannabis from medications such as nonsteroidal anti-inflammatory drugs (NSAIDs) and opioids, with the substitution being associated with greater self-reported improvement in symptoms than nonsubstitution.
METHODOLOGY:
- Researchers conducted a secondary analysis of a cross-sectional survey to investigate the prevalence of switching to medical cannabis from traditional medications in patients with rheumatic conditions from the United States and Canada.
- The survey included questions on current and past medical cannabis use, sociodemographic characteristics, medication taken and substituted, substance use, and patient-reported outcomes.
- Of the 1727 patients who completed the survey, 763 patients (mean age, 59 years; 84.1% women) reported current use of cannabis and were included in this analysis.
- Participants were asked if they had substituted any medications with medical cannabis and were sub-grouped accordingly.
- They also reported any changes in symptoms after initiating cannabis, the current and anticipated duration of medical cannabis use, methods of ingestion, cannabinoid content, and frequency of use.
TAKEAWAY:
- Overall, 62.5% reported substituting medical cannabis for certain medications, including NSAIDs (54.7%), opioids (48.6%), sleep aids (29.6%), muscle relaxants (25.2%), benzodiazepines (15.5%), and gabapentinoids (10.5%).
- The most common reasons given for substituting medical cannabis were fewer side effects (39%), better symptom control (27%), and fewer adverse effects (12%).
- Participants who substituted medical cannabis reported significant improvements in symptoms such as pain, sleep, joint stiffness, muscle spasms, and inflammation, and in overall health, compared with those who did not substitute it for medications.
- The substitution group was more likely to use inhalation methods (smoking and vaporizing) than the nonsubstitution group; they also used medical cannabis more frequently and preferred products containing delta-9-tetrahydrocannabinol.
IN PRACTICE:
“The changing legal status of cannabis has allowed a greater openness with more people willing to try cannabis for symptom relief. These encouraging results of medication reduction and favorable effect of [medical cannabis] require confirmation with more rigorous methods. At this time, survey information may be seen as a signal for effect, rather than sound evidence that could be applicable to those with musculoskeletal complaints in general,” the authors wrote.
SOURCE:
The study was led by Kevin F. Boehnke, PhD, University of Michigan Medical School, Ann Arbor, and was published online in ACR Open Rheumatology.
LIMITATIONS:
The cross-sectional nature of the study limited the determination of causality between medical cannabis use and symptom improvement. Moreover, the anonymous and self-reported nature of the survey at a single timepoint may have introduced recall bias. The sample predominantly consisted of older, White females, which may have limited the generalizability of the findings to other demographic groups.
DISCLOSURES:
Some authors received grant support from the National Institute on Drug Abuse and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some others received payments, honoraria, grant funding, consulting fees, and travel support, and reported other ties with pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Maternal BMI and Eating Disorders Tied to Mental Health in Kids
TOPLINE:
Children of mothers who had obesity or eating disorders before or during pregnancy may face higher risks for neurodevelopmental and psychiatric disorders.
METHODOLOGY:
- Researchers conducted a population-based cohort study to investigate the association of maternal eating disorders and high prepregnancy body mass index (BMI) with psychiatric disorder and neurodevelopmental diagnoses in offspring.
- They used Finnish national registers to assess all live births from 2004 through 2014, with follow-up until 2021.
- Data of 392,098 mothers (mean age, 30.15 years) and 649,956 offspring (48.86% girls) were included.
- Maternal eating disorders and prepregnancy BMI were the main exposures, with 1.60% of mothers having a history of eating disorders; 5.89% were underweight and 53.13% had obesity.
- Diagnoses of children were identified and grouped by ICD-10 codes of mental, behavioral, and neurodevelopmental disorders, mood disorders, anxiety disorders, sleep disorders, attention-deficit/hyperactivity disorder, and conduct disorders, among several others.
TAKEAWAY:
- From birth until 7-17 years of age, 16.43% of offspring were diagnosed with a neurodevelopmental or psychiatric disorder.
- Maternal eating disorders were associated with psychiatric disorders in the offspring, with the largest effect sizes observed for sleep disorders (hazard ratio [HR], 2.36) and social functioning and tic disorders (HR, 2.18; P < .001 for both).
- The offspring of mothers with severe prepregnancy obesity had a more than twofold increased risk for intellectual disabilities (HR, 2.04; 95% CI, 1.83-2.28); being underweight before pregnancy was also linked to many psychiatric disorders in offspring.
- The occurrence of adverse birth outcomes along with maternal eating disorders or high BMI further increased the risk for neurodevelopmental and psychiatric disorders in the offspring.
IN PRACTICE:
“The findings underline the risk of offspring mental illness associated with maternal eating disorders and prepregnancy BMI and suggest the need to consider these exposures clinically to help prevent offspring mental illness,” the authors wrote.
SOURCE:
This study was led by Ida A.K. Nilsson, PhD, of the Department of Molecular Medicine and Surgery at the Karolinska Institutet in Stockholm, Sweden, and was published online in JAMA Network Open.
LIMITATIONS:
A limitation of the study was the relatively short follow-up time, which restricted the inclusion of late-onset psychiatric disorder diagnoses, such as schizophrenia spectrum disorders. Paternal data and genetic information, which may have influenced the interpretation of the data, were not available. Another potential bias was that mothers with eating disorders may have been more perceptive to their child’s eating behavior, leading to greater access to care and diagnosis for these children.
DISCLOSURES:
This work was supported by the Swedish Research Council, the regional agreement on medical training and clinical research between Region Stockholm and the Karolinska Institutet, the Swedish Brain Foundation, and other sources. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Children of mothers who had obesity or eating disorders before or during pregnancy may face higher risks for neurodevelopmental and psychiatric disorders.
METHODOLOGY:
- Researchers conducted a population-based cohort study to investigate the association of maternal eating disorders and high prepregnancy body mass index (BMI) with psychiatric disorder and neurodevelopmental diagnoses in offspring.
- They used Finnish national registers to assess all live births from 2004 through 2014, with follow-up until 2021.
- Data of 392,098 mothers (mean age, 30.15 years) and 649,956 offspring (48.86% girls) were included.
- Maternal eating disorders and prepregnancy BMI were the main exposures, with 1.60% of mothers having a history of eating disorders; 5.89% were underweight and 53.13% had obesity.
- Diagnoses of children were identified and grouped by ICD-10 codes of mental, behavioral, and neurodevelopmental disorders, mood disorders, anxiety disorders, sleep disorders, attention-deficit/hyperactivity disorder, and conduct disorders, among several others.
TAKEAWAY:
- From birth until 7-17 years of age, 16.43% of offspring were diagnosed with a neurodevelopmental or psychiatric disorder.
- Maternal eating disorders were associated with psychiatric disorders in the offspring, with the largest effect sizes observed for sleep disorders (hazard ratio [HR], 2.36) and social functioning and tic disorders (HR, 2.18; P < .001 for both).
- The offspring of mothers with severe prepregnancy obesity had a more than twofold increased risk for intellectual disabilities (HR, 2.04; 95% CI, 1.83-2.28); being underweight before pregnancy was also linked to many psychiatric disorders in offspring.
- The occurrence of adverse birth outcomes along with maternal eating disorders or high BMI further increased the risk for neurodevelopmental and psychiatric disorders in the offspring.
IN PRACTICE:
“The findings underline the risk of offspring mental illness associated with maternal eating disorders and prepregnancy BMI and suggest the need to consider these exposures clinically to help prevent offspring mental illness,” the authors wrote.
SOURCE:
This study was led by Ida A.K. Nilsson, PhD, of the Department of Molecular Medicine and Surgery at the Karolinska Institutet in Stockholm, Sweden, and was published online in JAMA Network Open.
LIMITATIONS:
A limitation of the study was the relatively short follow-up time, which restricted the inclusion of late-onset psychiatric disorder diagnoses, such as schizophrenia spectrum disorders. Paternal data and genetic information, which may have influenced the interpretation of the data, were not available. Another potential bias was that mothers with eating disorders may have been more perceptive to their child’s eating behavior, leading to greater access to care and diagnosis for these children.
DISCLOSURES:
This work was supported by the Swedish Research Council, the regional agreement on medical training and clinical research between Region Stockholm and the Karolinska Institutet, the Swedish Brain Foundation, and other sources. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Children of mothers who had obesity or eating disorders before or during pregnancy may face higher risks for neurodevelopmental and psychiatric disorders.
METHODOLOGY:
- Researchers conducted a population-based cohort study to investigate the association of maternal eating disorders and high prepregnancy body mass index (BMI) with psychiatric disorder and neurodevelopmental diagnoses in offspring.
- They used Finnish national registers to assess all live births from 2004 through 2014, with follow-up until 2021.
- Data of 392,098 mothers (mean age, 30.15 years) and 649,956 offspring (48.86% girls) were included.
- Maternal eating disorders and prepregnancy BMI were the main exposures, with 1.60% of mothers having a history of eating disorders; 5.89% were underweight and 53.13% had obesity.
- Diagnoses of children were identified and grouped by ICD-10 codes of mental, behavioral, and neurodevelopmental disorders, mood disorders, anxiety disorders, sleep disorders, attention-deficit/hyperactivity disorder, and conduct disorders, among several others.
TAKEAWAY:
- From birth until 7-17 years of age, 16.43% of offspring were diagnosed with a neurodevelopmental or psychiatric disorder.
- Maternal eating disorders were associated with psychiatric disorders in the offspring, with the largest effect sizes observed for sleep disorders (hazard ratio [HR], 2.36) and social functioning and tic disorders (HR, 2.18; P < .001 for both).
- The offspring of mothers with severe prepregnancy obesity had a more than twofold increased risk for intellectual disabilities (HR, 2.04; 95% CI, 1.83-2.28); being underweight before pregnancy was also linked to many psychiatric disorders in offspring.
- The occurrence of adverse birth outcomes along with maternal eating disorders or high BMI further increased the risk for neurodevelopmental and psychiatric disorders in the offspring.
IN PRACTICE:
“The findings underline the risk of offspring mental illness associated with maternal eating disorders and prepregnancy BMI and suggest the need to consider these exposures clinically to help prevent offspring mental illness,” the authors wrote.
SOURCE:
This study was led by Ida A.K. Nilsson, PhD, of the Department of Molecular Medicine and Surgery at the Karolinska Institutet in Stockholm, Sweden, and was published online in JAMA Network Open.
LIMITATIONS:
A limitation of the study was the relatively short follow-up time, which restricted the inclusion of late-onset psychiatric disorder diagnoses, such as schizophrenia spectrum disorders. Paternal data and genetic information, which may have influenced the interpretation of the data, were not available. Another potential bias was that mothers with eating disorders may have been more perceptive to their child’s eating behavior, leading to greater access to care and diagnosis for these children.
DISCLOSURES:
This work was supported by the Swedish Research Council, the regional agreement on medical training and clinical research between Region Stockholm and the Karolinska Institutet, the Swedish Brain Foundation, and other sources. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Can Fish Skin Grafts Heal Diabetic Foot Ulcers?
TOPLINE:
METHODOLOGY:
- Standard wound care for diabetic foot ulcers involves vascular assessment, surgical debridement, use of appropriate dressings, infection management, and glycemic control; however, standard care is typically associated with poor outcomes.
- Researchers conducted a multicenter clinical trial in 15 tertiary care centers with diabetic foot units across France, Italy, Germany, and Sweden to evaluate the efficacy and safety of intact fish skin grafts over standard-of-care practices in treating complex diabetic foot ulcers.
- A total of 255 patients aged 18 years or older with diabetes and lower limb wounds penetrating to the tendon, capsule, bone, or joint were randomly assigned to receive either an intact fish skin graft or standard wound care for 14 weeks.
- The primary endpoint was the percentage of wounds achieving complete closure by 16 weeks.
- Wound healing was also assessed at 20 and 24 weeks.
TAKEAWAY:
- The proportion of wounds healed at 16 weeks was higher with intact fish skin grafts than with standard-of-care (44.0% vs 26.4% adjusted odds ratio [aOR], 2.58; 95% CI, 1.48-4.56).
- The fish skin grafts continued to be more effective than standard wound care practices at weeks 20 (aOR, 2.15; 95% CI, 1.27–3.70) and 24 (aOR, 2.19; 95% CI, 1.31–3.70).
- The mean time to healing was 17.31 weeks for the intact fish skin graft group and 19.37 weeks for the standard-of-care group; intact fish skin grafts were also associated with faster healing times than standard wound care (hazard ratio, 1.59; 95% CI, 1.07-2.36).
- Target wound infections were the most common adverse events, occurring in a similar number of patients in both the groups.
IN PRACTICE:
“Our trial demonstrated treatment of complex diabetic foot ulcers with intact fish skin grafts achieved a significantly greater proportion of diabetic foot ulcers healed at 16 weeks than standard of care, and was associated with increased healing at 20 and 24 weeks. That these results were achieved in non-superficial UT [University of Texas diabetic wound classification system] grade 2 and 3 diabetic foot ulcers and included ischemic and/or infected diabetic foot ulcers is of importance,” the authors wrote.
SOURCE:
The study was led by Dured Dardari, MD, PhD, Center Hospitalier Sud Francilien, Corbeil-Essonnes, France, and was published online in NEJM Evidence.
LIMITATIONS:
No limitations were discussed for this study.
DISCLOSURES:
The study was funded by European Commission Fast Track to Innovation Horizon 2020 and Kerecis. Two authors reported being employees with or without stock options at Kerecis, and other authors reported having ties with many sources including Kerecis.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Standard wound care for diabetic foot ulcers involves vascular assessment, surgical debridement, use of appropriate dressings, infection management, and glycemic control; however, standard care is typically associated with poor outcomes.
- Researchers conducted a multicenter clinical trial in 15 tertiary care centers with diabetic foot units across France, Italy, Germany, and Sweden to evaluate the efficacy and safety of intact fish skin grafts over standard-of-care practices in treating complex diabetic foot ulcers.
- A total of 255 patients aged 18 years or older with diabetes and lower limb wounds penetrating to the tendon, capsule, bone, or joint were randomly assigned to receive either an intact fish skin graft or standard wound care for 14 weeks.
- The primary endpoint was the percentage of wounds achieving complete closure by 16 weeks.
- Wound healing was also assessed at 20 and 24 weeks.
TAKEAWAY:
- The proportion of wounds healed at 16 weeks was higher with intact fish skin grafts than with standard-of-care (44.0% vs 26.4% adjusted odds ratio [aOR], 2.58; 95% CI, 1.48-4.56).
- The fish skin grafts continued to be more effective than standard wound care practices at weeks 20 (aOR, 2.15; 95% CI, 1.27–3.70) and 24 (aOR, 2.19; 95% CI, 1.31–3.70).
- The mean time to healing was 17.31 weeks for the intact fish skin graft group and 19.37 weeks for the standard-of-care group; intact fish skin grafts were also associated with faster healing times than standard wound care (hazard ratio, 1.59; 95% CI, 1.07-2.36).
- Target wound infections were the most common adverse events, occurring in a similar number of patients in both the groups.
IN PRACTICE:
“Our trial demonstrated treatment of complex diabetic foot ulcers with intact fish skin grafts achieved a significantly greater proportion of diabetic foot ulcers healed at 16 weeks than standard of care, and was associated with increased healing at 20 and 24 weeks. That these results were achieved in non-superficial UT [University of Texas diabetic wound classification system] grade 2 and 3 diabetic foot ulcers and included ischemic and/or infected diabetic foot ulcers is of importance,” the authors wrote.
SOURCE:
The study was led by Dured Dardari, MD, PhD, Center Hospitalier Sud Francilien, Corbeil-Essonnes, France, and was published online in NEJM Evidence.
LIMITATIONS:
No limitations were discussed for this study.
DISCLOSURES:
The study was funded by European Commission Fast Track to Innovation Horizon 2020 and Kerecis. Two authors reported being employees with or without stock options at Kerecis, and other authors reported having ties with many sources including Kerecis.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Standard wound care for diabetic foot ulcers involves vascular assessment, surgical debridement, use of appropriate dressings, infection management, and glycemic control; however, standard care is typically associated with poor outcomes.
- Researchers conducted a multicenter clinical trial in 15 tertiary care centers with diabetic foot units across France, Italy, Germany, and Sweden to evaluate the efficacy and safety of intact fish skin grafts over standard-of-care practices in treating complex diabetic foot ulcers.
- A total of 255 patients aged 18 years or older with diabetes and lower limb wounds penetrating to the tendon, capsule, bone, or joint were randomly assigned to receive either an intact fish skin graft or standard wound care for 14 weeks.
- The primary endpoint was the percentage of wounds achieving complete closure by 16 weeks.
- Wound healing was also assessed at 20 and 24 weeks.
TAKEAWAY:
- The proportion of wounds healed at 16 weeks was higher with intact fish skin grafts than with standard-of-care (44.0% vs 26.4% adjusted odds ratio [aOR], 2.58; 95% CI, 1.48-4.56).
- The fish skin grafts continued to be more effective than standard wound care practices at weeks 20 (aOR, 2.15; 95% CI, 1.27–3.70) and 24 (aOR, 2.19; 95% CI, 1.31–3.70).
- The mean time to healing was 17.31 weeks for the intact fish skin graft group and 19.37 weeks for the standard-of-care group; intact fish skin grafts were also associated with faster healing times than standard wound care (hazard ratio, 1.59; 95% CI, 1.07-2.36).
- Target wound infections were the most common adverse events, occurring in a similar number of patients in both the groups.
IN PRACTICE:
“Our trial demonstrated treatment of complex diabetic foot ulcers with intact fish skin grafts achieved a significantly greater proportion of diabetic foot ulcers healed at 16 weeks than standard of care, and was associated with increased healing at 20 and 24 weeks. That these results were achieved in non-superficial UT [University of Texas diabetic wound classification system] grade 2 and 3 diabetic foot ulcers and included ischemic and/or infected diabetic foot ulcers is of importance,” the authors wrote.
SOURCE:
The study was led by Dured Dardari, MD, PhD, Center Hospitalier Sud Francilien, Corbeil-Essonnes, France, and was published online in NEJM Evidence.
LIMITATIONS:
No limitations were discussed for this study.
DISCLOSURES:
The study was funded by European Commission Fast Track to Innovation Horizon 2020 and Kerecis. Two authors reported being employees with or without stock options at Kerecis, and other authors reported having ties with many sources including Kerecis.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.