Poorly Controlled Asthma Equal to Greenhouse Gases From More Than 124,000 Homes

Article Type
Changed
Tue, 03/05/2024 - 16:51

Asthma is not well controlled in about half of patients with the disease in the UK and Europe, increasing the risk of hospital admission and severe illness, and increasing healthcare costs. 

Now, the authors of a new study have reported that poorly controlled asthma is also associated with a higher carbon footprint, eight times higher than that of well-controlled asthma and equivalent to the greenhouse gas emissions produced by more than 124,000 homes each year in the UK.

The study was published in the journal Thorax and is part of the Healthcare-Based Environmental Cost of Treatment (CARBON) programme, which aims to provide a broader understanding of the carbon footprint associated with respiratory care. 

John Bell, BMBCh, medical director of BioPharmaceuticals Medical, AstraZeneca, and co-author of the study, said that he was surprised by the scale to which poorly controlled asthma contributed to the overall carbon footprint of asthma care. “This suggests that suboptimal asthma care is not just a public health issue, but also one which has environmental consequences,” he said.

Improving the care of asthma patients could help the NHS meet its net zero target, the authors suggested.
 

SABA – Largest Contributor to Asthma-Related Greenhouse Gases

Healthcare is a major contributor to greenhouse gas emissions. In 2020, the NHS set an ambitious target of reducing its carbon footprint by 80% over the next 15 years, with the aim of reaching net zero by 2045.

To estimate the environmental footprint of asthma care in the UK, the researchers retrospectively analyzed anonymized data of 236,506 people with asthma submitted to the Clinical Practice Research Datalink between 2008 and 2019. 

Greenhouse gas (GHG) emissions, measured as carbon dioxide equivalent (CO2e), were then estimated for asthma-related medication use, healthcare resource utilization, and severe exacerbations.

Well-controlled asthma was considered as having no episodes of severe worsening symptoms and fewer than three prescriptions of short-acting beta-agonists (SABAs) reliever inhalers in a year. Poorly controlled asthma included three or more SABA canister prescriptions or one or more episodes of severe worsening symptoms in a year.

Almost one in two patients with asthma (47.3%) were categorized as being poorly controlled. 

The researchers estimated the overall carbon footprint attributed to asthma care when scaled to the entire UK asthma population was 750,540 tonnes CO2e/year, with poorly controlled asthma contributing to excess GHG emissions of 303,874 tonnes CO2e/year. 

“Poorly controlled asthma generated three-fold higher greenhouse gas emissions per capita compared with well-controlled asthma, when taking into account GHG emissions related to all aspects of asthma care, including routine prescribing and management,” Dr. Bell explained. It also generated eight-fold higher excess per capita carbon footprint compared to well-controlled asthma.

SABA relievers represented the largest contributors to per capita asthma-related GHG emissions, accounting for more than 60% of overall GHG emissions and more than 90% of excess GHG emissions. The remainder was mostly due to healthcare resource utilization, such as GP and hospital visits, required to treat severe worsening symptoms.

The researchers acknowledged various limitations to their findings, including that the study results were largely descriptive in nature. And factors other than the level of asthma symptom control, such as prescribing patterns, may also have contributed to high SABA use.
 

Couple Optimized Patient Outcomes With Environmental Targets

With inappropriate SABA use having emerged as the single largest contributor to asthma care-related GHG emissions, improving this care could achieve substantial carbon emissions savings and help the NHS meet its net zero target, the authors explained. 

This improvement could include the adoption of the Global Initiative for Asthma (GINA) treatment strategies that, since 2019, no longer recommends that SABAs are used alone as the preferred reliever for acute asthma symptoms, the authors wrote. 

However, the National Institute for Health and Care Excellence (NICE) asthma guidelines still recommend SABA alone as a reliever therapy.

On the other hand, the Primary Care Respiratory Society (PCRS) highlights on its website that the Medicines and Healthcare Products Regulatory Agency (MHRA) had approved the use of a dual (inhaled corticosteroid/formoterol) combination treatment to be used as a reliever therapy for people aged 12 and over.

“In the UK, this new therapy option does not yet sit within an approved national guideline as NICE last updated its treatment pathway in 2020. We await a new national asthma guideline but do not anticipate this new joint approach between NICE, BTS [British Thoracic Society], and SIGN [Scottish Intercollegiate Guidelines Network] to publish until 2024,” the society wrote.

Dr. Bell explained that the carbon footprint of asthma care increased with higher socio-economic deprivation. “Thus, targeting suboptimal care to areas of higher deprivation could help improve patient outcomes and address health inequities, with the additional benefit of reducing the overall carbon footprint of asthma care,” he said.

This coupling of optimized patient outcomes with environmental targets to decrease GHG emissions could be extended to other chronic progressive diseases, particularly those associated with multi-morbidities, the authors wrote.

Dr. Andy Whittamore, MBBS, clinical lead at Asthma + Lung UK, who was not involved in the research, said: “This study highlights that high levels of uncontrolled asthma not only put thousands of people at risk of life-threatening asthma attacks, but also have a detrimental effect on the environment. It’s important to point out that people shouldn’t stop taking their inhalers because they are worried about the environment. The best thing for the environment is to keep your asthma under control,” he emphasized.

Please refer to the study for full study author disclosures.Dr. Hicks has disclosed no relevant financial relationships.

A version of this article appeared on Medscape UK.

Publications
Topics
Sections

Asthma is not well controlled in about half of patients with the disease in the UK and Europe, increasing the risk of hospital admission and severe illness, and increasing healthcare costs. 

Now, the authors of a new study have reported that poorly controlled asthma is also associated with a higher carbon footprint, eight times higher than that of well-controlled asthma and equivalent to the greenhouse gas emissions produced by more than 124,000 homes each year in the UK.

The study was published in the journal Thorax and is part of the Healthcare-Based Environmental Cost of Treatment (CARBON) programme, which aims to provide a broader understanding of the carbon footprint associated with respiratory care. 

John Bell, BMBCh, medical director of BioPharmaceuticals Medical, AstraZeneca, and co-author of the study, said that he was surprised by the scale to which poorly controlled asthma contributed to the overall carbon footprint of asthma care. “This suggests that suboptimal asthma care is not just a public health issue, but also one which has environmental consequences,” he said.

Improving the care of asthma patients could help the NHS meet its net zero target, the authors suggested.
 

SABA – Largest Contributor to Asthma-Related Greenhouse Gases

Healthcare is a major contributor to greenhouse gas emissions. In 2020, the NHS set an ambitious target of reducing its carbon footprint by 80% over the next 15 years, with the aim of reaching net zero by 2045.

To estimate the environmental footprint of asthma care in the UK, the researchers retrospectively analyzed anonymized data of 236,506 people with asthma submitted to the Clinical Practice Research Datalink between 2008 and 2019. 

Greenhouse gas (GHG) emissions, measured as carbon dioxide equivalent (CO2e), were then estimated for asthma-related medication use, healthcare resource utilization, and severe exacerbations.

Well-controlled asthma was considered as having no episodes of severe worsening symptoms and fewer than three prescriptions of short-acting beta-agonists (SABAs) reliever inhalers in a year. Poorly controlled asthma included three or more SABA canister prescriptions or one or more episodes of severe worsening symptoms in a year.

Almost one in two patients with asthma (47.3%) were categorized as being poorly controlled. 

The researchers estimated the overall carbon footprint attributed to asthma care when scaled to the entire UK asthma population was 750,540 tonnes CO2e/year, with poorly controlled asthma contributing to excess GHG emissions of 303,874 tonnes CO2e/year. 

“Poorly controlled asthma generated three-fold higher greenhouse gas emissions per capita compared with well-controlled asthma, when taking into account GHG emissions related to all aspects of asthma care, including routine prescribing and management,” Dr. Bell explained. It also generated eight-fold higher excess per capita carbon footprint compared to well-controlled asthma.

SABA relievers represented the largest contributors to per capita asthma-related GHG emissions, accounting for more than 60% of overall GHG emissions and more than 90% of excess GHG emissions. The remainder was mostly due to healthcare resource utilization, such as GP and hospital visits, required to treat severe worsening symptoms.

The researchers acknowledged various limitations to their findings, including that the study results were largely descriptive in nature. And factors other than the level of asthma symptom control, such as prescribing patterns, may also have contributed to high SABA use.
 

Couple Optimized Patient Outcomes With Environmental Targets

With inappropriate SABA use having emerged as the single largest contributor to asthma care-related GHG emissions, improving this care could achieve substantial carbon emissions savings and help the NHS meet its net zero target, the authors explained. 

This improvement could include the adoption of the Global Initiative for Asthma (GINA) treatment strategies that, since 2019, no longer recommends that SABAs are used alone as the preferred reliever for acute asthma symptoms, the authors wrote. 

However, the National Institute for Health and Care Excellence (NICE) asthma guidelines still recommend SABA alone as a reliever therapy.

On the other hand, the Primary Care Respiratory Society (PCRS) highlights on its website that the Medicines and Healthcare Products Regulatory Agency (MHRA) had approved the use of a dual (inhaled corticosteroid/formoterol) combination treatment to be used as a reliever therapy for people aged 12 and over.

“In the UK, this new therapy option does not yet sit within an approved national guideline as NICE last updated its treatment pathway in 2020. We await a new national asthma guideline but do not anticipate this new joint approach between NICE, BTS [British Thoracic Society], and SIGN [Scottish Intercollegiate Guidelines Network] to publish until 2024,” the society wrote.

Dr. Bell explained that the carbon footprint of asthma care increased with higher socio-economic deprivation. “Thus, targeting suboptimal care to areas of higher deprivation could help improve patient outcomes and address health inequities, with the additional benefit of reducing the overall carbon footprint of asthma care,” he said.

This coupling of optimized patient outcomes with environmental targets to decrease GHG emissions could be extended to other chronic progressive diseases, particularly those associated with multi-morbidities, the authors wrote.

Dr. Andy Whittamore, MBBS, clinical lead at Asthma + Lung UK, who was not involved in the research, said: “This study highlights that high levels of uncontrolled asthma not only put thousands of people at risk of life-threatening asthma attacks, but also have a detrimental effect on the environment. It’s important to point out that people shouldn’t stop taking their inhalers because they are worried about the environment. The best thing for the environment is to keep your asthma under control,” he emphasized.

Please refer to the study for full study author disclosures.Dr. Hicks has disclosed no relevant financial relationships.

A version of this article appeared on Medscape UK.

Asthma is not well controlled in about half of patients with the disease in the UK and Europe, increasing the risk of hospital admission and severe illness, and increasing healthcare costs. 

Now, the authors of a new study have reported that poorly controlled asthma is also associated with a higher carbon footprint, eight times higher than that of well-controlled asthma and equivalent to the greenhouse gas emissions produced by more than 124,000 homes each year in the UK.

The study was published in the journal Thorax and is part of the Healthcare-Based Environmental Cost of Treatment (CARBON) programme, which aims to provide a broader understanding of the carbon footprint associated with respiratory care. 

John Bell, BMBCh, medical director of BioPharmaceuticals Medical, AstraZeneca, and co-author of the study, said that he was surprised by the scale to which poorly controlled asthma contributed to the overall carbon footprint of asthma care. “This suggests that suboptimal asthma care is not just a public health issue, but also one which has environmental consequences,” he said.

Improving the care of asthma patients could help the NHS meet its net zero target, the authors suggested.
 

SABA – Largest Contributor to Asthma-Related Greenhouse Gases

Healthcare is a major contributor to greenhouse gas emissions. In 2020, the NHS set an ambitious target of reducing its carbon footprint by 80% over the next 15 years, with the aim of reaching net zero by 2045.

To estimate the environmental footprint of asthma care in the UK, the researchers retrospectively analyzed anonymized data of 236,506 people with asthma submitted to the Clinical Practice Research Datalink between 2008 and 2019. 

Greenhouse gas (GHG) emissions, measured as carbon dioxide equivalent (CO2e), were then estimated for asthma-related medication use, healthcare resource utilization, and severe exacerbations.

Well-controlled asthma was considered as having no episodes of severe worsening symptoms and fewer than three prescriptions of short-acting beta-agonists (SABAs) reliever inhalers in a year. Poorly controlled asthma included three or more SABA canister prescriptions or one or more episodes of severe worsening symptoms in a year.

Almost one in two patients with asthma (47.3%) were categorized as being poorly controlled. 

The researchers estimated the overall carbon footprint attributed to asthma care when scaled to the entire UK asthma population was 750,540 tonnes CO2e/year, with poorly controlled asthma contributing to excess GHG emissions of 303,874 tonnes CO2e/year. 

“Poorly controlled asthma generated three-fold higher greenhouse gas emissions per capita compared with well-controlled asthma, when taking into account GHG emissions related to all aspects of asthma care, including routine prescribing and management,” Dr. Bell explained. It also generated eight-fold higher excess per capita carbon footprint compared to well-controlled asthma.

SABA relievers represented the largest contributors to per capita asthma-related GHG emissions, accounting for more than 60% of overall GHG emissions and more than 90% of excess GHG emissions. The remainder was mostly due to healthcare resource utilization, such as GP and hospital visits, required to treat severe worsening symptoms.

The researchers acknowledged various limitations to their findings, including that the study results were largely descriptive in nature. And factors other than the level of asthma symptom control, such as prescribing patterns, may also have contributed to high SABA use.
 

Couple Optimized Patient Outcomes With Environmental Targets

With inappropriate SABA use having emerged as the single largest contributor to asthma care-related GHG emissions, improving this care could achieve substantial carbon emissions savings and help the NHS meet its net zero target, the authors explained. 

This improvement could include the adoption of the Global Initiative for Asthma (GINA) treatment strategies that, since 2019, no longer recommends that SABAs are used alone as the preferred reliever for acute asthma symptoms, the authors wrote. 

However, the National Institute for Health and Care Excellence (NICE) asthma guidelines still recommend SABA alone as a reliever therapy.

On the other hand, the Primary Care Respiratory Society (PCRS) highlights on its website that the Medicines and Healthcare Products Regulatory Agency (MHRA) had approved the use of a dual (inhaled corticosteroid/formoterol) combination treatment to be used as a reliever therapy for people aged 12 and over.

“In the UK, this new therapy option does not yet sit within an approved national guideline as NICE last updated its treatment pathway in 2020. We await a new national asthma guideline but do not anticipate this new joint approach between NICE, BTS [British Thoracic Society], and SIGN [Scottish Intercollegiate Guidelines Network] to publish until 2024,” the society wrote.

Dr. Bell explained that the carbon footprint of asthma care increased with higher socio-economic deprivation. “Thus, targeting suboptimal care to areas of higher deprivation could help improve patient outcomes and address health inequities, with the additional benefit of reducing the overall carbon footprint of asthma care,” he said.

This coupling of optimized patient outcomes with environmental targets to decrease GHG emissions could be extended to other chronic progressive diseases, particularly those associated with multi-morbidities, the authors wrote.

Dr. Andy Whittamore, MBBS, clinical lead at Asthma + Lung UK, who was not involved in the research, said: “This study highlights that high levels of uncontrolled asthma not only put thousands of people at risk of life-threatening asthma attacks, but also have a detrimental effect on the environment. It’s important to point out that people shouldn’t stop taking their inhalers because they are worried about the environment. The best thing for the environment is to keep your asthma under control,” he emphasized.

Please refer to the study for full study author disclosures.Dr. Hicks has disclosed no relevant financial relationships.

A version of this article appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THORAX

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 03/05/2024 - 16:45
Un-Gate On Date
Tue, 03/05/2024 - 16:45
Use ProPublica
CFC Schedule Remove Status
Tue, 03/05/2024 - 16:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Magnesium sulfate shown to reduce risk of cerebral palsy in premature babies

Article Type
Changed
Thu, 01/12/2023 - 15:55

A program to increase the use of magnesium sulfate to reduce the risk of cerebral palsy is effective, say researchers. Giving magnesium sulfate to women at risk of premature birth can reduce the risk of a child having cerebral palsy by a third, and costs just £1 per dose.

However, the authors of the new observational study, published in Archives of Disease in Childhood – Fetal and Neonatal Edition, pointed out that in 2017 only around two-thirds (64%) of eligible women were being given magnesium sulfate in England, Scotland, and Wales, with “wide regional variations.”

To address this, in 2014 the PReCePT (Preventing Cerebral Palsy in Pre Term labor) quality improvement toolkit was developed by both parents and staff with the aim of supporting all maternity units in England to improve maternity staff awareness and increase the use of magnesium sulfate in mothers at risk of giving birth at 30 weeks’ gestation or under. PReCePT provided practical tools and training to support hospital staff to give magnesium sulfate to eligible mothers.

The pilot study in 2015, which involved five maternity units, found an increase in uptake from 21% to 88% associated with the PReCePT approach. Subsequently, in 2018, NHS England funded the National PReCePT Programme, which scaled up the intervention for national roll-out and provided the PReCePT quality toolkit – which includes preterm labor proforma, staff training presentations, parent information leaflet, posters for the unit, and a learning log – to each maternity unit.
 

Improvement ‘over and above’ expectation

For the first evaluation of a U.K. universally implemented national perinatal quality improvement program to increase administration of an evidence-based drug, researchers, led by University of Bristol, England, set out to evaluate the effectiveness and cost-effectiveness of the National PReCePT Programme in increasing use of magnesium sulfate in preterm births.

Using data from the U.K. National Neonatal Research Database for the year before and the year after PReCePT was implemented in maternity units in England, the researchers performed a before-and-after study that involved 137 maternity units within NHS England. Participants were babies born at 30 weeks’ gestation or under admitted to neonatal units in England, and the main outcome measure was magnesium sulfate uptake before and after the implementation of the National PReCePT Programme. In addition, implementation and lifetime costs were estimated.

During the first year, post implementation of the program, uptake increased by an average of 6.3 percentage points (to 83.1%) across all maternity units in England, which the authors explained was “over and above” the increase that would be expected over time as the practice spread organically. The researchers also found that after adjusting for variations in when maternity units started the program, the increase in use of magnesium sulfate was 9.5 percentage points. “By May 2020, on average 86.4% of eligible mothers were receiving magnesium sulfate,” they said.

Professor John Macleod, NIHR ARC West Director, professor in clinical epidemiology and primary care, University of Bristol, and principal investigator of the evaluation, said: “Our in-depth analysis has been able to demonstrate that the PReCePT program is both effective and cost-effective. The program has increased uptake of magnesium sulfate, which we know is a cost-effective medicine to prevent cerebral palsy, much more quickly than we could have otherwise expected.”

From a societal and lifetime perspective, the health gains and cost savings associated with the National PReCePT Programme generated a “net monetary benefit of £866 per preterm baby,” with the probability of the program being cost-effective being “greater than 95%,” the authors highlighted.

The researchers also estimated that the program’s first year could be associated with a lifetime saving to society of £3 million – which accounts for the costs of the program, of administering the treatment, of cerebral palsy to society over a lifetime, and the associated health gains of avoiding cases. “This is across all the extra babies the program helped get access to the treatment during the first year,” they said.

The authors highlighted that in the five pilot sites, the improved use of magnesium sulfate has been “sustained over the years” since PReCePT was implemented. As the program costs were mostly in the first year of implementation, longer-term national analysis may show that PReCePT is “even more cost-effective over a longer period,” they postulated.
 

 

 

Accelerate uptake

Uptake of new evidence or guidelines is often “slow” due to practical barriers, lack of knowledge, and need for behavior change, and can “take decades to become embedded” in perinatal clinical practice, expressed the authors, which in turn comes at a “high clinical and economic cost.”

Karen Luyt, professor in neonatal medicine, University of Bristol, said: “The PReCePT national quality improvement program demonstrates that a collaborative and coordinated perinatal implementation program supporting every hospital in England can accelerate the uptake of new evidence-based treatments into routine practice, enabling equitable health benefits to babies and ultimately reductions in lifetime societal costs.”

The authors said the PReCePT model “may serve as a blueprint for future interventions to improve perinatal care.”

Professor Lucy Chappell, chief executive officer of the National Institute for Health and Care Research, said: “This important study shows the impact of taking a promising intervention that had been shown to work in a research setting and scaling it up across the country. Giving magnesium sulfate to prevent cerebral palsy in premature babies is a simple, inexpensive intervention that can make such a difference to families and the health service.”

Prof. Macleod added: “We are pleased to have played a part in helping get this cheap yet effective treatment to more babies.”

This work was jointly funded by the National Institute for Health and Care Research Applied Research Collaboration West and the AHSN Network funded by NHS England. The Health Foundation funded the health economics evaluation. The authors declare that the study management group has no competing financial, professional, or personal interests that might have influenced the study design or conduct.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

A program to increase the use of magnesium sulfate to reduce the risk of cerebral palsy is effective, say researchers. Giving magnesium sulfate to women at risk of premature birth can reduce the risk of a child having cerebral palsy by a third, and costs just £1 per dose.

However, the authors of the new observational study, published in Archives of Disease in Childhood – Fetal and Neonatal Edition, pointed out that in 2017 only around two-thirds (64%) of eligible women were being given magnesium sulfate in England, Scotland, and Wales, with “wide regional variations.”

To address this, in 2014 the PReCePT (Preventing Cerebral Palsy in Pre Term labor) quality improvement toolkit was developed by both parents and staff with the aim of supporting all maternity units in England to improve maternity staff awareness and increase the use of magnesium sulfate in mothers at risk of giving birth at 30 weeks’ gestation or under. PReCePT provided practical tools and training to support hospital staff to give magnesium sulfate to eligible mothers.

The pilot study in 2015, which involved five maternity units, found an increase in uptake from 21% to 88% associated with the PReCePT approach. Subsequently, in 2018, NHS England funded the National PReCePT Programme, which scaled up the intervention for national roll-out and provided the PReCePT quality toolkit – which includes preterm labor proforma, staff training presentations, parent information leaflet, posters for the unit, and a learning log – to each maternity unit.
 

Improvement ‘over and above’ expectation

For the first evaluation of a U.K. universally implemented national perinatal quality improvement program to increase administration of an evidence-based drug, researchers, led by University of Bristol, England, set out to evaluate the effectiveness and cost-effectiveness of the National PReCePT Programme in increasing use of magnesium sulfate in preterm births.

Using data from the U.K. National Neonatal Research Database for the year before and the year after PReCePT was implemented in maternity units in England, the researchers performed a before-and-after study that involved 137 maternity units within NHS England. Participants were babies born at 30 weeks’ gestation or under admitted to neonatal units in England, and the main outcome measure was magnesium sulfate uptake before and after the implementation of the National PReCePT Programme. In addition, implementation and lifetime costs were estimated.

During the first year, post implementation of the program, uptake increased by an average of 6.3 percentage points (to 83.1%) across all maternity units in England, which the authors explained was “over and above” the increase that would be expected over time as the practice spread organically. The researchers also found that after adjusting for variations in when maternity units started the program, the increase in use of magnesium sulfate was 9.5 percentage points. “By May 2020, on average 86.4% of eligible mothers were receiving magnesium sulfate,” they said.

Professor John Macleod, NIHR ARC West Director, professor in clinical epidemiology and primary care, University of Bristol, and principal investigator of the evaluation, said: “Our in-depth analysis has been able to demonstrate that the PReCePT program is both effective and cost-effective. The program has increased uptake of magnesium sulfate, which we know is a cost-effective medicine to prevent cerebral palsy, much more quickly than we could have otherwise expected.”

From a societal and lifetime perspective, the health gains and cost savings associated with the National PReCePT Programme generated a “net monetary benefit of £866 per preterm baby,” with the probability of the program being cost-effective being “greater than 95%,” the authors highlighted.

The researchers also estimated that the program’s first year could be associated with a lifetime saving to society of £3 million – which accounts for the costs of the program, of administering the treatment, of cerebral palsy to society over a lifetime, and the associated health gains of avoiding cases. “This is across all the extra babies the program helped get access to the treatment during the first year,” they said.

The authors highlighted that in the five pilot sites, the improved use of magnesium sulfate has been “sustained over the years” since PReCePT was implemented. As the program costs were mostly in the first year of implementation, longer-term national analysis may show that PReCePT is “even more cost-effective over a longer period,” they postulated.
 

 

 

Accelerate uptake

Uptake of new evidence or guidelines is often “slow” due to practical barriers, lack of knowledge, and need for behavior change, and can “take decades to become embedded” in perinatal clinical practice, expressed the authors, which in turn comes at a “high clinical and economic cost.”

Karen Luyt, professor in neonatal medicine, University of Bristol, said: “The PReCePT national quality improvement program demonstrates that a collaborative and coordinated perinatal implementation program supporting every hospital in England can accelerate the uptake of new evidence-based treatments into routine practice, enabling equitable health benefits to babies and ultimately reductions in lifetime societal costs.”

The authors said the PReCePT model “may serve as a blueprint for future interventions to improve perinatal care.”

Professor Lucy Chappell, chief executive officer of the National Institute for Health and Care Research, said: “This important study shows the impact of taking a promising intervention that had been shown to work in a research setting and scaling it up across the country. Giving magnesium sulfate to prevent cerebral palsy in premature babies is a simple, inexpensive intervention that can make such a difference to families and the health service.”

Prof. Macleod added: “We are pleased to have played a part in helping get this cheap yet effective treatment to more babies.”

This work was jointly funded by the National Institute for Health and Care Research Applied Research Collaboration West and the AHSN Network funded by NHS England. The Health Foundation funded the health economics evaluation. The authors declare that the study management group has no competing financial, professional, or personal interests that might have influenced the study design or conduct.

A version of this article first appeared on Medscape UK.

A program to increase the use of magnesium sulfate to reduce the risk of cerebral palsy is effective, say researchers. Giving magnesium sulfate to women at risk of premature birth can reduce the risk of a child having cerebral palsy by a third, and costs just £1 per dose.

However, the authors of the new observational study, published in Archives of Disease in Childhood – Fetal and Neonatal Edition, pointed out that in 2017 only around two-thirds (64%) of eligible women were being given magnesium sulfate in England, Scotland, and Wales, with “wide regional variations.”

To address this, in 2014 the PReCePT (Preventing Cerebral Palsy in Pre Term labor) quality improvement toolkit was developed by both parents and staff with the aim of supporting all maternity units in England to improve maternity staff awareness and increase the use of magnesium sulfate in mothers at risk of giving birth at 30 weeks’ gestation or under. PReCePT provided practical tools and training to support hospital staff to give magnesium sulfate to eligible mothers.

The pilot study in 2015, which involved five maternity units, found an increase in uptake from 21% to 88% associated with the PReCePT approach. Subsequently, in 2018, NHS England funded the National PReCePT Programme, which scaled up the intervention for national roll-out and provided the PReCePT quality toolkit – which includes preterm labor proforma, staff training presentations, parent information leaflet, posters for the unit, and a learning log – to each maternity unit.
 

Improvement ‘over and above’ expectation

For the first evaluation of a U.K. universally implemented national perinatal quality improvement program to increase administration of an evidence-based drug, researchers, led by University of Bristol, England, set out to evaluate the effectiveness and cost-effectiveness of the National PReCePT Programme in increasing use of magnesium sulfate in preterm births.

Using data from the U.K. National Neonatal Research Database for the year before and the year after PReCePT was implemented in maternity units in England, the researchers performed a before-and-after study that involved 137 maternity units within NHS England. Participants were babies born at 30 weeks’ gestation or under admitted to neonatal units in England, and the main outcome measure was magnesium sulfate uptake before and after the implementation of the National PReCePT Programme. In addition, implementation and lifetime costs were estimated.

During the first year, post implementation of the program, uptake increased by an average of 6.3 percentage points (to 83.1%) across all maternity units in England, which the authors explained was “over and above” the increase that would be expected over time as the practice spread organically. The researchers also found that after adjusting for variations in when maternity units started the program, the increase in use of magnesium sulfate was 9.5 percentage points. “By May 2020, on average 86.4% of eligible mothers were receiving magnesium sulfate,” they said.

Professor John Macleod, NIHR ARC West Director, professor in clinical epidemiology and primary care, University of Bristol, and principal investigator of the evaluation, said: “Our in-depth analysis has been able to demonstrate that the PReCePT program is both effective and cost-effective. The program has increased uptake of magnesium sulfate, which we know is a cost-effective medicine to prevent cerebral palsy, much more quickly than we could have otherwise expected.”

From a societal and lifetime perspective, the health gains and cost savings associated with the National PReCePT Programme generated a “net monetary benefit of £866 per preterm baby,” with the probability of the program being cost-effective being “greater than 95%,” the authors highlighted.

The researchers also estimated that the program’s first year could be associated with a lifetime saving to society of £3 million – which accounts for the costs of the program, of administering the treatment, of cerebral palsy to society over a lifetime, and the associated health gains of avoiding cases. “This is across all the extra babies the program helped get access to the treatment during the first year,” they said.

The authors highlighted that in the five pilot sites, the improved use of magnesium sulfate has been “sustained over the years” since PReCePT was implemented. As the program costs were mostly in the first year of implementation, longer-term national analysis may show that PReCePT is “even more cost-effective over a longer period,” they postulated.
 

 

 

Accelerate uptake

Uptake of new evidence or guidelines is often “slow” due to practical barriers, lack of knowledge, and need for behavior change, and can “take decades to become embedded” in perinatal clinical practice, expressed the authors, which in turn comes at a “high clinical and economic cost.”

Karen Luyt, professor in neonatal medicine, University of Bristol, said: “The PReCePT national quality improvement program demonstrates that a collaborative and coordinated perinatal implementation program supporting every hospital in England can accelerate the uptake of new evidence-based treatments into routine practice, enabling equitable health benefits to babies and ultimately reductions in lifetime societal costs.”

The authors said the PReCePT model “may serve as a blueprint for future interventions to improve perinatal care.”

Professor Lucy Chappell, chief executive officer of the National Institute for Health and Care Research, said: “This important study shows the impact of taking a promising intervention that had been shown to work in a research setting and scaling it up across the country. Giving magnesium sulfate to prevent cerebral palsy in premature babies is a simple, inexpensive intervention that can make such a difference to families and the health service.”

Prof. Macleod added: “We are pleased to have played a part in helping get this cheap yet effective treatment to more babies.”

This work was jointly funded by the National Institute for Health and Care Research Applied Research Collaboration West and the AHSN Network funded by NHS England. The Health Foundation funded the health economics evaluation. The authors declare that the study management group has no competing financial, professional, or personal interests that might have influenced the study design or conduct.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Hypertension linked to risk of severe COVID

Article Type
Changed
Thu, 12/15/2022 - 14:23

U.K. researchers have established that hypertension is associated with a 22% greater risk of severe COVID-19, with the odds of severe COVID-19 unaffected by medication type.

Hypertension “appears to be one of the commonest comorbidities in COVID-19 patients”, explained the authors of a new study, published in PLOS ONE. The authors highlighted that previous research had shown that hypertension was more prevalent in severe and fatal cases compared with all cases of COVID-19.

They pointed out, however, that whether hypertensive individuals have a higher risk of severe COVID-19, compared with nonhypertensives, and whether the absolute level of systolic blood pressure or the type of antihypertensive medication is related to this risk, remained “unclear.”

To try to answer these questions, the research team, led by University of Cambridge researchers, analyzed data from 16,134 individuals who tested positive for COVID-19 (mean age 65.3 years, 47% male, 90% white), 40% were diagnosed with essential hypertension at the analysis baseline – 22% of whom had developed severe COVID-19.

Systolic blood pressure (SBP) was categorized by 10–mm Hg ranges, starting from < 120 mm Hg up to 180+ mm Hg, with the reference category defined as 120-129 mm Hg, based on data from the SPRINT study, which demonstrated that intensive SBP lowering to below 120 mm Hg, as compared with the traditional threshold of 140 mm Hg, was beneficial. Diastolic blood pressure was categorized by 10–mm Hg ranges, starting from < 60 mm Hg up to 100+ mm Hg with 80-90 mm Hg being the reference category.

In their analyses the researchers adjusted for age, sex, body mass index, ethnicity, smoking status, diabetes status, socioeconomic status, and inflammation (C-reactive protein [CRP]), as these were proposed as potential confounders. To assess the direct effect of hypertension on COVID-19, they also adjusted for intermediate variables, including cardiovascular comorbidities and stroke, on the causal pathway between hypertension and severe COVID-19.
 

Majority of effect of hypertension on severe COVID-19 was direct

The unadjusted odds ratio of the association between hypertension and severe COVID-19 was 2.33 (95% confidence interval, 2.16-2.51), the authors emphasized. They found that, after adjusting for all confounding variables, hypertension was associated with 22% higher odds of severe COVID-19 (OR, 1.22; 95% CI, 1.12-1.33), compared with normotension.

Individuals with severe COVID-19 were marginally older, more likely to be male, and more deprived, the authors said. “They were also more likely to be hypertensive, compared with individuals without severe COVID-19, and a greater proportion of individuals with severe COVID-19 had cardiovascular comorbidities.”

The majority of the effect of hypertension on development of severe COVID-19 was “direct,” they said. However, a modest proportion of the effect was mediated via cardiovascular comorbidities such as peripheral vascular disease, MI, coronary heart disease, arrhythmias, and stroke. Of note, those with a history of stroke had a 47% higher risk of severe COVID-19 and those with a history of other cardiovascular comorbidities had a 30% higher risk of severe COVID-19, the authors commented.
 

J-shaped relationship

Of the total of 6,517 (40%) individuals who had a diagnosis of essential hypertension at baseline, 67% were treated (41% with monotherapy, 59% with combination therapy), and 33% were untreated.

There were similar numbers of severe COVID-19 in each medication group: ACE inhibitors, 34%; angiotensin receptor blockers (ARBs), 36%; and “other” medications 34%.

In hypertensive individuals receiving antihypertensive medications, there was a “J-shaped relationship” between the level of blood pressure and risk of severe COVID-19 when using a systolic blood pressure level of 120-129 mm Hg as a reference – 150-159 mm Hg versus 120-129 mm Hg (OR 1.91; 95% CI, 1.44-2.53), > 180+ mm Hg versus 120-129 mm Hg (OR 1.93; 95% CI, 1.06-3.51).

The authors commented that there was no evidence of a higher risk of severe COVID-19 until systolic blood pressure “exceeded 150 mm Hg.”

They said it was an interesting finding that “very well-controlled” systolic blood pressure < 120 mm Hg was associated with a 40% (OR, 1.40; 95% CI, 1.11-1.78) greater odds of severe COVID-19. “This may be due to reverse causality, where low systolic blood pressure levels may indicate poorer health, such that the occurrence of severe COVID-19 may be related to underlying disease rather than the level of SBP per se,” they suggested.

The J-shaped association observed remained after multiple adjustments, including presence of known cardiovascular comorbidities, which suggested a possible “real effect” of low SBP on severe COVID-19, “at least in treated hypertensive individuals.”

Their analyses also identified that, compared with a “normal” diastolic blood pressure (80-90 mm Hg), having a diastolic blood pressure higher than 90 mm Hg was associated with higher odds of severe COVID-19.

The association between hypertension and COVID-19 was “amplified” if the individuals were treated and their BP remained uncontrolled, the authors pointed out.

There did not appear to be any difference in the risk of severe COVID-19 between individuals taking ACE inhibitors and those taking ARBs or other antihypertensive medications, the authors said.
 

Better understanding of underlying mechanisms needed

Individuals with hypertension who tested positive for COVID-19 had “over twice” the risk of developing severe COVID-19, compared with nonhypertensive individuals, the authors said.

They highlighted that their findings also suggest that there are “further effects” influencing the severity of COVID-19 beyond a “dichotomous” diagnosis of hypertension.

“Individuals with a higher-than-target systolic blood pressure may be less healthy, less active, suffering more severe hypertension, or have developed drug-resistant hypertension, all suggesting that the effects of hypertension have already had detrimental physiological effects on the cardiovascular system, which in turn may offer some explanation for the higher risk of severe COVID-19 with uncontrolled SBP,” they explained.

“Hypertension is an important risk factor for COVID-19,” reiterated the authors, who emphasized that a better understanding of the underlying mechanisms driving this increased risk is warranted in case of “more severe strains or other viruses” in the future.

The authors have declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

U.K. researchers have established that hypertension is associated with a 22% greater risk of severe COVID-19, with the odds of severe COVID-19 unaffected by medication type.

Hypertension “appears to be one of the commonest comorbidities in COVID-19 patients”, explained the authors of a new study, published in PLOS ONE. The authors highlighted that previous research had shown that hypertension was more prevalent in severe and fatal cases compared with all cases of COVID-19.

They pointed out, however, that whether hypertensive individuals have a higher risk of severe COVID-19, compared with nonhypertensives, and whether the absolute level of systolic blood pressure or the type of antihypertensive medication is related to this risk, remained “unclear.”

To try to answer these questions, the research team, led by University of Cambridge researchers, analyzed data from 16,134 individuals who tested positive for COVID-19 (mean age 65.3 years, 47% male, 90% white), 40% were diagnosed with essential hypertension at the analysis baseline – 22% of whom had developed severe COVID-19.

Systolic blood pressure (SBP) was categorized by 10–mm Hg ranges, starting from < 120 mm Hg up to 180+ mm Hg, with the reference category defined as 120-129 mm Hg, based on data from the SPRINT study, which demonstrated that intensive SBP lowering to below 120 mm Hg, as compared with the traditional threshold of 140 mm Hg, was beneficial. Diastolic blood pressure was categorized by 10–mm Hg ranges, starting from < 60 mm Hg up to 100+ mm Hg with 80-90 mm Hg being the reference category.

In their analyses the researchers adjusted for age, sex, body mass index, ethnicity, smoking status, diabetes status, socioeconomic status, and inflammation (C-reactive protein [CRP]), as these were proposed as potential confounders. To assess the direct effect of hypertension on COVID-19, they also adjusted for intermediate variables, including cardiovascular comorbidities and stroke, on the causal pathway between hypertension and severe COVID-19.
 

Majority of effect of hypertension on severe COVID-19 was direct

The unadjusted odds ratio of the association between hypertension and severe COVID-19 was 2.33 (95% confidence interval, 2.16-2.51), the authors emphasized. They found that, after adjusting for all confounding variables, hypertension was associated with 22% higher odds of severe COVID-19 (OR, 1.22; 95% CI, 1.12-1.33), compared with normotension.

Individuals with severe COVID-19 were marginally older, more likely to be male, and more deprived, the authors said. “They were also more likely to be hypertensive, compared with individuals without severe COVID-19, and a greater proportion of individuals with severe COVID-19 had cardiovascular comorbidities.”

The majority of the effect of hypertension on development of severe COVID-19 was “direct,” they said. However, a modest proportion of the effect was mediated via cardiovascular comorbidities such as peripheral vascular disease, MI, coronary heart disease, arrhythmias, and stroke. Of note, those with a history of stroke had a 47% higher risk of severe COVID-19 and those with a history of other cardiovascular comorbidities had a 30% higher risk of severe COVID-19, the authors commented.
 

J-shaped relationship

Of the total of 6,517 (40%) individuals who had a diagnosis of essential hypertension at baseline, 67% were treated (41% with monotherapy, 59% with combination therapy), and 33% were untreated.

There were similar numbers of severe COVID-19 in each medication group: ACE inhibitors, 34%; angiotensin receptor blockers (ARBs), 36%; and “other” medications 34%.

In hypertensive individuals receiving antihypertensive medications, there was a “J-shaped relationship” between the level of blood pressure and risk of severe COVID-19 when using a systolic blood pressure level of 120-129 mm Hg as a reference – 150-159 mm Hg versus 120-129 mm Hg (OR 1.91; 95% CI, 1.44-2.53), > 180+ mm Hg versus 120-129 mm Hg (OR 1.93; 95% CI, 1.06-3.51).

The authors commented that there was no evidence of a higher risk of severe COVID-19 until systolic blood pressure “exceeded 150 mm Hg.”

They said it was an interesting finding that “very well-controlled” systolic blood pressure < 120 mm Hg was associated with a 40% (OR, 1.40; 95% CI, 1.11-1.78) greater odds of severe COVID-19. “This may be due to reverse causality, where low systolic blood pressure levels may indicate poorer health, such that the occurrence of severe COVID-19 may be related to underlying disease rather than the level of SBP per se,” they suggested.

The J-shaped association observed remained after multiple adjustments, including presence of known cardiovascular comorbidities, which suggested a possible “real effect” of low SBP on severe COVID-19, “at least in treated hypertensive individuals.”

Their analyses also identified that, compared with a “normal” diastolic blood pressure (80-90 mm Hg), having a diastolic blood pressure higher than 90 mm Hg was associated with higher odds of severe COVID-19.

The association between hypertension and COVID-19 was “amplified” if the individuals were treated and their BP remained uncontrolled, the authors pointed out.

There did not appear to be any difference in the risk of severe COVID-19 between individuals taking ACE inhibitors and those taking ARBs or other antihypertensive medications, the authors said.
 

Better understanding of underlying mechanisms needed

Individuals with hypertension who tested positive for COVID-19 had “over twice” the risk of developing severe COVID-19, compared with nonhypertensive individuals, the authors said.

They highlighted that their findings also suggest that there are “further effects” influencing the severity of COVID-19 beyond a “dichotomous” diagnosis of hypertension.

“Individuals with a higher-than-target systolic blood pressure may be less healthy, less active, suffering more severe hypertension, or have developed drug-resistant hypertension, all suggesting that the effects of hypertension have already had detrimental physiological effects on the cardiovascular system, which in turn may offer some explanation for the higher risk of severe COVID-19 with uncontrolled SBP,” they explained.

“Hypertension is an important risk factor for COVID-19,” reiterated the authors, who emphasized that a better understanding of the underlying mechanisms driving this increased risk is warranted in case of “more severe strains or other viruses” in the future.

The authors have declared no competing interests.

A version of this article first appeared on Medscape UK.

U.K. researchers have established that hypertension is associated with a 22% greater risk of severe COVID-19, with the odds of severe COVID-19 unaffected by medication type.

Hypertension “appears to be one of the commonest comorbidities in COVID-19 patients”, explained the authors of a new study, published in PLOS ONE. The authors highlighted that previous research had shown that hypertension was more prevalent in severe and fatal cases compared with all cases of COVID-19.

They pointed out, however, that whether hypertensive individuals have a higher risk of severe COVID-19, compared with nonhypertensives, and whether the absolute level of systolic blood pressure or the type of antihypertensive medication is related to this risk, remained “unclear.”

To try to answer these questions, the research team, led by University of Cambridge researchers, analyzed data from 16,134 individuals who tested positive for COVID-19 (mean age 65.3 years, 47% male, 90% white), 40% were diagnosed with essential hypertension at the analysis baseline – 22% of whom had developed severe COVID-19.

Systolic blood pressure (SBP) was categorized by 10–mm Hg ranges, starting from < 120 mm Hg up to 180+ mm Hg, with the reference category defined as 120-129 mm Hg, based on data from the SPRINT study, which demonstrated that intensive SBP lowering to below 120 mm Hg, as compared with the traditional threshold of 140 mm Hg, was beneficial. Diastolic blood pressure was categorized by 10–mm Hg ranges, starting from < 60 mm Hg up to 100+ mm Hg with 80-90 mm Hg being the reference category.

In their analyses the researchers adjusted for age, sex, body mass index, ethnicity, smoking status, diabetes status, socioeconomic status, and inflammation (C-reactive protein [CRP]), as these were proposed as potential confounders. To assess the direct effect of hypertension on COVID-19, they also adjusted for intermediate variables, including cardiovascular comorbidities and stroke, on the causal pathway between hypertension and severe COVID-19.
 

Majority of effect of hypertension on severe COVID-19 was direct

The unadjusted odds ratio of the association between hypertension and severe COVID-19 was 2.33 (95% confidence interval, 2.16-2.51), the authors emphasized. They found that, after adjusting for all confounding variables, hypertension was associated with 22% higher odds of severe COVID-19 (OR, 1.22; 95% CI, 1.12-1.33), compared with normotension.

Individuals with severe COVID-19 were marginally older, more likely to be male, and more deprived, the authors said. “They were also more likely to be hypertensive, compared with individuals without severe COVID-19, and a greater proportion of individuals with severe COVID-19 had cardiovascular comorbidities.”

The majority of the effect of hypertension on development of severe COVID-19 was “direct,” they said. However, a modest proportion of the effect was mediated via cardiovascular comorbidities such as peripheral vascular disease, MI, coronary heart disease, arrhythmias, and stroke. Of note, those with a history of stroke had a 47% higher risk of severe COVID-19 and those with a history of other cardiovascular comorbidities had a 30% higher risk of severe COVID-19, the authors commented.
 

J-shaped relationship

Of the total of 6,517 (40%) individuals who had a diagnosis of essential hypertension at baseline, 67% were treated (41% with monotherapy, 59% with combination therapy), and 33% were untreated.

There were similar numbers of severe COVID-19 in each medication group: ACE inhibitors, 34%; angiotensin receptor blockers (ARBs), 36%; and “other” medications 34%.

In hypertensive individuals receiving antihypertensive medications, there was a “J-shaped relationship” between the level of blood pressure and risk of severe COVID-19 when using a systolic blood pressure level of 120-129 mm Hg as a reference – 150-159 mm Hg versus 120-129 mm Hg (OR 1.91; 95% CI, 1.44-2.53), > 180+ mm Hg versus 120-129 mm Hg (OR 1.93; 95% CI, 1.06-3.51).

The authors commented that there was no evidence of a higher risk of severe COVID-19 until systolic blood pressure “exceeded 150 mm Hg.”

They said it was an interesting finding that “very well-controlled” systolic blood pressure < 120 mm Hg was associated with a 40% (OR, 1.40; 95% CI, 1.11-1.78) greater odds of severe COVID-19. “This may be due to reverse causality, where low systolic blood pressure levels may indicate poorer health, such that the occurrence of severe COVID-19 may be related to underlying disease rather than the level of SBP per se,” they suggested.

The J-shaped association observed remained after multiple adjustments, including presence of known cardiovascular comorbidities, which suggested a possible “real effect” of low SBP on severe COVID-19, “at least in treated hypertensive individuals.”

Their analyses also identified that, compared with a “normal” diastolic blood pressure (80-90 mm Hg), having a diastolic blood pressure higher than 90 mm Hg was associated with higher odds of severe COVID-19.

The association between hypertension and COVID-19 was “amplified” if the individuals were treated and their BP remained uncontrolled, the authors pointed out.

There did not appear to be any difference in the risk of severe COVID-19 between individuals taking ACE inhibitors and those taking ARBs or other antihypertensive medications, the authors said.
 

Better understanding of underlying mechanisms needed

Individuals with hypertension who tested positive for COVID-19 had “over twice” the risk of developing severe COVID-19, compared with nonhypertensive individuals, the authors said.

They highlighted that their findings also suggest that there are “further effects” influencing the severity of COVID-19 beyond a “dichotomous” diagnosis of hypertension.

“Individuals with a higher-than-target systolic blood pressure may be less healthy, less active, suffering more severe hypertension, or have developed drug-resistant hypertension, all suggesting that the effects of hypertension have already had detrimental physiological effects on the cardiovascular system, which in turn may offer some explanation for the higher risk of severe COVID-19 with uncontrolled SBP,” they explained.

“Hypertension is an important risk factor for COVID-19,” reiterated the authors, who emphasized that a better understanding of the underlying mechanisms driving this increased risk is warranted in case of “more severe strains or other viruses” in the future.

The authors have declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PLOS ONE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A hormone that can predict male long-term health

Article Type
Changed
Fri, 11/11/2022 - 08:28

A new discovery could help predict the long-term health of men after a vital role of a hormone was identified, say researchers. Insulin-like peptide 3 (INSL3) is a constitutive hormone secreted in men by the mature Leydig cells of the testes, explained the authors of the new study, published in Frontiers in Endocrinology.

“It is an accurate biomarker for Leydig cell functional capacity, reflecting their total cell number and differentiation status,” they said.

“The holy grail of aging research is to reduce the fitness gap that appears as people age,” said Ravinder Anand-Ivell, PhD, associate professor in endocrinology and reproductive physiology at the University of Nottingham (England), and study coauthor. Understanding why some people are more likely to develop disability and disease as they age is “vital” so that interventions can be found to ensure people not only live a long life but also a healthy life as they age, she highlighted.

The European team of researchers, led by scientists from the University of Nottingham, set out to determine the ability of INSL3 as a biomarker to predict hypogonadism and age-related morbidity, and whether this also allowed it to predict morbidity in a similar way to testosterone.

For the study, the researchers analyzed blood samples from the European Male Aging Study (EMAS) cohort to assess circulating INSL3 and its cross-sectional and longitudinal relationships to hypogonadism – defined by testosterone less than 10.5 nmol/L – and a range of age-related morbidities determined by correlation and regression analysis.

The EMAS cohort of community-dwelling men comprises more than 3,000 men, aged 40-79 years at the time of recruitment, from eight centers in Europe. Men were recruited from 2003 to 2004 and again 4-5 years later for a second phase of the study. In both phases, blood was collected for hormonal measurements, and subjects were assessed for anthropometric parameters and asked to complete questionnaires relating to their health, lifestyle, and diet.
 

Hormone levels remain constant

The results showed that, unlike testosterone, which fluctuates throughout a man’s life, INSL3 remains consistent, with the level at puberty remaining largely the same throughout a man’s life, decreasing only slightly into old age. “This makes it the first clear and reliable predictive biomarker of age-related morbidity as compared with any other measurable parameters,” explained the researchers.

They also discovered that the level of INSL3 in blood “correlates with a range of age-related conditions,” such as bone weakness, sexual dysfunction, diabetes, and cardiovascular disease. 

They emphasized that the discovery of the consistent nature of this hormone is “very significant.” It means that a man with high INSL3 when young will still have high INSL3 when he is older, but someone with low INSL3 already at a young age will have low INSL3 when older, “making him more likely to acquire typical age-related illnesses.”

Dr. Anand-Ivell commented that the hormone discovery was an “important step” and will pave the way for not only helping people individually but also helping to “ease the care crisis we face as a society.”
 

Exciting possibilities for predicting age

The study also showed that the normal male population, even when young and relatively healthy, still shows an almost 10-fold variation between individuals in the concentration of INSL3 in the blood, the authors reported.

The authors highlighted that the study’s strengths are the large and comprehensive dataset provided by the EMAS cohort, together with the accuracy of the hormonal parameters measured. The weaknesses, they explained, are the self-reported nature of some of the morbidity parameters as well as the relatively short longitudinal dimension of only 4.3 years average.

Richard Ivell, University of Nottingham, and lead author, explained that now the important role of INSL3 in predicting disease, and how it varies amongst men, had been established, the team is looking to investigate what factors have the most influence on the level of INSL3 in the blood. “Preliminary work suggests early life nutrition may play a role, but many other factors such as genetics or exposure to some environmental endocrine disruptors may play a part”.

The study findings open up “exciting possibilities for predicting age-related illnesses and finding ways to prevent the onset of these diseases with early intervention,” the authors enthused.

The study was initiated and supported by the European 5th Framework, and the German Research Council provided funding for the INSL3 analysis. The authors declared no conflicts of interest.

Dr. Hicks has disclosed no relevant financial relationships. A version of this article first appeared on MedscapeUK.

Publications
Topics
Sections

A new discovery could help predict the long-term health of men after a vital role of a hormone was identified, say researchers. Insulin-like peptide 3 (INSL3) is a constitutive hormone secreted in men by the mature Leydig cells of the testes, explained the authors of the new study, published in Frontiers in Endocrinology.

“It is an accurate biomarker for Leydig cell functional capacity, reflecting their total cell number and differentiation status,” they said.

“The holy grail of aging research is to reduce the fitness gap that appears as people age,” said Ravinder Anand-Ivell, PhD, associate professor in endocrinology and reproductive physiology at the University of Nottingham (England), and study coauthor. Understanding why some people are more likely to develop disability and disease as they age is “vital” so that interventions can be found to ensure people not only live a long life but also a healthy life as they age, she highlighted.

The European team of researchers, led by scientists from the University of Nottingham, set out to determine the ability of INSL3 as a biomarker to predict hypogonadism and age-related morbidity, and whether this also allowed it to predict morbidity in a similar way to testosterone.

For the study, the researchers analyzed blood samples from the European Male Aging Study (EMAS) cohort to assess circulating INSL3 and its cross-sectional and longitudinal relationships to hypogonadism – defined by testosterone less than 10.5 nmol/L – and a range of age-related morbidities determined by correlation and regression analysis.

The EMAS cohort of community-dwelling men comprises more than 3,000 men, aged 40-79 years at the time of recruitment, from eight centers in Europe. Men were recruited from 2003 to 2004 and again 4-5 years later for a second phase of the study. In both phases, blood was collected for hormonal measurements, and subjects were assessed for anthropometric parameters and asked to complete questionnaires relating to their health, lifestyle, and diet.
 

Hormone levels remain constant

The results showed that, unlike testosterone, which fluctuates throughout a man’s life, INSL3 remains consistent, with the level at puberty remaining largely the same throughout a man’s life, decreasing only slightly into old age. “This makes it the first clear and reliable predictive biomarker of age-related morbidity as compared with any other measurable parameters,” explained the researchers.

They also discovered that the level of INSL3 in blood “correlates with a range of age-related conditions,” such as bone weakness, sexual dysfunction, diabetes, and cardiovascular disease. 

They emphasized that the discovery of the consistent nature of this hormone is “very significant.” It means that a man with high INSL3 when young will still have high INSL3 when he is older, but someone with low INSL3 already at a young age will have low INSL3 when older, “making him more likely to acquire typical age-related illnesses.”

Dr. Anand-Ivell commented that the hormone discovery was an “important step” and will pave the way for not only helping people individually but also helping to “ease the care crisis we face as a society.”
 

Exciting possibilities for predicting age

The study also showed that the normal male population, even when young and relatively healthy, still shows an almost 10-fold variation between individuals in the concentration of INSL3 in the blood, the authors reported.

The authors highlighted that the study’s strengths are the large and comprehensive dataset provided by the EMAS cohort, together with the accuracy of the hormonal parameters measured. The weaknesses, they explained, are the self-reported nature of some of the morbidity parameters as well as the relatively short longitudinal dimension of only 4.3 years average.

Richard Ivell, University of Nottingham, and lead author, explained that now the important role of INSL3 in predicting disease, and how it varies amongst men, had been established, the team is looking to investigate what factors have the most influence on the level of INSL3 in the blood. “Preliminary work suggests early life nutrition may play a role, but many other factors such as genetics or exposure to some environmental endocrine disruptors may play a part”.

The study findings open up “exciting possibilities for predicting age-related illnesses and finding ways to prevent the onset of these diseases with early intervention,” the authors enthused.

The study was initiated and supported by the European 5th Framework, and the German Research Council provided funding for the INSL3 analysis. The authors declared no conflicts of interest.

Dr. Hicks has disclosed no relevant financial relationships. A version of this article first appeared on MedscapeUK.

A new discovery could help predict the long-term health of men after a vital role of a hormone was identified, say researchers. Insulin-like peptide 3 (INSL3) is a constitutive hormone secreted in men by the mature Leydig cells of the testes, explained the authors of the new study, published in Frontiers in Endocrinology.

“It is an accurate biomarker for Leydig cell functional capacity, reflecting their total cell number and differentiation status,” they said.

“The holy grail of aging research is to reduce the fitness gap that appears as people age,” said Ravinder Anand-Ivell, PhD, associate professor in endocrinology and reproductive physiology at the University of Nottingham (England), and study coauthor. Understanding why some people are more likely to develop disability and disease as they age is “vital” so that interventions can be found to ensure people not only live a long life but also a healthy life as they age, she highlighted.

The European team of researchers, led by scientists from the University of Nottingham, set out to determine the ability of INSL3 as a biomarker to predict hypogonadism and age-related morbidity, and whether this also allowed it to predict morbidity in a similar way to testosterone.

For the study, the researchers analyzed blood samples from the European Male Aging Study (EMAS) cohort to assess circulating INSL3 and its cross-sectional and longitudinal relationships to hypogonadism – defined by testosterone less than 10.5 nmol/L – and a range of age-related morbidities determined by correlation and regression analysis.

The EMAS cohort of community-dwelling men comprises more than 3,000 men, aged 40-79 years at the time of recruitment, from eight centers in Europe. Men were recruited from 2003 to 2004 and again 4-5 years later for a second phase of the study. In both phases, blood was collected for hormonal measurements, and subjects were assessed for anthropometric parameters and asked to complete questionnaires relating to their health, lifestyle, and diet.
 

Hormone levels remain constant

The results showed that, unlike testosterone, which fluctuates throughout a man’s life, INSL3 remains consistent, with the level at puberty remaining largely the same throughout a man’s life, decreasing only slightly into old age. “This makes it the first clear and reliable predictive biomarker of age-related morbidity as compared with any other measurable parameters,” explained the researchers.

They also discovered that the level of INSL3 in blood “correlates with a range of age-related conditions,” such as bone weakness, sexual dysfunction, diabetes, and cardiovascular disease. 

They emphasized that the discovery of the consistent nature of this hormone is “very significant.” It means that a man with high INSL3 when young will still have high INSL3 when he is older, but someone with low INSL3 already at a young age will have low INSL3 when older, “making him more likely to acquire typical age-related illnesses.”

Dr. Anand-Ivell commented that the hormone discovery was an “important step” and will pave the way for not only helping people individually but also helping to “ease the care crisis we face as a society.”
 

Exciting possibilities for predicting age

The study also showed that the normal male population, even when young and relatively healthy, still shows an almost 10-fold variation between individuals in the concentration of INSL3 in the blood, the authors reported.

The authors highlighted that the study’s strengths are the large and comprehensive dataset provided by the EMAS cohort, together with the accuracy of the hormonal parameters measured. The weaknesses, they explained, are the self-reported nature of some of the morbidity parameters as well as the relatively short longitudinal dimension of only 4.3 years average.

Richard Ivell, University of Nottingham, and lead author, explained that now the important role of INSL3 in predicting disease, and how it varies amongst men, had been established, the team is looking to investigate what factors have the most influence on the level of INSL3 in the blood. “Preliminary work suggests early life nutrition may play a role, but many other factors such as genetics or exposure to some environmental endocrine disruptors may play a part”.

The study findings open up “exciting possibilities for predicting age-related illnesses and finding ways to prevent the onset of these diseases with early intervention,” the authors enthused.

The study was initiated and supported by the European 5th Framework, and the German Research Council provided funding for the INSL3 analysis. The authors declared no conflicts of interest.

Dr. Hicks has disclosed no relevant financial relationships. A version of this article first appeared on MedscapeUK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM FRONTIERS IN ENDOCRINOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Best anticoagulant for minimizing bleeding risk identified

Article Type
Changed
Wed, 11/02/2022 - 14:52

A commonly prescribed direct oral anticoagulant (DOAC) has the lowest risk of bleeding, say researchers. Used to prevent strokes in those with atrial fibrillation (AFib), DOACs have recently become more common than warfarin, the previous standard treatment, as they do not require as much follow-up monitoring – which was “particularly valuable” during the COVID-19 pandemic – and have “less risk” of side effects, highlighted the authors of a new study, published in Annals of Internal Medicine.

However, the authors explained that, although current guidelines recommend using DOACs over warfarin in patients with AFib, “head-to-head trial data do not exist to guide the choice of DOAC.” So, they set out to try and fill this evidence gap by doing a large-scale comparison between all DOACs – apixaban, dabigatran, edoxaban, and rivaroxaban – in routine clinical practice.

Wallis Lau, PhD, University College London, and co–lead author, said: “Direct oral anticoagulants have been prescribed with increasing frequency worldwide in recent years, but evidence comparing them directly has been limited.”
 

One drug stood out

For the multinational population-based cohort study the researchers compared the efficacy and risk of side effects for the four most common DOACs. They reviewed data – from five standardized electronic health care databases that covered 221 million people in the United Kingdom, France, Germany, and the United States – of 527,226 patients who had been newly diagnosed with AFib between 2010 and 2019, and who had received a new DOAC prescription. The study included 281,320 apixaban users, 61,008 dabigatran users, 12,722 edoxaban users, and 172,176 rivaroxaban users.

Database-specific hazard ratios of ischemic stroke or systemic embolism, intracranial hemorrhage, gastrointestinal bleeding, and all-cause mortality between DOACs were estimated using a Cox regression model stratified by propensity score and pooled using a random-effects model.

In total, 9,530 ischemic stroke or systemic embolism events, 841 intercranial hemorrhage events, 8,319 gastrointestinal bleeding events, and 1,476 deaths were identified over the study follow-up. The researchers found that all four drugs were comparable on outcomes for ischemic stroke, intercranial hemorrhage, and all-cause mortality.

However, they identified a difference in the risk of gastrointestinal bleeding, which they highlighted “is one of the most common and concerning side effects of DOACs.”

“Apixaban stood out as having lower risk of gastrointestinal bleeding,” said the authors, with a 19%-28% lower risk when compared directly with each of the other three DOACs. Specifically, apixaban use was associated with lower risk for gastrointestinal bleeding than use of dabigatran (HR, 0.81; 95% confidence interval, 0.70-0.94), edoxaban (HR, 0.77; 95% CI, 0.66-0.91), or rivaroxaban (HR, 0.72; 95% CI, 0.66-0.79).

The researchers also highlighted that their findings held true when looking at data only from those aged over 80, and those with chronic kidney disease, two groups that are “often underrepresented” in clinical trials.
 

Apixaban may be preferable

The researchers concluded that, among patients with AFib, apixaban use was associated with lower risk for gastrointestinal bleeding and similar rates of ischemic stroke or systemic embolism, intracranial hemorrhage and all-cause mortality, compared with dabigatran, edoxaban, and rivaroxaban.

“Our results indicate that apixaban may be preferable to other blood thinners because of the lower rate of gastrointestinal bleeding and similar rates of stroke, a finding that we hope will be supported by randomized controlled trials,” said Dr. Lau.

However, he emphasized that, “as with all medications, potential risks and benefits can differ between people, so considering the full spectrum of outcomes and side effects will still be necessary for each individual patient.”

The authors all declared no conflicting interests.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

A commonly prescribed direct oral anticoagulant (DOAC) has the lowest risk of bleeding, say researchers. Used to prevent strokes in those with atrial fibrillation (AFib), DOACs have recently become more common than warfarin, the previous standard treatment, as they do not require as much follow-up monitoring – which was “particularly valuable” during the COVID-19 pandemic – and have “less risk” of side effects, highlighted the authors of a new study, published in Annals of Internal Medicine.

However, the authors explained that, although current guidelines recommend using DOACs over warfarin in patients with AFib, “head-to-head trial data do not exist to guide the choice of DOAC.” So, they set out to try and fill this evidence gap by doing a large-scale comparison between all DOACs – apixaban, dabigatran, edoxaban, and rivaroxaban – in routine clinical practice.

Wallis Lau, PhD, University College London, and co–lead author, said: “Direct oral anticoagulants have been prescribed with increasing frequency worldwide in recent years, but evidence comparing them directly has been limited.”
 

One drug stood out

For the multinational population-based cohort study the researchers compared the efficacy and risk of side effects for the four most common DOACs. They reviewed data – from five standardized electronic health care databases that covered 221 million people in the United Kingdom, France, Germany, and the United States – of 527,226 patients who had been newly diagnosed with AFib between 2010 and 2019, and who had received a new DOAC prescription. The study included 281,320 apixaban users, 61,008 dabigatran users, 12,722 edoxaban users, and 172,176 rivaroxaban users.

Database-specific hazard ratios of ischemic stroke or systemic embolism, intracranial hemorrhage, gastrointestinal bleeding, and all-cause mortality between DOACs were estimated using a Cox regression model stratified by propensity score and pooled using a random-effects model.

In total, 9,530 ischemic stroke or systemic embolism events, 841 intercranial hemorrhage events, 8,319 gastrointestinal bleeding events, and 1,476 deaths were identified over the study follow-up. The researchers found that all four drugs were comparable on outcomes for ischemic stroke, intercranial hemorrhage, and all-cause mortality.

However, they identified a difference in the risk of gastrointestinal bleeding, which they highlighted “is one of the most common and concerning side effects of DOACs.”

“Apixaban stood out as having lower risk of gastrointestinal bleeding,” said the authors, with a 19%-28% lower risk when compared directly with each of the other three DOACs. Specifically, apixaban use was associated with lower risk for gastrointestinal bleeding than use of dabigatran (HR, 0.81; 95% confidence interval, 0.70-0.94), edoxaban (HR, 0.77; 95% CI, 0.66-0.91), or rivaroxaban (HR, 0.72; 95% CI, 0.66-0.79).

The researchers also highlighted that their findings held true when looking at data only from those aged over 80, and those with chronic kidney disease, two groups that are “often underrepresented” in clinical trials.
 

Apixaban may be preferable

The researchers concluded that, among patients with AFib, apixaban use was associated with lower risk for gastrointestinal bleeding and similar rates of ischemic stroke or systemic embolism, intracranial hemorrhage and all-cause mortality, compared with dabigatran, edoxaban, and rivaroxaban.

“Our results indicate that apixaban may be preferable to other blood thinners because of the lower rate of gastrointestinal bleeding and similar rates of stroke, a finding that we hope will be supported by randomized controlled trials,” said Dr. Lau.

However, he emphasized that, “as with all medications, potential risks and benefits can differ between people, so considering the full spectrum of outcomes and side effects will still be necessary for each individual patient.”

The authors all declared no conflicting interests.

A version of this article first appeared on Medscape UK.

A commonly prescribed direct oral anticoagulant (DOAC) has the lowest risk of bleeding, say researchers. Used to prevent strokes in those with atrial fibrillation (AFib), DOACs have recently become more common than warfarin, the previous standard treatment, as they do not require as much follow-up monitoring – which was “particularly valuable” during the COVID-19 pandemic – and have “less risk” of side effects, highlighted the authors of a new study, published in Annals of Internal Medicine.

However, the authors explained that, although current guidelines recommend using DOACs over warfarin in patients with AFib, “head-to-head trial data do not exist to guide the choice of DOAC.” So, they set out to try and fill this evidence gap by doing a large-scale comparison between all DOACs – apixaban, dabigatran, edoxaban, and rivaroxaban – in routine clinical practice.

Wallis Lau, PhD, University College London, and co–lead author, said: “Direct oral anticoagulants have been prescribed with increasing frequency worldwide in recent years, but evidence comparing them directly has been limited.”
 

One drug stood out

For the multinational population-based cohort study the researchers compared the efficacy and risk of side effects for the four most common DOACs. They reviewed data – from five standardized electronic health care databases that covered 221 million people in the United Kingdom, France, Germany, and the United States – of 527,226 patients who had been newly diagnosed with AFib between 2010 and 2019, and who had received a new DOAC prescription. The study included 281,320 apixaban users, 61,008 dabigatran users, 12,722 edoxaban users, and 172,176 rivaroxaban users.

Database-specific hazard ratios of ischemic stroke or systemic embolism, intracranial hemorrhage, gastrointestinal bleeding, and all-cause mortality between DOACs were estimated using a Cox regression model stratified by propensity score and pooled using a random-effects model.

In total, 9,530 ischemic stroke or systemic embolism events, 841 intercranial hemorrhage events, 8,319 gastrointestinal bleeding events, and 1,476 deaths were identified over the study follow-up. The researchers found that all four drugs were comparable on outcomes for ischemic stroke, intercranial hemorrhage, and all-cause mortality.

However, they identified a difference in the risk of gastrointestinal bleeding, which they highlighted “is one of the most common and concerning side effects of DOACs.”

“Apixaban stood out as having lower risk of gastrointestinal bleeding,” said the authors, with a 19%-28% lower risk when compared directly with each of the other three DOACs. Specifically, apixaban use was associated with lower risk for gastrointestinal bleeding than use of dabigatran (HR, 0.81; 95% confidence interval, 0.70-0.94), edoxaban (HR, 0.77; 95% CI, 0.66-0.91), or rivaroxaban (HR, 0.72; 95% CI, 0.66-0.79).

The researchers also highlighted that their findings held true when looking at data only from those aged over 80, and those with chronic kidney disease, two groups that are “often underrepresented” in clinical trials.
 

Apixaban may be preferable

The researchers concluded that, among patients with AFib, apixaban use was associated with lower risk for gastrointestinal bleeding and similar rates of ischemic stroke or systemic embolism, intracranial hemorrhage and all-cause mortality, compared with dabigatran, edoxaban, and rivaroxaban.

“Our results indicate that apixaban may be preferable to other blood thinners because of the lower rate of gastrointestinal bleeding and similar rates of stroke, a finding that we hope will be supported by randomized controlled trials,” said Dr. Lau.

However, he emphasized that, “as with all medications, potential risks and benefits can differ between people, so considering the full spectrum of outcomes and side effects will still be necessary for each individual patient.”

The authors all declared no conflicting interests.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study uncovers two molecular subgroups of cervical cancer

Article Type
Changed
Fri, 10/28/2022 - 15:10

Scientists have discovered that cervical cancer can be divided into two distinct molecular subgroups – one far more aggressive than the other – offering hope of better understanding and treatment of the disease.

In the United Kingdom, there are over 3,000 new case of cervical cancer, with around 850 deaths each year. It is almost always caused by the human papillomavirus (HPV), and vaccination against this virus has successfully reduced the incidence of cervical cancer – in fact, the reduction has been by 87% among women in their 20s in England who were offered the vaccine when they were aged 12-13 years as part of the U.K. HPV vaccination program.

“Despite major steps forward in preventing cervical cancer, many women still die from the disease,” said Tim Fenton, MD, associate professor in cancer biology, School of Cancer Sciences Centre for Cancer Immunology, University of Southampton (England), and coauthor of the new study.
 

Two distinct subgroups

In the new study, published in Nature Communications, researchers described their breakthrough findings as a “major step forward” in understanding the disease, and said they provided a “tantalizing new clue” in determining the best treatments for individual patients.

For the observational study - part of the largest ‘omics’ study of its kind – researchers led by scientists at University College London and the University of Southampton began by applying a multiomics approach to identify combinations of molecular markers and characteristics associated with the biological processes involved in cervical cancer cells. The integrated multiomic analysis of 643 cervical squamous cell carcinomas (CSCC) – the most common histological variant of cervical cancer – represented patient populations from the United States, Europe, and sub-Saharan Africa.

To begin with they analysed and compared DNA, RNA, proteins, and metabolites in 236 CSCC cases in a publicly available U.S. database. They found that U.S. cancers fell into two distinct “omics” subgroups, which they named C1 and C2.  After further investigation, the researchers identified that C1 tumors contained a much higher number of cytotoxic T cells. “The findings suggested that patients with C1 tumors would have a stronger immune response within the tumor micro-environment,” they said.
 

Weaker antitumor immune response

To determine if the two sub-types affect patients with cervical cancer in different ways, the team, which also included researchers from the University of Kent, the University of Cambridge, Oslo University Hospital, the University of Bergen (Norway), and the University of Innsbruck (Austria) derived molecular profiles and looked at clinical outcomes of a further 313 CSCC cases from Norway and Austria.

The researchers found that, just as in the US cohort, nearly a quarter of patients fell into the C2 subtype, and that again C1 tumors contained far more killer T cells than C2 tumors. “Importantly, the data also showed C2 was far more clinically aggressive with worse outcomes for patients,” the authors said.

Patients with C2 tumors were more than twice as likely (hazard ratio, 2.32) to die from their cervical cancer at any point during the follow-up period – up to 21 years – than those with C1 tumors. In terms of 5-year disease-specific survival, the rates were 79% survival for C1 and 66% survival for C2, the authors pointed out.

They highlighted that the difference in outcomes between patients with C1 and C2 tumors was very similar across the US and European cohorts.

Kerry Chester, PhD, professor of molecular medicine at UCL Cancer Institute, and coauthor, said: “Inclusion of patient cohorts in Norway and Austria, for which highly detailed clinical information was available to complement the molecular data, were key factors in the success of the study.”

Analyzing a further cohort of 94 Ugandan CSCC cases, the team found that C2 tumors were much more common than C1 tumors in patients who were also HIV-positive, “underlining the link to a weaker antitumor immune response” in this group.
 

 

 

Molecular subtyping offers better prognostic information

Cervical cancer can be caused by at least 12 different ‘high-risk’ HPV types, and there have been conflicting reports as to whether the HPV type present in a cervical cancer influences the prognosis for the patient. CSCCs can now also be categorized into two subtypes, C1 and C2, the authors explained, among which C1 tumors have a more favorable outcome.

“Although HPV16 is more likely to cause C1 tumors and HPV18 C2 tumors, HPV type is not an independent predictor of prognosis, suggesting it is the tumor type rather than the causative HPV type that is critical for the disease outcome,” they highlighted.

“Intriguingly, the C1/C2 grouping appeared to be more informative than the type of HPV present,” they added. “While certain HPV types were found more commonly in either C1 or C2 tumors, prognosis was linked to the group to which the tumor could be assigned, rather than the HPV type it contained.”

The reason that HPV16 and other alpha-9 HPV types have been associated with more favorable outcomes was possibly that these viruses are “more likely to cause C1-type tumors”, the authors suggested. Although larger numbers are needed for robust within-stage comparisons of C1 and C2 tumors, “we observe a clear trend in the survival rates between C1 and C2 by stage”, they said.

Taking molecular (C1/C2) subtyping into account may allow for more “accurate prognostication” than current staging and potentially different clinical management of patients with C1 versus C2 tumors, the authors said. This could include the identification of patients at risk of relapse who may require further adjuvant therapy after completion of up-front therapy.
 

New therapeutic targets

Dr. Fenton highlighted that the study findings suggested that determining whether a patient has a C1 or a C2 cervical cancer could help in planning their treatment, since it appeared to provide “additional prognostic information beyond that gained from clinical staging”. Given the differences in the antitumor immune response observed in C1 and C2 tumors, this classification might also be useful in predicting which patients are likely to benefit from emerging immunotherapy drugs, he said.

The study findings also found that CSCC can develop along “two trajectories” associated with differing clinical behavior that can be identified using defined gene expression or DNA methylation signatures, and this may guide “improved clinical management of cervical cancer patients”, they said.

“This collaborative multidisciplinary research is a major step forward in our understanding of cervical cancer,” said Dr. Chester. “Through careful molecular profiling and genetic analysis of cervical cancer tumors we have gained valuable new insight into the tumor microenvironment and factors potentially making the cancer less aggressive in some patients.”

The authors expressed hope that their study findings will stimulate functional studies of genes and their role in cervical cancer pathogenesis, potentially enabling identification of new therapeutic targets.

The study was funded by Debbie Fund (a UCL postgraduate research scholarship), Rosetrees Trust, Cancer Research UK, the Biotechnology and Biosciences Research Council, the Royal Society, and the Global Challenges Doctoral Centre at the University of Kent, MRC, PCUK, BBSRC, TUF, Orchid, and the UCLH BRC. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

Scientists have discovered that cervical cancer can be divided into two distinct molecular subgroups – one far more aggressive than the other – offering hope of better understanding and treatment of the disease.

In the United Kingdom, there are over 3,000 new case of cervical cancer, with around 850 deaths each year. It is almost always caused by the human papillomavirus (HPV), and vaccination against this virus has successfully reduced the incidence of cervical cancer – in fact, the reduction has been by 87% among women in their 20s in England who were offered the vaccine when they were aged 12-13 years as part of the U.K. HPV vaccination program.

“Despite major steps forward in preventing cervical cancer, many women still die from the disease,” said Tim Fenton, MD, associate professor in cancer biology, School of Cancer Sciences Centre for Cancer Immunology, University of Southampton (England), and coauthor of the new study.
 

Two distinct subgroups

In the new study, published in Nature Communications, researchers described their breakthrough findings as a “major step forward” in understanding the disease, and said they provided a “tantalizing new clue” in determining the best treatments for individual patients.

For the observational study - part of the largest ‘omics’ study of its kind – researchers led by scientists at University College London and the University of Southampton began by applying a multiomics approach to identify combinations of molecular markers and characteristics associated with the biological processes involved in cervical cancer cells. The integrated multiomic analysis of 643 cervical squamous cell carcinomas (CSCC) – the most common histological variant of cervical cancer – represented patient populations from the United States, Europe, and sub-Saharan Africa.

To begin with they analysed and compared DNA, RNA, proteins, and metabolites in 236 CSCC cases in a publicly available U.S. database. They found that U.S. cancers fell into two distinct “omics” subgroups, which they named C1 and C2.  After further investigation, the researchers identified that C1 tumors contained a much higher number of cytotoxic T cells. “The findings suggested that patients with C1 tumors would have a stronger immune response within the tumor micro-environment,” they said.
 

Weaker antitumor immune response

To determine if the two sub-types affect patients with cervical cancer in different ways, the team, which also included researchers from the University of Kent, the University of Cambridge, Oslo University Hospital, the University of Bergen (Norway), and the University of Innsbruck (Austria) derived molecular profiles and looked at clinical outcomes of a further 313 CSCC cases from Norway and Austria.

The researchers found that, just as in the US cohort, nearly a quarter of patients fell into the C2 subtype, and that again C1 tumors contained far more killer T cells than C2 tumors. “Importantly, the data also showed C2 was far more clinically aggressive with worse outcomes for patients,” the authors said.

Patients with C2 tumors were more than twice as likely (hazard ratio, 2.32) to die from their cervical cancer at any point during the follow-up period – up to 21 years – than those with C1 tumors. In terms of 5-year disease-specific survival, the rates were 79% survival for C1 and 66% survival for C2, the authors pointed out.

They highlighted that the difference in outcomes between patients with C1 and C2 tumors was very similar across the US and European cohorts.

Kerry Chester, PhD, professor of molecular medicine at UCL Cancer Institute, and coauthor, said: “Inclusion of patient cohorts in Norway and Austria, for which highly detailed clinical information was available to complement the molecular data, were key factors in the success of the study.”

Analyzing a further cohort of 94 Ugandan CSCC cases, the team found that C2 tumors were much more common than C1 tumors in patients who were also HIV-positive, “underlining the link to a weaker antitumor immune response” in this group.
 

 

 

Molecular subtyping offers better prognostic information

Cervical cancer can be caused by at least 12 different ‘high-risk’ HPV types, and there have been conflicting reports as to whether the HPV type present in a cervical cancer influences the prognosis for the patient. CSCCs can now also be categorized into two subtypes, C1 and C2, the authors explained, among which C1 tumors have a more favorable outcome.

“Although HPV16 is more likely to cause C1 tumors and HPV18 C2 tumors, HPV type is not an independent predictor of prognosis, suggesting it is the tumor type rather than the causative HPV type that is critical for the disease outcome,” they highlighted.

“Intriguingly, the C1/C2 grouping appeared to be more informative than the type of HPV present,” they added. “While certain HPV types were found more commonly in either C1 or C2 tumors, prognosis was linked to the group to which the tumor could be assigned, rather than the HPV type it contained.”

The reason that HPV16 and other alpha-9 HPV types have been associated with more favorable outcomes was possibly that these viruses are “more likely to cause C1-type tumors”, the authors suggested. Although larger numbers are needed for robust within-stage comparisons of C1 and C2 tumors, “we observe a clear trend in the survival rates between C1 and C2 by stage”, they said.

Taking molecular (C1/C2) subtyping into account may allow for more “accurate prognostication” than current staging and potentially different clinical management of patients with C1 versus C2 tumors, the authors said. This could include the identification of patients at risk of relapse who may require further adjuvant therapy after completion of up-front therapy.
 

New therapeutic targets

Dr. Fenton highlighted that the study findings suggested that determining whether a patient has a C1 or a C2 cervical cancer could help in planning their treatment, since it appeared to provide “additional prognostic information beyond that gained from clinical staging”. Given the differences in the antitumor immune response observed in C1 and C2 tumors, this classification might also be useful in predicting which patients are likely to benefit from emerging immunotherapy drugs, he said.

The study findings also found that CSCC can develop along “two trajectories” associated with differing clinical behavior that can be identified using defined gene expression or DNA methylation signatures, and this may guide “improved clinical management of cervical cancer patients”, they said.

“This collaborative multidisciplinary research is a major step forward in our understanding of cervical cancer,” said Dr. Chester. “Through careful molecular profiling and genetic analysis of cervical cancer tumors we have gained valuable new insight into the tumor microenvironment and factors potentially making the cancer less aggressive in some patients.”

The authors expressed hope that their study findings will stimulate functional studies of genes and their role in cervical cancer pathogenesis, potentially enabling identification of new therapeutic targets.

The study was funded by Debbie Fund (a UCL postgraduate research scholarship), Rosetrees Trust, Cancer Research UK, the Biotechnology and Biosciences Research Council, the Royal Society, and the Global Challenges Doctoral Centre at the University of Kent, MRC, PCUK, BBSRC, TUF, Orchid, and the UCLH BRC. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Scientists have discovered that cervical cancer can be divided into two distinct molecular subgroups – one far more aggressive than the other – offering hope of better understanding and treatment of the disease.

In the United Kingdom, there are over 3,000 new case of cervical cancer, with around 850 deaths each year. It is almost always caused by the human papillomavirus (HPV), and vaccination against this virus has successfully reduced the incidence of cervical cancer – in fact, the reduction has been by 87% among women in their 20s in England who were offered the vaccine when they were aged 12-13 years as part of the U.K. HPV vaccination program.

“Despite major steps forward in preventing cervical cancer, many women still die from the disease,” said Tim Fenton, MD, associate professor in cancer biology, School of Cancer Sciences Centre for Cancer Immunology, University of Southampton (England), and coauthor of the new study.
 

Two distinct subgroups

In the new study, published in Nature Communications, researchers described their breakthrough findings as a “major step forward” in understanding the disease, and said they provided a “tantalizing new clue” in determining the best treatments for individual patients.

For the observational study - part of the largest ‘omics’ study of its kind – researchers led by scientists at University College London and the University of Southampton began by applying a multiomics approach to identify combinations of molecular markers and characteristics associated with the biological processes involved in cervical cancer cells. The integrated multiomic analysis of 643 cervical squamous cell carcinomas (CSCC) – the most common histological variant of cervical cancer – represented patient populations from the United States, Europe, and sub-Saharan Africa.

To begin with they analysed and compared DNA, RNA, proteins, and metabolites in 236 CSCC cases in a publicly available U.S. database. They found that U.S. cancers fell into two distinct “omics” subgroups, which they named C1 and C2.  After further investigation, the researchers identified that C1 tumors contained a much higher number of cytotoxic T cells. “The findings suggested that patients with C1 tumors would have a stronger immune response within the tumor micro-environment,” they said.
 

Weaker antitumor immune response

To determine if the two sub-types affect patients with cervical cancer in different ways, the team, which also included researchers from the University of Kent, the University of Cambridge, Oslo University Hospital, the University of Bergen (Norway), and the University of Innsbruck (Austria) derived molecular profiles and looked at clinical outcomes of a further 313 CSCC cases from Norway and Austria.

The researchers found that, just as in the US cohort, nearly a quarter of patients fell into the C2 subtype, and that again C1 tumors contained far more killer T cells than C2 tumors. “Importantly, the data also showed C2 was far more clinically aggressive with worse outcomes for patients,” the authors said.

Patients with C2 tumors were more than twice as likely (hazard ratio, 2.32) to die from their cervical cancer at any point during the follow-up period – up to 21 years – than those with C1 tumors. In terms of 5-year disease-specific survival, the rates were 79% survival for C1 and 66% survival for C2, the authors pointed out.

They highlighted that the difference in outcomes between patients with C1 and C2 tumors was very similar across the US and European cohorts.

Kerry Chester, PhD, professor of molecular medicine at UCL Cancer Institute, and coauthor, said: “Inclusion of patient cohorts in Norway and Austria, for which highly detailed clinical information was available to complement the molecular data, were key factors in the success of the study.”

Analyzing a further cohort of 94 Ugandan CSCC cases, the team found that C2 tumors were much more common than C1 tumors in patients who were also HIV-positive, “underlining the link to a weaker antitumor immune response” in this group.
 

 

 

Molecular subtyping offers better prognostic information

Cervical cancer can be caused by at least 12 different ‘high-risk’ HPV types, and there have been conflicting reports as to whether the HPV type present in a cervical cancer influences the prognosis for the patient. CSCCs can now also be categorized into two subtypes, C1 and C2, the authors explained, among which C1 tumors have a more favorable outcome.

“Although HPV16 is more likely to cause C1 tumors and HPV18 C2 tumors, HPV type is not an independent predictor of prognosis, suggesting it is the tumor type rather than the causative HPV type that is critical for the disease outcome,” they highlighted.

“Intriguingly, the C1/C2 grouping appeared to be more informative than the type of HPV present,” they added. “While certain HPV types were found more commonly in either C1 or C2 tumors, prognosis was linked to the group to which the tumor could be assigned, rather than the HPV type it contained.”

The reason that HPV16 and other alpha-9 HPV types have been associated with more favorable outcomes was possibly that these viruses are “more likely to cause C1-type tumors”, the authors suggested. Although larger numbers are needed for robust within-stage comparisons of C1 and C2 tumors, “we observe a clear trend in the survival rates between C1 and C2 by stage”, they said.

Taking molecular (C1/C2) subtyping into account may allow for more “accurate prognostication” than current staging and potentially different clinical management of patients with C1 versus C2 tumors, the authors said. This could include the identification of patients at risk of relapse who may require further adjuvant therapy after completion of up-front therapy.
 

New therapeutic targets

Dr. Fenton highlighted that the study findings suggested that determining whether a patient has a C1 or a C2 cervical cancer could help in planning their treatment, since it appeared to provide “additional prognostic information beyond that gained from clinical staging”. Given the differences in the antitumor immune response observed in C1 and C2 tumors, this classification might also be useful in predicting which patients are likely to benefit from emerging immunotherapy drugs, he said.

The study findings also found that CSCC can develop along “two trajectories” associated with differing clinical behavior that can be identified using defined gene expression or DNA methylation signatures, and this may guide “improved clinical management of cervical cancer patients”, they said.

“This collaborative multidisciplinary research is a major step forward in our understanding of cervical cancer,” said Dr. Chester. “Through careful molecular profiling and genetic analysis of cervical cancer tumors we have gained valuable new insight into the tumor microenvironment and factors potentially making the cancer less aggressive in some patients.”

The authors expressed hope that their study findings will stimulate functional studies of genes and their role in cervical cancer pathogenesis, potentially enabling identification of new therapeutic targets.

The study was funded by Debbie Fund (a UCL postgraduate research scholarship), Rosetrees Trust, Cancer Research UK, the Biotechnology and Biosciences Research Council, the Royal Society, and the Global Challenges Doctoral Centre at the University of Kent, MRC, PCUK, BBSRC, TUF, Orchid, and the UCLH BRC. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE COMMUNICATIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Gene ‘cut-and-paste’ treatment could offer hope for inherited immune system diseases

Article Type
Changed
Thu, 10/27/2022 - 12:43

An “exciting” new gene-editing strategy means those born with a rare inherited disease of the immune system could be treated by repairing a fault in their cells.

Scientists have hailed new research that found faulty cells responsible for the immune system disease CTLA-4 insufficiency can be repaired with a pioneering gene editing technique.

CTLA-4 is a protein produced by T cells that helps to control the activity of the immune system. Most people carry two working copies of the gene responsible for producing CTLA-4, but those who have only one functional copy produce too little of the protein to sufficiently regulate the immune system.

For patients with the condition, CTLA-4 insufficiency causes regulatory T cells to function abnormally, leading to severe autoimmunity. The authors explained that the condition also affects effector T cells and thereby “hampers their immune system’s ‘memory,’ ” meaning patients can “struggle to fight off recurring infections by the same viruses and bacteria.” In some cases, it can also lead to lymphomas.
 

Gene editing to ‘cut’ out faulty genes and ‘paste’ in ‘corrected’ ones

The research, published in Science Translational Medicine, and led by scientists from University College London, demonstrated in human cells and in mice that the cell fault can be repaired.

The scientists used “cut-and-paste” gene-editing techniques. First, they used the CRISPR/Cas9 system to target the faulty gene in human T cells taken from patients with CTLA-4 insufficiency, and then snip the faulty CTLA-4 gene in two. Then, to repair the errors a corrected sequence of DNA – delivered to the cell using a modified virus – was pasted over the faulty part of the gene using a cellular DNA repair mechanism known as homology-directed repair.

The authors explained that this allowed them to “preserve” important sequences within the CTLA-4 gene – known as the intron – that allow it to be switched on and off by the cell only when needed. 

The outcome was “restored levels of CTLA-4 in the cells to those seen in healthy T cells,” the authors said.

Claire Booth, PhD, Mahboubian professor of gene therapy and pediatric immunology, UCL Great Ormond Street Institute of Child Health, and co–senior author, said that it was “really exciting” to think about taking this treatment forward to patients. “If we can improve their symptoms and reduce their risk of getting lymphoproliferative disease this will be a major step forward.”

In addition, the researchers were also able to improve symptoms of the disease in mice with CTLA-4 insufficiency by giving them injections of gene-edited T cells.
 

Technique may help tackle many conditions

The current standard treatment for CTLA-4 insufficiency is a bone marrow transplant to replace the stem cells responsible for producing T cells. However, “transplants are risky” and require high doses of chemotherapy and many weeks in hospital, the authors explained. “Older patients with CTLA-4 insufficiency are typically not well enough to tolerate the transplant procedure.”

Dr. Booth highlighted that the approach has many “positive aspects”. By correcting the patient’s T cells, “we think it can improve many of the symptoms of the disease”, she said, and added that this new approach is much less toxic than a bone marrow transplant. “Collecting the T cells is easier and correcting the T cells is easier. With this approach the amount of time in hospital the patients would need would be far less.”

Emma Morris, PhD, professor of clinical cell and gene therapy and director of UCL’s division of infection and immunity, and co–senior author, said: “Genes that play critical roles in controlling immune responses are not switched on all the time and are very tightly regulated. The technique we have used allows us to leave the natural (endogenous) mechanisms controlling gene expression intact, at the same time as correcting the mistake in the gene itself.”

The researchers explained that, although CTLA-4 insufficiency is rare, the gene editing therapy could be a proof of principle of their approach that could be adapted to tackle other conditions. 

“It’s a way of correcting genetic mutations that could potentially be applicable for other diseases,” suggested Dr. Morris. “The bigger picture is it allows us to correct genes that are dysregulated or overactive, but also allows us to understand much more about gene expression and gene regulation.”

The study was funded by the Wellcome Trust, the Association for Moleculary Pathology, the Medical Research Council, Alzheimer’s Research UK, and the UCLH/UCL NIHR Biomedical Research Centre. Dr. Morris is a founder sharehold of Quell Therapeutics and has received honoraria from Orchard Therapeutics, GlaxoSmithKline, and AstraZeneca. Dr. Booth has performed ad hoc consulting in the past 3 years for SOBI and Novartis and educational material production for SOBI and Chiesi. A patent on the intronic gene editing approach has been filed in the UK. The other authors declared that they have no completing interests.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

An “exciting” new gene-editing strategy means those born with a rare inherited disease of the immune system could be treated by repairing a fault in their cells.

Scientists have hailed new research that found faulty cells responsible for the immune system disease CTLA-4 insufficiency can be repaired with a pioneering gene editing technique.

CTLA-4 is a protein produced by T cells that helps to control the activity of the immune system. Most people carry two working copies of the gene responsible for producing CTLA-4, but those who have only one functional copy produce too little of the protein to sufficiently regulate the immune system.

For patients with the condition, CTLA-4 insufficiency causes regulatory T cells to function abnormally, leading to severe autoimmunity. The authors explained that the condition also affects effector T cells and thereby “hampers their immune system’s ‘memory,’ ” meaning patients can “struggle to fight off recurring infections by the same viruses and bacteria.” In some cases, it can also lead to lymphomas.
 

Gene editing to ‘cut’ out faulty genes and ‘paste’ in ‘corrected’ ones

The research, published in Science Translational Medicine, and led by scientists from University College London, demonstrated in human cells and in mice that the cell fault can be repaired.

The scientists used “cut-and-paste” gene-editing techniques. First, they used the CRISPR/Cas9 system to target the faulty gene in human T cells taken from patients with CTLA-4 insufficiency, and then snip the faulty CTLA-4 gene in two. Then, to repair the errors a corrected sequence of DNA – delivered to the cell using a modified virus – was pasted over the faulty part of the gene using a cellular DNA repair mechanism known as homology-directed repair.

The authors explained that this allowed them to “preserve” important sequences within the CTLA-4 gene – known as the intron – that allow it to be switched on and off by the cell only when needed. 

The outcome was “restored levels of CTLA-4 in the cells to those seen in healthy T cells,” the authors said.

Claire Booth, PhD, Mahboubian professor of gene therapy and pediatric immunology, UCL Great Ormond Street Institute of Child Health, and co–senior author, said that it was “really exciting” to think about taking this treatment forward to patients. “If we can improve their symptoms and reduce their risk of getting lymphoproliferative disease this will be a major step forward.”

In addition, the researchers were also able to improve symptoms of the disease in mice with CTLA-4 insufficiency by giving them injections of gene-edited T cells.
 

Technique may help tackle many conditions

The current standard treatment for CTLA-4 insufficiency is a bone marrow transplant to replace the stem cells responsible for producing T cells. However, “transplants are risky” and require high doses of chemotherapy and many weeks in hospital, the authors explained. “Older patients with CTLA-4 insufficiency are typically not well enough to tolerate the transplant procedure.”

Dr. Booth highlighted that the approach has many “positive aspects”. By correcting the patient’s T cells, “we think it can improve many of the symptoms of the disease”, she said, and added that this new approach is much less toxic than a bone marrow transplant. “Collecting the T cells is easier and correcting the T cells is easier. With this approach the amount of time in hospital the patients would need would be far less.”

Emma Morris, PhD, professor of clinical cell and gene therapy and director of UCL’s division of infection and immunity, and co–senior author, said: “Genes that play critical roles in controlling immune responses are not switched on all the time and are very tightly regulated. The technique we have used allows us to leave the natural (endogenous) mechanisms controlling gene expression intact, at the same time as correcting the mistake in the gene itself.”

The researchers explained that, although CTLA-4 insufficiency is rare, the gene editing therapy could be a proof of principle of their approach that could be adapted to tackle other conditions. 

“It’s a way of correcting genetic mutations that could potentially be applicable for other diseases,” suggested Dr. Morris. “The bigger picture is it allows us to correct genes that are dysregulated or overactive, but also allows us to understand much more about gene expression and gene regulation.”

The study was funded by the Wellcome Trust, the Association for Moleculary Pathology, the Medical Research Council, Alzheimer’s Research UK, and the UCLH/UCL NIHR Biomedical Research Centre. Dr. Morris is a founder sharehold of Quell Therapeutics and has received honoraria from Orchard Therapeutics, GlaxoSmithKline, and AstraZeneca. Dr. Booth has performed ad hoc consulting in the past 3 years for SOBI and Novartis and educational material production for SOBI and Chiesi. A patent on the intronic gene editing approach has been filed in the UK. The other authors declared that they have no completing interests.

A version of this article first appeared on Medscape UK.

An “exciting” new gene-editing strategy means those born with a rare inherited disease of the immune system could be treated by repairing a fault in their cells.

Scientists have hailed new research that found faulty cells responsible for the immune system disease CTLA-4 insufficiency can be repaired with a pioneering gene editing technique.

CTLA-4 is a protein produced by T cells that helps to control the activity of the immune system. Most people carry two working copies of the gene responsible for producing CTLA-4, but those who have only one functional copy produce too little of the protein to sufficiently regulate the immune system.

For patients with the condition, CTLA-4 insufficiency causes regulatory T cells to function abnormally, leading to severe autoimmunity. The authors explained that the condition also affects effector T cells and thereby “hampers their immune system’s ‘memory,’ ” meaning patients can “struggle to fight off recurring infections by the same viruses and bacteria.” In some cases, it can also lead to lymphomas.
 

Gene editing to ‘cut’ out faulty genes and ‘paste’ in ‘corrected’ ones

The research, published in Science Translational Medicine, and led by scientists from University College London, demonstrated in human cells and in mice that the cell fault can be repaired.

The scientists used “cut-and-paste” gene-editing techniques. First, they used the CRISPR/Cas9 system to target the faulty gene in human T cells taken from patients with CTLA-4 insufficiency, and then snip the faulty CTLA-4 gene in two. Then, to repair the errors a corrected sequence of DNA – delivered to the cell using a modified virus – was pasted over the faulty part of the gene using a cellular DNA repair mechanism known as homology-directed repair.

The authors explained that this allowed them to “preserve” important sequences within the CTLA-4 gene – known as the intron – that allow it to be switched on and off by the cell only when needed. 

The outcome was “restored levels of CTLA-4 in the cells to those seen in healthy T cells,” the authors said.

Claire Booth, PhD, Mahboubian professor of gene therapy and pediatric immunology, UCL Great Ormond Street Institute of Child Health, and co–senior author, said that it was “really exciting” to think about taking this treatment forward to patients. “If we can improve their symptoms and reduce their risk of getting lymphoproliferative disease this will be a major step forward.”

In addition, the researchers were also able to improve symptoms of the disease in mice with CTLA-4 insufficiency by giving them injections of gene-edited T cells.
 

Technique may help tackle many conditions

The current standard treatment for CTLA-4 insufficiency is a bone marrow transplant to replace the stem cells responsible for producing T cells. However, “transplants are risky” and require high doses of chemotherapy and many weeks in hospital, the authors explained. “Older patients with CTLA-4 insufficiency are typically not well enough to tolerate the transplant procedure.”

Dr. Booth highlighted that the approach has many “positive aspects”. By correcting the patient’s T cells, “we think it can improve many of the symptoms of the disease”, she said, and added that this new approach is much less toxic than a bone marrow transplant. “Collecting the T cells is easier and correcting the T cells is easier. With this approach the amount of time in hospital the patients would need would be far less.”

Emma Morris, PhD, professor of clinical cell and gene therapy and director of UCL’s division of infection and immunity, and co–senior author, said: “Genes that play critical roles in controlling immune responses are not switched on all the time and are very tightly regulated. The technique we have used allows us to leave the natural (endogenous) mechanisms controlling gene expression intact, at the same time as correcting the mistake in the gene itself.”

The researchers explained that, although CTLA-4 insufficiency is rare, the gene editing therapy could be a proof of principle of their approach that could be adapted to tackle other conditions. 

“It’s a way of correcting genetic mutations that could potentially be applicable for other diseases,” suggested Dr. Morris. “The bigger picture is it allows us to correct genes that are dysregulated or overactive, but also allows us to understand much more about gene expression and gene regulation.”

The study was funded by the Wellcome Trust, the Association for Moleculary Pathology, the Medical Research Council, Alzheimer’s Research UK, and the UCLH/UCL NIHR Biomedical Research Centre. Dr. Morris is a founder sharehold of Quell Therapeutics and has received honoraria from Orchard Therapeutics, GlaxoSmithKline, and AstraZeneca. Dr. Booth has performed ad hoc consulting in the past 3 years for SOBI and Novartis and educational material production for SOBI and Chiesi. A patent on the intronic gene editing approach has been filed in the UK. The other authors declared that they have no completing interests.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SCIENCE TRANSLATIONAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Dementia signs detected years before diagnosis

Article Type
Changed
Wed, 11/23/2022 - 19:29

U.K. scientists show it is possible to spot signs of brain impairment in patients as early as 9 years before they receive a diagnosis of dementia, offering hope for interventions to reduce the risk of the disease developing.

To date it has been unclear whether it might be possible to detect changes in brain function before the onset of symptoms, so researchers at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust set out to determine whether people who developed a range of neurodegenerative diagnoses demonstrated reduced cognitive function at their baseline assessment.

The authors explained: “The pathophysiological processes of neurodegenerative diseases begin years before diagnosis. However, prediagnostic changes in cognition and physical function are poorly understood, especially in sporadic neurodegenerative disease.”
 

Prediagnostic cognitive and functional impairment identified

The researchers analyzed data from the UK Biobank and compared cognitive and functional measures, including problem solving, memory, reaction times and grip strength, as well as data on weight loss and gain and on the number of falls, in individuals who subsequently developed a number of dementia-related diseases (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, progressive supranuclear palsy, dementia with Lewy bodies, and multiple system atrophy), with those who did not have a neurodegenerative diagnosis. After adjustment for the effects of age, the same measures were regressed against time to diagnosis. The study was published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

The researchers found evidence of prediagnostic cognitive impairment and decline with time, particularly in Alzheimer’s disease where those who went on to develop the disease scored more poorly compared with healthy individuals when it came to problem solving tasks, reaction times, remembering lists of numbers, prospective memory, and pair matching. This was also the case for people who developed frontotemporal dementia, the authors said.

Nol Swaddiwudhipong, MB, of the University of Cambridge, and first author, said: “When we looked back at patients’ histories, it became clear that they were showing some cognitive impairment several years before their symptoms became obvious enough to prompt a diagnosis. The impairments were often subtle, but across a number of aspects of cognition.”

Prediagnostic functional impairment and decline was also observed in multiple diseases, the authors said. People who went on to develop Alzheimer’s disease were more likely than were healthy adults to have had a fall in the previous 12 months, with those patients who went on to develop progressive supranuclear palsy (PSP) being more than twice as likely as healthy individuals to have had a fall.

The time between baseline assessment and diagnosis varied between 4.7 years for dementia with Lewy bodies and 8.3 years for Alzheimer’s disease.

“For every condition studied – including Parkinson’s disease and dementia with Lewy bodies – patients reported poorer overall health at baseline,” said the authors.
 

Potential for new treatments

The study findings that cognitive and functional decline occurs “years before symptoms become obvious” in multiple neurodegenerative diseases, raises the possibility that in the future at-risk patients could be screened to help select those who would benefit from interventions to reduce their risk of developing one of the conditions, or to help identify patients suitable for recruitment to clinical trials for new treatments.

Dr Swaddiwudhipong emphasized: “This is a step towards us being able to screen people who are at greatest risk – for example, people over 50 or those who have high blood pressure or do not do enough exercise – and intervene at an earlier stage to help them reduce their risk.”

There are currently very few effective treatments for dementia or other forms of neurodegeneration, the authors pointed out, in part because these conditions are often only diagnosed once symptoms appear, whereas the underlying neurodegeneration may have “begun years, even decades, earlier.” This means that by the time patients take part in clinical trials, it may already be too late in the disease process to alter its course, they explained.

Timothy Rittman, BMBS, PhD, department of clinical neurosciences, University of Cambridge, and senior author, explained that the findings could also help identify people who can participate in clinical trials for potential new treatments. “The problem with clinical trials is that by necessity they often recruit patients with a diagnosis, but we know that by this point they are already some way down the road and their condition cannot be stopped. If we can find these individuals early enough, we’ll have a better chance of seeing if the drugs are effective,” he emphasized.

Commenting on the new research, Richard Oakley, PhD, associate director of research at Alzheimer’s Society, said: “Studies like this show the importance in continued investment in dementia research to revolutionize diagnosis and drive new treatments, so one day we will beat dementia.”

The research was funded by the Medical Research Council with support from the NIHR Cambridge Biomedical Research Centre. The authors reported no conflicts of interest.

A version of this article first appeared on Medscape UK.

Issue
Neurology Reviews - 30(12)
Publications
Topics
Sections

U.K. scientists show it is possible to spot signs of brain impairment in patients as early as 9 years before they receive a diagnosis of dementia, offering hope for interventions to reduce the risk of the disease developing.

To date it has been unclear whether it might be possible to detect changes in brain function before the onset of symptoms, so researchers at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust set out to determine whether people who developed a range of neurodegenerative diagnoses demonstrated reduced cognitive function at their baseline assessment.

The authors explained: “The pathophysiological processes of neurodegenerative diseases begin years before diagnosis. However, prediagnostic changes in cognition and physical function are poorly understood, especially in sporadic neurodegenerative disease.”
 

Prediagnostic cognitive and functional impairment identified

The researchers analyzed data from the UK Biobank and compared cognitive and functional measures, including problem solving, memory, reaction times and grip strength, as well as data on weight loss and gain and on the number of falls, in individuals who subsequently developed a number of dementia-related diseases (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, progressive supranuclear palsy, dementia with Lewy bodies, and multiple system atrophy), with those who did not have a neurodegenerative diagnosis. After adjustment for the effects of age, the same measures were regressed against time to diagnosis. The study was published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

The researchers found evidence of prediagnostic cognitive impairment and decline with time, particularly in Alzheimer’s disease where those who went on to develop the disease scored more poorly compared with healthy individuals when it came to problem solving tasks, reaction times, remembering lists of numbers, prospective memory, and pair matching. This was also the case for people who developed frontotemporal dementia, the authors said.

Nol Swaddiwudhipong, MB, of the University of Cambridge, and first author, said: “When we looked back at patients’ histories, it became clear that they were showing some cognitive impairment several years before their symptoms became obvious enough to prompt a diagnosis. The impairments were often subtle, but across a number of aspects of cognition.”

Prediagnostic functional impairment and decline was also observed in multiple diseases, the authors said. People who went on to develop Alzheimer’s disease were more likely than were healthy adults to have had a fall in the previous 12 months, with those patients who went on to develop progressive supranuclear palsy (PSP) being more than twice as likely as healthy individuals to have had a fall.

The time between baseline assessment and diagnosis varied between 4.7 years for dementia with Lewy bodies and 8.3 years for Alzheimer’s disease.

“For every condition studied – including Parkinson’s disease and dementia with Lewy bodies – patients reported poorer overall health at baseline,” said the authors.
 

Potential for new treatments

The study findings that cognitive and functional decline occurs “years before symptoms become obvious” in multiple neurodegenerative diseases, raises the possibility that in the future at-risk patients could be screened to help select those who would benefit from interventions to reduce their risk of developing one of the conditions, or to help identify patients suitable for recruitment to clinical trials for new treatments.

Dr Swaddiwudhipong emphasized: “This is a step towards us being able to screen people who are at greatest risk – for example, people over 50 or those who have high blood pressure or do not do enough exercise – and intervene at an earlier stage to help them reduce their risk.”

There are currently very few effective treatments for dementia or other forms of neurodegeneration, the authors pointed out, in part because these conditions are often only diagnosed once symptoms appear, whereas the underlying neurodegeneration may have “begun years, even decades, earlier.” This means that by the time patients take part in clinical trials, it may already be too late in the disease process to alter its course, they explained.

Timothy Rittman, BMBS, PhD, department of clinical neurosciences, University of Cambridge, and senior author, explained that the findings could also help identify people who can participate in clinical trials for potential new treatments. “The problem with clinical trials is that by necessity they often recruit patients with a diagnosis, but we know that by this point they are already some way down the road and their condition cannot be stopped. If we can find these individuals early enough, we’ll have a better chance of seeing if the drugs are effective,” he emphasized.

Commenting on the new research, Richard Oakley, PhD, associate director of research at Alzheimer’s Society, said: “Studies like this show the importance in continued investment in dementia research to revolutionize diagnosis and drive new treatments, so one day we will beat dementia.”

The research was funded by the Medical Research Council with support from the NIHR Cambridge Biomedical Research Centre. The authors reported no conflicts of interest.

A version of this article first appeared on Medscape UK.

U.K. scientists show it is possible to spot signs of brain impairment in patients as early as 9 years before they receive a diagnosis of dementia, offering hope for interventions to reduce the risk of the disease developing.

To date it has been unclear whether it might be possible to detect changes in brain function before the onset of symptoms, so researchers at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust set out to determine whether people who developed a range of neurodegenerative diagnoses demonstrated reduced cognitive function at their baseline assessment.

The authors explained: “The pathophysiological processes of neurodegenerative diseases begin years before diagnosis. However, prediagnostic changes in cognition and physical function are poorly understood, especially in sporadic neurodegenerative disease.”
 

Prediagnostic cognitive and functional impairment identified

The researchers analyzed data from the UK Biobank and compared cognitive and functional measures, including problem solving, memory, reaction times and grip strength, as well as data on weight loss and gain and on the number of falls, in individuals who subsequently developed a number of dementia-related diseases (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, progressive supranuclear palsy, dementia with Lewy bodies, and multiple system atrophy), with those who did not have a neurodegenerative diagnosis. After adjustment for the effects of age, the same measures were regressed against time to diagnosis. The study was published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

The researchers found evidence of prediagnostic cognitive impairment and decline with time, particularly in Alzheimer’s disease where those who went on to develop the disease scored more poorly compared with healthy individuals when it came to problem solving tasks, reaction times, remembering lists of numbers, prospective memory, and pair matching. This was also the case for people who developed frontotemporal dementia, the authors said.

Nol Swaddiwudhipong, MB, of the University of Cambridge, and first author, said: “When we looked back at patients’ histories, it became clear that they were showing some cognitive impairment several years before their symptoms became obvious enough to prompt a diagnosis. The impairments were often subtle, but across a number of aspects of cognition.”

Prediagnostic functional impairment and decline was also observed in multiple diseases, the authors said. People who went on to develop Alzheimer’s disease were more likely than were healthy adults to have had a fall in the previous 12 months, with those patients who went on to develop progressive supranuclear palsy (PSP) being more than twice as likely as healthy individuals to have had a fall.

The time between baseline assessment and diagnosis varied between 4.7 years for dementia with Lewy bodies and 8.3 years for Alzheimer’s disease.

“For every condition studied – including Parkinson’s disease and dementia with Lewy bodies – patients reported poorer overall health at baseline,” said the authors.
 

Potential for new treatments

The study findings that cognitive and functional decline occurs “years before symptoms become obvious” in multiple neurodegenerative diseases, raises the possibility that in the future at-risk patients could be screened to help select those who would benefit from interventions to reduce their risk of developing one of the conditions, or to help identify patients suitable for recruitment to clinical trials for new treatments.

Dr Swaddiwudhipong emphasized: “This is a step towards us being able to screen people who are at greatest risk – for example, people over 50 or those who have high blood pressure or do not do enough exercise – and intervene at an earlier stage to help them reduce their risk.”

There are currently very few effective treatments for dementia or other forms of neurodegeneration, the authors pointed out, in part because these conditions are often only diagnosed once symptoms appear, whereas the underlying neurodegeneration may have “begun years, even decades, earlier.” This means that by the time patients take part in clinical trials, it may already be too late in the disease process to alter its course, they explained.

Timothy Rittman, BMBS, PhD, department of clinical neurosciences, University of Cambridge, and senior author, explained that the findings could also help identify people who can participate in clinical trials for potential new treatments. “The problem with clinical trials is that by necessity they often recruit patients with a diagnosis, but we know that by this point they are already some way down the road and their condition cannot be stopped. If we can find these individuals early enough, we’ll have a better chance of seeing if the drugs are effective,” he emphasized.

Commenting on the new research, Richard Oakley, PhD, associate director of research at Alzheimer’s Society, said: “Studies like this show the importance in continued investment in dementia research to revolutionize diagnosis and drive new treatments, so one day we will beat dementia.”

The research was funded by the Medical Research Council with support from the NIHR Cambridge Biomedical Research Centre. The authors reported no conflicts of interest.

A version of this article first appeared on Medscape UK.

Issue
Neurology Reviews - 30(12)
Issue
Neurology Reviews - 30(12)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S & DEMENTIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

3-D scaffold could revolutionize diabetes treatment

Article Type
Changed
Fri, 10/07/2022 - 14:06

Researchers have developed a scaffold using 3-D bioprinting that slowly releases antibiotics, offering the hope of revolutionizing treatment of diabetic foot ulcers.

Diabetes is among the top 10 causes of deaths worldwide, and in the United Kingdom more than 4.9 million people have diabetes, according to Diabetes UK, who said that “if nothing changes, we predict that 5.5 million people will have diabetes in the UK by 2030.” 

Diabetic foot ulcers affect approximately one in four diabetic patients. Standard therapies, such as pressure offloading and infection management, are often unsuccessful alone and require the introduction of advanced therapies, such as hydrogel wound dressings, which further increases treatment costs and requires hospitalization, highlighted the authors of the study, 3D bioprinted scaffolds for diabetic wound-healing applications.

By the time diabetic foot ulcers are identified, “over 50% are already infected and over 70% of cases result in lower limb amputation,” they said.
 

Drug-loaded scaffold

In their study, published in the journal Drug Delivery and Translational Research, and being presented at the Controlled Release Society Workshop, Italy, this week, researchers from Queen’s University Belfast explained that the treatment strategy required for the effective healing of diabetic foot ulcers is a “complex process” requiring several combined therapeutic approaches. As a result, there is a “significant clinical and economic burden” associated with treating diabetic foot ulcers, they said, and these treatments are often unsuccessful, commonly resulting in lower-limb amputation.

Diabetes UK pointed out that diabetes leads to almost 9,600 leg, toe, or foot amputations every year – “That’s 185 a week,” the charity emphasized. 

Recent research has focused on drug-loaded scaffolds to treat diabetic foot ulcers. The scaffold structure is a novel carrier for cell and drug delivery that enhances wound healing, explained the authors.

Dimitrios Lamprou, PhD, professor of biofabrication and advanced manufacturing, Queen’s School of Pharmacy, and corresponding author, explained: “These scaffolds are like windows that enable doctors to monitor the healing constantly. This avoids needing to remove them constantly, which can provoke infection and delay the healing process.”
 

Low-cost treatment alternative

For their proof-of-concept investigation, the researchers made 3-D–bioprinted scaffolds with different designs – honeycomb, square, parallel, triangular, double-parallel – to be used for the sustained release of levofloxacin to the diabetic foot ulcer.

“The ‘frame’ has an antibiotic that helps to ‘kill’ the bacteria infection, and the ‘glass’ that can be prepared by collagen/sodium alginate can contain a growth factor to encourage cell growth. The scaffold has two molecular layers that both play an important role in healing the wound,” explained Dr. Lamprou.

The authors highlighted that square and parallel designs were created to improve flexibility, and that the repeating unit nature of this scaffold would also allow the scaffold to be easily cut to the required size in order to reduce clinical wastage. The triangular and double-parallel designs were created to decrease the available surface area, and the double-parallel design was composed by repeating units to also meet the same clinical benefits.

“This proof of concept study demonstrates the innovative potential of bioprinting technologies in fabrication of antibiotic scaffolds for the treatment of diabetic foot ulcers,” said the authors. The chosen scaffold design provided sustained release of antibiotic over 4 weeks to infected diabetic foot ulcers, demonstrated suitable mechanical properties for tissue engineering purposes, and can be easily modified to the size of the wound, they said.

Katie Glover, PhD, Queen’s School of Pharmacy, lead author, said: “Using bioprinting technology, we have developed a scaffold with suitable mechanical properties to treat the wound, which can be easily modified to the size of the wound.”

She added that this provides a “low-cost alternative” to current treatments for diabetic foot ulcers, which could “revolutionize” their treatment. Moreover, it could improve patient outcomes while reducing the economic burden on health services, she said.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

Researchers have developed a scaffold using 3-D bioprinting that slowly releases antibiotics, offering the hope of revolutionizing treatment of diabetic foot ulcers.

Diabetes is among the top 10 causes of deaths worldwide, and in the United Kingdom more than 4.9 million people have diabetes, according to Diabetes UK, who said that “if nothing changes, we predict that 5.5 million people will have diabetes in the UK by 2030.” 

Diabetic foot ulcers affect approximately one in four diabetic patients. Standard therapies, such as pressure offloading and infection management, are often unsuccessful alone and require the introduction of advanced therapies, such as hydrogel wound dressings, which further increases treatment costs and requires hospitalization, highlighted the authors of the study, 3D bioprinted scaffolds for diabetic wound-healing applications.

By the time diabetic foot ulcers are identified, “over 50% are already infected and over 70% of cases result in lower limb amputation,” they said.
 

Drug-loaded scaffold

In their study, published in the journal Drug Delivery and Translational Research, and being presented at the Controlled Release Society Workshop, Italy, this week, researchers from Queen’s University Belfast explained that the treatment strategy required for the effective healing of diabetic foot ulcers is a “complex process” requiring several combined therapeutic approaches. As a result, there is a “significant clinical and economic burden” associated with treating diabetic foot ulcers, they said, and these treatments are often unsuccessful, commonly resulting in lower-limb amputation.

Diabetes UK pointed out that diabetes leads to almost 9,600 leg, toe, or foot amputations every year – “That’s 185 a week,” the charity emphasized. 

Recent research has focused on drug-loaded scaffolds to treat diabetic foot ulcers. The scaffold structure is a novel carrier for cell and drug delivery that enhances wound healing, explained the authors.

Dimitrios Lamprou, PhD, professor of biofabrication and advanced manufacturing, Queen’s School of Pharmacy, and corresponding author, explained: “These scaffolds are like windows that enable doctors to monitor the healing constantly. This avoids needing to remove them constantly, which can provoke infection and delay the healing process.”
 

Low-cost treatment alternative

For their proof-of-concept investigation, the researchers made 3-D–bioprinted scaffolds with different designs – honeycomb, square, parallel, triangular, double-parallel – to be used for the sustained release of levofloxacin to the diabetic foot ulcer.

“The ‘frame’ has an antibiotic that helps to ‘kill’ the bacteria infection, and the ‘glass’ that can be prepared by collagen/sodium alginate can contain a growth factor to encourage cell growth. The scaffold has two molecular layers that both play an important role in healing the wound,” explained Dr. Lamprou.

The authors highlighted that square and parallel designs were created to improve flexibility, and that the repeating unit nature of this scaffold would also allow the scaffold to be easily cut to the required size in order to reduce clinical wastage. The triangular and double-parallel designs were created to decrease the available surface area, and the double-parallel design was composed by repeating units to also meet the same clinical benefits.

“This proof of concept study demonstrates the innovative potential of bioprinting technologies in fabrication of antibiotic scaffolds for the treatment of diabetic foot ulcers,” said the authors. The chosen scaffold design provided sustained release of antibiotic over 4 weeks to infected diabetic foot ulcers, demonstrated suitable mechanical properties for tissue engineering purposes, and can be easily modified to the size of the wound, they said.

Katie Glover, PhD, Queen’s School of Pharmacy, lead author, said: “Using bioprinting technology, we have developed a scaffold with suitable mechanical properties to treat the wound, which can be easily modified to the size of the wound.”

She added that this provides a “low-cost alternative” to current treatments for diabetic foot ulcers, which could “revolutionize” their treatment. Moreover, it could improve patient outcomes while reducing the economic burden on health services, she said.

A version of this article first appeared on Medscape UK.

Researchers have developed a scaffold using 3-D bioprinting that slowly releases antibiotics, offering the hope of revolutionizing treatment of diabetic foot ulcers.

Diabetes is among the top 10 causes of deaths worldwide, and in the United Kingdom more than 4.9 million people have diabetes, according to Diabetes UK, who said that “if nothing changes, we predict that 5.5 million people will have diabetes in the UK by 2030.” 

Diabetic foot ulcers affect approximately one in four diabetic patients. Standard therapies, such as pressure offloading and infection management, are often unsuccessful alone and require the introduction of advanced therapies, such as hydrogel wound dressings, which further increases treatment costs and requires hospitalization, highlighted the authors of the study, 3D bioprinted scaffolds for diabetic wound-healing applications.

By the time diabetic foot ulcers are identified, “over 50% are already infected and over 70% of cases result in lower limb amputation,” they said.
 

Drug-loaded scaffold

In their study, published in the journal Drug Delivery and Translational Research, and being presented at the Controlled Release Society Workshop, Italy, this week, researchers from Queen’s University Belfast explained that the treatment strategy required for the effective healing of diabetic foot ulcers is a “complex process” requiring several combined therapeutic approaches. As a result, there is a “significant clinical and economic burden” associated with treating diabetic foot ulcers, they said, and these treatments are often unsuccessful, commonly resulting in lower-limb amputation.

Diabetes UK pointed out that diabetes leads to almost 9,600 leg, toe, or foot amputations every year – “That’s 185 a week,” the charity emphasized. 

Recent research has focused on drug-loaded scaffolds to treat diabetic foot ulcers. The scaffold structure is a novel carrier for cell and drug delivery that enhances wound healing, explained the authors.

Dimitrios Lamprou, PhD, professor of biofabrication and advanced manufacturing, Queen’s School of Pharmacy, and corresponding author, explained: “These scaffolds are like windows that enable doctors to monitor the healing constantly. This avoids needing to remove them constantly, which can provoke infection and delay the healing process.”
 

Low-cost treatment alternative

For their proof-of-concept investigation, the researchers made 3-D–bioprinted scaffolds with different designs – honeycomb, square, parallel, triangular, double-parallel – to be used for the sustained release of levofloxacin to the diabetic foot ulcer.

“The ‘frame’ has an antibiotic that helps to ‘kill’ the bacteria infection, and the ‘glass’ that can be prepared by collagen/sodium alginate can contain a growth factor to encourage cell growth. The scaffold has two molecular layers that both play an important role in healing the wound,” explained Dr. Lamprou.

The authors highlighted that square and parallel designs were created to improve flexibility, and that the repeating unit nature of this scaffold would also allow the scaffold to be easily cut to the required size in order to reduce clinical wastage. The triangular and double-parallel designs were created to decrease the available surface area, and the double-parallel design was composed by repeating units to also meet the same clinical benefits.

“This proof of concept study demonstrates the innovative potential of bioprinting technologies in fabrication of antibiotic scaffolds for the treatment of diabetic foot ulcers,” said the authors. The chosen scaffold design provided sustained release of antibiotic over 4 weeks to infected diabetic foot ulcers, demonstrated suitable mechanical properties for tissue engineering purposes, and can be easily modified to the size of the wound, they said.

Katie Glover, PhD, Queen’s School of Pharmacy, lead author, said: “Using bioprinting technology, we have developed a scaffold with suitable mechanical properties to treat the wound, which can be easily modified to the size of the wound.”

She added that this provides a “low-cost alternative” to current treatments for diabetic foot ulcers, which could “revolutionize” their treatment. Moreover, it could improve patient outcomes while reducing the economic burden on health services, she said.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DRUG DELIVERY AND TRANSLATIONAL RESEARCH

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Risk factors linked to post–COVID vaccination death identified

Article Type
Changed
Wed, 09/28/2022 - 15:47

Those with risk factors associated with COVID-19–related death post coronavirus vaccination should be considered a priority for COVID therapeutics and further booster doses say U.K. researchers.

The researchers have identified factors that put a person at greater risk of COVID-related death after they have completed both doses of the primary COVID vaccination schedule and a booster dose.

For their research, published in JAMA Network Open, researchers from the Office for National Statistics (ONS); Public Health Scotland; the University of Strathclyde, Glasgow; and the University of Edinburgh used data from the ONS Public linked data set combining the 2011 Census of England and covering 80% of the population of England. The study population included 19,473,570 individuals aged 18-100 years (mean age 60.8 years, 45.2% men, 92.0% White individuals) living in England who had completed both doses of their primary vaccination schedule and had received their mRNA booster 14 days or more prior to Dec. 31, 2021. The outcome of interest was time to death involving COVID-19 occurring between Jan. 1 and March 16, 2022.
 

Prioritization of booster doses and COVID-19 treatments

The authors highlighted how it had become “critical” to identify risk factors associated with COVID-19 death in those who had been vaccinated and pointed out that existing evidence was “based on people who have received one or two doses of a COVID-19 vaccine and were infected by the Alpha or Delta variant”. They emphasized that establishing which groups are at increased risk of COVID-19 death after receiving a booster is crucial for the “prioritization of further booster doses and access to COVID-19 therapeutics.”

During the study period the authors found that there were 4,781 (0.02%) deaths involving COVID-19 and 58,020 (0.3%) deaths from other causes. Of those who died of coronavirus, the mean age was 83.3 years, and the authors highlighted how “age was the most important characteristic” associated with the risk of postbooster COVID-19 death. They added that, compared with a 50-year-old, the HR for an 80-year-old individual was 31.3 (95% confidence interval, 26.1-37.6).

They found that women were at lower risk than men with an HR of 0.52 (95% CI, 0.49-0.55). An increased risk of COVID-19 death was also associated with living in a care home or in a socioeconomically deprived area.

Of note, they said that “there was no association between the risk of COVID-19 death and ethnicity, except for those of Indian background”, who they explained were at slightly elevated risk, compared with White individuals. However, they explained how the association with ethnicity was “unclear and differed from previous studies”, with their findings likely to be due “largely to the pronounced differences in vaccination uptake” between ethnic groups in previous studies.
 

Dementia concern

With regard to existing health conditions the authors commented that “most of the QCovid risk groups were associated with an increased HR of postbooster breakthrough death, except for of congenital heart disease, asthma, and prior fracture.”

Risk was particularly elevated, they said, for people with severe combined immunodeficiency (HR, 6.2; 95% CI, 3.3-11.5), and they also identified several conditions associated with HRs of greater than 3, including dementia.

In July, Alzheimer’s Research UK urged the Government to boost the development and deployment of new dementia treatments having found that a significant proportion of people who died of COVID-19 in 2020 and 2021 were living with the condition. At the time, data published by the ONS of deaths caused by coronavirus in England and Wales in 2021 showed dementia to be the second-most common pre-existing condition.

David Thomas, head of policy at Alzheimer’s Research UK, said: “We’ve known for some time that people with dementia have been hit disproportionately hard during the pandemic, but this new data serves as a stark reminder of the growing challenge we face in tackling the condition, and the urgent need to address it.”

The authors of the new research acknowledged the study’s limitations, notably that only data for the population living in England who were enumerated in the 2011 Census of England and Wales was included.

However, subpopulations “remain at increased risk of COVID-19 fatality” after receiving a booster vaccine during the Omicron wave, they pointed out.

“The subpopulations with the highest risk should be considered a priority for COVID-19 therapeutics and further booster doses,” they urged.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

Those with risk factors associated with COVID-19–related death post coronavirus vaccination should be considered a priority for COVID therapeutics and further booster doses say U.K. researchers.

The researchers have identified factors that put a person at greater risk of COVID-related death after they have completed both doses of the primary COVID vaccination schedule and a booster dose.

For their research, published in JAMA Network Open, researchers from the Office for National Statistics (ONS); Public Health Scotland; the University of Strathclyde, Glasgow; and the University of Edinburgh used data from the ONS Public linked data set combining the 2011 Census of England and covering 80% of the population of England. The study population included 19,473,570 individuals aged 18-100 years (mean age 60.8 years, 45.2% men, 92.0% White individuals) living in England who had completed both doses of their primary vaccination schedule and had received their mRNA booster 14 days or more prior to Dec. 31, 2021. The outcome of interest was time to death involving COVID-19 occurring between Jan. 1 and March 16, 2022.
 

Prioritization of booster doses and COVID-19 treatments

The authors highlighted how it had become “critical” to identify risk factors associated with COVID-19 death in those who had been vaccinated and pointed out that existing evidence was “based on people who have received one or two doses of a COVID-19 vaccine and were infected by the Alpha or Delta variant”. They emphasized that establishing which groups are at increased risk of COVID-19 death after receiving a booster is crucial for the “prioritization of further booster doses and access to COVID-19 therapeutics.”

During the study period the authors found that there were 4,781 (0.02%) deaths involving COVID-19 and 58,020 (0.3%) deaths from other causes. Of those who died of coronavirus, the mean age was 83.3 years, and the authors highlighted how “age was the most important characteristic” associated with the risk of postbooster COVID-19 death. They added that, compared with a 50-year-old, the HR for an 80-year-old individual was 31.3 (95% confidence interval, 26.1-37.6).

They found that women were at lower risk than men with an HR of 0.52 (95% CI, 0.49-0.55). An increased risk of COVID-19 death was also associated with living in a care home or in a socioeconomically deprived area.

Of note, they said that “there was no association between the risk of COVID-19 death and ethnicity, except for those of Indian background”, who they explained were at slightly elevated risk, compared with White individuals. However, they explained how the association with ethnicity was “unclear and differed from previous studies”, with their findings likely to be due “largely to the pronounced differences in vaccination uptake” between ethnic groups in previous studies.
 

Dementia concern

With regard to existing health conditions the authors commented that “most of the QCovid risk groups were associated with an increased HR of postbooster breakthrough death, except for of congenital heart disease, asthma, and prior fracture.”

Risk was particularly elevated, they said, for people with severe combined immunodeficiency (HR, 6.2; 95% CI, 3.3-11.5), and they also identified several conditions associated with HRs of greater than 3, including dementia.

In July, Alzheimer’s Research UK urged the Government to boost the development and deployment of new dementia treatments having found that a significant proportion of people who died of COVID-19 in 2020 and 2021 were living with the condition. At the time, data published by the ONS of deaths caused by coronavirus in England and Wales in 2021 showed dementia to be the second-most common pre-existing condition.

David Thomas, head of policy at Alzheimer’s Research UK, said: “We’ve known for some time that people with dementia have been hit disproportionately hard during the pandemic, but this new data serves as a stark reminder of the growing challenge we face in tackling the condition, and the urgent need to address it.”

The authors of the new research acknowledged the study’s limitations, notably that only data for the population living in England who were enumerated in the 2011 Census of England and Wales was included.

However, subpopulations “remain at increased risk of COVID-19 fatality” after receiving a booster vaccine during the Omicron wave, they pointed out.

“The subpopulations with the highest risk should be considered a priority for COVID-19 therapeutics and further booster doses,” they urged.

A version of this article first appeared on Medscape UK.

Those with risk factors associated with COVID-19–related death post coronavirus vaccination should be considered a priority for COVID therapeutics and further booster doses say U.K. researchers.

The researchers have identified factors that put a person at greater risk of COVID-related death after they have completed both doses of the primary COVID vaccination schedule and a booster dose.

For their research, published in JAMA Network Open, researchers from the Office for National Statistics (ONS); Public Health Scotland; the University of Strathclyde, Glasgow; and the University of Edinburgh used data from the ONS Public linked data set combining the 2011 Census of England and covering 80% of the population of England. The study population included 19,473,570 individuals aged 18-100 years (mean age 60.8 years, 45.2% men, 92.0% White individuals) living in England who had completed both doses of their primary vaccination schedule and had received their mRNA booster 14 days or more prior to Dec. 31, 2021. The outcome of interest was time to death involving COVID-19 occurring between Jan. 1 and March 16, 2022.
 

Prioritization of booster doses and COVID-19 treatments

The authors highlighted how it had become “critical” to identify risk factors associated with COVID-19 death in those who had been vaccinated and pointed out that existing evidence was “based on people who have received one or two doses of a COVID-19 vaccine and were infected by the Alpha or Delta variant”. They emphasized that establishing which groups are at increased risk of COVID-19 death after receiving a booster is crucial for the “prioritization of further booster doses and access to COVID-19 therapeutics.”

During the study period the authors found that there were 4,781 (0.02%) deaths involving COVID-19 and 58,020 (0.3%) deaths from other causes. Of those who died of coronavirus, the mean age was 83.3 years, and the authors highlighted how “age was the most important characteristic” associated with the risk of postbooster COVID-19 death. They added that, compared with a 50-year-old, the HR for an 80-year-old individual was 31.3 (95% confidence interval, 26.1-37.6).

They found that women were at lower risk than men with an HR of 0.52 (95% CI, 0.49-0.55). An increased risk of COVID-19 death was also associated with living in a care home or in a socioeconomically deprived area.

Of note, they said that “there was no association between the risk of COVID-19 death and ethnicity, except for those of Indian background”, who they explained were at slightly elevated risk, compared with White individuals. However, they explained how the association with ethnicity was “unclear and differed from previous studies”, with their findings likely to be due “largely to the pronounced differences in vaccination uptake” between ethnic groups in previous studies.
 

Dementia concern

With regard to existing health conditions the authors commented that “most of the QCovid risk groups were associated with an increased HR of postbooster breakthrough death, except for of congenital heart disease, asthma, and prior fracture.”

Risk was particularly elevated, they said, for people with severe combined immunodeficiency (HR, 6.2; 95% CI, 3.3-11.5), and they also identified several conditions associated with HRs of greater than 3, including dementia.

In July, Alzheimer’s Research UK urged the Government to boost the development and deployment of new dementia treatments having found that a significant proportion of people who died of COVID-19 in 2020 and 2021 were living with the condition. At the time, data published by the ONS of deaths caused by coronavirus in England and Wales in 2021 showed dementia to be the second-most common pre-existing condition.

David Thomas, head of policy at Alzheimer’s Research UK, said: “We’ve known for some time that people with dementia have been hit disproportionately hard during the pandemic, but this new data serves as a stark reminder of the growing challenge we face in tackling the condition, and the urgent need to address it.”

The authors of the new research acknowledged the study’s limitations, notably that only data for the population living in England who were enumerated in the 2011 Census of England and Wales was included.

However, subpopulations “remain at increased risk of COVID-19 fatality” after receiving a booster vaccine during the Omicron wave, they pointed out.

“The subpopulations with the highest risk should be considered a priority for COVID-19 therapeutics and further booster doses,” they urged.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article