Control Infectious Diseases At Source, Not at Border

Article Type
Changed
Thu, 12/06/2018 - 20:00
Display Headline
Control Infectious Diseases At Source, Not at Border

Countries should implement international health regulations that seek to control infectious diseases at their source rather than at national borders, and expand their cooperation on surveillance, knowledge, system building, and training, the World Health Organization said.

In its 2007 World Health Report, titled “A Safer Future: Global Public Health Security in the 21st Century,” WHO said international travel and communications mean that countries cannot suppress information about infectious disease or prevent its spread beyond their borders. This fact makes it imperative that countries embrace the revision to international health regulations published in 2005.

Since 1967, at least 39 new infectious diseases have emerged, including HIV and severe acute respiratory syndrome, while older pathogens such as influenza and tuberculosis have reemerged as threats because of health system complacency or misuse of antimicrobials. With easier international travel and communications, the threat of pathogens to international security is as great as ever, the report says.

“Given today's universal vulnerability to these threats, better security calls for global solidarity,” Dr. Margaret Chan, WHO's director-general, said in a written statement. “International public health security is both a collective aspiration and a mutual responsibility. The new watchwords are diplomacy, cooperation, transparency, and preparedness.”

The 2005 international health regulations enable public health officials to seek to control infectious disease outbreaks at their source, rather than focusing on border control at airports and seaports to avert importation of disease, and allows them to rely on information other than government sources to identify and monitor such disease outbreaks.

By seeking transparency from member governments, the goal is to reduce the toll both in human sickness and death, as well as the economic impact that rumors can have on a country believed to have a disease outbreak.

“Instant electronic communication means that disease outbreaks can no longer be kept secret, as was often the case during the implementation of the previous international health regulations. Governments were unwilling to report outbreaks because of the potential damage to their economies through disruptions in trade, travel, and tourism,” the report says.

Gaps occur in international public health security because of inadequate investment in public health defenses; unexpected policy changes; conflicts that force people to flee to overcrowded, unhygienic, and impoverished conditions; microbial evolution; and agricultural practices, the report states.

The report also focuses on the public health effects of environmental events, foodborne diseases, and accidental or deliberate chemical, radioactive, or biologic accidents, many of which now fall under the definition of events requiring an international public health response.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Countries should implement international health regulations that seek to control infectious diseases at their source rather than at national borders, and expand their cooperation on surveillance, knowledge, system building, and training, the World Health Organization said.

In its 2007 World Health Report, titled “A Safer Future: Global Public Health Security in the 21st Century,” WHO said international travel and communications mean that countries cannot suppress information about infectious disease or prevent its spread beyond their borders. This fact makes it imperative that countries embrace the revision to international health regulations published in 2005.

Since 1967, at least 39 new infectious diseases have emerged, including HIV and severe acute respiratory syndrome, while older pathogens such as influenza and tuberculosis have reemerged as threats because of health system complacency or misuse of antimicrobials. With easier international travel and communications, the threat of pathogens to international security is as great as ever, the report says.

“Given today's universal vulnerability to these threats, better security calls for global solidarity,” Dr. Margaret Chan, WHO's director-general, said in a written statement. “International public health security is both a collective aspiration and a mutual responsibility. The new watchwords are diplomacy, cooperation, transparency, and preparedness.”

The 2005 international health regulations enable public health officials to seek to control infectious disease outbreaks at their source, rather than focusing on border control at airports and seaports to avert importation of disease, and allows them to rely on information other than government sources to identify and monitor such disease outbreaks.

By seeking transparency from member governments, the goal is to reduce the toll both in human sickness and death, as well as the economic impact that rumors can have on a country believed to have a disease outbreak.

“Instant electronic communication means that disease outbreaks can no longer be kept secret, as was often the case during the implementation of the previous international health regulations. Governments were unwilling to report outbreaks because of the potential damage to their economies through disruptions in trade, travel, and tourism,” the report says.

Gaps occur in international public health security because of inadequate investment in public health defenses; unexpected policy changes; conflicts that force people to flee to overcrowded, unhygienic, and impoverished conditions; microbial evolution; and agricultural practices, the report states.

The report also focuses on the public health effects of environmental events, foodborne diseases, and accidental or deliberate chemical, radioactive, or biologic accidents, many of which now fall under the definition of events requiring an international public health response.

Countries should implement international health regulations that seek to control infectious diseases at their source rather than at national borders, and expand their cooperation on surveillance, knowledge, system building, and training, the World Health Organization said.

In its 2007 World Health Report, titled “A Safer Future: Global Public Health Security in the 21st Century,” WHO said international travel and communications mean that countries cannot suppress information about infectious disease or prevent its spread beyond their borders. This fact makes it imperative that countries embrace the revision to international health regulations published in 2005.

Since 1967, at least 39 new infectious diseases have emerged, including HIV and severe acute respiratory syndrome, while older pathogens such as influenza and tuberculosis have reemerged as threats because of health system complacency or misuse of antimicrobials. With easier international travel and communications, the threat of pathogens to international security is as great as ever, the report says.

“Given today's universal vulnerability to these threats, better security calls for global solidarity,” Dr. Margaret Chan, WHO's director-general, said in a written statement. “International public health security is both a collective aspiration and a mutual responsibility. The new watchwords are diplomacy, cooperation, transparency, and preparedness.”

The 2005 international health regulations enable public health officials to seek to control infectious disease outbreaks at their source, rather than focusing on border control at airports and seaports to avert importation of disease, and allows them to rely on information other than government sources to identify and monitor such disease outbreaks.

By seeking transparency from member governments, the goal is to reduce the toll both in human sickness and death, as well as the economic impact that rumors can have on a country believed to have a disease outbreak.

“Instant electronic communication means that disease outbreaks can no longer be kept secret, as was often the case during the implementation of the previous international health regulations. Governments were unwilling to report outbreaks because of the potential damage to their economies through disruptions in trade, travel, and tourism,” the report says.

Gaps occur in international public health security because of inadequate investment in public health defenses; unexpected policy changes; conflicts that force people to flee to overcrowded, unhygienic, and impoverished conditions; microbial evolution; and agricultural practices, the report states.

The report also focuses on the public health effects of environmental events, foodborne diseases, and accidental or deliberate chemical, radioactive, or biologic accidents, many of which now fall under the definition of events requiring an international public health response.

Publications
Publications
Topics
Article Type
Display Headline
Control Infectious Diseases At Source, Not at Border
Display Headline
Control Infectious Diseases At Source, Not at Border
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Breathing, Relaxation Improve Asthma Patients' Symptoms

Article Type
Changed
Thu, 01/17/2019 - 23:37
Display Headline
Breathing, Relaxation Improve Asthma Patients' Symptoms

Breathing and relaxation training added to usual asthma treatment improved patients' respiratory symptoms, dysfunctional breathing, and mood better than did usual asthma care alone, according to a British randomized controlled trial.

The researchers studied the effect of exercises collectively known as the Papworth method on asthma patients in a general practice in the town of Welwyn in Hertfordshire, England.

The Papworth method has five components: breathing training (including development of proper breathing patterns and elimination of hyperventilation and “mouth-breathing” habits); education; general and specific relaxation training; integration of breathing and relaxation techniques into daily living activities, including speech; and home exercises with reminders of the techniques, the researchers wrote.

A total of 85 patients aged 16-70–most of whom had mild asthma or symptoms that were well controlled by medication–were randomized into a control group or an intervention group that received five sessions of treatment using the Papworth method (Thorax 2007 June 28 [Epub doi:10.1136/thx.2006.076430]). Both groups continued to receive routine asthma medication and education during the study.

At 6 months, mean scores on a per-protocol basis on the St. George's Respiratory Questionnaire symptoms scale dropped from 42.9 to 21.8 in the intervention group, compared with a change from 35.1 to 32.8 for the control group. At 12 months, the scores were 24.9 for the intervention group and 33.5 for controls, researchers found. The changes were significantly greater in the intervention than in the control group.

The researchers also found significant improvements in intervention group versus control group scores on the Hospital Anxiety and Depression Scale anxiety and depression components, as well as in scores for hypocapnic symptoms on the Nijmegen questionnaire.

“These results support the hypothesis that the Papworth method ameliorates respiratory symptoms and improves quality of life in a general practice population of patients diagnosed with asthma,” wrote the researchers, Elizabeth Holloway and Robert West of University College London's department of epidemiology and public health. “To our knowledge, this is the first evidence from a controlled trial to demonstrate the effectiveness of the Papworth method.”

Papworth method patients significantly improved their relaxed breathing rate over 10 minutes compared with the usual care group. The researchers did not find any significant improvement in spirometric parameters for the intervention patients, however.

“The fact that no significant change was observed in objective measures of lung function suggests that the Papworth method does not improve the chronic underlying physiological causes of asthma, but rather their manifestation,” they wrote.

The researchers acknowledged that their study did not track medication use or changes in medication use, nor does it compare the Papworth method with other treatment options that exceed usual care. In addition, the researchers said they could draw no conclusions about the Papworth method's effectiveness for those with severe asthma, because such patients were not included in the study.

Article PDF
Author and Disclosure Information

Publications
Topics
Sections
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Breathing and relaxation training added to usual asthma treatment improved patients' respiratory symptoms, dysfunctional breathing, and mood better than did usual asthma care alone, according to a British randomized controlled trial.

The researchers studied the effect of exercises collectively known as the Papworth method on asthma patients in a general practice in the town of Welwyn in Hertfordshire, England.

The Papworth method has five components: breathing training (including development of proper breathing patterns and elimination of hyperventilation and “mouth-breathing” habits); education; general and specific relaxation training; integration of breathing and relaxation techniques into daily living activities, including speech; and home exercises with reminders of the techniques, the researchers wrote.

A total of 85 patients aged 16-70–most of whom had mild asthma or symptoms that were well controlled by medication–were randomized into a control group or an intervention group that received five sessions of treatment using the Papworth method (Thorax 2007 June 28 [Epub doi:10.1136/thx.2006.076430]). Both groups continued to receive routine asthma medication and education during the study.

At 6 months, mean scores on a per-protocol basis on the St. George's Respiratory Questionnaire symptoms scale dropped from 42.9 to 21.8 in the intervention group, compared with a change from 35.1 to 32.8 for the control group. At 12 months, the scores were 24.9 for the intervention group and 33.5 for controls, researchers found. The changes were significantly greater in the intervention than in the control group.

The researchers also found significant improvements in intervention group versus control group scores on the Hospital Anxiety and Depression Scale anxiety and depression components, as well as in scores for hypocapnic symptoms on the Nijmegen questionnaire.

“These results support the hypothesis that the Papworth method ameliorates respiratory symptoms and improves quality of life in a general practice population of patients diagnosed with asthma,” wrote the researchers, Elizabeth Holloway and Robert West of University College London's department of epidemiology and public health. “To our knowledge, this is the first evidence from a controlled trial to demonstrate the effectiveness of the Papworth method.”

Papworth method patients significantly improved their relaxed breathing rate over 10 minutes compared with the usual care group. The researchers did not find any significant improvement in spirometric parameters for the intervention patients, however.

“The fact that no significant change was observed in objective measures of lung function suggests that the Papworth method does not improve the chronic underlying physiological causes of asthma, but rather their manifestation,” they wrote.

The researchers acknowledged that their study did not track medication use or changes in medication use, nor does it compare the Papworth method with other treatment options that exceed usual care. In addition, the researchers said they could draw no conclusions about the Papworth method's effectiveness for those with severe asthma, because such patients were not included in the study.

Breathing and relaxation training added to usual asthma treatment improved patients' respiratory symptoms, dysfunctional breathing, and mood better than did usual asthma care alone, according to a British randomized controlled trial.

The researchers studied the effect of exercises collectively known as the Papworth method on asthma patients in a general practice in the town of Welwyn in Hertfordshire, England.

The Papworth method has five components: breathing training (including development of proper breathing patterns and elimination of hyperventilation and “mouth-breathing” habits); education; general and specific relaxation training; integration of breathing and relaxation techniques into daily living activities, including speech; and home exercises with reminders of the techniques, the researchers wrote.

A total of 85 patients aged 16-70–most of whom had mild asthma or symptoms that were well controlled by medication–were randomized into a control group or an intervention group that received five sessions of treatment using the Papworth method (Thorax 2007 June 28 [Epub doi:10.1136/thx.2006.076430]). Both groups continued to receive routine asthma medication and education during the study.

At 6 months, mean scores on a per-protocol basis on the St. George's Respiratory Questionnaire symptoms scale dropped from 42.9 to 21.8 in the intervention group, compared with a change from 35.1 to 32.8 for the control group. At 12 months, the scores were 24.9 for the intervention group and 33.5 for controls, researchers found. The changes were significantly greater in the intervention than in the control group.

The researchers also found significant improvements in intervention group versus control group scores on the Hospital Anxiety and Depression Scale anxiety and depression components, as well as in scores for hypocapnic symptoms on the Nijmegen questionnaire.

“These results support the hypothesis that the Papworth method ameliorates respiratory symptoms and improves quality of life in a general practice population of patients diagnosed with asthma,” wrote the researchers, Elizabeth Holloway and Robert West of University College London's department of epidemiology and public health. “To our knowledge, this is the first evidence from a controlled trial to demonstrate the effectiveness of the Papworth method.”

Papworth method patients significantly improved their relaxed breathing rate over 10 minutes compared with the usual care group. The researchers did not find any significant improvement in spirometric parameters for the intervention patients, however.

“The fact that no significant change was observed in objective measures of lung function suggests that the Papworth method does not improve the chronic underlying physiological causes of asthma, but rather their manifestation,” they wrote.

The researchers acknowledged that their study did not track medication use or changes in medication use, nor does it compare the Papworth method with other treatment options that exceed usual care. In addition, the researchers said they could draw no conclusions about the Papworth method's effectiveness for those with severe asthma, because such patients were not included in the study.

Publications
Publications
Topics
Article Type
Display Headline
Breathing, Relaxation Improve Asthma Patients' Symptoms
Display Headline
Breathing, Relaxation Improve Asthma Patients' Symptoms
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Parkinson's Risk Increases With Greater Pesticide Exposure

Article Type
Changed
Mon, 04/16/2018 - 12:47
Display Headline
Parkinson's Risk Increases With Greater Pesticide Exposure

Exposure to pesticides is associated with a significantly increased risk of developing Parkinson's disease and other degenerative parkinsonian syndromes, a large European multicenter study shows.

Moreover, the risk of Parkinson's increased, depending on the level of pesticide exposure. That suggests a cause-and-effect relationship. “Many previous studies have found such an association, but few have established an exposure-response relationship,” investigators wrote in their report, which was published online in May in Occupational and Environmental Medicine.

Having ever been knocked unconscious and first-degree family history of Parkinson's disease also were each significantly associated with increased Parkinson's risk, wrote Dr. Finlay Dick of the department of environmental and occupational medicine at Aberdeen (Scotland) University and associates.

The researchers enrolled 959 patients with parkinsonism, including 767 with Parkinson's disease, from centers in Italy, Malta, Romania, Scotland, and Sweden, along with 1,989 age- and gender-matched controls from clinics or the community at each site. Subjects completed a questionnaire about lifetime occupational and hobby exposure to solvents, pesticides, iron, copper, and manganese.

Adjusted logistic regression analysis showed that the strongest association was among patients with a first-degree family history of Parkinson's disease (odds ratio 4.85).

Significant dose-response associations were seen between the development of Parkinson's disease/parkinsonism and exposure to pesticides (odds ratio 1.13 for low exposure vs. no exposure; OR 1.41 for high vs. no exposure) and ever having been knocked unconscious (OR 1.35 for once vs. never; OR 2.53 for more than once vs. never). The researchers stressed that the study did not make clear whether head injuries occurred before disease onset, adding that the association might be due to recall bias or an increased risk of falls in Parkinson disease. “Head injury has previously been linked to an increased risk of Parkinson's disease, but the results have been inconsistent,” they noted (Occup. Environ. Med. 2007 May 30 [Epub doi:10.1136/oem.2006.027003]).

Patients taking, for more than 1 year, medication for depression (OR 1.92) or anxiety (OR 1.95), or sleeping pills (OR 1.33) also were at significantly elevated risk for developing Parkinson's disease and parkinsonism. “One explanation for this finding is that depression has been associated with an increased risk of Parkinson's disease later in life. The use of psychotropic medication may simply reflect the well recognized psychiatric effects of Parkinson's disease,” the researchers wrote.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Exposure to pesticides is associated with a significantly increased risk of developing Parkinson's disease and other degenerative parkinsonian syndromes, a large European multicenter study shows.

Moreover, the risk of Parkinson's increased, depending on the level of pesticide exposure. That suggests a cause-and-effect relationship. “Many previous studies have found such an association, but few have established an exposure-response relationship,” investigators wrote in their report, which was published online in May in Occupational and Environmental Medicine.

Having ever been knocked unconscious and first-degree family history of Parkinson's disease also were each significantly associated with increased Parkinson's risk, wrote Dr. Finlay Dick of the department of environmental and occupational medicine at Aberdeen (Scotland) University and associates.

The researchers enrolled 959 patients with parkinsonism, including 767 with Parkinson's disease, from centers in Italy, Malta, Romania, Scotland, and Sweden, along with 1,989 age- and gender-matched controls from clinics or the community at each site. Subjects completed a questionnaire about lifetime occupational and hobby exposure to solvents, pesticides, iron, copper, and manganese.

Adjusted logistic regression analysis showed that the strongest association was among patients with a first-degree family history of Parkinson's disease (odds ratio 4.85).

Significant dose-response associations were seen between the development of Parkinson's disease/parkinsonism and exposure to pesticides (odds ratio 1.13 for low exposure vs. no exposure; OR 1.41 for high vs. no exposure) and ever having been knocked unconscious (OR 1.35 for once vs. never; OR 2.53 for more than once vs. never). The researchers stressed that the study did not make clear whether head injuries occurred before disease onset, adding that the association might be due to recall bias or an increased risk of falls in Parkinson disease. “Head injury has previously been linked to an increased risk of Parkinson's disease, but the results have been inconsistent,” they noted (Occup. Environ. Med. 2007 May 30 [Epub doi:10.1136/oem.2006.027003]).

Patients taking, for more than 1 year, medication for depression (OR 1.92) or anxiety (OR 1.95), or sleeping pills (OR 1.33) also were at significantly elevated risk for developing Parkinson's disease and parkinsonism. “One explanation for this finding is that depression has been associated with an increased risk of Parkinson's disease later in life. The use of psychotropic medication may simply reflect the well recognized psychiatric effects of Parkinson's disease,” the researchers wrote.

Exposure to pesticides is associated with a significantly increased risk of developing Parkinson's disease and other degenerative parkinsonian syndromes, a large European multicenter study shows.

Moreover, the risk of Parkinson's increased, depending on the level of pesticide exposure. That suggests a cause-and-effect relationship. “Many previous studies have found such an association, but few have established an exposure-response relationship,” investigators wrote in their report, which was published online in May in Occupational and Environmental Medicine.

Having ever been knocked unconscious and first-degree family history of Parkinson's disease also were each significantly associated with increased Parkinson's risk, wrote Dr. Finlay Dick of the department of environmental and occupational medicine at Aberdeen (Scotland) University and associates.

The researchers enrolled 959 patients with parkinsonism, including 767 with Parkinson's disease, from centers in Italy, Malta, Romania, Scotland, and Sweden, along with 1,989 age- and gender-matched controls from clinics or the community at each site. Subjects completed a questionnaire about lifetime occupational and hobby exposure to solvents, pesticides, iron, copper, and manganese.

Adjusted logistic regression analysis showed that the strongest association was among patients with a first-degree family history of Parkinson's disease (odds ratio 4.85).

Significant dose-response associations were seen between the development of Parkinson's disease/parkinsonism and exposure to pesticides (odds ratio 1.13 for low exposure vs. no exposure; OR 1.41 for high vs. no exposure) and ever having been knocked unconscious (OR 1.35 for once vs. never; OR 2.53 for more than once vs. never). The researchers stressed that the study did not make clear whether head injuries occurred before disease onset, adding that the association might be due to recall bias or an increased risk of falls in Parkinson disease. “Head injury has previously been linked to an increased risk of Parkinson's disease, but the results have been inconsistent,” they noted (Occup. Environ. Med. 2007 May 30 [Epub doi:10.1136/oem.2006.027003]).

Patients taking, for more than 1 year, medication for depression (OR 1.92) or anxiety (OR 1.95), or sleeping pills (OR 1.33) also were at significantly elevated risk for developing Parkinson's disease and parkinsonism. “One explanation for this finding is that depression has been associated with an increased risk of Parkinson's disease later in life. The use of psychotropic medication may simply reflect the well recognized psychiatric effects of Parkinson's disease,” the researchers wrote.

Publications
Publications
Topics
Article Type
Display Headline
Parkinson's Risk Increases With Greater Pesticide Exposure
Display Headline
Parkinson's Risk Increases With Greater Pesticide Exposure
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Gene Therapy Helps Parkinson's in Phase I Trial

Article Type
Changed
Mon, 04/16/2018 - 12:47
Display Headline
Gene Therapy Helps Parkinson's in Phase I Trial

Gene therapy for Parkinson's disease was safe and well tolerated by 11 patients, who also showed significant improvement in motor function at 1-year follow-up in an open-label phase I trial.

The 11 patients were treated at New York-Presbyterian Hospital, New York, with a therapy aimed at inhibiting the neurologic stimulation that causes motor dysfunction in Parkinson's disease patients. To accomplish this goal, surgeons delivered the glutamic acid decarboxylase gene to the neurons of the subthalamic nucleus using adeno-associated virus (AAV); no adverse events occurred.

At 1 year after treatment, the researchers found a statistically significant improvement in scores on the 56-point motor component of the Unified Parkinson's Disease Rating Scale (UPDRS)–by 24% when patients were tested 12 hours after withdrawal of medication, and by 27% an hour after patients had taken medication. Statistically significant improvements in scores were also recorded at 3 and 6 months (Lancet 2007;369:2097–105).

“Our results show that AAV-mediated gene transfer can be done safely in the human brain, with no evidence of substantial toxic effects or adverse events in the perioperative period” and for at least 1 year after treatment, wrote the researchers, led by Dr. Michael G. Kaplitt of Cornell University, New York. This open label, non-randomized phase I study “was not designed to assess the effectiveness of the intervention. Nonetheless, the clinical outcomes were encouraging.”

Should further research support this treatment for Parkinson's, it would have an advantage over deep-brain stimulation, which is being used to improve motor function, the researchers wrote.

“The absence of indwelling hardware reduces the risk of infection, and some patients with Parkinson's disease simply prefer not to have an implanted device,” they wrote. “Additionally, frequent visits for deep brain stimulation adjustments are not needed” with the investigational approach.

In an accompanying commentary, Dr. A. Jon Stoessl, of the Pacific Parkinson's Research Centre at the University of British Columbia, Vancouver, questioned whether the development of a gene-therapy approach would be superior to deep-brain stimulation.

“Apart from the avoidance of stimulator adjustments and potential hardware problems, what is the real advantage of this approach?” Dr. Stoessl wrote. He cautioned that the research did not study the long-term effect of changing the neurologic pathways. But he praised the study and wrote that the approach should be subjected to further randomized, double-blind evaluation.

Because of ethical concerns, the researchers were restricted to using the treatment in only the more symptomatic hemisphere of the brain. They recorded greater improvements in motor function in the contralateral side of the body, compared with the untreated side, on the UPDRS.

In addition, although they did not record any improvements in the activities of daily living scores during the course of the study, at 12 months they measured a trend toward improvement in the off-medication state.

The researchers performed PET scans on the patients at 12 months, and found substantial reductions in glucose metabolism in the thalamus and overall in the operated hemisphere, a change that they did not detect on the untreated side.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Gene therapy for Parkinson's disease was safe and well tolerated by 11 patients, who also showed significant improvement in motor function at 1-year follow-up in an open-label phase I trial.

The 11 patients were treated at New York-Presbyterian Hospital, New York, with a therapy aimed at inhibiting the neurologic stimulation that causes motor dysfunction in Parkinson's disease patients. To accomplish this goal, surgeons delivered the glutamic acid decarboxylase gene to the neurons of the subthalamic nucleus using adeno-associated virus (AAV); no adverse events occurred.

At 1 year after treatment, the researchers found a statistically significant improvement in scores on the 56-point motor component of the Unified Parkinson's Disease Rating Scale (UPDRS)–by 24% when patients were tested 12 hours after withdrawal of medication, and by 27% an hour after patients had taken medication. Statistically significant improvements in scores were also recorded at 3 and 6 months (Lancet 2007;369:2097–105).

“Our results show that AAV-mediated gene transfer can be done safely in the human brain, with no evidence of substantial toxic effects or adverse events in the perioperative period” and for at least 1 year after treatment, wrote the researchers, led by Dr. Michael G. Kaplitt of Cornell University, New York. This open label, non-randomized phase I study “was not designed to assess the effectiveness of the intervention. Nonetheless, the clinical outcomes were encouraging.”

Should further research support this treatment for Parkinson's, it would have an advantage over deep-brain stimulation, which is being used to improve motor function, the researchers wrote.

“The absence of indwelling hardware reduces the risk of infection, and some patients with Parkinson's disease simply prefer not to have an implanted device,” they wrote. “Additionally, frequent visits for deep brain stimulation adjustments are not needed” with the investigational approach.

In an accompanying commentary, Dr. A. Jon Stoessl, of the Pacific Parkinson's Research Centre at the University of British Columbia, Vancouver, questioned whether the development of a gene-therapy approach would be superior to deep-brain stimulation.

“Apart from the avoidance of stimulator adjustments and potential hardware problems, what is the real advantage of this approach?” Dr. Stoessl wrote. He cautioned that the research did not study the long-term effect of changing the neurologic pathways. But he praised the study and wrote that the approach should be subjected to further randomized, double-blind evaluation.

Because of ethical concerns, the researchers were restricted to using the treatment in only the more symptomatic hemisphere of the brain. They recorded greater improvements in motor function in the contralateral side of the body, compared with the untreated side, on the UPDRS.

In addition, although they did not record any improvements in the activities of daily living scores during the course of the study, at 12 months they measured a trend toward improvement in the off-medication state.

The researchers performed PET scans on the patients at 12 months, and found substantial reductions in glucose metabolism in the thalamus and overall in the operated hemisphere, a change that they did not detect on the untreated side.

Gene therapy for Parkinson's disease was safe and well tolerated by 11 patients, who also showed significant improvement in motor function at 1-year follow-up in an open-label phase I trial.

The 11 patients were treated at New York-Presbyterian Hospital, New York, with a therapy aimed at inhibiting the neurologic stimulation that causes motor dysfunction in Parkinson's disease patients. To accomplish this goal, surgeons delivered the glutamic acid decarboxylase gene to the neurons of the subthalamic nucleus using adeno-associated virus (AAV); no adverse events occurred.

At 1 year after treatment, the researchers found a statistically significant improvement in scores on the 56-point motor component of the Unified Parkinson's Disease Rating Scale (UPDRS)–by 24% when patients were tested 12 hours after withdrawal of medication, and by 27% an hour after patients had taken medication. Statistically significant improvements in scores were also recorded at 3 and 6 months (Lancet 2007;369:2097–105).

“Our results show that AAV-mediated gene transfer can be done safely in the human brain, with no evidence of substantial toxic effects or adverse events in the perioperative period” and for at least 1 year after treatment, wrote the researchers, led by Dr. Michael G. Kaplitt of Cornell University, New York. This open label, non-randomized phase I study “was not designed to assess the effectiveness of the intervention. Nonetheless, the clinical outcomes were encouraging.”

Should further research support this treatment for Parkinson's, it would have an advantage over deep-brain stimulation, which is being used to improve motor function, the researchers wrote.

“The absence of indwelling hardware reduces the risk of infection, and some patients with Parkinson's disease simply prefer not to have an implanted device,” they wrote. “Additionally, frequent visits for deep brain stimulation adjustments are not needed” with the investigational approach.

In an accompanying commentary, Dr. A. Jon Stoessl, of the Pacific Parkinson's Research Centre at the University of British Columbia, Vancouver, questioned whether the development of a gene-therapy approach would be superior to deep-brain stimulation.

“Apart from the avoidance of stimulator adjustments and potential hardware problems, what is the real advantage of this approach?” Dr. Stoessl wrote. He cautioned that the research did not study the long-term effect of changing the neurologic pathways. But he praised the study and wrote that the approach should be subjected to further randomized, double-blind evaluation.

Because of ethical concerns, the researchers were restricted to using the treatment in only the more symptomatic hemisphere of the brain. They recorded greater improvements in motor function in the contralateral side of the body, compared with the untreated side, on the UPDRS.

In addition, although they did not record any improvements in the activities of daily living scores during the course of the study, at 12 months they measured a trend toward improvement in the off-medication state.

The researchers performed PET scans on the patients at 12 months, and found substantial reductions in glucose metabolism in the thalamus and overall in the operated hemisphere, a change that they did not detect on the untreated side.

Publications
Publications
Topics
Article Type
Display Headline
Gene Therapy Helps Parkinson's in Phase I Trial
Display Headline
Gene Therapy Helps Parkinson's in Phase I Trial
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Disparities in Diabetes Treatment Remain After Britain Introduces P4P

Article Type
Changed
Tue, 05/03/2022 - 16:06
Display Headline
Disparities in Diabetes Treatment Remain After Britain Introduces P4P

Pay for performance significantly increased the percentage of patients achieving some diabetes treatment goals in one London primary care trust, but did not improve disparities between white British and ethnic minority patients, according to an observational study.

Researchers from Imperial College London, the University of Leicester, and Wandsworth Primary Care Research Center, London, reviewed electronic general practice records for 4,284 adult type 1 and 2 diabetes patients enrolled in 32 practices in Wandsworth Primary Care Trust before and after the National Health Service implemented a pay-for-performance contract with general practitioners in 2004 (PLoS Med. 2007 June 12 [Epub doi:10. 1371/journal.pmed.0040191]). National treatment targets for diabetes patients included HbA1c less than or equal to 7.0%, blood pressure less than 140/80 mm Hg, and total cholesterol less than or equal to 193 mg/dL.

In terms of cholesterol control, significantly more patients met the treatment target for total cholesterol in 2005, compared with 2003 (70.4% vs. 57.5%, respectively). Improvements were uniform across all ethnic groups except for the Bangladeshi group, which had significantly greater improvement in cholesterol control relative to the white British group after adjusting for age, gender, deprivation, and practice-level clustering. Similarly, there were also large improvements in hypertension control across all ethnic groups except for black Caribbean patients, who had significantly less improvement in blood pressure control, compared with white British patients after adjusting for age, gender, deprivation, and practice-level clustering, the researchers wrote.

The also found no significant difference in the percentage increase of white British or minority patients prescribed angiotensin-converting enzyme inhibitors and lipid-lowering drugs, although black Africans remained significantly behind white British patients in terms of the percentage prescribed the lipid-lowering drugs in 2005 (63.9% vs. 48.8%)

Researchers also found that significantly more patients reached recommended levels of HbA1c in 2005, compared with 2003 (37.4% vs. 35.1%) except for black Caribbean patients, who had significantly less improvement, compared with the white British group. They also found, compared with white British patients in 2005, that black Caribbean, black African, Indian, and Pakistani patients had significantly greater percentages prescribed oral hyperglycemic agents. And, while significantly more patients overall were treated with insulin in 2005, compared with 2003, the increases in insulin prescribing were significantly lower in the black African and south Asian groups, compared with the white British group.

“Although diabetes management improved in all ethnic groups after the introduction of pay-for-performance incentives in UK primary care, disparities in prescribing and intermediate clinical outcomes persisted,” wrote the researchers, led by Christopher Millett, of the Imperial College London's department of primary care and social medicine. “Hence, the main lesson from this study for health-care systems in other countries is that pay-for-performance by itself may not be sufficient to address ethnic disparities in the quality of care.”

The authors acknowledged that they were unable to definitively link changes in diabetes management to the pay-for-performance program, in part because their study design was unable to allow for a control group.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Pay for performance significantly increased the percentage of patients achieving some diabetes treatment goals in one London primary care trust, but did not improve disparities between white British and ethnic minority patients, according to an observational study.

Researchers from Imperial College London, the University of Leicester, and Wandsworth Primary Care Research Center, London, reviewed electronic general practice records for 4,284 adult type 1 and 2 diabetes patients enrolled in 32 practices in Wandsworth Primary Care Trust before and after the National Health Service implemented a pay-for-performance contract with general practitioners in 2004 (PLoS Med. 2007 June 12 [Epub doi:10. 1371/journal.pmed.0040191]). National treatment targets for diabetes patients included HbA1c less than or equal to 7.0%, blood pressure less than 140/80 mm Hg, and total cholesterol less than or equal to 193 mg/dL.

In terms of cholesterol control, significantly more patients met the treatment target for total cholesterol in 2005, compared with 2003 (70.4% vs. 57.5%, respectively). Improvements were uniform across all ethnic groups except for the Bangladeshi group, which had significantly greater improvement in cholesterol control relative to the white British group after adjusting for age, gender, deprivation, and practice-level clustering. Similarly, there were also large improvements in hypertension control across all ethnic groups except for black Caribbean patients, who had significantly less improvement in blood pressure control, compared with white British patients after adjusting for age, gender, deprivation, and practice-level clustering, the researchers wrote.

The also found no significant difference in the percentage increase of white British or minority patients prescribed angiotensin-converting enzyme inhibitors and lipid-lowering drugs, although black Africans remained significantly behind white British patients in terms of the percentage prescribed the lipid-lowering drugs in 2005 (63.9% vs. 48.8%)

Researchers also found that significantly more patients reached recommended levels of HbA1c in 2005, compared with 2003 (37.4% vs. 35.1%) except for black Caribbean patients, who had significantly less improvement, compared with the white British group. They also found, compared with white British patients in 2005, that black Caribbean, black African, Indian, and Pakistani patients had significantly greater percentages prescribed oral hyperglycemic agents. And, while significantly more patients overall were treated with insulin in 2005, compared with 2003, the increases in insulin prescribing were significantly lower in the black African and south Asian groups, compared with the white British group.

“Although diabetes management improved in all ethnic groups after the introduction of pay-for-performance incentives in UK primary care, disparities in prescribing and intermediate clinical outcomes persisted,” wrote the researchers, led by Christopher Millett, of the Imperial College London's department of primary care and social medicine. “Hence, the main lesson from this study for health-care systems in other countries is that pay-for-performance by itself may not be sufficient to address ethnic disparities in the quality of care.”

The authors acknowledged that they were unable to definitively link changes in diabetes management to the pay-for-performance program, in part because their study design was unable to allow for a control group.

Pay for performance significantly increased the percentage of patients achieving some diabetes treatment goals in one London primary care trust, but did not improve disparities between white British and ethnic minority patients, according to an observational study.

Researchers from Imperial College London, the University of Leicester, and Wandsworth Primary Care Research Center, London, reviewed electronic general practice records for 4,284 adult type 1 and 2 diabetes patients enrolled in 32 practices in Wandsworth Primary Care Trust before and after the National Health Service implemented a pay-for-performance contract with general practitioners in 2004 (PLoS Med. 2007 June 12 [Epub doi:10. 1371/journal.pmed.0040191]). National treatment targets for diabetes patients included HbA1c less than or equal to 7.0%, blood pressure less than 140/80 mm Hg, and total cholesterol less than or equal to 193 mg/dL.

In terms of cholesterol control, significantly more patients met the treatment target for total cholesterol in 2005, compared with 2003 (70.4% vs. 57.5%, respectively). Improvements were uniform across all ethnic groups except for the Bangladeshi group, which had significantly greater improvement in cholesterol control relative to the white British group after adjusting for age, gender, deprivation, and practice-level clustering. Similarly, there were also large improvements in hypertension control across all ethnic groups except for black Caribbean patients, who had significantly less improvement in blood pressure control, compared with white British patients after adjusting for age, gender, deprivation, and practice-level clustering, the researchers wrote.

The also found no significant difference in the percentage increase of white British or minority patients prescribed angiotensin-converting enzyme inhibitors and lipid-lowering drugs, although black Africans remained significantly behind white British patients in terms of the percentage prescribed the lipid-lowering drugs in 2005 (63.9% vs. 48.8%)

Researchers also found that significantly more patients reached recommended levels of HbA1c in 2005, compared with 2003 (37.4% vs. 35.1%) except for black Caribbean patients, who had significantly less improvement, compared with the white British group. They also found, compared with white British patients in 2005, that black Caribbean, black African, Indian, and Pakistani patients had significantly greater percentages prescribed oral hyperglycemic agents. And, while significantly more patients overall were treated with insulin in 2005, compared with 2003, the increases in insulin prescribing were significantly lower in the black African and south Asian groups, compared with the white British group.

“Although diabetes management improved in all ethnic groups after the introduction of pay-for-performance incentives in UK primary care, disparities in prescribing and intermediate clinical outcomes persisted,” wrote the researchers, led by Christopher Millett, of the Imperial College London's department of primary care and social medicine. “Hence, the main lesson from this study for health-care systems in other countries is that pay-for-performance by itself may not be sufficient to address ethnic disparities in the quality of care.”

The authors acknowledged that they were unable to definitively link changes in diabetes management to the pay-for-performance program, in part because their study design was unable to allow for a control group.

Publications
Publications
Topics
Article Type
Display Headline
Disparities in Diabetes Treatment Remain After Britain Introduces P4P
Display Headline
Disparities in Diabetes Treatment Remain After Britain Introduces P4P
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Aspirin's Chemopreventive Effects Seen 10 Years After Tx Initiation

Article Type
Changed
Thu, 12/06/2018 - 19:49
Display Headline
Aspirin's Chemopreventive Effects Seen 10 Years After Tx Initiation

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, agreed in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio [HR] 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The preventive effect was strongest in years 10–19, when the HR for aspirin users was 0.60, but a significantly reduced HR of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant preventive effect was seen at 0–9 years (HR 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already had a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg or 300 mg of aspirin, or placebo. Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers identified trial participants who had developed cancer through cancer registries and death certificates.

Taking 300 mg of aspirin a day for 5 years reduces the risk of colorectal cancer. PhotoDisc, Inc.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, agreed in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio [HR] 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The preventive effect was strongest in years 10–19, when the HR for aspirin users was 0.60, but a significantly reduced HR of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant preventive effect was seen at 0–9 years (HR 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already had a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg or 300 mg of aspirin, or placebo. Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers identified trial participants who had developed cancer through cancer registries and death certificates.

Taking 300 mg of aspirin a day for 5 years reduces the risk of colorectal cancer. PhotoDisc, Inc.

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, agreed in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio [HR] 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The preventive effect was strongest in years 10–19, when the HR for aspirin users was 0.60, but a significantly reduced HR of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant preventive effect was seen at 0–9 years (HR 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already had a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg or 300 mg of aspirin, or placebo. Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers identified trial participants who had developed cancer through cancer registries and death certificates.

Taking 300 mg of aspirin a day for 5 years reduces the risk of colorectal cancer. PhotoDisc, Inc.

Publications
Publications
Topics
Article Type
Display Headline
Aspirin's Chemopreventive Effects Seen 10 Years After Tx Initiation
Display Headline
Aspirin's Chemopreventive Effects Seen 10 Years After Tx Initiation
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Aspirin's Chemopreventive Effects Seen at 10 Years

Article Type
Changed
Thu, 01/17/2019 - 23:28
Display Headline
Aspirin's Chemopreventive Effects Seen at 10 Years

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, concurred in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The chemopreventive effect was strongest in years 10–19, when the hazard ratio for aspirin users was 0.60, but a significantly reduced hazard ratio of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant chemopreventive effect was seen at 0–9 years (hazard ration 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already experienced a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg of aspirin, 300 mg of aspirin, or placebo.

Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers reported that they identified trial participants who had developed cancer through cancer registries and death certificates.

The findings are not sufficient enough to warrant a recommendation.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, concurred in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The chemopreventive effect was strongest in years 10–19, when the hazard ratio for aspirin users was 0.60, but a significantly reduced hazard ratio of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant chemopreventive effect was seen at 0–9 years (hazard ration 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already experienced a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg of aspirin, 300 mg of aspirin, or placebo.

Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers reported that they identified trial participants who had developed cancer through cancer registries and death certificates.

The findings are not sufficient enough to warrant a recommendation.

Taking 300 mg of aspirin daily for at least 5 years was shown to prevent colorectal cancer in an analysis of two large randomized trials. The effect was seen beginning 10 years after treatment was initiated.

Although this strategy might be effective in certain high-risk groups, further research is needed to elucidate the risks and benefits of aspirin chemoprevention in various clinical settings, the researchers wrote. The effectiveness of colonoscopy screening and the risk of bleeding complications with long-term aspirin use also should be considered, they noted.

Dr. Andrew Chan of the gastrointestinal unit at Massachusetts General Hospital, Boston, concurred in an accompanying commentary. “These findings are not sufficient to warrant a recommendation for the general population to use aspirin for cancer prevention,” he wrote (Lancet 2007;369:1577–8).

Previous observational studies had reported a decreased incidence of colorectal cancers in regular users of aspirin, but two large trials did not demonstrate a decreased risk over 10 years of follow-up. Longer follow-up is needed, given that the delay is 10–15 years between initiation of development of an adenoma and colorectal cancer, wrote Dr. Peter Rothwell, professor of clinical neurology at the University of Oxford (England), and associates (Lancet 2007;369:1603–13).

Their analysis focused on the British Doctors Aspirin Trial and the UK Transient Ischaemic Attack Aspirin Trial; there was a median follow-up of 23 years in both trials.

During that follow-up period, subjects who took at least 300 mg of aspirin a day for at least 5 years were significantly less likely to develop colorectal cancer than were controls (hazard ratio 0.63), according to a pooled analysis of the two trials. The researchers found no significant effect on any other type of cancer.

The chemopreventive effect was strongest in years 10–19, when the hazard ratio for aspirin users was 0.60, but a significantly reduced hazard ratio of 0.74 was seen in years 20 and later for the subjects who took aspirin. No significant chemopreventive effect was seen at 0–9 years (hazard ration 0.92).

The British Doctors Aspirin Trial randomized doctors in 1978 and 1979 into a group of 3,429 taking a daily dose of 500 mg of aspirin and a control group of 1,710 who took nothing. Treatment continued for 5–6 years.

The UK Transient Ischaemic Attack Aspirin Trial randomized 2,449 patients over age 40 who had already experienced a transient ischemic attack or mild ischemic stroke to receive daily doses of either 1,200 mg of aspirin, 300 mg of aspirin, or placebo.

Recruitment took place between 1979 and 1985, with the trial ending in 1986. The researchers performed a subgroup analysis of only those patients who took aspirin for at least 5 years.

The researchers reported that they identified trial participants who had developed cancer through cancer registries and death certificates.

The findings are not sufficient enough to warrant a recommendation.

Publications
Publications
Topics
Article Type
Display Headline
Aspirin's Chemopreventive Effects Seen at 10 Years
Display Headline
Aspirin's Chemopreventive Effects Seen at 10 Years
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Hydrolyzed Formula After Breast-Feeding May Cut Atopy Risk

Article Type
Changed
Tue, 08/28/2018 - 09:04
Display Headline
Hydrolyzed Formula After Breast-Feeding May Cut Atopy Risk

Babies who have been breast-fed for 4 months and then receive certain types of hydrolyzed formula have a significantly lower risk of developing atopic dermatitis, compared with those given a cow's milk-based formula after breast-feeding, according to results of a 3-year randomized German study of more than 2,000 babies.

One case of atopic dermatitis (AD) could be averted if 20–25 babies were fed either one of two types of hydrolyzed formulas rather than cow's milk-based formulas. Use of the hydrolyzed formulas did not affect the incidence of asthma, however.

“The preventive effect [against AD] developed in the first year and persisted into the third year, indicating real disease reduction rather than postponement of disease onset,” wrote the researchers, led by Dr. Andrea von Berg from the pediatrics department at Marien-Hospital, Wesel, Germany.

“Although it remains controversial whether breast-feeding reduces the risk for allergy in high-risk infants, breast-feeding is the gold standard for infant nutrition,” they wrote. “It was therefore not the goal of our study to question this gold standard and show that hydrolyzates are worse or better. Instead, we wanted to evaluate, in case of formula feeding (for whatever reason), which formula would be the best alternative to reduce the risk for (allergic manifestations).”

The researchers enrolled 2,252 infants who had at least one parent or sibling with an atopic syndrome. The infants were randomized into groups fed one of three hydrolyzed formulas (extensively hydrolyzed casein formula and partially or extensively hydrolyzed whey formula). An observational arm of 889 babies exclusively breast-fed was also included (J. Allergy Clin. Immunol. 2007;119:718–25).

Infants were exclusively breast-fed during the first 4 months, with the introduction of solid food postponed until after 4 months. Researchers tracked diagnoses of AD, urticaria, food allergies, and asthma.

After 3 years, 904 babies on formula and 543 babies in the breast-feeding arm remained in the study population.

Compared with those in the cow's milk-based formula group, infants fed the partially hydrolyzed whey formula (odds ratio 0.57) and those fed the extensively hydrolyzed casein formula (odds ratio 0.43) demonstrated at 1 year a significantly reduced risk of developing any of the allergic manifestations studied, after adjustment for family history of AD and asthma, sex, and maternal smoking.

By the third year, that effect was gone for allergic conditions as a whole, but the protective effect persisted to 3 years for AD. The 3-year cumulative risk of developing AD was lower in children fed the partially hydrolyzed whey formula (odds ratio 0.60) and those fed extensively hydrolyzed casein formula (odds ratio 0.53), compared with those in the cow's milk-based formula group.

Analyzing outcomes based on family history, the only significant effect identified was among those with a family history of AD who were fed extensively hydrolyzed casein formulas; such babies were at lower risk of AD than those given cow's milk-based formula (odds ratio 0.53).

“This is indeed the first study to suggest that the allergic phenotype in the family rather than a biparental family history modifies the effect of nutritional intervention and may be considered when deciding which hydrolyzate should be given,” the researchers wrote. On an intention-to-treat basis, feeding 20 infants extensively hydrolyzed casein formula and 25 partially hydrolyzed whey formula averts a single case of AD. In the smaller group with a family history of AD, the numbers were 11 and 51, respectively.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Babies who have been breast-fed for 4 months and then receive certain types of hydrolyzed formula have a significantly lower risk of developing atopic dermatitis, compared with those given a cow's milk-based formula after breast-feeding, according to results of a 3-year randomized German study of more than 2,000 babies.

One case of atopic dermatitis (AD) could be averted if 20–25 babies were fed either one of two types of hydrolyzed formulas rather than cow's milk-based formulas. Use of the hydrolyzed formulas did not affect the incidence of asthma, however.

“The preventive effect [against AD] developed in the first year and persisted into the third year, indicating real disease reduction rather than postponement of disease onset,” wrote the researchers, led by Dr. Andrea von Berg from the pediatrics department at Marien-Hospital, Wesel, Germany.

“Although it remains controversial whether breast-feeding reduces the risk for allergy in high-risk infants, breast-feeding is the gold standard for infant nutrition,” they wrote. “It was therefore not the goal of our study to question this gold standard and show that hydrolyzates are worse or better. Instead, we wanted to evaluate, in case of formula feeding (for whatever reason), which formula would be the best alternative to reduce the risk for (allergic manifestations).”

The researchers enrolled 2,252 infants who had at least one parent or sibling with an atopic syndrome. The infants were randomized into groups fed one of three hydrolyzed formulas (extensively hydrolyzed casein formula and partially or extensively hydrolyzed whey formula). An observational arm of 889 babies exclusively breast-fed was also included (J. Allergy Clin. Immunol. 2007;119:718–25).

Infants were exclusively breast-fed during the first 4 months, with the introduction of solid food postponed until after 4 months. Researchers tracked diagnoses of AD, urticaria, food allergies, and asthma.

After 3 years, 904 babies on formula and 543 babies in the breast-feeding arm remained in the study population.

Compared with those in the cow's milk-based formula group, infants fed the partially hydrolyzed whey formula (odds ratio 0.57) and those fed the extensively hydrolyzed casein formula (odds ratio 0.43) demonstrated at 1 year a significantly reduced risk of developing any of the allergic manifestations studied, after adjustment for family history of AD and asthma, sex, and maternal smoking.

By the third year, that effect was gone for allergic conditions as a whole, but the protective effect persisted to 3 years for AD. The 3-year cumulative risk of developing AD was lower in children fed the partially hydrolyzed whey formula (odds ratio 0.60) and those fed extensively hydrolyzed casein formula (odds ratio 0.53), compared with those in the cow's milk-based formula group.

Analyzing outcomes based on family history, the only significant effect identified was among those with a family history of AD who were fed extensively hydrolyzed casein formulas; such babies were at lower risk of AD than those given cow's milk-based formula (odds ratio 0.53).

“This is indeed the first study to suggest that the allergic phenotype in the family rather than a biparental family history modifies the effect of nutritional intervention and may be considered when deciding which hydrolyzate should be given,” the researchers wrote. On an intention-to-treat basis, feeding 20 infants extensively hydrolyzed casein formula and 25 partially hydrolyzed whey formula averts a single case of AD. In the smaller group with a family history of AD, the numbers were 11 and 51, respectively.

Babies who have been breast-fed for 4 months and then receive certain types of hydrolyzed formula have a significantly lower risk of developing atopic dermatitis, compared with those given a cow's milk-based formula after breast-feeding, according to results of a 3-year randomized German study of more than 2,000 babies.

One case of atopic dermatitis (AD) could be averted if 20–25 babies were fed either one of two types of hydrolyzed formulas rather than cow's milk-based formulas. Use of the hydrolyzed formulas did not affect the incidence of asthma, however.

“The preventive effect [against AD] developed in the first year and persisted into the third year, indicating real disease reduction rather than postponement of disease onset,” wrote the researchers, led by Dr. Andrea von Berg from the pediatrics department at Marien-Hospital, Wesel, Germany.

“Although it remains controversial whether breast-feeding reduces the risk for allergy in high-risk infants, breast-feeding is the gold standard for infant nutrition,” they wrote. “It was therefore not the goal of our study to question this gold standard and show that hydrolyzates are worse or better. Instead, we wanted to evaluate, in case of formula feeding (for whatever reason), which formula would be the best alternative to reduce the risk for (allergic manifestations).”

The researchers enrolled 2,252 infants who had at least one parent or sibling with an atopic syndrome. The infants were randomized into groups fed one of three hydrolyzed formulas (extensively hydrolyzed casein formula and partially or extensively hydrolyzed whey formula). An observational arm of 889 babies exclusively breast-fed was also included (J. Allergy Clin. Immunol. 2007;119:718–25).

Infants were exclusively breast-fed during the first 4 months, with the introduction of solid food postponed until after 4 months. Researchers tracked diagnoses of AD, urticaria, food allergies, and asthma.

After 3 years, 904 babies on formula and 543 babies in the breast-feeding arm remained in the study population.

Compared with those in the cow's milk-based formula group, infants fed the partially hydrolyzed whey formula (odds ratio 0.57) and those fed the extensively hydrolyzed casein formula (odds ratio 0.43) demonstrated at 1 year a significantly reduced risk of developing any of the allergic manifestations studied, after adjustment for family history of AD and asthma, sex, and maternal smoking.

By the third year, that effect was gone for allergic conditions as a whole, but the protective effect persisted to 3 years for AD. The 3-year cumulative risk of developing AD was lower in children fed the partially hydrolyzed whey formula (odds ratio 0.60) and those fed extensively hydrolyzed casein formula (odds ratio 0.53), compared with those in the cow's milk-based formula group.

Analyzing outcomes based on family history, the only significant effect identified was among those with a family history of AD who were fed extensively hydrolyzed casein formulas; such babies were at lower risk of AD than those given cow's milk-based formula (odds ratio 0.53).

“This is indeed the first study to suggest that the allergic phenotype in the family rather than a biparental family history modifies the effect of nutritional intervention and may be considered when deciding which hydrolyzate should be given,” the researchers wrote. On an intention-to-treat basis, feeding 20 infants extensively hydrolyzed casein formula and 25 partially hydrolyzed whey formula averts a single case of AD. In the smaller group with a family history of AD, the numbers were 11 and 51, respectively.

Publications
Publications
Topics
Article Type
Display Headline
Hydrolyzed Formula After Breast-Feeding May Cut Atopy Risk
Display Headline
Hydrolyzed Formula After Breast-Feeding May Cut Atopy Risk
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Alendronate Praised for Osteoporosis Prevention

Article Type
Changed
Thu, 12/06/2018 - 09:41
Display Headline
Alendronate Praised for Osteoporosis Prevention

Alendronate is the only proven cost-effective medication for initiation of primary or secondary prevention of osteoporosis, according to draft assessments issued March 5 by the agency that determines which drugs the National Health Service uses in England and Wales.

If affirmed later this year, the National Institute for Health and Clinical Effectiveness (NICE) draft document would rule out the use of strontium ranelate, etidronate, risedronate, and raloxifene for primary prevention of osteoporosis in women with at least one clinical risk factor.

For initiation of secondary prevention, a separate NICE document would rule out those four drugs and teriparatide.

NICE found that nonproprietary alendronate, the second-cheapest drug, was as effective as risedronate, etidronate, strontium, and raloxifene. As a result, NICE's drafts ruled that alendronate was the most cost-effective medication. Only etidronate, at £85.65 ($171) a year, could match alendronate on price. However, the committee questioned the evidence regarding etidronate's effectiveness.

At an annual cost of £95.03 ($190) for once-weekly treatment, generic alendronate for primary prevention in patients at high risk of osteoporosis was estimated to cost £17,632 ($35,288) or less per quality-adjusted life year, a measurement of a single year in perfect health, compared with no treatment, according to the NICE committee that appraised the medications.

For initiation of secondary prevention among women with bone mineral density more than 2.5 standard deviations below normal confirmed by bone-density scanning, it was estimated to cost less than £27,422 ($54,881) per quality-adjusted life year, depending on age and treatment strategy, the committee said.

A NICE document on secondary prevention of osteoporosis issued in January 2005 recommended the use of all three bisphosphonates, raloxifene, and teriparatide for secondary prevention in postmenopausal women who have already had an osteoporosis-related fracture.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Alendronate is the only proven cost-effective medication for initiation of primary or secondary prevention of osteoporosis, according to draft assessments issued March 5 by the agency that determines which drugs the National Health Service uses in England and Wales.

If affirmed later this year, the National Institute for Health and Clinical Effectiveness (NICE) draft document would rule out the use of strontium ranelate, etidronate, risedronate, and raloxifene for primary prevention of osteoporosis in women with at least one clinical risk factor.

For initiation of secondary prevention, a separate NICE document would rule out those four drugs and teriparatide.

NICE found that nonproprietary alendronate, the second-cheapest drug, was as effective as risedronate, etidronate, strontium, and raloxifene. As a result, NICE's drafts ruled that alendronate was the most cost-effective medication. Only etidronate, at £85.65 ($171) a year, could match alendronate on price. However, the committee questioned the evidence regarding etidronate's effectiveness.

At an annual cost of £95.03 ($190) for once-weekly treatment, generic alendronate for primary prevention in patients at high risk of osteoporosis was estimated to cost £17,632 ($35,288) or less per quality-adjusted life year, a measurement of a single year in perfect health, compared with no treatment, according to the NICE committee that appraised the medications.

For initiation of secondary prevention among women with bone mineral density more than 2.5 standard deviations below normal confirmed by bone-density scanning, it was estimated to cost less than £27,422 ($54,881) per quality-adjusted life year, depending on age and treatment strategy, the committee said.

A NICE document on secondary prevention of osteoporosis issued in January 2005 recommended the use of all three bisphosphonates, raloxifene, and teriparatide for secondary prevention in postmenopausal women who have already had an osteoporosis-related fracture.

Alendronate is the only proven cost-effective medication for initiation of primary or secondary prevention of osteoporosis, according to draft assessments issued March 5 by the agency that determines which drugs the National Health Service uses in England and Wales.

If affirmed later this year, the National Institute for Health and Clinical Effectiveness (NICE) draft document would rule out the use of strontium ranelate, etidronate, risedronate, and raloxifene for primary prevention of osteoporosis in women with at least one clinical risk factor.

For initiation of secondary prevention, a separate NICE document would rule out those four drugs and teriparatide.

NICE found that nonproprietary alendronate, the second-cheapest drug, was as effective as risedronate, etidronate, strontium, and raloxifene. As a result, NICE's drafts ruled that alendronate was the most cost-effective medication. Only etidronate, at £85.65 ($171) a year, could match alendronate on price. However, the committee questioned the evidence regarding etidronate's effectiveness.

At an annual cost of £95.03 ($190) for once-weekly treatment, generic alendronate for primary prevention in patients at high risk of osteoporosis was estimated to cost £17,632 ($35,288) or less per quality-adjusted life year, a measurement of a single year in perfect health, compared with no treatment, according to the NICE committee that appraised the medications.

For initiation of secondary prevention among women with bone mineral density more than 2.5 standard deviations below normal confirmed by bone-density scanning, it was estimated to cost less than £27,422 ($54,881) per quality-adjusted life year, depending on age and treatment strategy, the committee said.

A NICE document on secondary prevention of osteoporosis issued in January 2005 recommended the use of all three bisphosphonates, raloxifene, and teriparatide for secondary prevention in postmenopausal women who have already had an osteoporosis-related fracture.

Publications
Publications
Topics
Article Type
Display Headline
Alendronate Praised for Osteoporosis Prevention
Display Headline
Alendronate Praised for Osteoporosis Prevention
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Study Links High Beef Consumption in Mothers to Lowered Sperm Counts in Sons

Article Type
Changed
Tue, 08/28/2018 - 09:04
Display Headline
Study Links High Beef Consumption in Mothers to Lowered Sperm Counts in Sons

High maternal beef consumption in pregnancy was associated with significantly decreased sperm concentration in adult male offspring, investigators reported in Human Reproduction.

The study, which included 387 fertile men born between 1949 and 1983 and living in the United States, found that sons of women who ate at least seven servings of beef weekly had a mean sperm count that was 24% lower than did sons of mothers who ate less.

Investigators raised the possibility that the presence of anabolic steroids and other xenobiotics in beef—may have affected the men's testicular development in utero, resulting in lowered sperm counts.

They noted that diethylstilbestrol (DES), the first synthetic hormone, was used in cattle from 1954 to 1979 in the United States. After DES was banned, other anabolic hormones continued to be used legally (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem068]).

Of the offspring of high beef consumers, almost 18% met the World Health Organization threshold of subfertility (20 million sperm/mL of seminal fluid), compared with 5.7% of the sons of mothers who ate seven or fewer servings of beef per week. This was a statistically significant difference, Shanna Swan, Ph.D., director of the Center for Reproductive Epidemiology at the University of Rochester and her associates wrote.

Between 1999 and 2005, the researchers recruited 773 men born 1949–1983 from five U.S. cities. The men provided semen samples.

Mothers of 387 of the men provided diet information by completing a questionnaire. A total of 26% reported they ate more than seven servings per week of any type of red meat. Thirteen percent said they consumed more than seven servings of beef weekly.

Sons of women who ate more than seven servings of beef per week had sperm concentrations of 43.1 million/mL, compared with 56.9 million/mL in those sons whose mothers ate less beef. This 24% difference was statistically significant. Mothers' consumption of other red meat, fish, chicken, and vegetables were unrelated to their sons' sperm concentrations.

In addition to the higher proportion of men meeting the WHO definition of subfertility, the sons of the high beef consumers also were significantly more likely to self-report previous subfertility (9.8% vs. 5.7%), after adjustment for age.

The researchers noted that self-reporting of beef consumption is likely to be subject to error. In addition, they noted that the steroids in animal feeds might have affected the men as children or adults, and persistent pesticides and industrial chemicals in meat also might play a role. To clarify the role of steroids, the researchers suggested a study of men born in Europe after 1988, when steroids were banned in beef sold and produced there.

In an editorial accompanying the report, Frederick S. vom Saal, Ph.D., of the University of Missouri, Columbia, noted that although DES was banned in the United States in 1979, “administration of combinations of other hormonally active drugs to beef cattle has continued to be a common practice in the [United States].”

He added that, “if xenobiotics are causally involved, the finding of reduced semen quality should be the 'tip of the iceberg,' and other reproductive pathologies should also be observed.”

Dr. Saal urged regulatory bodies to revisit the risks associated with exposure during development to hormonal residues in beef (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem092]).

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

High maternal beef consumption in pregnancy was associated with significantly decreased sperm concentration in adult male offspring, investigators reported in Human Reproduction.

The study, which included 387 fertile men born between 1949 and 1983 and living in the United States, found that sons of women who ate at least seven servings of beef weekly had a mean sperm count that was 24% lower than did sons of mothers who ate less.

Investigators raised the possibility that the presence of anabolic steroids and other xenobiotics in beef—may have affected the men's testicular development in utero, resulting in lowered sperm counts.

They noted that diethylstilbestrol (DES), the first synthetic hormone, was used in cattle from 1954 to 1979 in the United States. After DES was banned, other anabolic hormones continued to be used legally (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem068]).

Of the offspring of high beef consumers, almost 18% met the World Health Organization threshold of subfertility (20 million sperm/mL of seminal fluid), compared with 5.7% of the sons of mothers who ate seven or fewer servings of beef per week. This was a statistically significant difference, Shanna Swan, Ph.D., director of the Center for Reproductive Epidemiology at the University of Rochester and her associates wrote.

Between 1999 and 2005, the researchers recruited 773 men born 1949–1983 from five U.S. cities. The men provided semen samples.

Mothers of 387 of the men provided diet information by completing a questionnaire. A total of 26% reported they ate more than seven servings per week of any type of red meat. Thirteen percent said they consumed more than seven servings of beef weekly.

Sons of women who ate more than seven servings of beef per week had sperm concentrations of 43.1 million/mL, compared with 56.9 million/mL in those sons whose mothers ate less beef. This 24% difference was statistically significant. Mothers' consumption of other red meat, fish, chicken, and vegetables were unrelated to their sons' sperm concentrations.

In addition to the higher proportion of men meeting the WHO definition of subfertility, the sons of the high beef consumers also were significantly more likely to self-report previous subfertility (9.8% vs. 5.7%), after adjustment for age.

The researchers noted that self-reporting of beef consumption is likely to be subject to error. In addition, they noted that the steroids in animal feeds might have affected the men as children or adults, and persistent pesticides and industrial chemicals in meat also might play a role. To clarify the role of steroids, the researchers suggested a study of men born in Europe after 1988, when steroids were banned in beef sold and produced there.

In an editorial accompanying the report, Frederick S. vom Saal, Ph.D., of the University of Missouri, Columbia, noted that although DES was banned in the United States in 1979, “administration of combinations of other hormonally active drugs to beef cattle has continued to be a common practice in the [United States].”

He added that, “if xenobiotics are causally involved, the finding of reduced semen quality should be the 'tip of the iceberg,' and other reproductive pathologies should also be observed.”

Dr. Saal urged regulatory bodies to revisit the risks associated with exposure during development to hormonal residues in beef (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem092]).

High maternal beef consumption in pregnancy was associated with significantly decreased sperm concentration in adult male offspring, investigators reported in Human Reproduction.

The study, which included 387 fertile men born between 1949 and 1983 and living in the United States, found that sons of women who ate at least seven servings of beef weekly had a mean sperm count that was 24% lower than did sons of mothers who ate less.

Investigators raised the possibility that the presence of anabolic steroids and other xenobiotics in beef—may have affected the men's testicular development in utero, resulting in lowered sperm counts.

They noted that diethylstilbestrol (DES), the first synthetic hormone, was used in cattle from 1954 to 1979 in the United States. After DES was banned, other anabolic hormones continued to be used legally (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem068]).

Of the offspring of high beef consumers, almost 18% met the World Health Organization threshold of subfertility (20 million sperm/mL of seminal fluid), compared with 5.7% of the sons of mothers who ate seven or fewer servings of beef per week. This was a statistically significant difference, Shanna Swan, Ph.D., director of the Center for Reproductive Epidemiology at the University of Rochester and her associates wrote.

Between 1999 and 2005, the researchers recruited 773 men born 1949–1983 from five U.S. cities. The men provided semen samples.

Mothers of 387 of the men provided diet information by completing a questionnaire. A total of 26% reported they ate more than seven servings per week of any type of red meat. Thirteen percent said they consumed more than seven servings of beef weekly.

Sons of women who ate more than seven servings of beef per week had sperm concentrations of 43.1 million/mL, compared with 56.9 million/mL in those sons whose mothers ate less beef. This 24% difference was statistically significant. Mothers' consumption of other red meat, fish, chicken, and vegetables were unrelated to their sons' sperm concentrations.

In addition to the higher proportion of men meeting the WHO definition of subfertility, the sons of the high beef consumers also were significantly more likely to self-report previous subfertility (9.8% vs. 5.7%), after adjustment for age.

The researchers noted that self-reporting of beef consumption is likely to be subject to error. In addition, they noted that the steroids in animal feeds might have affected the men as children or adults, and persistent pesticides and industrial chemicals in meat also might play a role. To clarify the role of steroids, the researchers suggested a study of men born in Europe after 1988, when steroids were banned in beef sold and produced there.

In an editorial accompanying the report, Frederick S. vom Saal, Ph.D., of the University of Missouri, Columbia, noted that although DES was banned in the United States in 1979, “administration of combinations of other hormonally active drugs to beef cattle has continued to be a common practice in the [United States].”

He added that, “if xenobiotics are causally involved, the finding of reduced semen quality should be the 'tip of the iceberg,' and other reproductive pathologies should also be observed.”

Dr. Saal urged regulatory bodies to revisit the risks associated with exposure during development to hormonal residues in beef (Human Reprod. 2007 [Epub DOI:10.1093/humrep/dem092]).

Publications
Publications
Topics
Article Type
Display Headline
Study Links High Beef Consumption in Mothers to Lowered Sperm Counts in Sons
Display Headline
Study Links High Beef Consumption in Mothers to Lowered Sperm Counts in Sons
Article Source

PURLs Copyright

Inside the Article

Article PDF Media