Never Say Never: Surgical Errors Remain a Concern

Universal Protocol is no magic wand
Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Never Say Never: Surgical Errors Remain a Concern

The frequency of surgical complications involving a wrong site or wrong patient remains high, even in the era of the Universal Protocol.

The Joint Commission introduced the Universal Protocol to ensure the correct patient, site, and procedure. Although it became effective July 1, 2004, there still exists a lack of data about the true incidence of wrong-patient and wrong-site operations, called “never events,” according to new research reported in the October issue of Archives of Surgery.

To determine the frequency, root causes, and outcomes of these never events, Dr. Philip F. Stahel of Denver Health Medical Center and the University of Colorado School of Medicine, and colleagues performed a retrospective analysis of the Colorado Physician Insurance Company’s (COPIC’s) comprehensive database (Arch. Surg. 2010;145:978-84).

Dr. Stahel and his colleagues screened 27,370 physician self-reported adverse occurrences between Jan. 1, 2002, and June 1, 2008. The researchers initially found 119 wrong-site and 29 wrong-patient procedures, but eliminated cases they could not classify as being a factual wrong site or wrong patient. The final analysis consisted of 107 wrong-site and 25 wrong-patient procedures.

Analysis of root causes found errors in:

  • Diagnosis, a root cause for 14 (56.0%) wrong-patient and 13 (12.1%) wrong-site procedures.
  • Communication, 25 (100%) wrong-patient and 52 (48.6%) wrong-site procedures.
  • Judgment, 2 (8.0%) wrong-patient and 91 (85.0%) wrong-site procedures.
  • Treatment, 22 (88.0%) wrong-patient and 9 (92.5%) wrong-site procedures.

In addition, system issues were a root cause in 21 (84.0%) wrong-patient procedures and 78 (72.9%) wrong-site procedures. This category included time-out not being performed in 77 (72%) wrong-site cases.

Wrong-patient cases often were due to a mix-up of patients’ medical records, radiographs, and laboratory or biopsy samples, as well as errors in communication.

Next, the researchers looked at outcomes, namely:

  • Death, which occurred in 1 patient (0.9%) secondary to a wrong-site procedure.
  • Significant harm, which occurred in 5 (20%) wrong-patient and 38 (35.5%) wrong-site cases.
  • Minimal harm or functional impairment, which occurred in 8 (32%) wrong-patient and 65 (60.7%) wrong-site cases.
  • No-harm event, which occurred in 9 (36%) wrong-patient and 3 (2.8%) wrong-site cases.

The most frequent specialties involved in wrong-patient procedures were internal medicine (24.0% of cases) as well as family or general practice, pathology, urology, obstetrics-gynecology, and pediatrics (8.0% each). The most frequent specialties involved in wrong-site occurrences were orthopedic surgery (22.4% of cases), general surgery (16.8%), and anesthesiology (12.1%).

Overall, nonsurgical specialties were involved in 14 (48.3%) wrong-patient and 29 (27.1%) wrong-site cases.

“The findings from the present study emphasize a continuing and concerning occurrence of wrong-site and wrong-patient procedures in the current era of the Universal Protocol, leading to frequent patient harm and, rarely, patient death,” the researchers said. “Shockingly, nonsurgical disciplines equally contribute to patient injuries related to wrong-site procedures.”

The researchers believe these findings warrant expansion of the Universal Protocol to nonsurgical specialties.

Limitations of the study include the restricted coverage of the COPIC database to about 6,000 physicians in Colorado; the potential for subjective bias in determining root causes; and the designation of inadequate planning for the procedure, which represents a generic category.

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin,” Dr. Marin A. Makary said in an accompanying editorial commentary. Consider the fact that the Universal Protocol has been in place since 2004, yet Dr. Stahel and his colleagues found that preventable errors, or “never events,” exist at alarming rates, Dr. Makary noted.

Further, the number of wrong-site procedures that this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here, Dr. Makary indicated.

He suggested that a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores, he said (Arch. Surg. 2010;145:984).

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors, concluded Dr. Makary, who is with the department of surgery at Johns Hopkins University, Baltimore.

Coauthors on the research analysis reported the following conflicts: Dr. Ted J. Clarke is the chief executive officer of COPIC; Dr. Jeffrey Varnell and Dr. Alan Lembitz are employed by COPIC; and Dr. Michael S. Victoroff and Dr. Dennis J. Boyle are consultants for COPIC.

 

 

Dr. Makary reported that he had no disclosures.

Body

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin.” Consider: The Universal Protocol has been in place since 2004, yet Dr. Philip F. Stahel and colleagues found that preventable errors, or “never events,” exist at alarming rates. Further, the number of wrong-site procedures this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here.

Perhaps a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores.

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors.

Martin A. Makary, M.D., M.P.H., is with the department of surgery at Johns Hopkins University, Baltimore. His remarks were made in an accompanying commentary to the article. He has no conflicts to disclose.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Surgical Errors, Universal Protocol, Joint Commission, Philip F. Stahe, COPIC, Colorado Physician Insurance Company, Diagnosis, Communication, Judgment, Treatment, Death, Significant harm, Minimal harm, functional impairment, internal medicine, family practice, pathology, urology, obstetrics-gynecology, pediatrics, orthopedic surgery, general surgery, anesthesiology, Marin A. Makary, National Surgical Quality Improvement Program, NSQIP
Author and Disclosure Information

Author and Disclosure Information

Body

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin.” Consider: The Universal Protocol has been in place since 2004, yet Dr. Philip F. Stahel and colleagues found that preventable errors, or “never events,” exist at alarming rates. Further, the number of wrong-site procedures this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here.

Perhaps a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores.

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors.

Martin A. Makary, M.D., M.P.H., is with the department of surgery at Johns Hopkins University, Baltimore. His remarks were made in an accompanying commentary to the article. He has no conflicts to disclose.

Body

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin.” Consider: The Universal Protocol has been in place since 2004, yet Dr. Philip F. Stahel and colleagues found that preventable errors, or “never events,” exist at alarming rates. Further, the number of wrong-site procedures this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here.

Perhaps a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores.

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors.

Martin A. Makary, M.D., M.P.H., is with the department of surgery at Johns Hopkins University, Baltimore. His remarks were made in an accompanying commentary to the article. He has no conflicts to disclose.

Title
Universal Protocol is no magic wand
Universal Protocol is no magic wand

The frequency of surgical complications involving a wrong site or wrong patient remains high, even in the era of the Universal Protocol.

The Joint Commission introduced the Universal Protocol to ensure the correct patient, site, and procedure. Although it became effective July 1, 2004, there still exists a lack of data about the true incidence of wrong-patient and wrong-site operations, called “never events,” according to new research reported in the October issue of Archives of Surgery.

To determine the frequency, root causes, and outcomes of these never events, Dr. Philip F. Stahel of Denver Health Medical Center and the University of Colorado School of Medicine, and colleagues performed a retrospective analysis of the Colorado Physician Insurance Company’s (COPIC’s) comprehensive database (Arch. Surg. 2010;145:978-84).

Dr. Stahel and his colleagues screened 27,370 physician self-reported adverse occurrences between Jan. 1, 2002, and June 1, 2008. The researchers initially found 119 wrong-site and 29 wrong-patient procedures, but eliminated cases they could not classify as being a factual wrong site or wrong patient. The final analysis consisted of 107 wrong-site and 25 wrong-patient procedures.

Analysis of root causes found errors in:

  • Diagnosis, a root cause for 14 (56.0%) wrong-patient and 13 (12.1%) wrong-site procedures.
  • Communication, 25 (100%) wrong-patient and 52 (48.6%) wrong-site procedures.
  • Judgment, 2 (8.0%) wrong-patient and 91 (85.0%) wrong-site procedures.
  • Treatment, 22 (88.0%) wrong-patient and 9 (92.5%) wrong-site procedures.

In addition, system issues were a root cause in 21 (84.0%) wrong-patient procedures and 78 (72.9%) wrong-site procedures. This category included time-out not being performed in 77 (72%) wrong-site cases.

Wrong-patient cases often were due to a mix-up of patients’ medical records, radiographs, and laboratory or biopsy samples, as well as errors in communication.

Next, the researchers looked at outcomes, namely:

  • Death, which occurred in 1 patient (0.9%) secondary to a wrong-site procedure.
  • Significant harm, which occurred in 5 (20%) wrong-patient and 38 (35.5%) wrong-site cases.
  • Minimal harm or functional impairment, which occurred in 8 (32%) wrong-patient and 65 (60.7%) wrong-site cases.
  • No-harm event, which occurred in 9 (36%) wrong-patient and 3 (2.8%) wrong-site cases.

The most frequent specialties involved in wrong-patient procedures were internal medicine (24.0% of cases) as well as family or general practice, pathology, urology, obstetrics-gynecology, and pediatrics (8.0% each). The most frequent specialties involved in wrong-site occurrences were orthopedic surgery (22.4% of cases), general surgery (16.8%), and anesthesiology (12.1%).

Overall, nonsurgical specialties were involved in 14 (48.3%) wrong-patient and 29 (27.1%) wrong-site cases.

“The findings from the present study emphasize a continuing and concerning occurrence of wrong-site and wrong-patient procedures in the current era of the Universal Protocol, leading to frequent patient harm and, rarely, patient death,” the researchers said. “Shockingly, nonsurgical disciplines equally contribute to patient injuries related to wrong-site procedures.”

The researchers believe these findings warrant expansion of the Universal Protocol to nonsurgical specialties.

Limitations of the study include the restricted coverage of the COPIC database to about 6,000 physicians in Colorado; the potential for subjective bias in determining root causes; and the designation of inadequate planning for the procedure, which represents a generic category.

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin,” Dr. Marin A. Makary said in an accompanying editorial commentary. Consider the fact that the Universal Protocol has been in place since 2004, yet Dr. Stahel and his colleagues found that preventable errors, or “never events,” exist at alarming rates, Dr. Makary noted.

Further, the number of wrong-site procedures that this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here, Dr. Makary indicated.

He suggested that a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores, he said (Arch. Surg. 2010;145:984).

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors, concluded Dr. Makary, who is with the department of surgery at Johns Hopkins University, Baltimore.

Coauthors on the research analysis reported the following conflicts: Dr. Ted J. Clarke is the chief executive officer of COPIC; Dr. Jeffrey Varnell and Dr. Alan Lembitz are employed by COPIC; and Dr. Michael S. Victoroff and Dr. Dennis J. Boyle are consultants for COPIC.

 

 

Dr. Makary reported that he had no disclosures.

The frequency of surgical complications involving a wrong site or wrong patient remains high, even in the era of the Universal Protocol.

The Joint Commission introduced the Universal Protocol to ensure the correct patient, site, and procedure. Although it became effective July 1, 2004, there still exists a lack of data about the true incidence of wrong-patient and wrong-site operations, called “never events,” according to new research reported in the October issue of Archives of Surgery.

To determine the frequency, root causes, and outcomes of these never events, Dr. Philip F. Stahel of Denver Health Medical Center and the University of Colorado School of Medicine, and colleagues performed a retrospective analysis of the Colorado Physician Insurance Company’s (COPIC’s) comprehensive database (Arch. Surg. 2010;145:978-84).

Dr. Stahel and his colleagues screened 27,370 physician self-reported adverse occurrences between Jan. 1, 2002, and June 1, 2008. The researchers initially found 119 wrong-site and 29 wrong-patient procedures, but eliminated cases they could not classify as being a factual wrong site or wrong patient. The final analysis consisted of 107 wrong-site and 25 wrong-patient procedures.

Analysis of root causes found errors in:

  • Diagnosis, a root cause for 14 (56.0%) wrong-patient and 13 (12.1%) wrong-site procedures.
  • Communication, 25 (100%) wrong-patient and 52 (48.6%) wrong-site procedures.
  • Judgment, 2 (8.0%) wrong-patient and 91 (85.0%) wrong-site procedures.
  • Treatment, 22 (88.0%) wrong-patient and 9 (92.5%) wrong-site procedures.

In addition, system issues were a root cause in 21 (84.0%) wrong-patient procedures and 78 (72.9%) wrong-site procedures. This category included time-out not being performed in 77 (72%) wrong-site cases.

Wrong-patient cases often were due to a mix-up of patients’ medical records, radiographs, and laboratory or biopsy samples, as well as errors in communication.

Next, the researchers looked at outcomes, namely:

  • Death, which occurred in 1 patient (0.9%) secondary to a wrong-site procedure.
  • Significant harm, which occurred in 5 (20%) wrong-patient and 38 (35.5%) wrong-site cases.
  • Minimal harm or functional impairment, which occurred in 8 (32%) wrong-patient and 65 (60.7%) wrong-site cases.
  • No-harm event, which occurred in 9 (36%) wrong-patient and 3 (2.8%) wrong-site cases.

The most frequent specialties involved in wrong-patient procedures were internal medicine (24.0% of cases) as well as family or general practice, pathology, urology, obstetrics-gynecology, and pediatrics (8.0% each). The most frequent specialties involved in wrong-site occurrences were orthopedic surgery (22.4% of cases), general surgery (16.8%), and anesthesiology (12.1%).

Overall, nonsurgical specialties were involved in 14 (48.3%) wrong-patient and 29 (27.1%) wrong-site cases.

“The findings from the present study emphasize a continuing and concerning occurrence of wrong-site and wrong-patient procedures in the current era of the Universal Protocol, leading to frequent patient harm and, rarely, patient death,” the researchers said. “Shockingly, nonsurgical disciplines equally contribute to patient injuries related to wrong-site procedures.”

The researchers believe these findings warrant expansion of the Universal Protocol to nonsurgical specialties.

Limitations of the study include the restricted coverage of the COPIC database to about 6,000 physicians in Colorado; the potential for subjective bias in determining root causes; and the designation of inadequate planning for the procedure, which represents a generic category.

Although compliance with the Universal Protocol is important, “it is not the magic wand of Merlin,” Dr. Marin A. Makary said in an accompanying editorial commentary. Consider the fact that the Universal Protocol has been in place since 2004, yet Dr. Stahel and his colleagues found that preventable errors, or “never events,” exist at alarming rates, Dr. Makary noted.

Further, the number of wrong-site procedures that this study cites more likely reflect the number of errors reported rather than the actual rates of events. So, the number of wrong-site procedures is probably much higher than reflected here, Dr. Makary indicated.

He suggested that a more accurate measurement comes from the complication rates and safety culture scores under the National Surgical Quality Improvement Program, or NSQIP. Safety culture scores reflect the comfort level of hospital employees about speaking up about safety concerns. To improve public reporting and benchmarking, hospitals should be required to publicly report their NSQIP outcomes and culture scores, he said (Arch. Surg. 2010;145:984).

Finally, the Universal Protocol, while important, does not relieve hospital systems from emphasizing individual responsibility in preventing surgical errors, concluded Dr. Makary, who is with the department of surgery at Johns Hopkins University, Baltimore.

Coauthors on the research analysis reported the following conflicts: Dr. Ted J. Clarke is the chief executive officer of COPIC; Dr. Jeffrey Varnell and Dr. Alan Lembitz are employed by COPIC; and Dr. Michael S. Victoroff and Dr. Dennis J. Boyle are consultants for COPIC.

 

 

Dr. Makary reported that he had no disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Never Say Never: Surgical Errors Remain a Concern
Display Headline
Never Say Never: Surgical Errors Remain a Concern
Legacy Keywords
Surgical Errors, Universal Protocol, Joint Commission, Philip F. Stahe, COPIC, Colorado Physician Insurance Company, Diagnosis, Communication, Judgment, Treatment, Death, Significant harm, Minimal harm, functional impairment, internal medicine, family practice, pathology, urology, obstetrics-gynecology, pediatrics, orthopedic surgery, general surgery, anesthesiology, Marin A. Makary, National Surgical Quality Improvement Program, NSQIP
Legacy Keywords
Surgical Errors, Universal Protocol, Joint Commission, Philip F. Stahe, COPIC, Colorado Physician Insurance Company, Diagnosis, Communication, Judgment, Treatment, Death, Significant harm, Minimal harm, functional impairment, internal medicine, family practice, pathology, urology, obstetrics-gynecology, pediatrics, orthopedic surgery, general surgery, anesthesiology, Marin A. Makary, National Surgical Quality Improvement Program, NSQIP
Article Source

from archives of surgery

PURLs Copyright

Inside the Article

AMD Treatments Don't Increase Risk of Mortality

Article Type
Changed
Thu, 12/06/2018 - 20:13
Display Headline
AMD Treatments Don't Increase Risk of Mortality

Newer treatments for neovascular age-related macular degeneration are not associated with increased risks of mortality, myocardial infarction, bleeding, or stroke when compared with older therapies, according to a new study in the October 2010 issue of Archives of Ophthalmology.

The Food and Drug Administration approved ranibizumab (Lucentis) for the treatment of neovascular AMD in June 2006. Since 2005, ophthalmologists have used bevacizumab (Avastin), a cancer treatment that has a similar mechanism to ranibizumab, as an off-label treatment for AMD. The relative safety of these treatments, however, was unknown.

With that in mind, Dr. Lesley H. Curtis, Ph.D., and fellow researchers from Duke University in Durham, N.C., conducted a retrospective cohort study of adverse events involving 146,942 Medicare beneficiaries aged 65 years or older whose records they obtained between Jan. 1, 2004, and Dec. 31, 2007 (Arch. Ophthalmol. 2010;128:1273-9).

They identified Medicare patients with a diagnosis of AMD between Jan. 1, 2005, and Dec. 31, 2006, then classified patients into four treatment groups: the control group of 52,256 patients who received photodynamic therapy (PDT), 36,942 who received pegaptanib octasodium (Macugen), 38,718 who received bevacizumab, and 19,026 patients who received ranibizumab.

Specifically, the researchers looked at these AMD therapies and the risks of the following:

All-cause mortality. Cumulative incidence of all-cause mortality was 4.8% in those who received pegaptanib, 4.1% in both the PDT (control) and the ranibizumab groups, and 4.4% in the bevacizumab group, a statistically significant difference.

Myocardial infarction. The risk of MI in the PDT and pegaptanib groups (1.3% each) was slightly higher than, but not statistically significant from, the risk in the bevacizumab and ranibizumab groups (1.2% and 1.1%, respectively).

Bleeding events. Rates were 5.8% for the PDT group, 5.9% for the pegaptanib group, 5.5% for the bevacizumab group, and 5.8% for the ranibizumab group – the differences were not statistically significant.

Stroke. Rates were 2% each for the photodynamic therapy and pegaptanib groups, 2.1% for the bevacizumab group, and 1.8% for the ranibizumab group – again, not statistically significant.

After adjusting for baseline characteristics and comorbid conditions, the researchers found significantly lower hazards of mortality MI with ranibizumab than with PDT or pegaptanib and a significantly lower risk of MI with ranibizumab vs. PDT. They also found no significant differences in the hazard of mortality or MI between bevacizumab use and other treatments.

The overall tests for differences in bleeding and stroke across treatment groups were not statistically significant.

By the end of the study, clinicians used bevacizumab or ranibizumab as first-line therapy on all neovascular AMD patients. A secondary analysis, limited to new users of bevacizumab or ranibizumab, found significantly lower risk of mortality and stroke with ranibizumab vs. bevacizumab.

Serious adverse systemic effects have not been associated with PDT or pegaptanib in clinical trials, so why did this study find that the risk is lowest with ranibizumab? The researchers dismissed the possibility that recipients of ranibizumab may have been healthier than other subjects or that the risk of thromboembolic events declined over time. They speculated instead that the size of the observed population was great enough to detect rarer adverse reactions, compared with the small study sizes used in authentic clinical trials.

The study was limited in that therapies were not randomly assigned and there were no untreated patients as a control group, the researchers said. Also, the claims data lacked clinical detail about the severity of comorbid conditions and ophthalmologic outcomes.

Dr. Curtis received research support from a number of pharmaceutical and biotechnology companies, including Novartis Pharmaceuticals, LTD (manufacturer of ranibizumab), Genentech (manufacturer of bevacizumab), and Pfizer (manufacturer of pegaptanib). Another investigator serves as an investigator/collaborator for clinical research, grantee and paid scientific adviser/consultant for Genentech and Eyetech. This study was supported by a research agreement between OSI Eyetech and Duke University.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
neovascular age-related macular degeneration, mortality, myocardial infarction, bleeding, stroke, Archives of Ophthalmology, Food and Drug Administration, ranibizumab, Lucentis, neovascular AMD, ophthalmologists, bevacizumab, Avastin, cancer treatment, ranibizumab, off-label treatment, AMD
Author and Disclosure Information

Author and Disclosure Information

Newer treatments for neovascular age-related macular degeneration are not associated with increased risks of mortality, myocardial infarction, bleeding, or stroke when compared with older therapies, according to a new study in the October 2010 issue of Archives of Ophthalmology.

The Food and Drug Administration approved ranibizumab (Lucentis) for the treatment of neovascular AMD in June 2006. Since 2005, ophthalmologists have used bevacizumab (Avastin), a cancer treatment that has a similar mechanism to ranibizumab, as an off-label treatment for AMD. The relative safety of these treatments, however, was unknown.

With that in mind, Dr. Lesley H. Curtis, Ph.D., and fellow researchers from Duke University in Durham, N.C., conducted a retrospective cohort study of adverse events involving 146,942 Medicare beneficiaries aged 65 years or older whose records they obtained between Jan. 1, 2004, and Dec. 31, 2007 (Arch. Ophthalmol. 2010;128:1273-9).

They identified Medicare patients with a diagnosis of AMD between Jan. 1, 2005, and Dec. 31, 2006, then classified patients into four treatment groups: the control group of 52,256 patients who received photodynamic therapy (PDT), 36,942 who received pegaptanib octasodium (Macugen), 38,718 who received bevacizumab, and 19,026 patients who received ranibizumab.

Specifically, the researchers looked at these AMD therapies and the risks of the following:

All-cause mortality. Cumulative incidence of all-cause mortality was 4.8% in those who received pegaptanib, 4.1% in both the PDT (control) and the ranibizumab groups, and 4.4% in the bevacizumab group, a statistically significant difference.

Myocardial infarction. The risk of MI in the PDT and pegaptanib groups (1.3% each) was slightly higher than, but not statistically significant from, the risk in the bevacizumab and ranibizumab groups (1.2% and 1.1%, respectively).

Bleeding events. Rates were 5.8% for the PDT group, 5.9% for the pegaptanib group, 5.5% for the bevacizumab group, and 5.8% for the ranibizumab group – the differences were not statistically significant.

Stroke. Rates were 2% each for the photodynamic therapy and pegaptanib groups, 2.1% for the bevacizumab group, and 1.8% for the ranibizumab group – again, not statistically significant.

After adjusting for baseline characteristics and comorbid conditions, the researchers found significantly lower hazards of mortality MI with ranibizumab than with PDT or pegaptanib and a significantly lower risk of MI with ranibizumab vs. PDT. They also found no significant differences in the hazard of mortality or MI between bevacizumab use and other treatments.

The overall tests for differences in bleeding and stroke across treatment groups were not statistically significant.

By the end of the study, clinicians used bevacizumab or ranibizumab as first-line therapy on all neovascular AMD patients. A secondary analysis, limited to new users of bevacizumab or ranibizumab, found significantly lower risk of mortality and stroke with ranibizumab vs. bevacizumab.

Serious adverse systemic effects have not been associated with PDT or pegaptanib in clinical trials, so why did this study find that the risk is lowest with ranibizumab? The researchers dismissed the possibility that recipients of ranibizumab may have been healthier than other subjects or that the risk of thromboembolic events declined over time. They speculated instead that the size of the observed population was great enough to detect rarer adverse reactions, compared with the small study sizes used in authentic clinical trials.

The study was limited in that therapies were not randomly assigned and there were no untreated patients as a control group, the researchers said. Also, the claims data lacked clinical detail about the severity of comorbid conditions and ophthalmologic outcomes.

Dr. Curtis received research support from a number of pharmaceutical and biotechnology companies, including Novartis Pharmaceuticals, LTD (manufacturer of ranibizumab), Genentech (manufacturer of bevacizumab), and Pfizer (manufacturer of pegaptanib). Another investigator serves as an investigator/collaborator for clinical research, grantee and paid scientific adviser/consultant for Genentech and Eyetech. This study was supported by a research agreement between OSI Eyetech and Duke University.

Newer treatments for neovascular age-related macular degeneration are not associated with increased risks of mortality, myocardial infarction, bleeding, or stroke when compared with older therapies, according to a new study in the October 2010 issue of Archives of Ophthalmology.

The Food and Drug Administration approved ranibizumab (Lucentis) for the treatment of neovascular AMD in June 2006. Since 2005, ophthalmologists have used bevacizumab (Avastin), a cancer treatment that has a similar mechanism to ranibizumab, as an off-label treatment for AMD. The relative safety of these treatments, however, was unknown.

With that in mind, Dr. Lesley H. Curtis, Ph.D., and fellow researchers from Duke University in Durham, N.C., conducted a retrospective cohort study of adverse events involving 146,942 Medicare beneficiaries aged 65 years or older whose records they obtained between Jan. 1, 2004, and Dec. 31, 2007 (Arch. Ophthalmol. 2010;128:1273-9).

They identified Medicare patients with a diagnosis of AMD between Jan. 1, 2005, and Dec. 31, 2006, then classified patients into four treatment groups: the control group of 52,256 patients who received photodynamic therapy (PDT), 36,942 who received pegaptanib octasodium (Macugen), 38,718 who received bevacizumab, and 19,026 patients who received ranibizumab.

Specifically, the researchers looked at these AMD therapies and the risks of the following:

All-cause mortality. Cumulative incidence of all-cause mortality was 4.8% in those who received pegaptanib, 4.1% in both the PDT (control) and the ranibizumab groups, and 4.4% in the bevacizumab group, a statistically significant difference.

Myocardial infarction. The risk of MI in the PDT and pegaptanib groups (1.3% each) was slightly higher than, but not statistically significant from, the risk in the bevacizumab and ranibizumab groups (1.2% and 1.1%, respectively).

Bleeding events. Rates were 5.8% for the PDT group, 5.9% for the pegaptanib group, 5.5% for the bevacizumab group, and 5.8% for the ranibizumab group – the differences were not statistically significant.

Stroke. Rates were 2% each for the photodynamic therapy and pegaptanib groups, 2.1% for the bevacizumab group, and 1.8% for the ranibizumab group – again, not statistically significant.

After adjusting for baseline characteristics and comorbid conditions, the researchers found significantly lower hazards of mortality MI with ranibizumab than with PDT or pegaptanib and a significantly lower risk of MI with ranibizumab vs. PDT. They also found no significant differences in the hazard of mortality or MI between bevacizumab use and other treatments.

The overall tests for differences in bleeding and stroke across treatment groups were not statistically significant.

By the end of the study, clinicians used bevacizumab or ranibizumab as first-line therapy on all neovascular AMD patients. A secondary analysis, limited to new users of bevacizumab or ranibizumab, found significantly lower risk of mortality and stroke with ranibizumab vs. bevacizumab.

Serious adverse systemic effects have not been associated with PDT or pegaptanib in clinical trials, so why did this study find that the risk is lowest with ranibizumab? The researchers dismissed the possibility that recipients of ranibizumab may have been healthier than other subjects or that the risk of thromboembolic events declined over time. They speculated instead that the size of the observed population was great enough to detect rarer adverse reactions, compared with the small study sizes used in authentic clinical trials.

The study was limited in that therapies were not randomly assigned and there were no untreated patients as a control group, the researchers said. Also, the claims data lacked clinical detail about the severity of comorbid conditions and ophthalmologic outcomes.

Dr. Curtis received research support from a number of pharmaceutical and biotechnology companies, including Novartis Pharmaceuticals, LTD (manufacturer of ranibizumab), Genentech (manufacturer of bevacizumab), and Pfizer (manufacturer of pegaptanib). Another investigator serves as an investigator/collaborator for clinical research, grantee and paid scientific adviser/consultant for Genentech and Eyetech. This study was supported by a research agreement between OSI Eyetech and Duke University.

Publications
Publications
Topics
Article Type
Display Headline
AMD Treatments Don't Increase Risk of Mortality
Display Headline
AMD Treatments Don't Increase Risk of Mortality
Legacy Keywords
neovascular age-related macular degeneration, mortality, myocardial infarction, bleeding, stroke, Archives of Ophthalmology, Food and Drug Administration, ranibizumab, Lucentis, neovascular AMD, ophthalmologists, bevacizumab, Avastin, cancer treatment, ranibizumab, off-label treatment, AMD
Legacy Keywords
neovascular age-related macular degeneration, mortality, myocardial infarction, bleeding, stroke, Archives of Ophthalmology, Food and Drug Administration, ranibizumab, Lucentis, neovascular AMD, ophthalmologists, bevacizumab, Avastin, cancer treatment, ranibizumab, off-label treatment, AMD
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Use of bevacizumab and ranibizumab to treat neovascular age-related macular degeneration was not associated with increased risks of mortality, MI, bleeding, or stroke, compared with photodynamic therapy and intravitreal injections of pegaptanib.

Data Source: Retrospective cohort study of 146,942 Medicare beneficiaries aged 65 years and older with a claim for age-related macular degeneration between Jan. 1, 2005, and Dec. 31, 2006.

Disclosures: Dr. Curtis received research support from a number of pharmaceutical and biotechnology companies, including Novartis Pharmaceuticals, LTD (manufacturer of ranibizumab), Genentech (manufacturer of bevacizumab), and Pfizer (manufacturer of pegaptanib). Another investigator serves as an investigator/collaborator for clinical research, grantee, and paid scientific adviser/consultant for Genentech and Eyetech. This study was supported by a research agreement between OSI Eyetech and Duke University.

Most Alcohol Abuse After Disaster Is Pre-Existing

Article Type
Changed
Thu, 12/06/2018 - 20:12
Display Headline
Most Alcohol Abuse After Disaster Is Pre-Existing

Most alcohol use problems following a disaster represent preexisting problems rather than new disorders, according to a meta-analysis of 10 recent U.S. disasters.

Although studies have shown a high prevalence of problem drinking and alcohol use disorders following disasters, they have not determined whether a causal relationship exists.

Dr. Carol S. North, of the VA North Texas Health Care System, and her associates analyzed a large database of survivors of 10 different disasters to examine the relationship between pre- and post-disaster prevalence of alcohol use disorders. Their findings were published online Oct. 4 in Archives of General Psychiatry.

Of 811 participants in the index sample, 697 (86%) provided complete pre- and post-disaster alcohol data. Of the respondents, most were white (92%), and more than half (57%) were female. Mean age at the time of the disaster was 46 years, with 28% of patients between ages 18 and 35. More than one-third of subjects (38%) were injured during the disaster, with 20% diagnosed with a disaster-related post-traumatic stress disorder (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.131]).

Researchers used the Diagnostic Interview Schedule for DSM-III-R to determine lifetime diagnoses of alcohol use and dependence, and onset and recency questions to determine if subjects had an alcohol use disorder before or after the disaster or both.

The prevalence of an alcohol use disorder (alcohol abuse/dependence) was 25% before the disaster and 19% afterward, wrote Dr. North, also of the University of Texas Southwestern, Dallas, and her colleagues.

Of the 567 individuals without post-disaster alcohol abuse/dependence at the start of the study, just 3% (20) developed an alcohol use disorder during the follow-up period. Twelve of these were new cases, for a 2% incidence. The rate of onset of new alcohol use disorders over the next 2 years (0.08 new cases per month) was the same as the post-disaster rate.

Among those with a pre-disaster alcohol use disorder, 83% consumed alcohol after the disaster, and 22% coped with their emotions by drinking.

“Despite evidence from other studies that alcohol use may increase after disasters, the findings from this study suggest that this increase in use may not regularly translate into new onset of post-disaster alcohol use disorders,” according to Dr. North and her colleagues.

“The distinction between alcoholic relapse and continuing or new alcohol problems is important, because people who are in recovery from alcoholism when a disaster strikes may be especially vulnerable to relapse when exposed to highly stressful events, thus constituting a population deserving of particular attention in the post-disaster period,” they added.

Disclosures: The study was funded by grants from the National Institute of Mental Health and the Center for Substance Abuse Prevention of the Substance Abuse and Mental Health Services Administration. Dr. North also disclosed grants from other federal health agencies, the American Psychiatric Association, and her university.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Alcohol, Abuse , Disaster , Carol S. North, Archives of General Psychiatry, post-traumatic stress disorder
Author and Disclosure Information

Author and Disclosure Information

Most alcohol use problems following a disaster represent preexisting problems rather than new disorders, according to a meta-analysis of 10 recent U.S. disasters.

Although studies have shown a high prevalence of problem drinking and alcohol use disorders following disasters, they have not determined whether a causal relationship exists.

Dr. Carol S. North, of the VA North Texas Health Care System, and her associates analyzed a large database of survivors of 10 different disasters to examine the relationship between pre- and post-disaster prevalence of alcohol use disorders. Their findings were published online Oct. 4 in Archives of General Psychiatry.

Of 811 participants in the index sample, 697 (86%) provided complete pre- and post-disaster alcohol data. Of the respondents, most were white (92%), and more than half (57%) were female. Mean age at the time of the disaster was 46 years, with 28% of patients between ages 18 and 35. More than one-third of subjects (38%) were injured during the disaster, with 20% diagnosed with a disaster-related post-traumatic stress disorder (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.131]).

Researchers used the Diagnostic Interview Schedule for DSM-III-R to determine lifetime diagnoses of alcohol use and dependence, and onset and recency questions to determine if subjects had an alcohol use disorder before or after the disaster or both.

The prevalence of an alcohol use disorder (alcohol abuse/dependence) was 25% before the disaster and 19% afterward, wrote Dr. North, also of the University of Texas Southwestern, Dallas, and her colleagues.

Of the 567 individuals without post-disaster alcohol abuse/dependence at the start of the study, just 3% (20) developed an alcohol use disorder during the follow-up period. Twelve of these were new cases, for a 2% incidence. The rate of onset of new alcohol use disorders over the next 2 years (0.08 new cases per month) was the same as the post-disaster rate.

Among those with a pre-disaster alcohol use disorder, 83% consumed alcohol after the disaster, and 22% coped with their emotions by drinking.

“Despite evidence from other studies that alcohol use may increase after disasters, the findings from this study suggest that this increase in use may not regularly translate into new onset of post-disaster alcohol use disorders,” according to Dr. North and her colleagues.

“The distinction between alcoholic relapse and continuing or new alcohol problems is important, because people who are in recovery from alcoholism when a disaster strikes may be especially vulnerable to relapse when exposed to highly stressful events, thus constituting a population deserving of particular attention in the post-disaster period,” they added.

Disclosures: The study was funded by grants from the National Institute of Mental Health and the Center for Substance Abuse Prevention of the Substance Abuse and Mental Health Services Administration. Dr. North also disclosed grants from other federal health agencies, the American Psychiatric Association, and her university.

Most alcohol use problems following a disaster represent preexisting problems rather than new disorders, according to a meta-analysis of 10 recent U.S. disasters.

Although studies have shown a high prevalence of problem drinking and alcohol use disorders following disasters, they have not determined whether a causal relationship exists.

Dr. Carol S. North, of the VA North Texas Health Care System, and her associates analyzed a large database of survivors of 10 different disasters to examine the relationship between pre- and post-disaster prevalence of alcohol use disorders. Their findings were published online Oct. 4 in Archives of General Psychiatry.

Of 811 participants in the index sample, 697 (86%) provided complete pre- and post-disaster alcohol data. Of the respondents, most were white (92%), and more than half (57%) were female. Mean age at the time of the disaster was 46 years, with 28% of patients between ages 18 and 35. More than one-third of subjects (38%) were injured during the disaster, with 20% diagnosed with a disaster-related post-traumatic stress disorder (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.131]).

Researchers used the Diagnostic Interview Schedule for DSM-III-R to determine lifetime diagnoses of alcohol use and dependence, and onset and recency questions to determine if subjects had an alcohol use disorder before or after the disaster or both.

The prevalence of an alcohol use disorder (alcohol abuse/dependence) was 25% before the disaster and 19% afterward, wrote Dr. North, also of the University of Texas Southwestern, Dallas, and her colleagues.

Of the 567 individuals without post-disaster alcohol abuse/dependence at the start of the study, just 3% (20) developed an alcohol use disorder during the follow-up period. Twelve of these were new cases, for a 2% incidence. The rate of onset of new alcohol use disorders over the next 2 years (0.08 new cases per month) was the same as the post-disaster rate.

Among those with a pre-disaster alcohol use disorder, 83% consumed alcohol after the disaster, and 22% coped with their emotions by drinking.

“Despite evidence from other studies that alcohol use may increase after disasters, the findings from this study suggest that this increase in use may not regularly translate into new onset of post-disaster alcohol use disorders,” according to Dr. North and her colleagues.

“The distinction between alcoholic relapse and continuing or new alcohol problems is important, because people who are in recovery from alcoholism when a disaster strikes may be especially vulnerable to relapse when exposed to highly stressful events, thus constituting a population deserving of particular attention in the post-disaster period,” they added.

Disclosures: The study was funded by grants from the National Institute of Mental Health and the Center for Substance Abuse Prevention of the Substance Abuse and Mental Health Services Administration. Dr. North also disclosed grants from other federal health agencies, the American Psychiatric Association, and her university.

Publications
Publications
Topics
Article Type
Display Headline
Most Alcohol Abuse After Disaster Is Pre-Existing
Display Headline
Most Alcohol Abuse After Disaster Is Pre-Existing
Legacy Keywords
Alcohol, Abuse , Disaster , Carol S. North, Archives of General Psychiatry, post-traumatic stress disorder
Legacy Keywords
Alcohol, Abuse , Disaster , Carol S. North, Archives of General Psychiatry, post-traumatic stress disorder
Article Source

FROM ARCHIVES OF GENERAL PSYCHIATRY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Most post-disaster alcohol abuse disorders are a continuation or recurrence of preexisting problems.

Data Source: Ten disaster studies, including follow-up in the first few months and at 1-3 years later.

Disclosures: The study was funded by grants from the National Institute of Mental Health and the Center for Substance Abuse Prevention of the Substance Abuse and Mental Health Services Administration. Dr. North disclosed grants from other federal health agencies, the American Psychiatric Association, and her university.

Genetic Risk for Psychosis Expressed as Cannabis Sensitivity

Article Type
Changed
Thu, 12/06/2018 - 20:12
Display Headline
Genetic Risk for Psychosis Expressed as Cannabis Sensitivity

Genetic risk for psychotic disorder might be expressed in part as sensitivity to the psychotomimetic effect of cannabis. And, cannabis use, combined with this preexisting risk, might cause positive and negative symptoms of psychosis, according to new research published online Oct. 4.

© Ron Hilton/iStockphoto.com
    The tendency to develop psychotic experiences after cannabis use may be tied to a genetic risk for psychosis. 

Previous studies have suggested that exposure to delta-9-tetrahydrocannabinol, the main psychotropic component of Cannabis sativa, induces psychotic symptoms in a substantial proportion of healthy controls. Also, prospective epidemiological studies indicate that cannabis use not only predicts onset of psychotic disorder but also is associated with subthreshold expression of psychosis either in the form of schizotypy or subclinical psychotic experiences.

Data for this study come from the Genetic Risk and Outcome in Psychosis (GROUP) trial, an ongoing longitudinal study in selected areas of the Netherlands and Belgium (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.132]). The GROUP sample consists of 1,120 patients with nonaffective psychotic disorder, 1,057 siblings of these patients, 919 parents of patients and their siblings, and 590 unrelated controls.

Researchers used urinalysis to measure current substance abuse, sections of the Composite International Diagnostic Interview to assess long-term substance abuse, and an interview-based measure of schizotypy for the sibling–healthy control comparison. They performed sibling-control and cross-sibling comparisons using samples of patients with a psychotic disorder and community controls.

The main outcome measures were positive and negative schizotypy using the Structured Interview for Schizotypy–Revised for siblings and controls and self-reported positive and negative psychotic experiences using the Community Assessment of Psychic Experiences (CAPE) for siblings and patients.

Patients and their siblings more often used cannabis than did control subjects, and more often were male and nonwhite, the researchers found. Additional results showed that:

• Siblings of patients displayed more than 15 times greater sensitivity to positive schizotypy associated with particularly current cannabis use by urinalysis, and a similar difference in sensitivity to its effect on negative schizotypy.

• Siblings exposed to cannabis resembled their patient relative nearly 10 times more closely in the positive psychotic dimension of CAPE vs. nonexposed siblings.

• No significant effect was apparent for the negative domain of CAPE, although the association was directionally similar (two times more resemblance; P interaction = .17).

• Cross-sibling, cross-trait analyses suggested that the mechanism underlying these findings was moderation (familial risk increasing sensitivity to cannabis) rather than mediation (familial risk increasing use of cannabis).

“An important issue revealed by this study is that while the relative effect sizes of differential sensitivity were high, absolute effect sizes, for example, of cannabis on schizotypy in unaffected siblings, were small,” the researchers wrote. “It therefore follows that any study examining differential sensitivity will require a very large sample to demonstrate differences in sensitivity for an environmental risk factor between groups.”

Disclosures: The authors had no financial disclosures. The infrastructure for the GROUP study received funding from the Geestkracht program of the Netherlands Organisation for Health Research and Development, and from numerous universities and mental health care organizations in the Netherlands and Belgium. The analyses were supported by unrestricted grants from Janssen-Cilag, Eli Lilly and Co., AstraZeneca, and Lundbeck.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Genetic risk, psychotic disorder, psychotomimetic effect, cannabis, marijuana, psychosis, Genetic Risk and Outcome in Psychosis
Author and Disclosure Information

Author and Disclosure Information

Genetic risk for psychotic disorder might be expressed in part as sensitivity to the psychotomimetic effect of cannabis. And, cannabis use, combined with this preexisting risk, might cause positive and negative symptoms of psychosis, according to new research published online Oct. 4.

© Ron Hilton/iStockphoto.com
    The tendency to develop psychotic experiences after cannabis use may be tied to a genetic risk for psychosis. 

Previous studies have suggested that exposure to delta-9-tetrahydrocannabinol, the main psychotropic component of Cannabis sativa, induces psychotic symptoms in a substantial proportion of healthy controls. Also, prospective epidemiological studies indicate that cannabis use not only predicts onset of psychotic disorder but also is associated with subthreshold expression of psychosis either in the form of schizotypy or subclinical psychotic experiences.

Data for this study come from the Genetic Risk and Outcome in Psychosis (GROUP) trial, an ongoing longitudinal study in selected areas of the Netherlands and Belgium (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.132]). The GROUP sample consists of 1,120 patients with nonaffective psychotic disorder, 1,057 siblings of these patients, 919 parents of patients and their siblings, and 590 unrelated controls.

Researchers used urinalysis to measure current substance abuse, sections of the Composite International Diagnostic Interview to assess long-term substance abuse, and an interview-based measure of schizotypy for the sibling–healthy control comparison. They performed sibling-control and cross-sibling comparisons using samples of patients with a psychotic disorder and community controls.

The main outcome measures were positive and negative schizotypy using the Structured Interview for Schizotypy–Revised for siblings and controls and self-reported positive and negative psychotic experiences using the Community Assessment of Psychic Experiences (CAPE) for siblings and patients.

Patients and their siblings more often used cannabis than did control subjects, and more often were male and nonwhite, the researchers found. Additional results showed that:

• Siblings of patients displayed more than 15 times greater sensitivity to positive schizotypy associated with particularly current cannabis use by urinalysis, and a similar difference in sensitivity to its effect on negative schizotypy.

• Siblings exposed to cannabis resembled their patient relative nearly 10 times more closely in the positive psychotic dimension of CAPE vs. nonexposed siblings.

• No significant effect was apparent for the negative domain of CAPE, although the association was directionally similar (two times more resemblance; P interaction = .17).

• Cross-sibling, cross-trait analyses suggested that the mechanism underlying these findings was moderation (familial risk increasing sensitivity to cannabis) rather than mediation (familial risk increasing use of cannabis).

“An important issue revealed by this study is that while the relative effect sizes of differential sensitivity were high, absolute effect sizes, for example, of cannabis on schizotypy in unaffected siblings, were small,” the researchers wrote. “It therefore follows that any study examining differential sensitivity will require a very large sample to demonstrate differences in sensitivity for an environmental risk factor between groups.”

Disclosures: The authors had no financial disclosures. The infrastructure for the GROUP study received funding from the Geestkracht program of the Netherlands Organisation for Health Research and Development, and from numerous universities and mental health care organizations in the Netherlands and Belgium. The analyses were supported by unrestricted grants from Janssen-Cilag, Eli Lilly and Co., AstraZeneca, and Lundbeck.

Genetic risk for psychotic disorder might be expressed in part as sensitivity to the psychotomimetic effect of cannabis. And, cannabis use, combined with this preexisting risk, might cause positive and negative symptoms of psychosis, according to new research published online Oct. 4.

© Ron Hilton/iStockphoto.com
    The tendency to develop psychotic experiences after cannabis use may be tied to a genetic risk for psychosis. 

Previous studies have suggested that exposure to delta-9-tetrahydrocannabinol, the main psychotropic component of Cannabis sativa, induces psychotic symptoms in a substantial proportion of healthy controls. Also, prospective epidemiological studies indicate that cannabis use not only predicts onset of psychotic disorder but also is associated with subthreshold expression of psychosis either in the form of schizotypy or subclinical psychotic experiences.

Data for this study come from the Genetic Risk and Outcome in Psychosis (GROUP) trial, an ongoing longitudinal study in selected areas of the Netherlands and Belgium (Arch. Gen. Psychiatry 2010 Oct. 4 [doi:10.1001/archgenpsychiatry.2010.132]). The GROUP sample consists of 1,120 patients with nonaffective psychotic disorder, 1,057 siblings of these patients, 919 parents of patients and their siblings, and 590 unrelated controls.

Researchers used urinalysis to measure current substance abuse, sections of the Composite International Diagnostic Interview to assess long-term substance abuse, and an interview-based measure of schizotypy for the sibling–healthy control comparison. They performed sibling-control and cross-sibling comparisons using samples of patients with a psychotic disorder and community controls.

The main outcome measures were positive and negative schizotypy using the Structured Interview for Schizotypy–Revised for siblings and controls and self-reported positive and negative psychotic experiences using the Community Assessment of Psychic Experiences (CAPE) for siblings and patients.

Patients and their siblings more often used cannabis than did control subjects, and more often were male and nonwhite, the researchers found. Additional results showed that:

• Siblings of patients displayed more than 15 times greater sensitivity to positive schizotypy associated with particularly current cannabis use by urinalysis, and a similar difference in sensitivity to its effect on negative schizotypy.

• Siblings exposed to cannabis resembled their patient relative nearly 10 times more closely in the positive psychotic dimension of CAPE vs. nonexposed siblings.

• No significant effect was apparent for the negative domain of CAPE, although the association was directionally similar (two times more resemblance; P interaction = .17).

• Cross-sibling, cross-trait analyses suggested that the mechanism underlying these findings was moderation (familial risk increasing sensitivity to cannabis) rather than mediation (familial risk increasing use of cannabis).

“An important issue revealed by this study is that while the relative effect sizes of differential sensitivity were high, absolute effect sizes, for example, of cannabis on schizotypy in unaffected siblings, were small,” the researchers wrote. “It therefore follows that any study examining differential sensitivity will require a very large sample to demonstrate differences in sensitivity for an environmental risk factor between groups.”

Disclosures: The authors had no financial disclosures. The infrastructure for the GROUP study received funding from the Geestkracht program of the Netherlands Organisation for Health Research and Development, and from numerous universities and mental health care organizations in the Netherlands and Belgium. The analyses were supported by unrestricted grants from Janssen-Cilag, Eli Lilly and Co., AstraZeneca, and Lundbeck.

Publications
Publications
Topics
Article Type
Display Headline
Genetic Risk for Psychosis Expressed as Cannabis Sensitivity
Display Headline
Genetic Risk for Psychosis Expressed as Cannabis Sensitivity
Legacy Keywords
Genetic risk, psychotic disorder, psychotomimetic effect, cannabis, marijuana, psychosis, Genetic Risk and Outcome in Psychosis
Legacy Keywords
Genetic risk, psychotic disorder, psychotomimetic effect, cannabis, marijuana, psychosis, Genetic Risk and Outcome in Psychosis
Article Source

FROM ARCHIVES OF GENERAL PSYCHIATRY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Familial liability to psychosis appears to be partly expressed as a tendency to develop psychotic experiences after cannabis use.

Data Source: The Genetic Risk and Outcome in Psychosis (GROUP) Study, an ongoing longitudinal study in the Netherlands and Belgium.

Disclosures: The authors had no financial disclosures. The infrastructure for the GROUP study received funding from the Geestkracht program of the Netherlands Organisation for Health Research and Development, and from numerous universities and mental health care organizations in the Netherlands and Belgium. The analyses were supported by unrestricted grants from Janssen-Cilag, Eli Lilly and Co., AstraZeneca, and Lundbeck.

Transcranial Magnetic Stimulation Affects Hand Choice

Article Type
Changed
Thu, 12/06/2018 - 20:11
Display Headline
Transcranial Magnetic Stimulation Affects Hand Choice

The choice of which hand to use to perform an action might seem simple enough, but a competitive decision-making process in the brain, most notably the posterior parietal cortex, is behind that choice, new research in the Proceedings of the National Academy of Sciences shows.

Several studies have shown that the posterior parietal cortex (PPC) has a critical role in planning reaching movements, but few have examined the underlying neural mechanisms behind the decision of which hand to use to perform an action.

Flavio T.P. Oliveira of the University of California, Berkeley, and colleagues investigated how the brain mediates the decision about which hand to use for a manual action (PNAS 2010 [Epub ahead of print] doi: 10.1073/pnas.1006223107]).

In the first of three experiments, the researchers instructed 13 right-handed study participants to reach for targets as quickly and accurately as possible. Researchers alternated between requiring participants to use their right hand only, left hand only, or either hand. The researchers estimated the point of subjective equality (PSE), the point at which subjects would be equally likely to use either hand; identified targets in which there was little or much uncertainty; and measured reaction time (RT), defined as the time from the onset of the target to the time they moved the cursor outside the starting circle.

Overall reaction times were faster when subjects were told which hand to use rather than allowed to choose. When allowed to choose, RTs were significantly longer for the targets around the PSE than for those targets in extreme locations. “The increase in RT suggests a cost associated with a competition between the action plans for each hand at locations where ambiguity in hand choice is maximal,” the researchers said.

In the second experiment, they used single-pulse transcranial magnetic stimulation (TMS) to temporarily disrupt brain activity in the left and right PPC in 10 right-handed subjects. Again, RTs were slower for targets around the PSE (400 ms) vs. those in extreme locations (388 ms). TMS led to marginally reliable increases in RT (392 ms for the left PPC, 394 ms for the right) vs. no TMS (386 ms).

Also, TMS to the left PPC alone led to an increase in the use of the hand ipsilateral to the stimulation site. Participants had a 4% increase in left-hand use after left-PPC stimulation, compared with the no-TMS condition, and a 5.2% increase relative to right-PPC stimulation.

In a follow-up control experiment involving 10 right-handed subjects, the researchers applied TMS to the left and right PPC and found no significant changes in hand preference.

These experiments suggest that hand choice involves a process that “resolves a competition arising from the parallel engagement of action plans for both hands,” the researchers said. “Serial models in which hand choice is made at a higher cognitive level without activation of action plans for both hands might have predicted an increase in RT with TMS but cannot account for the shift in hand use. Rather, the results indicate that motor planning is initiated before response selection is made.”

The study also shows that the PPC is proactively involved in deciding which hand an individual will use for a manual reach. “Although it is likely that a broad network of cortical and subcortical areas are involved in different aspects of decision-making, the present results highlight the critical role that the PPC has in transforming sensory information into free choices of action,” the researchers said.

Disclosures: The authors had no conflicts to disclose. The Natural Sciences and Engineering Research Council of Canada, the Canadian Institutes of Health Research, the National Institutes of Health, the National Science Foundation, and the Belgian American Educational Foundation provided support for this study.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
hand, decision-making process, brain, posterior parietal cortex, Proceedings of the National Academy of Sciences, Transcranial Magnetic Stimulation
Author and Disclosure Information

Author and Disclosure Information

The choice of which hand to use to perform an action might seem simple enough, but a competitive decision-making process in the brain, most notably the posterior parietal cortex, is behind that choice, new research in the Proceedings of the National Academy of Sciences shows.

Several studies have shown that the posterior parietal cortex (PPC) has a critical role in planning reaching movements, but few have examined the underlying neural mechanisms behind the decision of which hand to use to perform an action.

Flavio T.P. Oliveira of the University of California, Berkeley, and colleagues investigated how the brain mediates the decision about which hand to use for a manual action (PNAS 2010 [Epub ahead of print] doi: 10.1073/pnas.1006223107]).

In the first of three experiments, the researchers instructed 13 right-handed study participants to reach for targets as quickly and accurately as possible. Researchers alternated between requiring participants to use their right hand only, left hand only, or either hand. The researchers estimated the point of subjective equality (PSE), the point at which subjects would be equally likely to use either hand; identified targets in which there was little or much uncertainty; and measured reaction time (RT), defined as the time from the onset of the target to the time they moved the cursor outside the starting circle.

Overall reaction times were faster when subjects were told which hand to use rather than allowed to choose. When allowed to choose, RTs were significantly longer for the targets around the PSE than for those targets in extreme locations. “The increase in RT suggests a cost associated with a competition between the action plans for each hand at locations where ambiguity in hand choice is maximal,” the researchers said.

In the second experiment, they used single-pulse transcranial magnetic stimulation (TMS) to temporarily disrupt brain activity in the left and right PPC in 10 right-handed subjects. Again, RTs were slower for targets around the PSE (400 ms) vs. those in extreme locations (388 ms). TMS led to marginally reliable increases in RT (392 ms for the left PPC, 394 ms for the right) vs. no TMS (386 ms).

Also, TMS to the left PPC alone led to an increase in the use of the hand ipsilateral to the stimulation site. Participants had a 4% increase in left-hand use after left-PPC stimulation, compared with the no-TMS condition, and a 5.2% increase relative to right-PPC stimulation.

In a follow-up control experiment involving 10 right-handed subjects, the researchers applied TMS to the left and right PPC and found no significant changes in hand preference.

These experiments suggest that hand choice involves a process that “resolves a competition arising from the parallel engagement of action plans for both hands,” the researchers said. “Serial models in which hand choice is made at a higher cognitive level without activation of action plans for both hands might have predicted an increase in RT with TMS but cannot account for the shift in hand use. Rather, the results indicate that motor planning is initiated before response selection is made.”

The study also shows that the PPC is proactively involved in deciding which hand an individual will use for a manual reach. “Although it is likely that a broad network of cortical and subcortical areas are involved in different aspects of decision-making, the present results highlight the critical role that the PPC has in transforming sensory information into free choices of action,” the researchers said.

Disclosures: The authors had no conflicts to disclose. The Natural Sciences and Engineering Research Council of Canada, the Canadian Institutes of Health Research, the National Institutes of Health, the National Science Foundation, and the Belgian American Educational Foundation provided support for this study.

The choice of which hand to use to perform an action might seem simple enough, but a competitive decision-making process in the brain, most notably the posterior parietal cortex, is behind that choice, new research in the Proceedings of the National Academy of Sciences shows.

Several studies have shown that the posterior parietal cortex (PPC) has a critical role in planning reaching movements, but few have examined the underlying neural mechanisms behind the decision of which hand to use to perform an action.

Flavio T.P. Oliveira of the University of California, Berkeley, and colleagues investigated how the brain mediates the decision about which hand to use for a manual action (PNAS 2010 [Epub ahead of print] doi: 10.1073/pnas.1006223107]).

In the first of three experiments, the researchers instructed 13 right-handed study participants to reach for targets as quickly and accurately as possible. Researchers alternated between requiring participants to use their right hand only, left hand only, or either hand. The researchers estimated the point of subjective equality (PSE), the point at which subjects would be equally likely to use either hand; identified targets in which there was little or much uncertainty; and measured reaction time (RT), defined as the time from the onset of the target to the time they moved the cursor outside the starting circle.

Overall reaction times were faster when subjects were told which hand to use rather than allowed to choose. When allowed to choose, RTs were significantly longer for the targets around the PSE than for those targets in extreme locations. “The increase in RT suggests a cost associated with a competition between the action plans for each hand at locations where ambiguity in hand choice is maximal,” the researchers said.

In the second experiment, they used single-pulse transcranial magnetic stimulation (TMS) to temporarily disrupt brain activity in the left and right PPC in 10 right-handed subjects. Again, RTs were slower for targets around the PSE (400 ms) vs. those in extreme locations (388 ms). TMS led to marginally reliable increases in RT (392 ms for the left PPC, 394 ms for the right) vs. no TMS (386 ms).

Also, TMS to the left PPC alone led to an increase in the use of the hand ipsilateral to the stimulation site. Participants had a 4% increase in left-hand use after left-PPC stimulation, compared with the no-TMS condition, and a 5.2% increase relative to right-PPC stimulation.

In a follow-up control experiment involving 10 right-handed subjects, the researchers applied TMS to the left and right PPC and found no significant changes in hand preference.

These experiments suggest that hand choice involves a process that “resolves a competition arising from the parallel engagement of action plans for both hands,” the researchers said. “Serial models in which hand choice is made at a higher cognitive level without activation of action plans for both hands might have predicted an increase in RT with TMS but cannot account for the shift in hand use. Rather, the results indicate that motor planning is initiated before response selection is made.”

The study also shows that the PPC is proactively involved in deciding which hand an individual will use for a manual reach. “Although it is likely that a broad network of cortical and subcortical areas are involved in different aspects of decision-making, the present results highlight the critical role that the PPC has in transforming sensory information into free choices of action,” the researchers said.

Disclosures: The authors had no conflicts to disclose. The Natural Sciences and Engineering Research Council of Canada, the Canadian Institutes of Health Research, the National Institutes of Health, the National Science Foundation, and the Belgian American Educational Foundation provided support for this study.

Publications
Publications
Topics
Article Type
Display Headline
Transcranial Magnetic Stimulation Affects Hand Choice
Display Headline
Transcranial Magnetic Stimulation Affects Hand Choice
Legacy Keywords
hand, decision-making process, brain, posterior parietal cortex, Proceedings of the National Academy of Sciences, Transcranial Magnetic Stimulation
Legacy Keywords
hand, decision-making process, brain, posterior parietal cortex, Proceedings of the National Academy of Sciences, Transcranial Magnetic Stimulation
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Hand choice entails a competitive decision process between simultaneously activated action plans for each hand. Also, single-pulse transcranial magnetic stimulation to the left posterior parietal cortex biases this competitive process, leading to an increase in ipsilateral, left hand reaches.

Data Source: Two experiments of 13 and 10 participants, plus a follow-up control study involving 10 participants.

Disclosures: The authors had no conflicts to disclose. The Natural Sciences and Engineering Research Council of Canada, the Canadian Institutes of Health Research, the National Institutes of Health, the National Science Foundation, and the Belgian American Educational Foundation provided support for this study.

Anti-VEGF Drug May Improve Choroidal Neovascularization in Multifocal Choroiditis

Article Type
Changed
Thu, 12/06/2018 - 20:10
Display Headline
Anti-VEGF Drug May Improve Choroidal Neovascularization in Multifocal Choroiditis

Intravitreal bevacizumab appears to be more effective than photodynamic therapy at restoring functional vision in patients with subfoveal choroidal neovascularization secondary to multifocal choroiditis, according to new research in the September issue of Archives of Ophthalmology.

Previous studies have reported successful treatment of subfoveal choroidal neovascularization (CNV) secondary to different disorders with bevacizumab, an anti–vascular endothelial growth factor (VEGF) agent. Other studies have shown that photodynamic therapy (PDT) can help stabilize but not improve vision.

“The most frequent cause of severe visual acuity deterioration in patients affected by multifocal choroiditis is CNV, occurring in as many as one-third of patients, but currently there is no general consensus, to our knowledge, about the most appropriate treatment,” say researchers led by Dr. Maurizio Battaglia Parodi of the Scientific Institute of the University Vita-Salute San Raffaele in Milan, Italy.

For that reason, Dr. Parodi and colleagues conducted a prospective pilot randomized clinical trial among patients treated with PDT or intravitreal bevacizumab injection during a 1-year follow-up period between March 15, 2005, and April 30, 2009 (Arch. Ophthalmol. 2010;128:1100-3). They enrolled 27 patients (27 eyes) with subfoveal CNV secondary to multifocal choroiditis and randomized 13 patients to received PDT and 14 to receive bevacizumab. Subjects consisted of 18 women and 9 men aged 22-57 (mean age, 39 years). Patients in the bevacizumab group underwent a mean 1.7 out of 4 possible treatments, while those in the PDT group underwent a mean 3.8 treatments.

At 12 months, 5 of 14 eyes (36%) in the bevacizumab-treated group gained more than three lines of best-corrected visual acuity vs. none of the PDT-treated eyes. Twelve eyes (86%) in the bevacizumab group and 6 eyes (46%) in the PDT group gained more than one line of BCVA, the researchers found. Mean BCVA (logMAR) went from 0.45(0.2) to 0.47(0.2) in the PDT group and from 0.48(0.2) to 0.30 (0.2) in the bevacizumab subgroups. Meanwhile, central macular thickness progressively declined from 299 mcm at baseline to 256 mcm at 12 months in the PDT group and from 289 mcm to 234 mcm in the bevacizumab group, a statistically significant reduction.

There were no reports of complications, signs of multifocal choroiditis activity or new lesions during follow-up, the researchers said.

Although other studies suggest that PDT may help stabilize visual acuity and lead to limited functional improvement, results of this study suggest that bevacizumab “is superior to PDT in improving the mean BCVA during a 12-month follow-up,” the researchers say.

Limitations of this study include a small sample size, limited follow-up of 1 year, and no control group to mask with sham treatment. “Nevertheless,” the researchers wrote, multifocal choroiditis “is a rather uncommon entity, which makes the planning of a randomized clinical trial difficult” the researchers say.

Bevacizumab is approved in the United States only for the treatment of several metastatic cancers and has been used off label to treat age-related macular degeneration.

Further studies with longer follow-up are necessary, the researchers say.

The authors reported no financial conflicts of interest.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Intravitreal bevacizumab, photodynamic therapy, functional vision, subfoveal choroidal neovascularization, multifocal choroiditis, Archives of Ophthalmology
Author and Disclosure Information

Author and Disclosure Information

Intravitreal bevacizumab appears to be more effective than photodynamic therapy at restoring functional vision in patients with subfoveal choroidal neovascularization secondary to multifocal choroiditis, according to new research in the September issue of Archives of Ophthalmology.

Previous studies have reported successful treatment of subfoveal choroidal neovascularization (CNV) secondary to different disorders with bevacizumab, an anti–vascular endothelial growth factor (VEGF) agent. Other studies have shown that photodynamic therapy (PDT) can help stabilize but not improve vision.

“The most frequent cause of severe visual acuity deterioration in patients affected by multifocal choroiditis is CNV, occurring in as many as one-third of patients, but currently there is no general consensus, to our knowledge, about the most appropriate treatment,” say researchers led by Dr. Maurizio Battaglia Parodi of the Scientific Institute of the University Vita-Salute San Raffaele in Milan, Italy.

For that reason, Dr. Parodi and colleagues conducted a prospective pilot randomized clinical trial among patients treated with PDT or intravitreal bevacizumab injection during a 1-year follow-up period between March 15, 2005, and April 30, 2009 (Arch. Ophthalmol. 2010;128:1100-3). They enrolled 27 patients (27 eyes) with subfoveal CNV secondary to multifocal choroiditis and randomized 13 patients to received PDT and 14 to receive bevacizumab. Subjects consisted of 18 women and 9 men aged 22-57 (mean age, 39 years). Patients in the bevacizumab group underwent a mean 1.7 out of 4 possible treatments, while those in the PDT group underwent a mean 3.8 treatments.

At 12 months, 5 of 14 eyes (36%) in the bevacizumab-treated group gained more than three lines of best-corrected visual acuity vs. none of the PDT-treated eyes. Twelve eyes (86%) in the bevacizumab group and 6 eyes (46%) in the PDT group gained more than one line of BCVA, the researchers found. Mean BCVA (logMAR) went from 0.45(0.2) to 0.47(0.2) in the PDT group and from 0.48(0.2) to 0.30 (0.2) in the bevacizumab subgroups. Meanwhile, central macular thickness progressively declined from 299 mcm at baseline to 256 mcm at 12 months in the PDT group and from 289 mcm to 234 mcm in the bevacizumab group, a statistically significant reduction.

There were no reports of complications, signs of multifocal choroiditis activity or new lesions during follow-up, the researchers said.

Although other studies suggest that PDT may help stabilize visual acuity and lead to limited functional improvement, results of this study suggest that bevacizumab “is superior to PDT in improving the mean BCVA during a 12-month follow-up,” the researchers say.

Limitations of this study include a small sample size, limited follow-up of 1 year, and no control group to mask with sham treatment. “Nevertheless,” the researchers wrote, multifocal choroiditis “is a rather uncommon entity, which makes the planning of a randomized clinical trial difficult” the researchers say.

Bevacizumab is approved in the United States only for the treatment of several metastatic cancers and has been used off label to treat age-related macular degeneration.

Further studies with longer follow-up are necessary, the researchers say.

The authors reported no financial conflicts of interest.

Intravitreal bevacizumab appears to be more effective than photodynamic therapy at restoring functional vision in patients with subfoveal choroidal neovascularization secondary to multifocal choroiditis, according to new research in the September issue of Archives of Ophthalmology.

Previous studies have reported successful treatment of subfoveal choroidal neovascularization (CNV) secondary to different disorders with bevacizumab, an anti–vascular endothelial growth factor (VEGF) agent. Other studies have shown that photodynamic therapy (PDT) can help stabilize but not improve vision.

“The most frequent cause of severe visual acuity deterioration in patients affected by multifocal choroiditis is CNV, occurring in as many as one-third of patients, but currently there is no general consensus, to our knowledge, about the most appropriate treatment,” say researchers led by Dr. Maurizio Battaglia Parodi of the Scientific Institute of the University Vita-Salute San Raffaele in Milan, Italy.

For that reason, Dr. Parodi and colleagues conducted a prospective pilot randomized clinical trial among patients treated with PDT or intravitreal bevacizumab injection during a 1-year follow-up period between March 15, 2005, and April 30, 2009 (Arch. Ophthalmol. 2010;128:1100-3). They enrolled 27 patients (27 eyes) with subfoveal CNV secondary to multifocal choroiditis and randomized 13 patients to received PDT and 14 to receive bevacizumab. Subjects consisted of 18 women and 9 men aged 22-57 (mean age, 39 years). Patients in the bevacizumab group underwent a mean 1.7 out of 4 possible treatments, while those in the PDT group underwent a mean 3.8 treatments.

At 12 months, 5 of 14 eyes (36%) in the bevacizumab-treated group gained more than three lines of best-corrected visual acuity vs. none of the PDT-treated eyes. Twelve eyes (86%) in the bevacizumab group and 6 eyes (46%) in the PDT group gained more than one line of BCVA, the researchers found. Mean BCVA (logMAR) went from 0.45(0.2) to 0.47(0.2) in the PDT group and from 0.48(0.2) to 0.30 (0.2) in the bevacizumab subgroups. Meanwhile, central macular thickness progressively declined from 299 mcm at baseline to 256 mcm at 12 months in the PDT group and from 289 mcm to 234 mcm in the bevacizumab group, a statistically significant reduction.

There were no reports of complications, signs of multifocal choroiditis activity or new lesions during follow-up, the researchers said.

Although other studies suggest that PDT may help stabilize visual acuity and lead to limited functional improvement, results of this study suggest that bevacizumab “is superior to PDT in improving the mean BCVA during a 12-month follow-up,” the researchers say.

Limitations of this study include a small sample size, limited follow-up of 1 year, and no control group to mask with sham treatment. “Nevertheless,” the researchers wrote, multifocal choroiditis “is a rather uncommon entity, which makes the planning of a randomized clinical trial difficult” the researchers say.

Bevacizumab is approved in the United States only for the treatment of several metastatic cancers and has been used off label to treat age-related macular degeneration.

Further studies with longer follow-up are necessary, the researchers say.

The authors reported no financial conflicts of interest.

Publications
Publications
Topics
Article Type
Display Headline
Anti-VEGF Drug May Improve Choroidal Neovascularization in Multifocal Choroiditis
Display Headline
Anti-VEGF Drug May Improve Choroidal Neovascularization in Multifocal Choroiditis
Legacy Keywords
Intravitreal bevacizumab, photodynamic therapy, functional vision, subfoveal choroidal neovascularization, multifocal choroiditis, Archives of Ophthalmology
Legacy Keywords
Intravitreal bevacizumab, photodynamic therapy, functional vision, subfoveal choroidal neovascularization, multifocal choroiditis, Archives of Ophthalmology
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Intravitreal bevacizumab was more likely to improve visual acuity and reduce central macular thickness in patients with subfoveal choroidal neovascularization secondary to multifocal choroiditis.

Data Source: Prospective pilot randomized clinical trial.

Disclosures: The authors reported no conflicts.

Prophylactic Antivirals May Decrease Ocular Herpes Recurrence

Article Type
Changed
Fri, 01/11/2019 - 10:49
Display Headline
Prophylactic Antivirals May Decrease Ocular Herpes Recurrence

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

"The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease," the researchers wrote. "We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy."

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Prophylactic, ptophylaxis, treatment, antiviral agents, herpes simplex virus, eye disease, Rochester Epidemiology Project, REP, eye herpes
Author and Disclosure Information

Author and Disclosure Information

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

"The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease," the researchers wrote. "We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy."

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

"The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease," the researchers wrote. "We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy."

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Publications
Publications
Topics
Article Type
Display Headline
Prophylactic Antivirals May Decrease Ocular Herpes Recurrence
Display Headline
Prophylactic Antivirals May Decrease Ocular Herpes Recurrence
Legacy Keywords
Prophylactic, ptophylaxis, treatment, antiviral agents, herpes simplex virus, eye disease, Rochester Epidemiology Project, REP, eye herpes
Legacy Keywords
Prophylactic, ptophylaxis, treatment, antiviral agents, herpes simplex virus, eye disease, Rochester Epidemiology Project, REP, eye herpes
Article Source

From Archives of Ophthalmology

PURLs Copyright

Inside the Article

Vitals

Major Finding: Oral antiviral prophylaxis was associated with a decreased risk of recurrence of epithelial keratitis, stromal keratitis, conjunctivitis, and blepharitis due to herpes simplex virus.

Data Source: Community-based retrospective study of 644 patients with diagnostic codes consistent with ocular herpes simplex virus between 1976 and 2007.

Disclosures: The authors reported no financial disclosures. Research to Prevent Blindness Inc. and the Mayo Foundation provided unrestricted grants to support the study.

Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences

Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences

ROCHESTER, Minn. – Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported in the Archives of Ophthalmology. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

The authors reported no financial disclosures. Research to Prevent Blindness Inc. and the Mayo Foundation provided an unrestricted grant to support the study.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Author and Disclosure Information

Author and Disclosure Information

ROCHESTER, Minn. – Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported in the Archives of Ophthalmology. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

The authors reported no financial disclosures. Research to Prevent Blindness Inc. and the Mayo Foundation provided an unrestricted grant to support the study.

ROCHESTER, Minn. – Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported in the Archives of Ophthalmology. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

The authors reported no financial disclosures. Research to Prevent Blindness Inc. and the Mayo Foundation provided an unrestricted grant to support the study.

Publications
Publications
Topics
Article Type
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Article Source

PURLs Copyright

Inside the Article

Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences

Article Type
Changed
Thu, 12/06/2018 - 20:08
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Author and Disclosure Information

Author and Disclosure Information

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Prophylactic treatment with oral antiviral agents appeared to dramatically decrease the recurrence of herpes simplex virus eye disease even while incidence remained stable, according to a long-term study of residents of one Minnesota county.

Using the Rochester Epidemiology Project (REP), Mayo Clinic researchers retrieved 694 records that contained diagnostic codes related to herpetic eye disease in residents of Olmsted County, Minn., between 1976 and 2007, Ryan C. Young and his colleagues reported. They found 394 confirmed cases (181 men, 213 women), with a mean age of onset of 43 years. Mean follow-up was 7.7 years.

After researchers eliminated nine patients identified as not county residents and adjusted for age and sex, they determined the incidence of new cases of herpes simplex virus (HSV) keratitis to be 9.2/100,000 resident per year during the study period and of ocular HSV to be 11.8/100,000 per year (Arch. Ophthalmol. 2010;128:1178-83). The incidence increased with age, with a peak at 28/100,000 people in the ninth decade of life, the researchers wrote.

In all, 169 patients experienced recurrence of ocular HSV, with an estimated likelihood of recurrence of 27% at 1 year after the initial episode, 50% at 5 years, 57% at 10 years, and 63% at 20 years, the researchers found.

Of those patients, 108 had a second recurrence, for a rate of 38% at 1 year, 67% at 5 years, 78% at 10 years, and 83% at 20 years. And 65 patients experienced a third recurrence, for a rate of 29% at 1 year, 65% at 5 years, 78% at 10 years, and 82% at 20 years.

Of 175 patients (44%) who received oral antiviral therapy for a mean period of 2.8 years, 4 experienced a first recurrence, 10 developed an epithelial recurrence, 20 had a stromal recurrence, and 3 developed blepharitis or conjunctivitis.

Oral antiviral prophylaxis decreased the relative risk of a first recurrence by a factor of 2.9, according to Cox proportional hazards models. Without prophylaxis, patients were 9.4 times more likely to have a recurrence of epithelial keratitis, 8.4 times more likely to have a recurrence of stromal keratitis, and 34.5 times more likely to have a recurrence of blepharitis or conjunctivitis.

A total of 20 patients experienced adverse outcomes, including vision loss, corneal perforation, enucleation, glaucoma surgery, keratoplasty, and conjunctival flap. Of these, 17 were not receiving antiviral treatment at the last recurrence before the adverse event, and 14 had never been treated prophylactically.

“The results of this study suggest that oral antiviral prophylaxis should be considered for patients with frequent recurrences of corneal disease,” the researchers wrote. “We recommend an evaluation of the possible barriers preventing compliance with antiviral prophylaxis and a reassessment of the cost effectiveness of long-term oral antiviral therapy.”

Strengths of the study included the large number of subjects, long-term follow-up, and use of a community-based cohort, according to the researchers. Potential weaknesses include lack of comparison before and after the introduction of prophylactic oral therapy, reliance on the accuracy and completeness of patient records, and lack of laboratory confirmation in most HSV cases.

Publications
Publications
Topics
Article Type
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences
Display Headline
Prophylactic Antiviral Treatment May Decrease Ocular Herpes Recurrences
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Legacy Keywords
Prophylactic treatment, oral antiviral agents, herpes simplex virus, eye disease, Minnesota, Rochester Epidemiology Project, Mayo Clinic, herpetic eye disease, Olmsted County, Minn., Ryan C. Young, Archives of Ophthalmology
Article Source

From Archives of Ophthalmology

PURLs Copyright

Inside the Article

Vitals

Major Finding: Oral antiviral prophylaxis was associated with a decreased risk of recurrence of epithelial keratitis, stromal keratitis, conjunctivitis, and blepharitis due to herpes simplex virus.

Data Source: Community-based retrospective study of 644 patients with diagnostic codes consistent with ocular herpes simplex virus between 1976 and 2007.

Disclosures: The authors reported no financial disclosures. Research to Prevent Blindness Inc. and the Mayo Foundation provided unrestricted grants to support the study.

Malay Study Finds Poor Diabetes Control

Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Malay Study Finds Poor Diabetes Control

Only one in four diabetic patients participating in the Malay Eye Study achieved optimal glycemic control and one in eight achieved optimal blood pressure controls, and these numbers were even lower in individuals with diabetic retinopathy, according to a report in the September 2010 issue of Archives of Ophthalmology.

Olivia S. Huang, B.Sc, of the University of New South Wales, in Kensington, Australia, and her colleagues measured glycemic and blood pressure control among a population-based sample of patients with diabetes and, specifically, those patients with diabetic retinopathy.

Previous studies have predicted that the prevalence of diabetes in Asia will increase from 240 million in 2007 to 380 million in 2025.

The cross-sectional study is based on a population of 3,280 Malay adults aged 40-80 years in Singapore from 2004 to 2006. The researchers identified 768 patients with diabetes, which they defined as non-fasting glucose level of 200 mg/dL or greater, use of diabetic medication, or physician diagnosis. Nearly 1 in 10 had severe diabetic retinopathy.

Overall, the mean glycated hemoglobin (HbA1c) level was 8.0% (range of 4.5%-15.1%) with only 26.9% having an optimal HbA1c level, defined as less than 7%. This number dropped to 17.4% among participants with diabetic retinopathy.

HbA1c of greater than 8% was present in 49.1% of the overall sample, but rose to 61.9% of patients with diabetic retinopathy. Patients who had HbA1c levels greater than 8% were more likely to have diabetic retinopathy vs. those with an HbA1c lower than 8%.

Factors associated with higher odds of suboptimal glycemic control included higher serum cholesterol levels, being previously undiagnosed with diabetes, being treated with oral hypoglycemic agents, and having diabetic retinopathy, the researchers found. Older age, however, was associated with decreased odds of suboptimal control.

Mean systolic and diastolic blood pressure levels were 154.6 mm Hg and 79.2 mm Hg, respectively. Only 13.4% of the overall patients and 10.3% of diabetic patients had optimal blood pressure, defined as 130/80 mm Hg or lower, were achieved in 13.4% patients. In those with diabetic retinopathy, the number dropped to 10.3% (Arch. Ophthalmol. 2010;128:1185-90 [doi:10.1001/archophthalmol. 2010.168]).

Patients with systolic pressure greater than 150 mm Hg were more likely to have diabetic retinopathy than were those with 150 mm Hg or less. A total of 57.3% of the overall sample had optimal diastolic blood pressure levels.

Factors associated with higher odds of suboptimal blood pressure included older age, higher total serum cholesterol levels, higher body mass index, diabetic retinopathy, and posterior subcapsular cataract. Patients who had a previous acute myocardial infarction were less likely to have suboptimal blood pressure control.

Other socioeconomic factors, namely education and income, as well as ocular and systemic factors were not significantly associated with suboptimal blood pressure or glycemic control, the researchers said.

“Our findings present a challenge to health-care policy makers and professionals regarding effective implementation of diabetes care in Asia,” the researchers wrote. Specifically, they need to devise strategies to improve awareness and implement evidence-based guidelines in order to reduce diabetic complications, they added.

Strengths of the study include the population-based sample, masked external grading of retinal photographs with higher proportions of gradable photographs, and standardized measurement of HbA1c and blood pressure, the investigators said. A limitation, they added, was a definition of diabetes based on random blood glucose levels instead of oral glucose tolerance tests.

The authors have no financial disclosures. The study was supported by grants from the National Medical Research Council and Biomedical Research Council and by the Ministry of Health, all in Singapore.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
diabetic, Malay Eye Study, glycemic control, blood pressure, diabetic retinopathy, Archives of Ophthalmology
Author and Disclosure Information

Author and Disclosure Information

Only one in four diabetic patients participating in the Malay Eye Study achieved optimal glycemic control and one in eight achieved optimal blood pressure controls, and these numbers were even lower in individuals with diabetic retinopathy, according to a report in the September 2010 issue of Archives of Ophthalmology.

Olivia S. Huang, B.Sc, of the University of New South Wales, in Kensington, Australia, and her colleagues measured glycemic and blood pressure control among a population-based sample of patients with diabetes and, specifically, those patients with diabetic retinopathy.

Previous studies have predicted that the prevalence of diabetes in Asia will increase from 240 million in 2007 to 380 million in 2025.

The cross-sectional study is based on a population of 3,280 Malay adults aged 40-80 years in Singapore from 2004 to 2006. The researchers identified 768 patients with diabetes, which they defined as non-fasting glucose level of 200 mg/dL or greater, use of diabetic medication, or physician diagnosis. Nearly 1 in 10 had severe diabetic retinopathy.

Overall, the mean glycated hemoglobin (HbA1c) level was 8.0% (range of 4.5%-15.1%) with only 26.9% having an optimal HbA1c level, defined as less than 7%. This number dropped to 17.4% among participants with diabetic retinopathy.

HbA1c of greater than 8% was present in 49.1% of the overall sample, but rose to 61.9% of patients with diabetic retinopathy. Patients who had HbA1c levels greater than 8% were more likely to have diabetic retinopathy vs. those with an HbA1c lower than 8%.

Factors associated with higher odds of suboptimal glycemic control included higher serum cholesterol levels, being previously undiagnosed with diabetes, being treated with oral hypoglycemic agents, and having diabetic retinopathy, the researchers found. Older age, however, was associated with decreased odds of suboptimal control.

Mean systolic and diastolic blood pressure levels were 154.6 mm Hg and 79.2 mm Hg, respectively. Only 13.4% of the overall patients and 10.3% of diabetic patients had optimal blood pressure, defined as 130/80 mm Hg or lower, were achieved in 13.4% patients. In those with diabetic retinopathy, the number dropped to 10.3% (Arch. Ophthalmol. 2010;128:1185-90 [doi:10.1001/archophthalmol. 2010.168]).

Patients with systolic pressure greater than 150 mm Hg were more likely to have diabetic retinopathy than were those with 150 mm Hg or less. A total of 57.3% of the overall sample had optimal diastolic blood pressure levels.

Factors associated with higher odds of suboptimal blood pressure included older age, higher total serum cholesterol levels, higher body mass index, diabetic retinopathy, and posterior subcapsular cataract. Patients who had a previous acute myocardial infarction were less likely to have suboptimal blood pressure control.

Other socioeconomic factors, namely education and income, as well as ocular and systemic factors were not significantly associated with suboptimal blood pressure or glycemic control, the researchers said.

“Our findings present a challenge to health-care policy makers and professionals regarding effective implementation of diabetes care in Asia,” the researchers wrote. Specifically, they need to devise strategies to improve awareness and implement evidence-based guidelines in order to reduce diabetic complications, they added.

Strengths of the study include the population-based sample, masked external grading of retinal photographs with higher proportions of gradable photographs, and standardized measurement of HbA1c and blood pressure, the investigators said. A limitation, they added, was a definition of diabetes based on random blood glucose levels instead of oral glucose tolerance tests.

The authors have no financial disclosures. The study was supported by grants from the National Medical Research Council and Biomedical Research Council and by the Ministry of Health, all in Singapore.

Only one in four diabetic patients participating in the Malay Eye Study achieved optimal glycemic control and one in eight achieved optimal blood pressure controls, and these numbers were even lower in individuals with diabetic retinopathy, according to a report in the September 2010 issue of Archives of Ophthalmology.

Olivia S. Huang, B.Sc, of the University of New South Wales, in Kensington, Australia, and her colleagues measured glycemic and blood pressure control among a population-based sample of patients with diabetes and, specifically, those patients with diabetic retinopathy.

Previous studies have predicted that the prevalence of diabetes in Asia will increase from 240 million in 2007 to 380 million in 2025.

The cross-sectional study is based on a population of 3,280 Malay adults aged 40-80 years in Singapore from 2004 to 2006. The researchers identified 768 patients with diabetes, which they defined as non-fasting glucose level of 200 mg/dL or greater, use of diabetic medication, or physician diagnosis. Nearly 1 in 10 had severe diabetic retinopathy.

Overall, the mean glycated hemoglobin (HbA1c) level was 8.0% (range of 4.5%-15.1%) with only 26.9% having an optimal HbA1c level, defined as less than 7%. This number dropped to 17.4% among participants with diabetic retinopathy.

HbA1c of greater than 8% was present in 49.1% of the overall sample, but rose to 61.9% of patients with diabetic retinopathy. Patients who had HbA1c levels greater than 8% were more likely to have diabetic retinopathy vs. those with an HbA1c lower than 8%.

Factors associated with higher odds of suboptimal glycemic control included higher serum cholesterol levels, being previously undiagnosed with diabetes, being treated with oral hypoglycemic agents, and having diabetic retinopathy, the researchers found. Older age, however, was associated with decreased odds of suboptimal control.

Mean systolic and diastolic blood pressure levels were 154.6 mm Hg and 79.2 mm Hg, respectively. Only 13.4% of the overall patients and 10.3% of diabetic patients had optimal blood pressure, defined as 130/80 mm Hg or lower, were achieved in 13.4% patients. In those with diabetic retinopathy, the number dropped to 10.3% (Arch. Ophthalmol. 2010;128:1185-90 [doi:10.1001/archophthalmol. 2010.168]).

Patients with systolic pressure greater than 150 mm Hg were more likely to have diabetic retinopathy than were those with 150 mm Hg or less. A total of 57.3% of the overall sample had optimal diastolic blood pressure levels.

Factors associated with higher odds of suboptimal blood pressure included older age, higher total serum cholesterol levels, higher body mass index, diabetic retinopathy, and posterior subcapsular cataract. Patients who had a previous acute myocardial infarction were less likely to have suboptimal blood pressure control.

Other socioeconomic factors, namely education and income, as well as ocular and systemic factors were not significantly associated with suboptimal blood pressure or glycemic control, the researchers said.

“Our findings present a challenge to health-care policy makers and professionals regarding effective implementation of diabetes care in Asia,” the researchers wrote. Specifically, they need to devise strategies to improve awareness and implement evidence-based guidelines in order to reduce diabetic complications, they added.

Strengths of the study include the population-based sample, masked external grading of retinal photographs with higher proportions of gradable photographs, and standardized measurement of HbA1c and blood pressure, the investigators said. A limitation, they added, was a definition of diabetes based on random blood glucose levels instead of oral glucose tolerance tests.

The authors have no financial disclosures. The study was supported by grants from the National Medical Research Council and Biomedical Research Council and by the Ministry of Health, all in Singapore.

Publications
Publications
Topics
Article Type
Display Headline
Malay Study Finds Poor Diabetes Control
Display Headline
Malay Study Finds Poor Diabetes Control
Legacy Keywords
diabetic, Malay Eye Study, glycemic control, blood pressure, diabetic retinopathy, Archives of Ophthalmology
Legacy Keywords
diabetic, Malay Eye Study, glycemic control, blood pressure, diabetic retinopathy, Archives of Ophthalmology
Article Source

PURLs Copyright

Inside the Article