Dexamethasone Implant Improved Uveitis : Close to half of eyes that were treated with the 0.7-mg implant had a vitreous haze score of 0.

Article Type
Changed
Thu, 12/06/2018 - 10:27
Display Headline
Dexamethasone Implant Improved Uveitis : Close to half of eyes that were treated with the 0.7-mg implant had a vitreous haze score of 0.

Major Finding: A single dexamethasone intravitreal implant significantly improved intraocular inflammation and visual acuity in patients with noninfectious intermediate or posterior uveitis.

Data Source: A 26-week, prospective, multicenter, masked, randomized, parallel-group, sham-controlled clinical trial of 229 patients with noninfectious intermediate or posterior uveitis.

Disclosures: This study was sponsored by Allergan, which participated in the design of the study, analysis of the data, and interpretation. Allergan also supervised the preparation of the manuscript and approved the final version. Five authors of the study are employees of Allergan.

A single dose of an intravitreal dexamethasone implant produced significant improvements in intraocular inflammation and visual acuity that lasted 6 months in patients with noninfectious intermediate or posterior uveitis, according to research published online.

The dexamethasone (DEX) implant, Ozurdex (Allergan Inc.), currently is approved for treatment of macular edema associated with retinal vein occlusions. Dr. Careen Lowder of the Cole Eye Institute, Cleveland, and her colleagues in the Ozurdex HURON Study Group sought to determine the safety of the DEX implant in the treatment of noninfectious intermediate and posterior uveitis.

She and her colleagues conducted a 26-week, prospective, multicenter, masked, parallel-group, sham-controlled clinical trial, in which they randomized 229 patients with a diagnosis of noninfectious intermediate or posterior uveitis to receive a sham procedure or treatment with a 0.7-mg or 0.35-mg DEX implant (Arch. Ophthalmol. 2011 Jan. 10 [doi:10.1001/archophthalmol.2010.339]).

The mean age of the patients was 45 years, more than 60% were women, and more than 60% were white. Of the 229 patients, 217 (95%) completed the 26-week study, 2 in the 0.7-mg group dropped out because of adverse events, and 1 in the 0.35-mg group discontinued because of a lack of efficacy.

The primary outcome measure was the vitreous haze score at 8 weeks, as measured by a standardized photographic scale ranging from 0 (no inflammation) to 4 (optic nerve head not visible).

Patients in all groups had a mean vitreous haze score of +2 (moderate blurring of the optic nerve head) at baseline.

At the 8-week follow-up, a vitreous haze score of 0 was observed in 47% of eyes with the 0.7-mg implant, 36% of those with the 0.35-mg implant, and 12% of those that had the sham procedure, the investigators said.

There was no significant difference between the two treatment doses, and the benefit associated with the implant persisted through the 26-week study.

In addition, one or more cells were present in the anterior chamber in 14.5% of the 0.7-mg group and 20.3% of the 0.35-mg group, compared with 38.7% of the sham group. And, two to six times as many eyes in the DEX implant groups vs. the sham group gained 15 or more letters of best-corrected visual acuity, a significant difference.

“The results of the present study demonstrate that the DEX implant has a favorable safety profile and can effectively reduce inflammation and substantially improve vision in eyes with noninfectious intermediate or posterior uveitis,” the researchers stated.

The findings suggest that the DEX implant may be used to safely and effectively treat intermediate and posterior uveitis. “Typically, the most common adverse events associated with intravitreal corticosteroids, which may have impacted use in the past, include increases in intraocular pressure and cataract,” the researchers wrote. “On any given follow-up visit in the present study, substantial increases in intraocular pressure (to 25 mm Hg or greater) occurred in less than 10% of treated eyes.”

One limitation in the study, however, was that patients were treated with a single DEX implant and followed for only 6 months. This limits the ability to assess the risk of cataract. “Future studies will be needed to explore the long-term effects of repeated treatment with the DEX implant in patients with uveitis and to evaluate the potential of this therapy in other retinal disorders beyond retinal vein occlusion,” the researchers wrote.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Major Finding: A single dexamethasone intravitreal implant significantly improved intraocular inflammation and visual acuity in patients with noninfectious intermediate or posterior uveitis.

Data Source: A 26-week, prospective, multicenter, masked, randomized, parallel-group, sham-controlled clinical trial of 229 patients with noninfectious intermediate or posterior uveitis.

Disclosures: This study was sponsored by Allergan, which participated in the design of the study, analysis of the data, and interpretation. Allergan also supervised the preparation of the manuscript and approved the final version. Five authors of the study are employees of Allergan.

A single dose of an intravitreal dexamethasone implant produced significant improvements in intraocular inflammation and visual acuity that lasted 6 months in patients with noninfectious intermediate or posterior uveitis, according to research published online.

The dexamethasone (DEX) implant, Ozurdex (Allergan Inc.), currently is approved for treatment of macular edema associated with retinal vein occlusions. Dr. Careen Lowder of the Cole Eye Institute, Cleveland, and her colleagues in the Ozurdex HURON Study Group sought to determine the safety of the DEX implant in the treatment of noninfectious intermediate and posterior uveitis.

She and her colleagues conducted a 26-week, prospective, multicenter, masked, parallel-group, sham-controlled clinical trial, in which they randomized 229 patients with a diagnosis of noninfectious intermediate or posterior uveitis to receive a sham procedure or treatment with a 0.7-mg or 0.35-mg DEX implant (Arch. Ophthalmol. 2011 Jan. 10 [doi:10.1001/archophthalmol.2010.339]).

The mean age of the patients was 45 years, more than 60% were women, and more than 60% were white. Of the 229 patients, 217 (95%) completed the 26-week study, 2 in the 0.7-mg group dropped out because of adverse events, and 1 in the 0.35-mg group discontinued because of a lack of efficacy.

The primary outcome measure was the vitreous haze score at 8 weeks, as measured by a standardized photographic scale ranging from 0 (no inflammation) to 4 (optic nerve head not visible).

Patients in all groups had a mean vitreous haze score of +2 (moderate blurring of the optic nerve head) at baseline.

At the 8-week follow-up, a vitreous haze score of 0 was observed in 47% of eyes with the 0.7-mg implant, 36% of those with the 0.35-mg implant, and 12% of those that had the sham procedure, the investigators said.

There was no significant difference between the two treatment doses, and the benefit associated with the implant persisted through the 26-week study.

In addition, one or more cells were present in the anterior chamber in 14.5% of the 0.7-mg group and 20.3% of the 0.35-mg group, compared with 38.7% of the sham group. And, two to six times as many eyes in the DEX implant groups vs. the sham group gained 15 or more letters of best-corrected visual acuity, a significant difference.

“The results of the present study demonstrate that the DEX implant has a favorable safety profile and can effectively reduce inflammation and substantially improve vision in eyes with noninfectious intermediate or posterior uveitis,” the researchers stated.

The findings suggest that the DEX implant may be used to safely and effectively treat intermediate and posterior uveitis. “Typically, the most common adverse events associated with intravitreal corticosteroids, which may have impacted use in the past, include increases in intraocular pressure and cataract,” the researchers wrote. “On any given follow-up visit in the present study, substantial increases in intraocular pressure (to 25 mm Hg or greater) occurred in less than 10% of treated eyes.”

One limitation in the study, however, was that patients were treated with a single DEX implant and followed for only 6 months. This limits the ability to assess the risk of cataract. “Future studies will be needed to explore the long-term effects of repeated treatment with the DEX implant in patients with uveitis and to evaluate the potential of this therapy in other retinal disorders beyond retinal vein occlusion,” the researchers wrote.

Major Finding: A single dexamethasone intravitreal implant significantly improved intraocular inflammation and visual acuity in patients with noninfectious intermediate or posterior uveitis.

Data Source: A 26-week, prospective, multicenter, masked, randomized, parallel-group, sham-controlled clinical trial of 229 patients with noninfectious intermediate or posterior uveitis.

Disclosures: This study was sponsored by Allergan, which participated in the design of the study, analysis of the data, and interpretation. Allergan also supervised the preparation of the manuscript and approved the final version. Five authors of the study are employees of Allergan.

A single dose of an intravitreal dexamethasone implant produced significant improvements in intraocular inflammation and visual acuity that lasted 6 months in patients with noninfectious intermediate or posterior uveitis, according to research published online.

The dexamethasone (DEX) implant, Ozurdex (Allergan Inc.), currently is approved for treatment of macular edema associated with retinal vein occlusions. Dr. Careen Lowder of the Cole Eye Institute, Cleveland, and her colleagues in the Ozurdex HURON Study Group sought to determine the safety of the DEX implant in the treatment of noninfectious intermediate and posterior uveitis.

She and her colleagues conducted a 26-week, prospective, multicenter, masked, parallel-group, sham-controlled clinical trial, in which they randomized 229 patients with a diagnosis of noninfectious intermediate or posterior uveitis to receive a sham procedure or treatment with a 0.7-mg or 0.35-mg DEX implant (Arch. Ophthalmol. 2011 Jan. 10 [doi:10.1001/archophthalmol.2010.339]).

The mean age of the patients was 45 years, more than 60% were women, and more than 60% were white. Of the 229 patients, 217 (95%) completed the 26-week study, 2 in the 0.7-mg group dropped out because of adverse events, and 1 in the 0.35-mg group discontinued because of a lack of efficacy.

The primary outcome measure was the vitreous haze score at 8 weeks, as measured by a standardized photographic scale ranging from 0 (no inflammation) to 4 (optic nerve head not visible).

Patients in all groups had a mean vitreous haze score of +2 (moderate blurring of the optic nerve head) at baseline.

At the 8-week follow-up, a vitreous haze score of 0 was observed in 47% of eyes with the 0.7-mg implant, 36% of those with the 0.35-mg implant, and 12% of those that had the sham procedure, the investigators said.

There was no significant difference between the two treatment doses, and the benefit associated with the implant persisted through the 26-week study.

In addition, one or more cells were present in the anterior chamber in 14.5% of the 0.7-mg group and 20.3% of the 0.35-mg group, compared with 38.7% of the sham group. And, two to six times as many eyes in the DEX implant groups vs. the sham group gained 15 or more letters of best-corrected visual acuity, a significant difference.

“The results of the present study demonstrate that the DEX implant has a favorable safety profile and can effectively reduce inflammation and substantially improve vision in eyes with noninfectious intermediate or posterior uveitis,” the researchers stated.

The findings suggest that the DEX implant may be used to safely and effectively treat intermediate and posterior uveitis. “Typically, the most common adverse events associated with intravitreal corticosteroids, which may have impacted use in the past, include increases in intraocular pressure and cataract,” the researchers wrote. “On any given follow-up visit in the present study, substantial increases in intraocular pressure (to 25 mm Hg or greater) occurred in less than 10% of treated eyes.”

One limitation in the study, however, was that patients were treated with a single DEX implant and followed for only 6 months. This limits the ability to assess the risk of cataract. “Future studies will be needed to explore the long-term effects of repeated treatment with the DEX implant in patients with uveitis and to evaluate the potential of this therapy in other retinal disorders beyond retinal vein occlusion,” the researchers wrote.

Publications
Publications
Topics
Article Type
Display Headline
Dexamethasone Implant Improved Uveitis : Close to half of eyes that were treated with the 0.7-mg implant had a vitreous haze score of 0.
Display Headline
Dexamethasone Implant Improved Uveitis : Close to half of eyes that were treated with the 0.7-mg implant had a vitreous haze score of 0.
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Vessel Size May Predict Retinopathy Progression

Article Type
Changed
Fri, 01/18/2019 - 00:35
Display Headline
Vessel Size May Predict Retinopathy Progression

Major Finding: Larger central retinal venular diameter is an independent and early indicator of progression to proliferative diabetic retinopathy with or without high-risk characteristics in black patients with type 1 diabetes mellitus.

Data Source: A 6-year follow-up evaluation of a cohort of 468 black patients from the New Jersey 725 study.

Disclosures: The authors had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

The investigators examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

“The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted,” the researchers pointed out. “Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis.”

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, Dr. Roy and her associates said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

“It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature,” the researchers noted.

In an accompanying editorial, Dr. Tien Yin Wong of the Singapore Eye Research Institute stated that larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after accounting for other known risk factors (Arch. Ophthalmol. 2011;129:95-6).

Although the underlying mechanisms are unclear, the researchers suggested that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction, added Dr. Wong, who had no financial disclosures.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Major Finding: Larger central retinal venular diameter is an independent and early indicator of progression to proliferative diabetic retinopathy with or without high-risk characteristics in black patients with type 1 diabetes mellitus.

Data Source: A 6-year follow-up evaluation of a cohort of 468 black patients from the New Jersey 725 study.

Disclosures: The authors had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

The investigators examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

“The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted,” the researchers pointed out. “Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis.”

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, Dr. Roy and her associates said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

“It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature,” the researchers noted.

In an accompanying editorial, Dr. Tien Yin Wong of the Singapore Eye Research Institute stated that larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after accounting for other known risk factors (Arch. Ophthalmol. 2011;129:95-6).

Although the underlying mechanisms are unclear, the researchers suggested that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction, added Dr. Wong, who had no financial disclosures.

Major Finding: Larger central retinal venular diameter is an independent and early indicator of progression to proliferative diabetic retinopathy with or without high-risk characteristics in black patients with type 1 diabetes mellitus.

Data Source: A 6-year follow-up evaluation of a cohort of 468 black patients from the New Jersey 725 study.

Disclosures: The authors had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

The investigators examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

“The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted,” the researchers pointed out. “Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis.”

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, Dr. Roy and her associates said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

“It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature,” the researchers noted.

In an accompanying editorial, Dr. Tien Yin Wong of the Singapore Eye Research Institute stated that larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after accounting for other known risk factors (Arch. Ophthalmol. 2011;129:95-6).

Although the underlying mechanisms are unclear, the researchers suggested that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction, added Dr. Wong, who had no financial disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Vessel Size May Predict Retinopathy Progression
Display Headline
Vessel Size May Predict Retinopathy Progression
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Pulmonary Embolism Treatment Often Lags

Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Pulmonary Embolism Treatment Often Lags

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Author and Disclosure Information

Author and Disclosure Information

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Publications
Publications
Topics
Article Type
Display Headline
Pulmonary Embolism Treatment Often Lags
Display Headline
Pulmonary Embolism Treatment Often Lags
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Pulmonary Embolism Treatment Often Lags

Article Type
Changed
Thu, 12/06/2018 - 20:36
Display Headline
Pulmonary Embolism Treatment Often Lags

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Sections
Author and Disclosure Information

Author and Disclosure Information

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Contrary to previous studies, only 1 in 100 patients with pulmonary embolism die in emergency departments, but more standardized treatment, including earlier use of anticoagulants, is needed, according to the initial report of the EMPEROR study in the Feb. 8, 2011, issue of the Journal of the American College of Cardiology.

Of the 20 patients in this study whose death was due to PE, only 3 received anticoagulant treatment before a diagnosis was confirmed, even though previous studies have shown that delays in starting anticoagulants can be fatal. Only another 3 of the 20 patients received fibrinolytic therapy in the emergency department, the Emergency Medicine Pulmonary Embolism in the Real World Registry (EMPEROR) researchers found.

Dr. Charles V. Pollack of Pennsylvania Hospital, Philadelphia, and colleagues created the national, prospective, multicenter, observational registry to establish more definitive signs and symptoms of PE, compare treatment strategies, and measure use of risk stratification methods, frequency of empiric anticoagulation use, and rates of major hemorrhage and death. Earlier registries of patients with PE have not provided specific data about race or ethnicity, risk stratification methods, or the frequency of anticoagulant use in the emergency department.

Between Jan. 1, 2005, and Dec. 29, 2008, the researchers enrolled 2,408 patients from 22 emergency departments, 1,880 of whom had confirmed PE (J. Am. Coll. Cardiol. 2011;57:700-6).

Systemic non–vitamin K–dependent anticoagulation was initiated in the ED in 1,593 (84%) patients, but only 173 (9%) received heparin of any type before the results of diagnostic imaging were available and 33 (1.7%) received a fibrinolytic agent.

While other registries show a mortality of 10%-15%, only 20 of the 1,880 patients (1%) in this study died as a direct result of the embolism, with 0.2% mortality from hemorrhage. At 30 days, patients whose acute PE was diagnosed in emergency departments had an all-cause mortality of 5.4%. The lower mortality in this report may reflect the fact that the registry included only outpatients who experienced PE and that this population was younger and less ill than those in other registries, the researchers said.

Of the 1,880 patients with confirmed PE, 1,654 (88%) were diagnosed based on CT pulmonary angiogram, 91 (5%) on formal pulmonary angiography, 82 (4%) on ventilation-perfusion scan, 51 (3%) on deep vein thrombosis with appropriate PE symptoms, and 2 (0.1%) on pulmonary MRI.

The mean age was 57 years, with one-third past age 65; 53% of the patients were women, and 68% were white. The racial and ethnic distribution of patients with PE closely parallels that of all patients who present to emergency departments.

The most common presenting signs and symptoms of PE were dyspnea at rest (50%), pleuritic chest pain (39%), dyspnea with exertion (27%), extremity swelling suggestive of deep vein thrombosis (24%), and syncope (5%).

The most common comorbidities that could represent potential risk factors for PE were hypertension (46%), obesity (27%), recent hospitalization (24%), and active malignancy (22%). Based on the absence of any predefined risk factors for PE, 312 (16.6%) of the 1,880 patients were considered to have idiopathic PE.

Most patients (79%) diagnosed with PE in the ED lived independently, while 11.6% reported generalized immobility.

The low rate of early treatment found by the study "suggested that empiric anticoagulation in patients with suspected PE should be instituted more often in the ED and that timely, therapeutic anticoagulation should be administered after the diagnosis is confirmed," the researchers wrote. "Future treatment studies of PE conducted in U.S. EDs should focus on accelerating the time frame of administration of systemic anticoagulation and fibrinolysis to patients with evidence of severe PE."

Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Publications
Publications
Topics
Article Type
Display Headline
Pulmonary Embolism Treatment Often Lags
Display Headline
Pulmonary Embolism Treatment Often Lags
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Legacy Keywords
pulmonary embolism, emergency departments, anticoagulants, EMPEROR study, Journal of the American College of Cardiology, fibrinolytic therapy, Emergency Medicine Pulmonary Embolism in the Real World Registry
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Only 6 of 20 patients whose death was due to PE received anticoagulant or fibrinolytic therapy in the emergency department.

Data Source: A total of 1,880 patients with confirmed acute PE from 22 U.S. emergency departments included in the EMPEROR registry.

Disclosures: Several coauthors disclosed consulting relationships with and/or funding from a wide range of pharmaceutical companies that make anticoagulant agents.

Study Confirms Parental History as Heart Attack Risk Factor

Family History Re-emerges as Heart Disease Risk Factor
Article Type
Changed
Thu, 12/06/2018 - 20:35
Display Headline
Study Confirms Parental History as Heart Attack Risk Factor

Having a parent with a history of myocardial infarction nearly doubled a person’s own risk of future MI, even after accounting for other risk factors, according to an analysis of a large case-control study.

The findings are consistent in all regions of the world, added the authors of the analysis, who reported their findings in the Feb. 1 issue of the Journal of the American College of Cardiology.

INTERHEART, a multinational case-control study, involved 15,152 patients who presented with a first MI and 14,820 age- and sex-matched control subjects between February 1999 and March 2003. Previous analysis of the study identified nine variables for determining MI risk: abnormal lipids, smoking, hypertension, diabetes, abdominal obesity, psychosocial factors, physical activity, fruit and vegetable consumption, and alcohol consumption.

The current analysis included 12,149 patients presenting with a first MI and 14,467 control subjects. Dr. Clara K. Chow of McMaster University and Hamilton Health Sciences, Hamilton, Ontario, and her colleagues obtained data on demographic factors, socioeconomic status, and risk factors for all participants, and performed genetic analysis in 8,795 participants using a panel of 1,536 single nucleotide polymorphisms (SNPs) from 103 genes believed to be associated with MI or risk factors for MI (J. Am. Coll. Cardiol. 2011;57:619-27).

In 18.1% of cases and 12% of controls, either parent had a history of MI, while both parents had a history of MI in 2.1% of cases and 0.9% of controls. A maternal history of MI was present in 7.5% of cases and 4.9% of controls, while 12.7% of cases and 8.1% of controls had a paternal history of MI. There were no significant differences in the risks associated with history of MI in either parent.

The relationship between parental history and MI risk remained after adjusting for the nine INTERHEART risk factors.

Genetic risk scores also didn’t alter the relationship. The researchers calculated genotype scores for 3,372 cases and 4,043 controls, and found that the genetic risk scores were not greater in those with a parental history of MI than in those without. The mean genotype score was 11.90 in controls with no parental history of MI, 11.84 in controls with a parental history of MI, 12.26 in cases with no parental history of MI, and 12.14 in cases with a parental history of MI.

Parental history approximately doubled an individual’s risk of MI, and the risk increased if both parents had a history of MI, especially if it occurred at a younger age.

Specifically, the odds ratio of an individual having MI was 1.81 if either parent had a history of MI (1.74 after adjusting for other risk factors). When either parent had an MI at age 50 years or older, the odds ratio fell to 1.67 – but it rose to 2.90 if both parents had an MI at age 50 or older.

The odds ratio was 2.36 for those patients with one parent who had an MI before age 50, but increased to 3.26 if both parents had an MI and one parent was younger than age 50. If both parents had an MI before age 50, the odds ratio reached 6.56.

The study results suggest that parental history of MI is an independent predictor of an individual’s risk of future MI – even after adjusting for age, sex, region, and other risk factors. The association between parental history of MI and MI is consistent across geographic regions, age, sex, and socioeconomic subgroups.

"While other studies have shown the relationship between parental history and risk, they have not established its independence from the extensive list of other potential explanatory factors, as measured by the INTERHEART Study, and not established it in other world regions or ethnic groups," the authors wrote.

The results raise another question: Are there factors other than shared risk factors and genetics that are involved in heart disease, such as early life exposures or home environmental factors? The genotype score obtained in the study analyses represents only a small percentage of possible genotype variants, the researchers noted.

The strengths of the study are its large international coverage; the large number of cases of MI; and the comprehensive measurement of risk factors, including genetic factors, the researchers said. Limitations included a lack of data collected from siblings or other relatives, limited measurement of SNPs in a limited number of individuals, and the possibility of recall biases due to the case-control nature of the study.

Body

Since the 1950s and early 1960s, epidemiology studies have cited familial history as a risk factor for coronary artery disease, explained Dr. Themistocles L. Assimes. The same finding appeared in larger case-control studies in the 1970s and 1980s, yet family history was not included in the Framingham Risk Score, the risk profile established by the Framingham Heart Study. That was primarily because researchers considered the predictive value of family history limited when compared with other major risk factors, and because it was hard to measure reliably, Dr. Assimes noted in an editorial accompanying the analysis (J. Am. Coll. Cardiol. 2011;57:628-29).

Since then, research has been less likely to focus on the role of a carefully documented family history in the prevention of coronary artery disease and more likely to focus on estimating what proportion of unidentified factors responsible for familial aggregation were genetic in nature.

The latest findings from the INTERHEART Study are important because they show how measurement of family (specifically parental) history of cardiovascular disease can help predict an individual’s risk for MI, Dr. Assimes explained. And they can do so independently from other known risk factors.

Although researchers are unlikely to identify the numerous genetic and nongenetic factors responsible for family aggregation of heart disease for years, the latest findings, "combined with recent efforts to develop new risk scores around the world, have undoubtedly reignited interest in understanding the role of this traditional risk factor in the primary prevention of [coronary artery disease]," he said.

Dr. Assimes is an assistant professor at the Stanford (Calif.) University School of Medicine. He had no relationships to disclose.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
INTERHEART, myocardial infarction, heart attack, Clara K. Chow, genetics, cardiovascular disease, heart disease, cardiac
Author and Disclosure Information

Author and Disclosure Information

Body

Since the 1950s and early 1960s, epidemiology studies have cited familial history as a risk factor for coronary artery disease, explained Dr. Themistocles L. Assimes. The same finding appeared in larger case-control studies in the 1970s and 1980s, yet family history was not included in the Framingham Risk Score, the risk profile established by the Framingham Heart Study. That was primarily because researchers considered the predictive value of family history limited when compared with other major risk factors, and because it was hard to measure reliably, Dr. Assimes noted in an editorial accompanying the analysis (J. Am. Coll. Cardiol. 2011;57:628-29).

Since then, research has been less likely to focus on the role of a carefully documented family history in the prevention of coronary artery disease and more likely to focus on estimating what proportion of unidentified factors responsible for familial aggregation were genetic in nature.

The latest findings from the INTERHEART Study are important because they show how measurement of family (specifically parental) history of cardiovascular disease can help predict an individual’s risk for MI, Dr. Assimes explained. And they can do so independently from other known risk factors.

Although researchers are unlikely to identify the numerous genetic and nongenetic factors responsible for family aggregation of heart disease for years, the latest findings, "combined with recent efforts to develop new risk scores around the world, have undoubtedly reignited interest in understanding the role of this traditional risk factor in the primary prevention of [coronary artery disease]," he said.

Dr. Assimes is an assistant professor at the Stanford (Calif.) University School of Medicine. He had no relationships to disclose.

Body

Since the 1950s and early 1960s, epidemiology studies have cited familial history as a risk factor for coronary artery disease, explained Dr. Themistocles L. Assimes. The same finding appeared in larger case-control studies in the 1970s and 1980s, yet family history was not included in the Framingham Risk Score, the risk profile established by the Framingham Heart Study. That was primarily because researchers considered the predictive value of family history limited when compared with other major risk factors, and because it was hard to measure reliably, Dr. Assimes noted in an editorial accompanying the analysis (J. Am. Coll. Cardiol. 2011;57:628-29).

Since then, research has been less likely to focus on the role of a carefully documented family history in the prevention of coronary artery disease and more likely to focus on estimating what proportion of unidentified factors responsible for familial aggregation were genetic in nature.

The latest findings from the INTERHEART Study are important because they show how measurement of family (specifically parental) history of cardiovascular disease can help predict an individual’s risk for MI, Dr. Assimes explained. And they can do so independently from other known risk factors.

Although researchers are unlikely to identify the numerous genetic and nongenetic factors responsible for family aggregation of heart disease for years, the latest findings, "combined with recent efforts to develop new risk scores around the world, have undoubtedly reignited interest in understanding the role of this traditional risk factor in the primary prevention of [coronary artery disease]," he said.

Dr. Assimes is an assistant professor at the Stanford (Calif.) University School of Medicine. He had no relationships to disclose.

Title
Family History Re-emerges as Heart Disease Risk Factor
Family History Re-emerges as Heart Disease Risk Factor

Having a parent with a history of myocardial infarction nearly doubled a person’s own risk of future MI, even after accounting for other risk factors, according to an analysis of a large case-control study.

The findings are consistent in all regions of the world, added the authors of the analysis, who reported their findings in the Feb. 1 issue of the Journal of the American College of Cardiology.

INTERHEART, a multinational case-control study, involved 15,152 patients who presented with a first MI and 14,820 age- and sex-matched control subjects between February 1999 and March 2003. Previous analysis of the study identified nine variables for determining MI risk: abnormal lipids, smoking, hypertension, diabetes, abdominal obesity, psychosocial factors, physical activity, fruit and vegetable consumption, and alcohol consumption.

The current analysis included 12,149 patients presenting with a first MI and 14,467 control subjects. Dr. Clara K. Chow of McMaster University and Hamilton Health Sciences, Hamilton, Ontario, and her colleagues obtained data on demographic factors, socioeconomic status, and risk factors for all participants, and performed genetic analysis in 8,795 participants using a panel of 1,536 single nucleotide polymorphisms (SNPs) from 103 genes believed to be associated with MI or risk factors for MI (J. Am. Coll. Cardiol. 2011;57:619-27).

In 18.1% of cases and 12% of controls, either parent had a history of MI, while both parents had a history of MI in 2.1% of cases and 0.9% of controls. A maternal history of MI was present in 7.5% of cases and 4.9% of controls, while 12.7% of cases and 8.1% of controls had a paternal history of MI. There were no significant differences in the risks associated with history of MI in either parent.

The relationship between parental history and MI risk remained after adjusting for the nine INTERHEART risk factors.

Genetic risk scores also didn’t alter the relationship. The researchers calculated genotype scores for 3,372 cases and 4,043 controls, and found that the genetic risk scores were not greater in those with a parental history of MI than in those without. The mean genotype score was 11.90 in controls with no parental history of MI, 11.84 in controls with a parental history of MI, 12.26 in cases with no parental history of MI, and 12.14 in cases with a parental history of MI.

Parental history approximately doubled an individual’s risk of MI, and the risk increased if both parents had a history of MI, especially if it occurred at a younger age.

Specifically, the odds ratio of an individual having MI was 1.81 if either parent had a history of MI (1.74 after adjusting for other risk factors). When either parent had an MI at age 50 years or older, the odds ratio fell to 1.67 – but it rose to 2.90 if both parents had an MI at age 50 or older.

The odds ratio was 2.36 for those patients with one parent who had an MI before age 50, but increased to 3.26 if both parents had an MI and one parent was younger than age 50. If both parents had an MI before age 50, the odds ratio reached 6.56.

The study results suggest that parental history of MI is an independent predictor of an individual’s risk of future MI – even after adjusting for age, sex, region, and other risk factors. The association between parental history of MI and MI is consistent across geographic regions, age, sex, and socioeconomic subgroups.

"While other studies have shown the relationship between parental history and risk, they have not established its independence from the extensive list of other potential explanatory factors, as measured by the INTERHEART Study, and not established it in other world regions or ethnic groups," the authors wrote.

The results raise another question: Are there factors other than shared risk factors and genetics that are involved in heart disease, such as early life exposures or home environmental factors? The genotype score obtained in the study analyses represents only a small percentage of possible genotype variants, the researchers noted.

The strengths of the study are its large international coverage; the large number of cases of MI; and the comprehensive measurement of risk factors, including genetic factors, the researchers said. Limitations included a lack of data collected from siblings or other relatives, limited measurement of SNPs in a limited number of individuals, and the possibility of recall biases due to the case-control nature of the study.

Having a parent with a history of myocardial infarction nearly doubled a person’s own risk of future MI, even after accounting for other risk factors, according to an analysis of a large case-control study.

The findings are consistent in all regions of the world, added the authors of the analysis, who reported their findings in the Feb. 1 issue of the Journal of the American College of Cardiology.

INTERHEART, a multinational case-control study, involved 15,152 patients who presented with a first MI and 14,820 age- and sex-matched control subjects between February 1999 and March 2003. Previous analysis of the study identified nine variables for determining MI risk: abnormal lipids, smoking, hypertension, diabetes, abdominal obesity, psychosocial factors, physical activity, fruit and vegetable consumption, and alcohol consumption.

The current analysis included 12,149 patients presenting with a first MI and 14,467 control subjects. Dr. Clara K. Chow of McMaster University and Hamilton Health Sciences, Hamilton, Ontario, and her colleagues obtained data on demographic factors, socioeconomic status, and risk factors for all participants, and performed genetic analysis in 8,795 participants using a panel of 1,536 single nucleotide polymorphisms (SNPs) from 103 genes believed to be associated with MI or risk factors for MI (J. Am. Coll. Cardiol. 2011;57:619-27).

In 18.1% of cases and 12% of controls, either parent had a history of MI, while both parents had a history of MI in 2.1% of cases and 0.9% of controls. A maternal history of MI was present in 7.5% of cases and 4.9% of controls, while 12.7% of cases and 8.1% of controls had a paternal history of MI. There were no significant differences in the risks associated with history of MI in either parent.

The relationship between parental history and MI risk remained after adjusting for the nine INTERHEART risk factors.

Genetic risk scores also didn’t alter the relationship. The researchers calculated genotype scores for 3,372 cases and 4,043 controls, and found that the genetic risk scores were not greater in those with a parental history of MI than in those without. The mean genotype score was 11.90 in controls with no parental history of MI, 11.84 in controls with a parental history of MI, 12.26 in cases with no parental history of MI, and 12.14 in cases with a parental history of MI.

Parental history approximately doubled an individual’s risk of MI, and the risk increased if both parents had a history of MI, especially if it occurred at a younger age.

Specifically, the odds ratio of an individual having MI was 1.81 if either parent had a history of MI (1.74 after adjusting for other risk factors). When either parent had an MI at age 50 years or older, the odds ratio fell to 1.67 – but it rose to 2.90 if both parents had an MI at age 50 or older.

The odds ratio was 2.36 for those patients with one parent who had an MI before age 50, but increased to 3.26 if both parents had an MI and one parent was younger than age 50. If both parents had an MI before age 50, the odds ratio reached 6.56.

The study results suggest that parental history of MI is an independent predictor of an individual’s risk of future MI – even after adjusting for age, sex, region, and other risk factors. The association between parental history of MI and MI is consistent across geographic regions, age, sex, and socioeconomic subgroups.

"While other studies have shown the relationship between parental history and risk, they have not established its independence from the extensive list of other potential explanatory factors, as measured by the INTERHEART Study, and not established it in other world regions or ethnic groups," the authors wrote.

The results raise another question: Are there factors other than shared risk factors and genetics that are involved in heart disease, such as early life exposures or home environmental factors? The genotype score obtained in the study analyses represents only a small percentage of possible genotype variants, the researchers noted.

The strengths of the study are its large international coverage; the large number of cases of MI; and the comprehensive measurement of risk factors, including genetic factors, the researchers said. Limitations included a lack of data collected from siblings or other relatives, limited measurement of SNPs in a limited number of individuals, and the possibility of recall biases due to the case-control nature of the study.

Publications
Publications
Topics
Article Type
Display Headline
Study Confirms Parental History as Heart Attack Risk Factor
Display Headline
Study Confirms Parental History as Heart Attack Risk Factor
Legacy Keywords
INTERHEART, myocardial infarction, heart attack, Clara K. Chow, genetics, cardiovascular disease, heart disease, cardiac
Legacy Keywords
INTERHEART, myocardial infarction, heart attack, Clara K. Chow, genetics, cardiovascular disease, heart disease, cardiac
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Parental history of myocardial infarction is an independent predictor of future MI – a finding consistent across geographic regions, age, sex, socioeconomic subgroups, and different risk groups.

Data Source: The INTERHEART study, a multinational, case-control study that enrolled 15,152 cases presenting with a first MI and 14,820 controls matched for age and sex between February 1999 and March 2003.

Disclosures: The INTERHEART study was funded by the Canadian Institutes of Health Research, the Heart and Stroke Foundation of Ontario, and the International Clinical Epidemiology Network (INCLEN). Several pharmaceutical companies – particularly AstraZeneca, Novartis, Aventis, Abbott, Bristol-Myers Squibb, King Pharma, and Sanofi-Sythelabo – provided unrestricted research grants. Various national bodies in different countries also provided funding.

Inappropriate Defibrillator Shocks Common, Increase Mortality Risk

Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Author and Disclosure Information

Author and Disclosure Information

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Publications
Publications
Topics
Article Type
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Article Source

FROM JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Inappropriate Defibrillator Shocks Common, Increase Mortality Risk

Article Type
Changed
Thu, 12/06/2018 - 20:35
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Author and Disclosure Information

Author and Disclosure Information

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Despite proven survival benefits of implantable cardioverter defibrillators in patients with ventricular tachycardia and ventricular fibrillation, device-related shocks remain a problem and are associated with an increased risk of mortality, according to study findings in the Feb. 1 Journal of the American College of Cardiology.

Dr. Johannes B. van Reese and his colleagues at Leiden (the Netherlands) University Medical Center sought to evaluate the occurrence of inappropriate shocks, identify parameters that may predict their occurrence, and assess their long-term impact. The investigators analyzed medical records from 1,658 patients at the center who received an implantable cardioverter defibrillator (ICD) system equipped with intracardiac electrogram storage between 1996 and 2006.

A total of 114 patients were lost to follow-up; 56% of the remaining 1,544 patients received an ICD for primary prevention and 64% had ischemic heart disease.

During 41 months of follow-up (±18 months), 204 patients (13%) had a total of 665 inappropriate shocks. The cumulative incidence of inappropriate shocks "steadily increased" across the extended follow-up period, reaching 7% at 1 year, 13% at 3 years and 18% at 5 years (J. Am. Coll. Cardiol. 2011;57:556-62).

Several factors independently predicted the occurrence of inappropriate shocks, including age less than 70 years, a history of atrial fibrillation or nonischemic heart disease, or nonuse of statins. The main cause of inappropriate shocks was misdiagnosis of supraventricular tachycardia, which occurred in 155 (76%) of the 204 patients.

There was no significant difference in the occurrence of inappropriate shocks among different types of devices. However, patients with a single-chamber ICD received significantly more shocks because of misdiagnosis of sinus tachycardia than did patients with a dual-chamber ICD (24% vs. 8%). Also, patients who had a defibrillator for cardiac resynchronization therapy tended to experience more inappropriate shocks as a result of abnormal sensing than did ICD recipients with a single-chamber ICD (15% vs. 8%), the investigators found.

Despite improved technology in ICDs, the researchers found that patients who underwent implantation between May 2004 and 2006 were at a greater risk of experiencing inappropriate shocks than were those who received their ICD between 1996 and May 2004. The authors said that evolving guidelines on who could receive ICDs may have caused more critical patients to receive the device in later years, ultimately increasing the number of inappropriate shocks experienced.

In what the authors considered "the most important finding," patients who experienced inappropriate shocks had a higher risk of all-cause mortality. Specifically, 298 (19%) patients died during follow-up, and after adjusting for potential confounders, the researchers found a 60% increased risk of death after a first inappropriate shock. The risk of mortality increased with each subsequent inappropriate shock, from a hazard ratio of 1.6 after the first shock to 3.7 after five inappropriate shocks.

Dr. Martin J. Schalij, one of the study’s authors, described these findings as a "serious issue" necessitating "that greater efforts be made to lower the number of these shocks," he said in a statement.

Although two analyses from ICD clinical trials also found an association between inappropriate shocks and increased mortality, Dr. Schalij noted that this current study is the first trial to do so in a large, general patient population.

The researchers could not determine the cause of the increased mortality associated with the shocks. However, Dr. Schalij, said that more should be done to reverse this trend. "It is not acceptable that so many patients suffer from inappropriate shocks. ICD therapy must be improved, through both patient-tailored programming of the devices and the development of superior algorithms to allow ICDs to better determine false alarms, such as supraventricular arrhythmias."

The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Publications
Publications
Topics
Article Type
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk
Display Headline
Inappropriate Defibrillator Shocks Common, Increase Mortality Risk
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Legacy Keywords
survival, cardioverter defibrillators, ventricular tachycardia, ventricular fibrillation, mortality, American College of Cardiology, intracardiac electrogram storage
Article Source

FROM JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Inappropriate shocks occur in 13% of implantable cardioverter defibrillator recipients, mainly because of misdiagnosis of supraventricular tachycardia, and are associated with a higher risk of all-cause mortality.

Data Source: Analysis of medical records of 1,544 patients who received an ICD system between 1996 and 2006 at the Leiden University Medical Center.

Disclosures: The authors reported receiving grants from Biotronik, Boston Scientific, Bristol-Myers Squibb Medical Imaging, Edwards Lifesciences, GE Healthcare, Medtronic, and St. Jude.

Refractive Error Similar Between Diabetics and Nondiabetics

Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
ophthalmology, diabetes
Author and Disclosure Information

Author and Disclosure Information

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

Publications
Publications
Topics
Article Type
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics
Legacy Keywords
ophthalmology, diabetes
Legacy Keywords
ophthalmology, diabetes
Article Source

FROM ARCHIVES OF OPHTHALMOLOGY

PURLs Copyright

Inside the Article

Refractive Error Similar Between Diabetics and Nondiabetics

Article Type
Changed
Thu, 12/06/2018 - 20:33
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
ophthalmology, diabetes
Author and Disclosure Information

Author and Disclosure Information

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

MADISON, Wis. – Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population, according to new research reported in the January issue of Archives of Ophthalmology.

Dr. Barbara E.K. Klein and her colleagues from the University of Wisconsin School of Medicine and Public Health, Madison, reported on the distribution of and change in refraction in the Wisconsin Epidemiologic Study of Diabetic Retinopathy, a large community-based study of persons with types 1 and 2 diabetes who were first identified in 1979-1980. They also examined risk factors for change in refraction.

Dr. Klein and her colleagues initially identified 10,135 diabetic persons, and then selected a sample of 2,990 individuals diagnosed with diabetes before age 30 for baseline examination (Arch. Ophthalmol. 2011;129:56-62).

For this latest analysis, the researchers used baseline examination and 10-year follow-up data of individuals aged 20 years or older at baseline. This group included 724 patients with type 1 diabetes, 1,370 patients with type 2 diabetes, and 269 patients in a nondiabetic comparison group.

The mean spherical equivalent at baseline was –1.24D in the patients with type 1 diabetes, +0.69D in those with type 2 diabetes, and –0.15D among the nondiabetic patients. The mean change in spherical equivalent was –0.28D among the type 1 patients and +0.48D among the type 2 patients. After adjusting for age and education, the researchers found a borderline significant difference in change in refraction between the groups.

The findings among the type 2 patients were similar to those seen during a 10-year interval in another study. "Most important, we found that refraction and its correlates in adults with diabetes, regardless of type, are similar to those reported in adults without diabetes," the researchers wrote.

Although they anticipated that glycemia would be an important determinant in refraction, this was not the case in either diabetic group. Also, there was no relationship between the severity of retinopathy and refractive error of patients in either diabetic group.

Limitations of the study included the exclusion of patients who were significantly older, had more hyperopic refraction, were more likely to have nuclear cataract, and had more severe retinopathy. "Thus, these exclusions might be expected to bias the baseline estimates of refraction toward myopia," the researchers wrote. Age adjustment in subsequent analyses would likely reduce some of these effects, they noted.

Dr. Klein and her colleagues reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

Publications
Publications
Topics
Article Type
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics
Display Headline
Refractive Error Similar Between Diabetics and Nondiabetics
Legacy Keywords
ophthalmology, diabetes
Legacy Keywords
ophthalmology, diabetes
Article Source

FROM ARCHIVES OF OPHTHALMOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Individuals with type 1 diabetes are slightly more likely to be nearsighted than those with type 2 diabetes, but overall rates of refractive error among adults with diabetes are similar to those reported in the general population.

Data Source: The Wisconsin Epidemiologic Study of Diabetic Retinopathy, a population-based survey of diabetic persons residing and receiving their health care in southern Wisconsin.

Disclosures: The researchers reported no financial disclosures. The research was supported by grants from the National Institutes of Health and, in part, by senior scientific investigator awards from Research to Prevent Blindness.

Blood Vessel Diameter May Predict Progression of Diabetic Retinopathy

A New Tool to Predict Progression of Diabetic Retinopathy?
Article Type
Changed
Wed, 12/14/2016 - 10:29
Display Headline
Blood Vessel Diameter May Predict Progression of Diabetic Retinopathy

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to research published in the January 2011 issue of the Archives of Ophthalmology.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion. Also, graders used a computer-assisted technique to measure retinal vessel diameters.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

"The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted," the researchers said. "Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis."

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, the researchers said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

"It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature," the researchers noted.

The researchers had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Body

Larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after an accounting for other known risk factors. Although the underlying mechanisms are unclear, research suggests that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction.

Before applying these findings to clinical practice, however, further studies are needed to investigate this link between venular diameter and progression of diabetic retinopathy. Also, newer, automated software programs for improving precision and reliability are still under development, and more evidence is needed to determine whether measurement of retinal venular diameter incrementally improves prediction over conventional methods, reclassifies a patient’s risk, changes treatment options, improves outcomes, and is cost effective.

Still, these findings are important, and further development of retinal vascular imaging technology may improve prediction of progression of DR.

Tien Yin Wong, M.D., is with the Singapore Eye Research Institute at the National University of Singapore. Dr. Wong had no financial disclosures. The remarks were part of an accompanying editorial commentary (Arch. Ophthalmol. 2011;129:95-6).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
ophthalmology
Author and Disclosure Information

Author and Disclosure Information

Body

Larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after an accounting for other known risk factors. Although the underlying mechanisms are unclear, research suggests that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction.

Before applying these findings to clinical practice, however, further studies are needed to investigate this link between venular diameter and progression of diabetic retinopathy. Also, newer, automated software programs for improving precision and reliability are still under development, and more evidence is needed to determine whether measurement of retinal venular diameter incrementally improves prediction over conventional methods, reclassifies a patient’s risk, changes treatment options, improves outcomes, and is cost effective.

Still, these findings are important, and further development of retinal vascular imaging technology may improve prediction of progression of DR.

Tien Yin Wong, M.D., is with the Singapore Eye Research Institute at the National University of Singapore. Dr. Wong had no financial disclosures. The remarks were part of an accompanying editorial commentary (Arch. Ophthalmol. 2011;129:95-6).

Body

Larger retinal venular diameter or a generalized venular dilatation may help predict the progression of diabetic retinopathy, even after an accounting for other known risk factors. Although the underlying mechanisms are unclear, research suggests that retinal venous dilatation may indicate ocular ischemia, systemic inflammation, and/or endothelial dysfunction.

Before applying these findings to clinical practice, however, further studies are needed to investigate this link between venular diameter and progression of diabetic retinopathy. Also, newer, automated software programs for improving precision and reliability are still under development, and more evidence is needed to determine whether measurement of retinal venular diameter incrementally improves prediction over conventional methods, reclassifies a patient’s risk, changes treatment options, improves outcomes, and is cost effective.

Still, these findings are important, and further development of retinal vascular imaging technology may improve prediction of progression of DR.

Tien Yin Wong, M.D., is with the Singapore Eye Research Institute at the National University of Singapore. Dr. Wong had no financial disclosures. The remarks were part of an accompanying editorial commentary (Arch. Ophthalmol. 2011;129:95-6).

Title
A New Tool to Predict Progression of Diabetic Retinopathy?
A New Tool to Predict Progression of Diabetic Retinopathy?

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to research published in the January 2011 issue of the Archives of Ophthalmology.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion. Also, graders used a computer-assisted technique to measure retinal vessel diameters.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

"The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted," the researchers said. "Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis."

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, the researchers said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

"It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature," the researchers noted.

The researchers had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Central retinal venular diameter may help predict the progression of retinal disease in black patients with type 1 diabetes, especially those with less-severe baseline diabetic retinopathy, according to research published in the January 2011 issue of the Archives of Ophthalmology.

These results may lead to another early clinical indicator for progression to more severe forms.

Dilation of the retinal venules has been associated with diabetic retinopathy (DR), but cross-sectional studies have not indicated whether these changes simply reflect disease severity or may help predict progression of DR.

Dr. Monique S. Roy of the New Jersey Medical School, Newark, and her colleagues examined 725 black patients with type 1 diabetes who participated in the New Jersey 725 study in 1993-1998, and reexamined a cohort of 468 of these patients 6 years later (Arch. Ophthalmol. 2011;129:8-15). The exams included seven-field retinal photographs that were evaluated in a masked fashion. Also, graders used a computer-assisted technique to measure retinal vessel diameters.

In the 468 follow-up patients, the mean central retinal arteriolar equivalent (CRAE) and central retinal venular equivalent (CRVE) were 168.8 mcm and 254.2 mcm, respectively. The study found no association between CRAE and incident DR outcomes, even after adjustment for other risk factors. However, increasing CRVE at baseline was associated with triple the risk of progression to proliferative diabetic retinopathy, or to PDR with high-risk characteristics, even when researchers adjusted for other risk factors.

"The relative dilation of the retinal veins seen in DR and other retinopathies associated with ischemia has been variously interpreted," the researchers said. "Wider retinal venules may reflect metabolic changes associated with [diabetes mellitus], such as increased lactic acidosis."

Strengths of the study include its prospective design with high rates of follow-up for a large cohort of well-characterized black patients with type 1 diabetes, the use of standardized protocols to document potential confounding variables, the masked grading of DR using stereoscopic seven-field retinal photographs, and measurements of the retinal vascular diameters with a validated computerized program, the researchers said.

However, they cautioned that measurement of retinal vessel diameter from color retinal photographs may underestimate the true vascular width because only the red blood cell column is being measured. Also, the increased retinal pigmentation that is present in blacks may lead to an overestimation of retinal diameter sizes.

"It remains to be seen whether such a measure may be used in the future to monitor treatments for [diabetes] and other vascular diseases that specifically target the microvasculature," the researchers noted.

The researchers had no financial disclosures. The research was supported by grants from the National Eye Institute and by a Lew Wasserman Merit Award from Research to Prevent Blindness Inc.

Publications
Publications
Topics
Article Type
Display Headline
Blood Vessel Diameter May Predict Progression of Diabetic Retinopathy
Display Headline
Blood Vessel Diameter May Predict Progression of Diabetic Retinopathy
Legacy Keywords
ophthalmology
Legacy Keywords
ophthalmology
Article Source

PURLs Copyright

Inside the Article