Study eyed endoscopic submucosal dissection for early-stage esophageal cancer

Use ESD for early-stage esophageal cancer
Article Type
Changed
Wed, 05/26/2021 - 13:48

For patients with early-stage esophageal squamous cell carcinoma, minimally invasive esophageal submucosal dissection (ESD) led to significantly fewer severe adverse perioperative events than esophagectomy and was associated with similar rates of all-cause mortality and cancer recurrence or metastasis, according to the findings of a single-center retrospective cohort study.

After a median of 21 months of follow-up (range, 6-73 months), rates of all-cause mortality were 7% with ESD and 11% with esophagectomy, said Yiqun Zhang of Zhongshan Hospital, Shanghai, China, and his associates. Rates of cancer recurrence or metastasis were 9.1% and 8.9%, respectively, while disease-specific mortality was lower with ESD (3.4% vs. 7.4% with esophagectomy; P = .049). Severe nonfatal adverse perioperative events occurred in 15% of ESD cases versus 28% of esophagectomy cases (P less than .001). The findings justify more studies of ESD in carefully selected patients with early-stage (T1a-m2/m3 or T1b) esophageal squamous cell carcinoma, the researchers wrote in Clinical Gastroenterology and Hepatology.

Esophagectomy is standard for managing early-stage esophageal squamous cell carcinoma but is associated with high rates of morbidity and mortality. While ESD is minimally invasive, it is considered risky because esophageal squamous cell carcinoma so frequently metastasizes to the lymph nodes, the investigators noted. For the study, they retrospectively compared 322 ESDs and 274 esophagectomies performed during 2011-2016 in patients with T1a-m2/m3 or T1b esophageal squamous cell carcinoma. All cases were pathologically confirmed, and none were premalignant (that is, high-grade intraepithelial neoplasias).

Endoscopic submucosal dissection was associated with significantly lower rates of esophageal fistula (0.3% with ESD vs. 16% with esophagectomy; P less than .001) and pulmonary complications (0.3% vs. 3.6%, respectively; P less than .001), which explained its overall superiority in terms of severe adverse perioperative events, the researchers wrote. Perioperative deaths were rare but occurred more often with esophagectomy (four patients) than with ESD (one patient). Depth of tumor invasion was the only significant correlate of all-cause mortality (hazard ratio for T1a–m3 or deeper tumors versus T1a–m2 tumors, 3.54; 95% confidence interval, 1.08-11.62; P = .04) in a Cox regression analysis that accounted for many potential confounders, such as demographic and tumor characteristics, hypertension, chronic obstructive pulmonary disease (COPD), nodal metastasis, chemotherapy, and radiotherapy.

Perhaps esophagectomy did not improve survival in this retrospective study because follow-up time was too short, because adjuvant therapy compensated for the increased risk of tumor relapse with ESD, or because of the confounding effects of unmeasured variables, such as submucosal stages of T1b cancer, lymphovascular invasion, or tumor morphology, the researchers wrote. “Since a randomized study comparing esophagectomy and ESD alone would not be practical, a potential strategy for future research may include serial treatments – that is, ESD first, followed by esophagectomy, radiotherapy, or chemotherapy, depending on the ESD pathology findings,” they added. “A quality-of-life analysis of ESD would also be helpful because this might be one of the biggest advantages of ESD over esophagectomy and was beyond the scope of this study.”

The study was supported by the National Natural Science Foundation of China, the Shanghai Committee of Science and Technology, and Zhongshan Hospital. The investigators reported having no relevant conflicts of interest.

 

SOURCE: Zhang Y et al. Clin Gastroenterol Hepatol. 2018 Apr 25. doi: 10.1016/j.cgh.2018.04.038.

Body

This study adds more evidence supporting the use of endoscopic submucosal dissection (ESD) in early esophageal cancer. Unlike esophageal adenocarcinoma, esophageal squamous cell carcinoma (ESCC) has a higher risk of lymph node metastasis and tends to be multifocal. ESCC lesions invading the submucosa (T1b) have the highest risk of lymph node metastasis (up to 60% in lesions with deep submucosal invasion). 

Historically, endoscopic resection was reserved for mucosal tumors while submucosal tumors were managed surgically. Several trials have investigated the role of ESD in ESCC limited to the mucosa with excellent results. However, data for ESCC invading the submucosa (T1b lesions) are lacking. This study included 596 patients, almost half of included patients (282 patients) had T1b lesions. Although most of the T1b lesions were treated surgically (200 patients), there was a large cohort of 82 T1b ESCC lesions treated by ESD. 

Interestingly, there was no difference in tumor recurrence or overall mortality in patients treated with ESD, compared with surgery for both mucosal and submucosal lesions. 
Another interesting finding in this study was the use of adjuvant treatment such as radiotherapy and chemotherapy for patients treated with ESD who were found to have evidence of lymphovascular invasion. The outcome of this subset of patients was not different from patients who underwent esophagectomy. Recent evidence from this study and other published data suggest that there is a subset of submucosal ESCC lesions that can be managed endoscopically, especially submucosal lesions limited to the upper third of the submucosa. Further studies investigating the role of adjuvant treatment after ESD for deep submucosal lesions or lesions with lymphovascular invasion are needed.

Mohamed O. Othman, MD, is an associate professor of medicine, director of advanced endoscopy, and chief of the section of gastroenterology, Baylor College of Medicine, Houston. He is a consultant for Olympus and Boston Scientific.
 

Publications
Topics
Sections
Body

This study adds more evidence supporting the use of endoscopic submucosal dissection (ESD) in early esophageal cancer. Unlike esophageal adenocarcinoma, esophageal squamous cell carcinoma (ESCC) has a higher risk of lymph node metastasis and tends to be multifocal. ESCC lesions invading the submucosa (T1b) have the highest risk of lymph node metastasis (up to 60% in lesions with deep submucosal invasion). 

Historically, endoscopic resection was reserved for mucosal tumors while submucosal tumors were managed surgically. Several trials have investigated the role of ESD in ESCC limited to the mucosa with excellent results. However, data for ESCC invading the submucosa (T1b lesions) are lacking. This study included 596 patients, almost half of included patients (282 patients) had T1b lesions. Although most of the T1b lesions were treated surgically (200 patients), there was a large cohort of 82 T1b ESCC lesions treated by ESD. 

Interestingly, there was no difference in tumor recurrence or overall mortality in patients treated with ESD, compared with surgery for both mucosal and submucosal lesions. 
Another interesting finding in this study was the use of adjuvant treatment such as radiotherapy and chemotherapy for patients treated with ESD who were found to have evidence of lymphovascular invasion. The outcome of this subset of patients was not different from patients who underwent esophagectomy. Recent evidence from this study and other published data suggest that there is a subset of submucosal ESCC lesions that can be managed endoscopically, especially submucosal lesions limited to the upper third of the submucosa. Further studies investigating the role of adjuvant treatment after ESD for deep submucosal lesions or lesions with lymphovascular invasion are needed.

Mohamed O. Othman, MD, is an associate professor of medicine, director of advanced endoscopy, and chief of the section of gastroenterology, Baylor College of Medicine, Houston. He is a consultant for Olympus and Boston Scientific.
 

Body

This study adds more evidence supporting the use of endoscopic submucosal dissection (ESD) in early esophageal cancer. Unlike esophageal adenocarcinoma, esophageal squamous cell carcinoma (ESCC) has a higher risk of lymph node metastasis and tends to be multifocal. ESCC lesions invading the submucosa (T1b) have the highest risk of lymph node metastasis (up to 60% in lesions with deep submucosal invasion). 

Historically, endoscopic resection was reserved for mucosal tumors while submucosal tumors were managed surgically. Several trials have investigated the role of ESD in ESCC limited to the mucosa with excellent results. However, data for ESCC invading the submucosa (T1b lesions) are lacking. This study included 596 patients, almost half of included patients (282 patients) had T1b lesions. Although most of the T1b lesions were treated surgically (200 patients), there was a large cohort of 82 T1b ESCC lesions treated by ESD. 

Interestingly, there was no difference in tumor recurrence or overall mortality in patients treated with ESD, compared with surgery for both mucosal and submucosal lesions. 
Another interesting finding in this study was the use of adjuvant treatment such as radiotherapy and chemotherapy for patients treated with ESD who were found to have evidence of lymphovascular invasion. The outcome of this subset of patients was not different from patients who underwent esophagectomy. Recent evidence from this study and other published data suggest that there is a subset of submucosal ESCC lesions that can be managed endoscopically, especially submucosal lesions limited to the upper third of the submucosa. Further studies investigating the role of adjuvant treatment after ESD for deep submucosal lesions or lesions with lymphovascular invasion are needed.

Mohamed O. Othman, MD, is an associate professor of medicine, director of advanced endoscopy, and chief of the section of gastroenterology, Baylor College of Medicine, Houston. He is a consultant for Olympus and Boston Scientific.
 

Title
Use ESD for early-stage esophageal cancer
Use ESD for early-stage esophageal cancer

For patients with early-stage esophageal squamous cell carcinoma, minimally invasive esophageal submucosal dissection (ESD) led to significantly fewer severe adverse perioperative events than esophagectomy and was associated with similar rates of all-cause mortality and cancer recurrence or metastasis, according to the findings of a single-center retrospective cohort study.

After a median of 21 months of follow-up (range, 6-73 months), rates of all-cause mortality were 7% with ESD and 11% with esophagectomy, said Yiqun Zhang of Zhongshan Hospital, Shanghai, China, and his associates. Rates of cancer recurrence or metastasis were 9.1% and 8.9%, respectively, while disease-specific mortality was lower with ESD (3.4% vs. 7.4% with esophagectomy; P = .049). Severe nonfatal adverse perioperative events occurred in 15% of ESD cases versus 28% of esophagectomy cases (P less than .001). The findings justify more studies of ESD in carefully selected patients with early-stage (T1a-m2/m3 or T1b) esophageal squamous cell carcinoma, the researchers wrote in Clinical Gastroenterology and Hepatology.

Esophagectomy is standard for managing early-stage esophageal squamous cell carcinoma but is associated with high rates of morbidity and mortality. While ESD is minimally invasive, it is considered risky because esophageal squamous cell carcinoma so frequently metastasizes to the lymph nodes, the investigators noted. For the study, they retrospectively compared 322 ESDs and 274 esophagectomies performed during 2011-2016 in patients with T1a-m2/m3 or T1b esophageal squamous cell carcinoma. All cases were pathologically confirmed, and none were premalignant (that is, high-grade intraepithelial neoplasias).

Endoscopic submucosal dissection was associated with significantly lower rates of esophageal fistula (0.3% with ESD vs. 16% with esophagectomy; P less than .001) and pulmonary complications (0.3% vs. 3.6%, respectively; P less than .001), which explained its overall superiority in terms of severe adverse perioperative events, the researchers wrote. Perioperative deaths were rare but occurred more often with esophagectomy (four patients) than with ESD (one patient). Depth of tumor invasion was the only significant correlate of all-cause mortality (hazard ratio for T1a–m3 or deeper tumors versus T1a–m2 tumors, 3.54; 95% confidence interval, 1.08-11.62; P = .04) in a Cox regression analysis that accounted for many potential confounders, such as demographic and tumor characteristics, hypertension, chronic obstructive pulmonary disease (COPD), nodal metastasis, chemotherapy, and radiotherapy.

Perhaps esophagectomy did not improve survival in this retrospective study because follow-up time was too short, because adjuvant therapy compensated for the increased risk of tumor relapse with ESD, or because of the confounding effects of unmeasured variables, such as submucosal stages of T1b cancer, lymphovascular invasion, or tumor morphology, the researchers wrote. “Since a randomized study comparing esophagectomy and ESD alone would not be practical, a potential strategy for future research may include serial treatments – that is, ESD first, followed by esophagectomy, radiotherapy, or chemotherapy, depending on the ESD pathology findings,” they added. “A quality-of-life analysis of ESD would also be helpful because this might be one of the biggest advantages of ESD over esophagectomy and was beyond the scope of this study.”

The study was supported by the National Natural Science Foundation of China, the Shanghai Committee of Science and Technology, and Zhongshan Hospital. The investigators reported having no relevant conflicts of interest.

 

SOURCE: Zhang Y et al. Clin Gastroenterol Hepatol. 2018 Apr 25. doi: 10.1016/j.cgh.2018.04.038.

For patients with early-stage esophageal squamous cell carcinoma, minimally invasive esophageal submucosal dissection (ESD) led to significantly fewer severe adverse perioperative events than esophagectomy and was associated with similar rates of all-cause mortality and cancer recurrence or metastasis, according to the findings of a single-center retrospective cohort study.

After a median of 21 months of follow-up (range, 6-73 months), rates of all-cause mortality were 7% with ESD and 11% with esophagectomy, said Yiqun Zhang of Zhongshan Hospital, Shanghai, China, and his associates. Rates of cancer recurrence or metastasis were 9.1% and 8.9%, respectively, while disease-specific mortality was lower with ESD (3.4% vs. 7.4% with esophagectomy; P = .049). Severe nonfatal adverse perioperative events occurred in 15% of ESD cases versus 28% of esophagectomy cases (P less than .001). The findings justify more studies of ESD in carefully selected patients with early-stage (T1a-m2/m3 or T1b) esophageal squamous cell carcinoma, the researchers wrote in Clinical Gastroenterology and Hepatology.

Esophagectomy is standard for managing early-stage esophageal squamous cell carcinoma but is associated with high rates of morbidity and mortality. While ESD is minimally invasive, it is considered risky because esophageal squamous cell carcinoma so frequently metastasizes to the lymph nodes, the investigators noted. For the study, they retrospectively compared 322 ESDs and 274 esophagectomies performed during 2011-2016 in patients with T1a-m2/m3 or T1b esophageal squamous cell carcinoma. All cases were pathologically confirmed, and none were premalignant (that is, high-grade intraepithelial neoplasias).

Endoscopic submucosal dissection was associated with significantly lower rates of esophageal fistula (0.3% with ESD vs. 16% with esophagectomy; P less than .001) and pulmonary complications (0.3% vs. 3.6%, respectively; P less than .001), which explained its overall superiority in terms of severe adverse perioperative events, the researchers wrote. Perioperative deaths were rare but occurred more often with esophagectomy (four patients) than with ESD (one patient). Depth of tumor invasion was the only significant correlate of all-cause mortality (hazard ratio for T1a–m3 or deeper tumors versus T1a–m2 tumors, 3.54; 95% confidence interval, 1.08-11.62; P = .04) in a Cox regression analysis that accounted for many potential confounders, such as demographic and tumor characteristics, hypertension, chronic obstructive pulmonary disease (COPD), nodal metastasis, chemotherapy, and radiotherapy.

Perhaps esophagectomy did not improve survival in this retrospective study because follow-up time was too short, because adjuvant therapy compensated for the increased risk of tumor relapse with ESD, or because of the confounding effects of unmeasured variables, such as submucosal stages of T1b cancer, lymphovascular invasion, or tumor morphology, the researchers wrote. “Since a randomized study comparing esophagectomy and ESD alone would not be practical, a potential strategy for future research may include serial treatments – that is, ESD first, followed by esophagectomy, radiotherapy, or chemotherapy, depending on the ESD pathology findings,” they added. “A quality-of-life analysis of ESD would also be helpful because this might be one of the biggest advantages of ESD over esophagectomy and was beyond the scope of this study.”

The study was supported by the National Natural Science Foundation of China, the Shanghai Committee of Science and Technology, and Zhongshan Hospital. The investigators reported having no relevant conflicts of interest.

 

SOURCE: Zhang Y et al. Clin Gastroenterol Hepatol. 2018 Apr 25. doi: 10.1016/j.cgh.2018.04.038.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
189716
Vitals

Key clinical point: Compared with esophagectomy, endoscopic submucosal dissection (ESD) was associated with significantly fewer severe adverse perioperative events and a similar rate of all-cause mortality in patients with early-stage esophageal squamous cell carcinoma.


Major finding: After a median of 21 months of follow-up, rates of all-cause mortality were 7% with ESD and 11% with esophagectomy (P = .21). Severe adverse perioperative events occurred in 15% of ESDs and 28% of esophagectomies.

Study details: Retrospective study of 596 patients with T1a-m2/m3 or T1b esophageal squamous cell carcinoma.

Disclosures: The study was supported by the National Natural Science Foundation of China, the Shanghai Committee of Science and Technology, and Zhongshan Hospital. The investigators reported having no relevant conflicts of interest.

Source: Zhang Y et al. Clin Gastroenterol Hepatol. 2018 Apr 25. doi: 10.1016/j.cgh.2018.04.038.

Disqus Comments
Default
Use ProPublica

Topical retinoid found effective as microneedling for acne scars

Article Type
Changed
Fri, 06/11/2021 - 10:18

 

The topical retinoid tazarotene could be an efficacious and practical alternative to microneedling for treating atrophic postacne scarring, according to a new study.

In a prospective, randomized, split-face study of adults with postacne scarring, both treatments resulted in similar efficacy after 6 months, reported T.P. Afra, MD, and associates from the department of dermatology, venereology, and leprology at the Postgraduate Institute of Medical Education and Research in Chandigarh, India. While the clinical usefulness of microneedling as a procedure for postacne scarring is well established, research exploring the effectiveness of topical therapies for acne scarring that could be used at home is lacking. “A home-based topical treatment with a comparable efficacy to microneedling and that is well tolerated would be a useful addition in the armamentarium of acne scar management,” they wrote in the study, published in JAMA Facial Plastic Surgery.

The study included 34 patients, aged 18-30 years, with grade 2-4 facial atrophic acne scars at their initial visit to the research team’s skin clinic. One side of each participants face was randomized to receive microneedling treatment for four sessions over 3 months (using a dermaroller with 1.5-mm needles). Topical tazarotene gel 0.1%, a retinoid approved by the Food and Drug Administration as a treatment for mild to moderate facial acne, was applied to the other side of their face once a night during the same time. Almost 81% were skin phototypes IV, the rest were type III or V. Patients followed up every month for 3 months, then at 6 months.

Changes in acne scar severity from baseline, the primary outcome, were assessed using Goodman and Baron quantitative and qualitative scores and a subjective dermatologist score. Patient satisfaction measured with a Patient Global Assessment (PGA) score and adverse events were secondary outcomes.

In 31 patients (91.2%), overall improvements from baseline to the 6-month visit in quantitative acne scar severity scores for both treatments were seen, with significant improvements from baseline to 6 months: A median improvement of 3 on the sides of the face treated with microneedling and a median improvement of 2.5 on the sides of the face treated with tazarotene (between-group comparison, P = .42). The qualitative acne scar severity score did not significantly improve with either treatment, the investigators noted.

The median improvement in the independent dermatologist score was also comparable for both methods at 3 and 6 months.

At 6 months, improvement in the mean PGA score was “slightly but significantly superior” for the microneedling treatment, compared with that for tazarotene (mean of 5.86 vs. 5.76, respectively; P less than .001), with both falling into the “satisfactory” range for the PGA, the investigators wrote. They also noted a positive correlation between previous exposure to oral isotretinoin and patient satisfaction.

“Although collagen accumulation has been considered a drawback of isotretinoin therapy owing to the development of hypertrophic scars, the better atrophic acne scar outcomes observed for both the present treatment groups in patients with a history of isotretinoin treatment indicates that the collagen accumulation in this case may actually be beneficial,” they wrote.

The topical retinoid was well tolerated by participants, with less than a third reporting dryness and scaling, and adverse effects associated with microneedling were described as “minimal.”

“The use of a modality such as tazarotene that prevents acne flares while addressing acne scarring is a practical addition to clinical practice,” the investigators concluded. “Tazarotene gel 0.1% would be a useful alternative to microneedling in the management of atrophic acne scars. Such a home-based medical management option for acne scarring may decrease physician dependence and health care expenditures for patients with postacne scarring.”

The study authors noted that, as collagen remodeling is a continuous process lasting more than 1 year, a limitation of their study was its short-follow-up of 6 months. However, a strength of the study was its use of validated acne scar severity scoring tools as well as patient and physician assessment of scar improvement in the outcome assessments.

The authors had no disclosures to report.

SOURCE: Afra TP et al. JAMA Facial Plast Surg. 2018 Nov 15. doi: 10.1001/jamafacial.2018.1404.

Publications
Topics
Sections

 

The topical retinoid tazarotene could be an efficacious and practical alternative to microneedling for treating atrophic postacne scarring, according to a new study.

In a prospective, randomized, split-face study of adults with postacne scarring, both treatments resulted in similar efficacy after 6 months, reported T.P. Afra, MD, and associates from the department of dermatology, venereology, and leprology at the Postgraduate Institute of Medical Education and Research in Chandigarh, India. While the clinical usefulness of microneedling as a procedure for postacne scarring is well established, research exploring the effectiveness of topical therapies for acne scarring that could be used at home is lacking. “A home-based topical treatment with a comparable efficacy to microneedling and that is well tolerated would be a useful addition in the armamentarium of acne scar management,” they wrote in the study, published in JAMA Facial Plastic Surgery.

The study included 34 patients, aged 18-30 years, with grade 2-4 facial atrophic acne scars at their initial visit to the research team’s skin clinic. One side of each participants face was randomized to receive microneedling treatment for four sessions over 3 months (using a dermaroller with 1.5-mm needles). Topical tazarotene gel 0.1%, a retinoid approved by the Food and Drug Administration as a treatment for mild to moderate facial acne, was applied to the other side of their face once a night during the same time. Almost 81% were skin phototypes IV, the rest were type III or V. Patients followed up every month for 3 months, then at 6 months.

Changes in acne scar severity from baseline, the primary outcome, were assessed using Goodman and Baron quantitative and qualitative scores and a subjective dermatologist score. Patient satisfaction measured with a Patient Global Assessment (PGA) score and adverse events were secondary outcomes.

In 31 patients (91.2%), overall improvements from baseline to the 6-month visit in quantitative acne scar severity scores for both treatments were seen, with significant improvements from baseline to 6 months: A median improvement of 3 on the sides of the face treated with microneedling and a median improvement of 2.5 on the sides of the face treated with tazarotene (between-group comparison, P = .42). The qualitative acne scar severity score did not significantly improve with either treatment, the investigators noted.

The median improvement in the independent dermatologist score was also comparable for both methods at 3 and 6 months.

At 6 months, improvement in the mean PGA score was “slightly but significantly superior” for the microneedling treatment, compared with that for tazarotene (mean of 5.86 vs. 5.76, respectively; P less than .001), with both falling into the “satisfactory” range for the PGA, the investigators wrote. They also noted a positive correlation between previous exposure to oral isotretinoin and patient satisfaction.

“Although collagen accumulation has been considered a drawback of isotretinoin therapy owing to the development of hypertrophic scars, the better atrophic acne scar outcomes observed for both the present treatment groups in patients with a history of isotretinoin treatment indicates that the collagen accumulation in this case may actually be beneficial,” they wrote.

The topical retinoid was well tolerated by participants, with less than a third reporting dryness and scaling, and adverse effects associated with microneedling were described as “minimal.”

“The use of a modality such as tazarotene that prevents acne flares while addressing acne scarring is a practical addition to clinical practice,” the investigators concluded. “Tazarotene gel 0.1% would be a useful alternative to microneedling in the management of atrophic acne scars. Such a home-based medical management option for acne scarring may decrease physician dependence and health care expenditures for patients with postacne scarring.”

The study authors noted that, as collagen remodeling is a continuous process lasting more than 1 year, a limitation of their study was its short-follow-up of 6 months. However, a strength of the study was its use of validated acne scar severity scoring tools as well as patient and physician assessment of scar improvement in the outcome assessments.

The authors had no disclosures to report.

SOURCE: Afra TP et al. JAMA Facial Plast Surg. 2018 Nov 15. doi: 10.1001/jamafacial.2018.1404.

 

The topical retinoid tazarotene could be an efficacious and practical alternative to microneedling for treating atrophic postacne scarring, according to a new study.

In a prospective, randomized, split-face study of adults with postacne scarring, both treatments resulted in similar efficacy after 6 months, reported T.P. Afra, MD, and associates from the department of dermatology, venereology, and leprology at the Postgraduate Institute of Medical Education and Research in Chandigarh, India. While the clinical usefulness of microneedling as a procedure for postacne scarring is well established, research exploring the effectiveness of topical therapies for acne scarring that could be used at home is lacking. “A home-based topical treatment with a comparable efficacy to microneedling and that is well tolerated would be a useful addition in the armamentarium of acne scar management,” they wrote in the study, published in JAMA Facial Plastic Surgery.

The study included 34 patients, aged 18-30 years, with grade 2-4 facial atrophic acne scars at their initial visit to the research team’s skin clinic. One side of each participants face was randomized to receive microneedling treatment for four sessions over 3 months (using a dermaroller with 1.5-mm needles). Topical tazarotene gel 0.1%, a retinoid approved by the Food and Drug Administration as a treatment for mild to moderate facial acne, was applied to the other side of their face once a night during the same time. Almost 81% were skin phototypes IV, the rest were type III or V. Patients followed up every month for 3 months, then at 6 months.

Changes in acne scar severity from baseline, the primary outcome, were assessed using Goodman and Baron quantitative and qualitative scores and a subjective dermatologist score. Patient satisfaction measured with a Patient Global Assessment (PGA) score and adverse events were secondary outcomes.

In 31 patients (91.2%), overall improvements from baseline to the 6-month visit in quantitative acne scar severity scores for both treatments were seen, with significant improvements from baseline to 6 months: A median improvement of 3 on the sides of the face treated with microneedling and a median improvement of 2.5 on the sides of the face treated with tazarotene (between-group comparison, P = .42). The qualitative acne scar severity score did not significantly improve with either treatment, the investigators noted.

The median improvement in the independent dermatologist score was also comparable for both methods at 3 and 6 months.

At 6 months, improvement in the mean PGA score was “slightly but significantly superior” for the microneedling treatment, compared with that for tazarotene (mean of 5.86 vs. 5.76, respectively; P less than .001), with both falling into the “satisfactory” range for the PGA, the investigators wrote. They also noted a positive correlation between previous exposure to oral isotretinoin and patient satisfaction.

“Although collagen accumulation has been considered a drawback of isotretinoin therapy owing to the development of hypertrophic scars, the better atrophic acne scar outcomes observed for both the present treatment groups in patients with a history of isotretinoin treatment indicates that the collagen accumulation in this case may actually be beneficial,” they wrote.

The topical retinoid was well tolerated by participants, with less than a third reporting dryness and scaling, and adverse effects associated with microneedling were described as “minimal.”

“The use of a modality such as tazarotene that prevents acne flares while addressing acne scarring is a practical addition to clinical practice,” the investigators concluded. “Tazarotene gel 0.1% would be a useful alternative to microneedling in the management of atrophic acne scars. Such a home-based medical management option for acne scarring may decrease physician dependence and health care expenditures for patients with postacne scarring.”

The study authors noted that, as collagen remodeling is a continuous process lasting more than 1 year, a limitation of their study was its short-follow-up of 6 months. However, a strength of the study was its use of validated acne scar severity scoring tools as well as patient and physician assessment of scar improvement in the outcome assessments.

The authors had no disclosures to report.

SOURCE: Afra TP et al. JAMA Facial Plast Surg. 2018 Nov 15. doi: 10.1001/jamafacial.2018.1404.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA FACIAL PLASTIC SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: The topical retinoid tazarotene could be a home-based option for treating atrophic acne scarring.

Major finding: Improvements in acne scarring were similar with microneedling and nightly applications of tazarotene gel 0.1% after 6 months.

Study details: A prospective, observer-blinded, split-face, randomized, clinical trial involving 34 patients with grade 2-4 facial atrophic postacne scars.

Disclosures: The authors had no disclosures to report.

Source: Afra TP et al. JAMA Facial Plast Surg. 2018 Nov 15. doi: 10.1001/jamafacial.2018.1404.

Disqus Comments
Default
Use ProPublica

NIH director expresses concern over CRISPR-cas9 baby claim

Article Type
Changed
Fri, 01/18/2019 - 18:08

 

The National Institutes of Health is deeply concerned about the work just presented at the Second International Summit on Human Genome Editing in Hong Kong by Dr. He Jiankui, who described his effort using CRISPR-Cas9 on human embryos to disable the CCR5 gene. He claims that the two embryos were subsequently implanted, and infant twins have been born.

Dr. Francis Collins NIH
Dr. Francis S. Collins

This work represents a deeply disturbing willingness by Dr. He and his team to flout international ethical norms. The project was largely carried out in secret, the medical necessity for inactivation of CCR5 in these infants is utterly unconvincing, the informed consent process appears highly questionable, and the possibility of damaging off-target effects has not been satisfactorily explored. It is profoundly unfortunate that the first apparent application of this powerful technique to the human germline has been carried out so irresponsibly.

The need for development of binding international consensus on setting limits for this kind of research, now being debated in Hong Kong, has never been more apparent. Without such limits, the world will face the serious risk of a deluge of similarly ill-considered and unethical projects.

Should such epic scientific misadventures proceed, a technology with enormous promise for prevention and treatment of disease will be overshadowed by justifiable public outrage, fear, and disgust.

Lest there be any doubt, and as we have stated previously, NIH does not support the use of gene-editing technologies in human embryos.

Francis S. Collins, M.D., Ph.D. is director of the National Institutes of Health. His comments were made in a statement Nov. 28.

Publications
Topics
Sections

 

The National Institutes of Health is deeply concerned about the work just presented at the Second International Summit on Human Genome Editing in Hong Kong by Dr. He Jiankui, who described his effort using CRISPR-Cas9 on human embryos to disable the CCR5 gene. He claims that the two embryos were subsequently implanted, and infant twins have been born.

Dr. Francis Collins NIH
Dr. Francis S. Collins

This work represents a deeply disturbing willingness by Dr. He and his team to flout international ethical norms. The project was largely carried out in secret, the medical necessity for inactivation of CCR5 in these infants is utterly unconvincing, the informed consent process appears highly questionable, and the possibility of damaging off-target effects has not been satisfactorily explored. It is profoundly unfortunate that the first apparent application of this powerful technique to the human germline has been carried out so irresponsibly.

The need for development of binding international consensus on setting limits for this kind of research, now being debated in Hong Kong, has never been more apparent. Without such limits, the world will face the serious risk of a deluge of similarly ill-considered and unethical projects.

Should such epic scientific misadventures proceed, a technology with enormous promise for prevention and treatment of disease will be overshadowed by justifiable public outrage, fear, and disgust.

Lest there be any doubt, and as we have stated previously, NIH does not support the use of gene-editing technologies in human embryos.

Francis S. Collins, M.D., Ph.D. is director of the National Institutes of Health. His comments were made in a statement Nov. 28.

 

The National Institutes of Health is deeply concerned about the work just presented at the Second International Summit on Human Genome Editing in Hong Kong by Dr. He Jiankui, who described his effort using CRISPR-Cas9 on human embryos to disable the CCR5 gene. He claims that the two embryos were subsequently implanted, and infant twins have been born.

Dr. Francis Collins NIH
Dr. Francis S. Collins

This work represents a deeply disturbing willingness by Dr. He and his team to flout international ethical norms. The project was largely carried out in secret, the medical necessity for inactivation of CCR5 in these infants is utterly unconvincing, the informed consent process appears highly questionable, and the possibility of damaging off-target effects has not been satisfactorily explored. It is profoundly unfortunate that the first apparent application of this powerful technique to the human germline has been carried out so irresponsibly.

The need for development of binding international consensus on setting limits for this kind of research, now being debated in Hong Kong, has never been more apparent. Without such limits, the world will face the serious risk of a deluge of similarly ill-considered and unethical projects.

Should such epic scientific misadventures proceed, a technology with enormous promise for prevention and treatment of disease will be overshadowed by justifiable public outrage, fear, and disgust.

Lest there be any doubt, and as we have stated previously, NIH does not support the use of gene-editing technologies in human embryos.

Francis S. Collins, M.D., Ph.D. is director of the National Institutes of Health. His comments were made in a statement Nov. 28.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Less-distressed patients driving increase in outpatient services

Article Type
Changed
Fri, 01/18/2019 - 18:08

 

Adults with less-severe psychological distress contributed to most of the recent increase in outpatient mental health services, based on survey data from nearly 140,000 adults.

“Rising national rates of suicide, opioid misuse, and opioid-related deaths further suggest increasing psychological distress,” wrote Mark Olfson, MD, MPH, of Columbia University, New York, and his colleagues. “However, it is not known whether or to what extent an increase in mental health treatment has occurred in response to rising rates of psychological distress.”

Dr. Olfson and his colleagues reviewed data from the Medical Expenditure Panel Surveys for the years 2004-2005, 2009-2010, and 2014-2015. Overall, 19% of adults received outpatient mental health services in 2004-2005; the percentage increased to 23% in 2014-2015. About half of the study subjects were women, 67% were white, and the average age was 46 years.

The total percentage of adults with serious psychological distress decreased from 5% in 2004-2005 to 4% in 2014-2015, the researchers noted, although those with serious psychological distress had a greater proportionate increase in the use of outpatient services during the study period, from 54% to 68%.

Serious psychological distress was more likely in women, compared with men, and in older and middle-aged adults, compared with younger adults. The number of adults with less-serious distress or no distress who were treated with outpatient mental health services increased from 35 million in 2004-2005 to 48 million in 2014-2015, the researchers wrote in JAMA Psychiatry.

The study results were limited by several factors, including the partial reliance on self-reports of mental health care use and on the limitations of the Kessler 6 scale as an assessment of psychological distress. Other limitations included an absence of data on the specific services used and on the effectiveness of treatments. However, the results suggest that, despite increases in outpatient mental health treatment, many adults with serious psychological distress received no mental health care, they wrote. Individuals with more-severe distress might view mental health care less favorably. In addition, the investigators emphasized the need for continued improvement in general medical settings for detecting and treating or referring adults for mental health service.

Dr. Olfson reported no disclosures. One of the coauthors, Steven C. Marcus, PhD, reported receiving consulting fees from several pharmaceutical companies. The study was supported in part by the National Institutes of Health and the New York State Psychiatric Institute. The Medical Expenditure Panel Surveys are sponsored by the Agency for Healthcare Research and Quality.

SOURCE: Olfson M et al. JAMA Psychiatry. 2018 Nov 28. doi: 10.1001/jamapsychiatry.2018.3550.

Publications
Topics
Sections

 

Adults with less-severe psychological distress contributed to most of the recent increase in outpatient mental health services, based on survey data from nearly 140,000 adults.

“Rising national rates of suicide, opioid misuse, and opioid-related deaths further suggest increasing psychological distress,” wrote Mark Olfson, MD, MPH, of Columbia University, New York, and his colleagues. “However, it is not known whether or to what extent an increase in mental health treatment has occurred in response to rising rates of psychological distress.”

Dr. Olfson and his colleagues reviewed data from the Medical Expenditure Panel Surveys for the years 2004-2005, 2009-2010, and 2014-2015. Overall, 19% of adults received outpatient mental health services in 2004-2005; the percentage increased to 23% in 2014-2015. About half of the study subjects were women, 67% were white, and the average age was 46 years.

The total percentage of adults with serious psychological distress decreased from 5% in 2004-2005 to 4% in 2014-2015, the researchers noted, although those with serious psychological distress had a greater proportionate increase in the use of outpatient services during the study period, from 54% to 68%.

Serious psychological distress was more likely in women, compared with men, and in older and middle-aged adults, compared with younger adults. The number of adults with less-serious distress or no distress who were treated with outpatient mental health services increased from 35 million in 2004-2005 to 48 million in 2014-2015, the researchers wrote in JAMA Psychiatry.

The study results were limited by several factors, including the partial reliance on self-reports of mental health care use and on the limitations of the Kessler 6 scale as an assessment of psychological distress. Other limitations included an absence of data on the specific services used and on the effectiveness of treatments. However, the results suggest that, despite increases in outpatient mental health treatment, many adults with serious psychological distress received no mental health care, they wrote. Individuals with more-severe distress might view mental health care less favorably. In addition, the investigators emphasized the need for continued improvement in general medical settings for detecting and treating or referring adults for mental health service.

Dr. Olfson reported no disclosures. One of the coauthors, Steven C. Marcus, PhD, reported receiving consulting fees from several pharmaceutical companies. The study was supported in part by the National Institutes of Health and the New York State Psychiatric Institute. The Medical Expenditure Panel Surveys are sponsored by the Agency for Healthcare Research and Quality.

SOURCE: Olfson M et al. JAMA Psychiatry. 2018 Nov 28. doi: 10.1001/jamapsychiatry.2018.3550.

 

Adults with less-severe psychological distress contributed to most of the recent increase in outpatient mental health services, based on survey data from nearly 140,000 adults.

“Rising national rates of suicide, opioid misuse, and opioid-related deaths further suggest increasing psychological distress,” wrote Mark Olfson, MD, MPH, of Columbia University, New York, and his colleagues. “However, it is not known whether or to what extent an increase in mental health treatment has occurred in response to rising rates of psychological distress.”

Dr. Olfson and his colleagues reviewed data from the Medical Expenditure Panel Surveys for the years 2004-2005, 2009-2010, and 2014-2015. Overall, 19% of adults received outpatient mental health services in 2004-2005; the percentage increased to 23% in 2014-2015. About half of the study subjects were women, 67% were white, and the average age was 46 years.

The total percentage of adults with serious psychological distress decreased from 5% in 2004-2005 to 4% in 2014-2015, the researchers noted, although those with serious psychological distress had a greater proportionate increase in the use of outpatient services during the study period, from 54% to 68%.

Serious psychological distress was more likely in women, compared with men, and in older and middle-aged adults, compared with younger adults. The number of adults with less-serious distress or no distress who were treated with outpatient mental health services increased from 35 million in 2004-2005 to 48 million in 2014-2015, the researchers wrote in JAMA Psychiatry.

The study results were limited by several factors, including the partial reliance on self-reports of mental health care use and on the limitations of the Kessler 6 scale as an assessment of psychological distress. Other limitations included an absence of data on the specific services used and on the effectiveness of treatments. However, the results suggest that, despite increases in outpatient mental health treatment, many adults with serious psychological distress received no mental health care, they wrote. Individuals with more-severe distress might view mental health care less favorably. In addition, the investigators emphasized the need for continued improvement in general medical settings for detecting and treating or referring adults for mental health service.

Dr. Olfson reported no disclosures. One of the coauthors, Steven C. Marcus, PhD, reported receiving consulting fees from several pharmaceutical companies. The study was supported in part by the National Institutes of Health and the New York State Psychiatric Institute. The Medical Expenditure Panel Surveys are sponsored by the Agency for Healthcare Research and Quality.

SOURCE: Olfson M et al. JAMA Psychiatry. 2018 Nov 28. doi: 10.1001/jamapsychiatry.2018.3550.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Overall use of outpatient mental health services is increasing, but most patients report less-severe or no psychological distress.

Major finding: The percentage of U.S. adults receiving outpatient mental health services increased from 19% in 2004-2005 to 23% in 2014-2015.

Study details: The data come from a review of nationally representative surveys taken in 2004-2005, 2009-2010, and 2014-2015 for a total of 139,862 adults aged 18 years and older.

Disclosures: Dr. Olson reported no disclosures. One of the coauthors, Steven C. Marcus, PhD, reported receiving consulting fees from several pharmaceutical companies. The study was supported in part by the National Institutes of Health and the New York State Psychiatric Institute. The Medical Expenditure Panel Surveys are sponsored by the Agency for Healthcare Research and Quality.

Source: Olfson M et al. JAMA Psychiatry. 2018 Nov 28. doi: 10.1001/jamapsychiatry.2018.3550.

Disqus Comments
Default
Use ProPublica

Tested: U.S. News & World Report hospital rankings

Article Type
Changed
Thu, 03/28/2019 - 14:32

Do the U.S. News & World Report “Best Hospitals” rankings stand up to scrutiny? When it comes to cardiovascular care, the answer is yes … and no.

Do better rankings mean better cardiovascular care?

The hospitals that were ranked as the Top 50 for “cardiology and heart surgery” had lower 30-day mortality for acute MI, heart failure, and coronary artery bypass grafting (CABG), compared with 3,502 nonranked hospitals, when David E. Wang, MD, and his associates at Brigham and Women’s Hospital in Boston looked at the Centers for Medicare & Medicaid Services Hospital Compare website.

The Top 50 hospitals also had higher patient satisfaction scores (3.9 vs. 3.3 on a scale of 1-5), based on the CMS Hospital Consumer Assessment of Healthcare Providers and Systems star ratings, the investigators said Nov. 28 in JAMA Cardiology.

A clear endorsement for the rankings, it would seem, but another run through the Hospital Compare data – this time for 30-day readmission rates – managed to muddy things up. The nonranked hospitals equaled the ranked hospitals in readmission rates for acute MI and CABG and were actually lower for heart failure, Dr. Wang and his associates said.

“In recent years, financial incentives for hospitals to reduce readmissions … have been 10- to 15-fold greater than incentives to improve mortality rates and have resulted in significant declines in cardiovascular readmissions. Our finding that top-ranked hospitals have lower mortality rates than nonranked hospitals but have generally similar readmission rates might reflect these incentives,” they wrote.

SOURCE: Wang DE et al. JAMA Cardiol. 2018 Nov 28. doi: 10.1001/jamacardio.2018.3951.

.
 

Publications
Topics
Sections
Related Articles

Do the U.S. News & World Report “Best Hospitals” rankings stand up to scrutiny? When it comes to cardiovascular care, the answer is yes … and no.

Do better rankings mean better cardiovascular care?

The hospitals that were ranked as the Top 50 for “cardiology and heart surgery” had lower 30-day mortality for acute MI, heart failure, and coronary artery bypass grafting (CABG), compared with 3,502 nonranked hospitals, when David E. Wang, MD, and his associates at Brigham and Women’s Hospital in Boston looked at the Centers for Medicare & Medicaid Services Hospital Compare website.

The Top 50 hospitals also had higher patient satisfaction scores (3.9 vs. 3.3 on a scale of 1-5), based on the CMS Hospital Consumer Assessment of Healthcare Providers and Systems star ratings, the investigators said Nov. 28 in JAMA Cardiology.

A clear endorsement for the rankings, it would seem, but another run through the Hospital Compare data – this time for 30-day readmission rates – managed to muddy things up. The nonranked hospitals equaled the ranked hospitals in readmission rates for acute MI and CABG and were actually lower for heart failure, Dr. Wang and his associates said.

“In recent years, financial incentives for hospitals to reduce readmissions … have been 10- to 15-fold greater than incentives to improve mortality rates and have resulted in significant declines in cardiovascular readmissions. Our finding that top-ranked hospitals have lower mortality rates than nonranked hospitals but have generally similar readmission rates might reflect these incentives,” they wrote.

SOURCE: Wang DE et al. JAMA Cardiol. 2018 Nov 28. doi: 10.1001/jamacardio.2018.3951.

.
 

Do the U.S. News & World Report “Best Hospitals” rankings stand up to scrutiny? When it comes to cardiovascular care, the answer is yes … and no.

Do better rankings mean better cardiovascular care?

The hospitals that were ranked as the Top 50 for “cardiology and heart surgery” had lower 30-day mortality for acute MI, heart failure, and coronary artery bypass grafting (CABG), compared with 3,502 nonranked hospitals, when David E. Wang, MD, and his associates at Brigham and Women’s Hospital in Boston looked at the Centers for Medicare & Medicaid Services Hospital Compare website.

The Top 50 hospitals also had higher patient satisfaction scores (3.9 vs. 3.3 on a scale of 1-5), based on the CMS Hospital Consumer Assessment of Healthcare Providers and Systems star ratings, the investigators said Nov. 28 in JAMA Cardiology.

A clear endorsement for the rankings, it would seem, but another run through the Hospital Compare data – this time for 30-day readmission rates – managed to muddy things up. The nonranked hospitals equaled the ranked hospitals in readmission rates for acute MI and CABG and were actually lower for heart failure, Dr. Wang and his associates said.

“In recent years, financial incentives for hospitals to reduce readmissions … have been 10- to 15-fold greater than incentives to improve mortality rates and have resulted in significant declines in cardiovascular readmissions. Our finding that top-ranked hospitals have lower mortality rates than nonranked hospitals but have generally similar readmission rates might reflect these incentives,” they wrote.

SOURCE: Wang DE et al. JAMA Cardiol. 2018 Nov 28. doi: 10.1001/jamacardio.2018.3951.

.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA CARDIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Omega-3 fatty acid supplementation reduces risk of preterm birth

Article Type
Changed
Fri, 01/18/2019 - 18:08

 

Taking omega-3 long-chain polyunsaturated fatty acids during pregnancy was associated with reduced risk of preterm birth, and also may reduce the risk of babies born at a low birth weight and risk of requiring neonatal intensive care, according to a Cochrane review of 70 randomized controlled trials.

A pregnant woman taking pills
Creatas Images

“There are not many options for preventing premature birth, so these new findings are very important for pregnant women, babies, and the health professionals who care for them,” Philippa Middleton, MPH, PhD, of Cochrane Pregnancy and Childbirth Group and the South Australian Health and Medical Research Institute, in Adelaide, stated in a press release. “We don’t yet fully understand the causes of premature labor, so predicting and preventing early birth has always been a challenge. This is one of the reasons omega-3 supplementation in pregnancy is of such great interest to researchers around the world.”

Dr. Middleton and her colleagues performed a search of the Cochrane Pregnancy and Childbirth’s Trials Register, ClinicalTrials.gov, and the WHO International Clinical Trials Registry Platform and identified 70 randomized controlled trials (RCTs) where 19,927 women at varying levels of risk for preterm birth received omega-3 long-chain polyunsaturated fatty acids (LCPUFA), placebo, or no omega-3.

“Many pregnant women in the UK are already taking omega-3 supplements by personal choice rather than as a result of advice from health professionals,” Dr. Middleton said in the release. “It’s worth noting though that many supplements currently on the market don’t contain the optimal dose or type of omega-3 for preventing premature birth. Our review found the optimum dose was a daily supplement containing between 500 and 1,000 milligrams of long-chain omega-3 fats (containing at least 500 mg of DHA [docosahexaenoic acid]) starting at 12 weeks of pregnancy.”

In 26 RCTs (10,304 women), the risk of preterm birth under 37 weeks was 11% lower for women who took omega-3 LCPUFA compared with women who did not take omega-3 (relative risk, 0.89; 95% confidence interval, 0.81-0.97), while the risk for preterm birth under 34 weeks in 9 RCTs (5,204 women) was 42% lower for women compared with women who did not take omega-3 (RR, 0.58; 95% CI, 0.44-0.77).

With regard to infant health, use of omega-3 LCPUFA during pregnancy was associated in 10 RCTs (7,416 women) with a potential reduced risk of perinatal mortality (RR, 0.75; 95% CI, 0.54-1.03) and, in 9 RCTs (6,920 women), a reduced risk of neonatal intensive care admission (RR, 0.92; 95% CI, 0.83-1.03). The researchers noted that omega-3 use in 15 trials (8,449 women) was potentially associated with a reduced number of babies with low birth weight (RR, 0.90; 95% CI, 0.82-0.99), but an increase in babies who were large for their gestational age in 3,722 women from 6 RCTs (RR, 1.15; 95% CI, 0.97-1.36). There was no significant difference among groups with regard to babies who were born small for their gestational age or in uterine growth restriction, they said.

While maternal outcomes were examined, Dr. Middleton and her colleagues found no significant differences between groups in factors such as postterm induction, serious adverse events, admission to intensive care, and postnatal depression.

“Ultimately, we hope this review will make a real contribution to the evidence base we need to reduce premature births, which continue to be one of the most pressing and intractable maternal and child health problems in every country around the world,” Dr. Middleton said.

The National Institutes of Health funded the review. The authors reported no conflicts of interest.

SOURCE: Middleton P et al. Cochrane Database Syst Rev. 2018; doi: 10.1002/14651858.CD003402.pub3.

Publications
Topics
Sections

 

Taking omega-3 long-chain polyunsaturated fatty acids during pregnancy was associated with reduced risk of preterm birth, and also may reduce the risk of babies born at a low birth weight and risk of requiring neonatal intensive care, according to a Cochrane review of 70 randomized controlled trials.

A pregnant woman taking pills
Creatas Images

“There are not many options for preventing premature birth, so these new findings are very important for pregnant women, babies, and the health professionals who care for them,” Philippa Middleton, MPH, PhD, of Cochrane Pregnancy and Childbirth Group and the South Australian Health and Medical Research Institute, in Adelaide, stated in a press release. “We don’t yet fully understand the causes of premature labor, so predicting and preventing early birth has always been a challenge. This is one of the reasons omega-3 supplementation in pregnancy is of such great interest to researchers around the world.”

Dr. Middleton and her colleagues performed a search of the Cochrane Pregnancy and Childbirth’s Trials Register, ClinicalTrials.gov, and the WHO International Clinical Trials Registry Platform and identified 70 randomized controlled trials (RCTs) where 19,927 women at varying levels of risk for preterm birth received omega-3 long-chain polyunsaturated fatty acids (LCPUFA), placebo, or no omega-3.

“Many pregnant women in the UK are already taking omega-3 supplements by personal choice rather than as a result of advice from health professionals,” Dr. Middleton said in the release. “It’s worth noting though that many supplements currently on the market don’t contain the optimal dose or type of omega-3 for preventing premature birth. Our review found the optimum dose was a daily supplement containing between 500 and 1,000 milligrams of long-chain omega-3 fats (containing at least 500 mg of DHA [docosahexaenoic acid]) starting at 12 weeks of pregnancy.”

In 26 RCTs (10,304 women), the risk of preterm birth under 37 weeks was 11% lower for women who took omega-3 LCPUFA compared with women who did not take omega-3 (relative risk, 0.89; 95% confidence interval, 0.81-0.97), while the risk for preterm birth under 34 weeks in 9 RCTs (5,204 women) was 42% lower for women compared with women who did not take omega-3 (RR, 0.58; 95% CI, 0.44-0.77).

With regard to infant health, use of omega-3 LCPUFA during pregnancy was associated in 10 RCTs (7,416 women) with a potential reduced risk of perinatal mortality (RR, 0.75; 95% CI, 0.54-1.03) and, in 9 RCTs (6,920 women), a reduced risk of neonatal intensive care admission (RR, 0.92; 95% CI, 0.83-1.03). The researchers noted that omega-3 use in 15 trials (8,449 women) was potentially associated with a reduced number of babies with low birth weight (RR, 0.90; 95% CI, 0.82-0.99), but an increase in babies who were large for their gestational age in 3,722 women from 6 RCTs (RR, 1.15; 95% CI, 0.97-1.36). There was no significant difference among groups with regard to babies who were born small for their gestational age or in uterine growth restriction, they said.

While maternal outcomes were examined, Dr. Middleton and her colleagues found no significant differences between groups in factors such as postterm induction, serious adverse events, admission to intensive care, and postnatal depression.

“Ultimately, we hope this review will make a real contribution to the evidence base we need to reduce premature births, which continue to be one of the most pressing and intractable maternal and child health problems in every country around the world,” Dr. Middleton said.

The National Institutes of Health funded the review. The authors reported no conflicts of interest.

SOURCE: Middleton P et al. Cochrane Database Syst Rev. 2018; doi: 10.1002/14651858.CD003402.pub3.

 

Taking omega-3 long-chain polyunsaturated fatty acids during pregnancy was associated with reduced risk of preterm birth, and also may reduce the risk of babies born at a low birth weight and risk of requiring neonatal intensive care, according to a Cochrane review of 70 randomized controlled trials.

A pregnant woman taking pills
Creatas Images

“There are not many options for preventing premature birth, so these new findings are very important for pregnant women, babies, and the health professionals who care for them,” Philippa Middleton, MPH, PhD, of Cochrane Pregnancy and Childbirth Group and the South Australian Health and Medical Research Institute, in Adelaide, stated in a press release. “We don’t yet fully understand the causes of premature labor, so predicting and preventing early birth has always been a challenge. This is one of the reasons omega-3 supplementation in pregnancy is of such great interest to researchers around the world.”

Dr. Middleton and her colleagues performed a search of the Cochrane Pregnancy and Childbirth’s Trials Register, ClinicalTrials.gov, and the WHO International Clinical Trials Registry Platform and identified 70 randomized controlled trials (RCTs) where 19,927 women at varying levels of risk for preterm birth received omega-3 long-chain polyunsaturated fatty acids (LCPUFA), placebo, or no omega-3.

“Many pregnant women in the UK are already taking omega-3 supplements by personal choice rather than as a result of advice from health professionals,” Dr. Middleton said in the release. “It’s worth noting though that many supplements currently on the market don’t contain the optimal dose or type of omega-3 for preventing premature birth. Our review found the optimum dose was a daily supplement containing between 500 and 1,000 milligrams of long-chain omega-3 fats (containing at least 500 mg of DHA [docosahexaenoic acid]) starting at 12 weeks of pregnancy.”

In 26 RCTs (10,304 women), the risk of preterm birth under 37 weeks was 11% lower for women who took omega-3 LCPUFA compared with women who did not take omega-3 (relative risk, 0.89; 95% confidence interval, 0.81-0.97), while the risk for preterm birth under 34 weeks in 9 RCTs (5,204 women) was 42% lower for women compared with women who did not take omega-3 (RR, 0.58; 95% CI, 0.44-0.77).

With regard to infant health, use of omega-3 LCPUFA during pregnancy was associated in 10 RCTs (7,416 women) with a potential reduced risk of perinatal mortality (RR, 0.75; 95% CI, 0.54-1.03) and, in 9 RCTs (6,920 women), a reduced risk of neonatal intensive care admission (RR, 0.92; 95% CI, 0.83-1.03). The researchers noted that omega-3 use in 15 trials (8,449 women) was potentially associated with a reduced number of babies with low birth weight (RR, 0.90; 95% CI, 0.82-0.99), but an increase in babies who were large for their gestational age in 3,722 women from 6 RCTs (RR, 1.15; 95% CI, 0.97-1.36). There was no significant difference among groups with regard to babies who were born small for their gestational age or in uterine growth restriction, they said.

While maternal outcomes were examined, Dr. Middleton and her colleagues found no significant differences between groups in factors such as postterm induction, serious adverse events, admission to intensive care, and postnatal depression.

“Ultimately, we hope this review will make a real contribution to the evidence base we need to reduce premature births, which continue to be one of the most pressing and intractable maternal and child health problems in every country around the world,” Dr. Middleton said.

The National Institutes of Health funded the review. The authors reported no conflicts of interest.

SOURCE: Middleton P et al. Cochrane Database Syst Rev. 2018; doi: 10.1002/14651858.CD003402.pub3.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM COCHRANE DATABASE OF SYSTEMATIC REVIEWS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Omega-3 fatty acid supplementation was associated with lower risk of babies born under 37 weeks and under 34 weeks, compared with women who didn’t take omega-3.

Major finding: In 26 randomized controlled trials, the risk of preterm birth at 37 weeks (10,304 women) was 11% lower and the risk of preterm birth at 34 weeks (5,204 women) in 9 RCTs was 42% lower for women taking omega-3, compared with women not taking omega-3.

Study details: A Cochrane review of 70 RCTs with a total of 19,927 women at varying levels of risk for preterm birth who received omega-3 long-chain polyunsaturated fatty acids, placebo, or no omega-3.

Disclosures: The National Institutes of Health funded the review. The authors reported no conflicts of interest.

Source: Middleton P et al. Cochrane Database Syst Rev. 2018. doi: 10.1002/14651858.CD003402.pub3.

Disqus Comments
Default
Use ProPublica

Sofa and bed injuries very common among young children

Article Type
Changed
Wed, 08/11/2021 - 13:07

– Injuries related to beds and sofas in children aged under 5 years occur more than twice as frequently than injuries related to stairs, according to new research.

“Findings from our analysis reveal that it is an important source of injury to young children and a leading cause of trauma to infants,” concluded David S. Liu, of Baylor College of Medicine, Houston, who presented the findings at the annual meeting of the American Academy of Pediatrics.

“The rate of bed- and sofa-related injuries is increasing, which underscores the need for increased prevention efforts, including parental education and improved safety design, to decrease soft furniture injuries among young children,” Mr. Liu and his colleagues wrote.

The researchers used the National Electronic Injury Surveillance System of the U.S. Consumer Product Safety Commission to conduct a retrospective analysis of injuries related to sofas and beds from 2007 to 2016.

They found that an estimated 2.3 million children aged under 5 years were treated for injuries related to soft furniture during those years, an average of 230,026 injuries a year, or 115 injuries per 10,000 children. To the surprise of the researchers, injuries related to beds and sofas were the most common types of accidental injury in that age group, occurring 2.5 times more often than stair-related injuries, which occurred at a rate of 47 per 10,000 population.

Boys were slightly more likely to be injured, making up 56% of all the cases. Soft tissue/internal organ injuries were most common, comprising 28% of all injuries, followed by lacerations in 24% of cases, abrasions in 15%, and fractures in 14%.

More than half the children (61%) sustained injuries to the head or face, and 3% were hospitalized for their injuries. Although infants (under 1 year old) only accounted for 28% of children injured, they were twice as likely to be hospitalized than older children.

The researchers also identified increases in injuries over the time period studied. Bed-related injuries increased 17% from 2007 to 2016, and sofa/couch-related injuries increased 17% during that period.

Although the vast majority of children were treated and released, approximately 4% of children were admitted or treated and transferred to another facility. Overall, an estimated 3,361 children died during the 9-year period, translating to a little over 370 children a year.

In a video interview, Mr. Liu discussed the implications of these findings.

“We know how dangerous car accidents and staircases are, and we often recommend car seats and stair gates for those,” Mr. Liu said. “Obviously we can’t put a gate or a barrier on every single sofa, couch, and bed in America, so as clinicians and parents, the best we can do is keep aware of how dangerous these items are. Just because of their soft nature doesn’t mean they’re inherently safer.”

The researchers reported no disclosures and the research received no external funding.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Injuries related to beds and sofas in children aged under 5 years occur more than twice as frequently than injuries related to stairs, according to new research.

“Findings from our analysis reveal that it is an important source of injury to young children and a leading cause of trauma to infants,” concluded David S. Liu, of Baylor College of Medicine, Houston, who presented the findings at the annual meeting of the American Academy of Pediatrics.

“The rate of bed- and sofa-related injuries is increasing, which underscores the need for increased prevention efforts, including parental education and improved safety design, to decrease soft furniture injuries among young children,” Mr. Liu and his colleagues wrote.

The researchers used the National Electronic Injury Surveillance System of the U.S. Consumer Product Safety Commission to conduct a retrospective analysis of injuries related to sofas and beds from 2007 to 2016.

They found that an estimated 2.3 million children aged under 5 years were treated for injuries related to soft furniture during those years, an average of 230,026 injuries a year, or 115 injuries per 10,000 children. To the surprise of the researchers, injuries related to beds and sofas were the most common types of accidental injury in that age group, occurring 2.5 times more often than stair-related injuries, which occurred at a rate of 47 per 10,000 population.

Boys were slightly more likely to be injured, making up 56% of all the cases. Soft tissue/internal organ injuries were most common, comprising 28% of all injuries, followed by lacerations in 24% of cases, abrasions in 15%, and fractures in 14%.

More than half the children (61%) sustained injuries to the head or face, and 3% were hospitalized for their injuries. Although infants (under 1 year old) only accounted for 28% of children injured, they were twice as likely to be hospitalized than older children.

The researchers also identified increases in injuries over the time period studied. Bed-related injuries increased 17% from 2007 to 2016, and sofa/couch-related injuries increased 17% during that period.

Although the vast majority of children were treated and released, approximately 4% of children were admitted or treated and transferred to another facility. Overall, an estimated 3,361 children died during the 9-year period, translating to a little over 370 children a year.

In a video interview, Mr. Liu discussed the implications of these findings.

“We know how dangerous car accidents and staircases are, and we often recommend car seats and stair gates for those,” Mr. Liu said. “Obviously we can’t put a gate or a barrier on every single sofa, couch, and bed in America, so as clinicians and parents, the best we can do is keep aware of how dangerous these items are. Just because of their soft nature doesn’t mean they’re inherently safer.”

The researchers reported no disclosures and the research received no external funding.

– Injuries related to beds and sofas in children aged under 5 years occur more than twice as frequently than injuries related to stairs, according to new research.

“Findings from our analysis reveal that it is an important source of injury to young children and a leading cause of trauma to infants,” concluded David S. Liu, of Baylor College of Medicine, Houston, who presented the findings at the annual meeting of the American Academy of Pediatrics.

“The rate of bed- and sofa-related injuries is increasing, which underscores the need for increased prevention efforts, including parental education and improved safety design, to decrease soft furniture injuries among young children,” Mr. Liu and his colleagues wrote.

The researchers used the National Electronic Injury Surveillance System of the U.S. Consumer Product Safety Commission to conduct a retrospective analysis of injuries related to sofas and beds from 2007 to 2016.

They found that an estimated 2.3 million children aged under 5 years were treated for injuries related to soft furniture during those years, an average of 230,026 injuries a year, or 115 injuries per 10,000 children. To the surprise of the researchers, injuries related to beds and sofas were the most common types of accidental injury in that age group, occurring 2.5 times more often than stair-related injuries, which occurred at a rate of 47 per 10,000 population.

Boys were slightly more likely to be injured, making up 56% of all the cases. Soft tissue/internal organ injuries were most common, comprising 28% of all injuries, followed by lacerations in 24% of cases, abrasions in 15%, and fractures in 14%.

More than half the children (61%) sustained injuries to the head or face, and 3% were hospitalized for their injuries. Although infants (under 1 year old) only accounted for 28% of children injured, they were twice as likely to be hospitalized than older children.

The researchers also identified increases in injuries over the time period studied. Bed-related injuries increased 17% from 2007 to 2016, and sofa/couch-related injuries increased 17% during that period.

Although the vast majority of children were treated and released, approximately 4% of children were admitted or treated and transferred to another facility. Overall, an estimated 3,361 children died during the 9-year period, translating to a little over 370 children a year.

In a video interview, Mr. Liu discussed the implications of these findings.

“We know how dangerous car accidents and staircases are, and we often recommend car seats and stair gates for those,” Mr. Liu said. “Obviously we can’t put a gate or a barrier on every single sofa, couch, and bed in America, so as clinicians and parents, the best we can do is keep aware of how dangerous these items are. Just because of their soft nature doesn’t mean they’re inherently safer.”

The researchers reported no disclosures and the research received no external funding.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

REPORTING FROM AAP 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
189680
Vitals

 

Key clinical point: Injuries from beds and sofas/couches are common in children aged under 5 years, occurring 2.5 times more frequently than stairs-related injuries.

Major finding: An estimated 115 bed/sofa-related injuries per 10,000 children occur every year.

Study details: The findings are based on a retrospective analysis of injuries related to sofas and beds from 2007 to 2016.

Disclosures: The researchers reported no disclosures and the research received no external funding.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Allergy Testing in Dermatology and Beyond

Article Type
Changed
Thu, 03/28/2019 - 14:32
Display Headline
Allergy Testing in Dermatology and Beyond

Allergy testing typically refers to evaluation of a patient for suspected type I or type IV hypersensitivity.1,2 The possibility of type I hypersensitivity is raised in patients presenting with food allergies, allergic rhinitis, asthma, and immediate adverse reactions to medications, whereas type IV hypersensitivity is suspected in patients with eczematous eruptions, delayed adverse cutaneous reactions to medications, and failure of metallic implants (eg, metal joint replacements, cardiac stents) in conjunction with overlying skin rashes (Table 1).1-5 Type II (eg, pemphigus vulgaris) and type III (eg, IgA vasculitis) hypersensitivities are not evaluated with screening allergy tests.

Type I Sensitization

Type I hypersensitivity is an immediate hypersensitivity mediated predominantly by IgE activation of mast cells in the skin as well as the respiratory and gastric mucosa.1 Sensitization of an individual patient occurs when antigen-presenting cells induce a helper T cell (TH2) cytokine response leading to B-cell class switching and allergen-specific IgE production. Upon repeat exposure to the allergen, circulating antibodies then bind to high-affinity receptors on mast cells and basophils and initiate an allergic inflammatory response, leading to a clinical presentation of allergic rhinitis, urticaria, or immediate drug reactions. Confirming type I sensitization may be performed via serologic (in vitro) or skin testing (in vivo).5,6

Serologic Testing (In Vitro)
Serologic testing is a blood test that detects circulating IgE levels against specific allergens.5 The first such test, the radioallergosorbent test, was introduced in the 1970s but is not quantitative and is no longer used. Although common, it is inaccurate to describe current serum IgE (s-IgE) testing as radioallergosorbent testing. There are several US Food and Drug Administration-approved s-IgE assays in common use, and these tests may be helpful in elucidating relevant allergens and for tailoring therapy appropriately, which may consist of avoidance of certain foods or environmental agents and/or allergen immunotherapy.

Skin Testing (In Vivo)
Skin testing can be performed percutaneously (eg, percutaneous skin testing) or intradermally (eg, intradermal testing).6 Percutaneous skin testing is performed by placing a drop of allergen extract on the skin, after which a lancet is used to lightly scratch the skin; intradermal testing is performed by injecting a small amount of allergen extract into the dermis. In both cases, the skin is evaluated after 15 to 20 minutes for the presence and size of a cutaneous wheal. Medications with antihistaminergic activity must be discontinued prior to testing. Both s-IgE and skin testing assess for type I hypersensitivity, and factors such as extensive rash, concern for anaphylaxis, or inability to discontinue antihistamines may favor s-IgE testing versus skin testing. False-positive results can occur with both tests, and for this reason, test results should always be interpreted in conjunction with clinical examination and patient history to determine relevant allergies.

Type IV Sensitization

Type IV hypersensitivity is a delayed hypersensitivity mediated primarily by lymphocytes.2 Sensitization occurs when haptens bind to host proteins and are presented by epidermal and dermal dendritic cells to T lymphocytes in the skin. These lymphocytes then migrate to regional lymph nodes where antigen-specific T lymphocytes are produced and home back to the skin. Upon reexposure to the allergen, these memory T lymphocytes become activated and incite a delayed allergic response. Confirming type IV hypersensitivity primarily is accomplished via patch testing, though other testing modalities exist.

Skin Biopsy
Biopsy is sometimes performed in the workup of an individual presenting with allergic contact dermatitis (ACD) and typically will show spongiosis with normal stratum corneum and epidermal thickness in the setting of acute ACD and mild to marked acanthosis and parakeratosis in chronic ACD.7 The findings, however, are nonspecific and the differential of these histopathologic findings encompasses nummular dermatitis, atopic dermatitis, irritant contact dermatitis, and dyshidrotic eczema, among others. The presence of eosinophils and Langerhans cell microabscesses may provide supportive evidence for ACD over the other spongiotic dermatitides.7,8

Patch Testing
Patch testing is the gold standard in diagnosing type IV hypersensitivities resulting in a clinical presentation of ACD. Hundreds of allergens are commercially available for patch testing, and more commonly tested allergens fall into one of several categories, such as cosmetic preservatives, rubbers, metals, textiles, fragrances, adhesives, antibiotics, plants, and even corticosteroids. Of note, a common misconception is that ACD must result from new exposures; however, patients may develop ACD secondary to an exposure or product they have been using for many years without a problem.

Three commonly used screening series are the thin-layer rapid use epicutaneous (T.R.U.E.) test (SmartPractice), North American Contact Dermatitis Group screening series, and American Contact Dermatitis Society Core 80 allergen series, which have some variation in the type and number of allergens included (Table 2). The T.R.U.E. test will miss a notable number of clinically relevant allergens in comparison to the North American Contact Dermatitis Group and American Contact Dermatitis Society Core series, and it may be of particularly low utility in identifying fragrance or preservative ACD.9

Allergens are placed on the back in chambers in a petrolatum or aqueous medium. The patches remain affixed for 48 hours, during which time the patient is asked to refrain from showering or exercising to prevent loss of patches. The patient's skin is then evaluated for reactions to allergens on 2 separate occasions: at the time of patch removal 48 hours after initial placement, then the areas of patches are marked for delayed readings at day 4 to day 7 after initial patch placement. Results are scored based on the degree of the inflammatory reaction (Table 3). Delayed readings beyond day 7 may be necessary for metals, specific preservatives (eg, dodecyl gallate, propolis), and neomycin.10

There is a wide spectrum of cutaneous disease that should prompt consideration of patch testing, including well-circumscribed eczematous dermatitis (eg, recurrent lip, hand, and foot dermatitis); patchy or diffuse eczema, especially if recently worsened and/or unresponsive to topical steroids; lichenoid eruptions, particularly of mucosal surfaces; mucous membrane eruptions (eg, stomatitis, vulvitis); and eczematous presentations that raise concern for airborne (photodistributed) or systemic contact dermatitis.11-13 Although further studies of efficacy and safety are ongoing, patch testing also may be useful in the diagnosis of nonimmediate cutaneous adverse drug reactions, especially fixed drug eruptions, acute generalized exanthematous pustulosis, systemic contact dermatitis from medications, and drug-induced hypersensitivity syndrome.3 Lastly, patients with type IV hypersensitivity to metals, adhesives, or antibiotics used in metallic orthopedic or cardiac implants may experience implant failure, regional contact dermatitis, or both, and benefit from patch testing prior to implant replacement to assess for potential allergens. Of the joints that fail, it is estimated that up to 5% are due to metal hypersensitivity.4

Throughout patch testing, patients may continue to manage their skin condition with oral antihistamines and topical steroids, though application to the site at which the patches are applied should be avoided throughout patch testing and during the week prior. According to expert consensus, immunosuppressive medications that are less likely to impact patch testing and therefore may be continued include low-dose methotrexate, oral prednisone less than 10 mg daily, biologic therapy, and low-dose cyclosporine (<2 mg/kg daily). Therapeutic interventions that are more likely to impact patch testing and should be avoided include phototherapy or extensive sun exposure within a week prior to testing, oral prednisone more than 10 mg daily, intramuscular triamcinolone within the preceding month, and high-dose cyclosporine (>2 mg/kg daily).14

An important component to successful patch testing is posttest patient counseling. Providers can create a safe list of products for patients by logging onto the American Contact Dermatitis Society website and accessing the Contact Allergen Management Program (CAMP).15 All relevant allergens found on patch testing may be selected and patient-specific identification codes generated. Once these codes are entered into the CAMP app on the patient's cellular device, a personalized, regularly updated list of safe products appears for many categories of products, including shampoos, sunscreens, moisturizers, cosmetic products, and laundry or dish detergents, among others. Of note, this app is not helpful for avoidance in patients with textile allergies. Patients should be counseled that improvement occurs with avoidance, which usually occurs within weeks but may slowly occur over time in some cases.

Lymphocyte Transformation Test (In Vitro)
The lymphocyte transformation test is an experimental in vitro test for type IV hypersensitivity. This serologic test utilizes allergens to stimulate memory T lymphocytes in vitro and measures the degree of response to the allergen. Although this test has generated excitement, particularly for the potential to safely evaluate for severe adverse cutaneous drug reactions, it currently is not the standard of care and is not utilized in the United States.16

Conclusion

Dermatologists play a vital role in the workup of suspected type IV hypersensitivities. Patch testing is an important but underutilized tool in the arsenal of allergy testing and may be indicated in a wide variety of cutaneous presentations, adverse reactions to medications, and implanted device failures. Identification and avoidance of a culprit allergen has the potential to lead to complete resolution of disease and notable improvement in quality of life for patients.

Acknowledgments
The author thanks Nina Botto, MD (San Francisco, California), for her mentorship in the arena of ACD as well as the Women's Dermatologic Society for the support they provided through the mentorship program.

References
  1. Oettgen H, Broide DH. Introduction to the mechanisms of allergic disease. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:1-32.
  2. Werfel T, Kapp A. Atopic dermatitis and allergic contact dermatitis. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:263-286.
  3. Zinn A, Gayam S, Chelliah MP, et al. Patch testing for nonimmediate cutaneous adverse drug reactions. J Am Acad Dermatol. 2018;78:421-423.
  4. Thyssen JP, Menne T, Schalock PC, et al. Pragmatic approach to the clinical work-up of patients with putative allergic disease to metallic orthopaedic implants before and after surgery. Br J Dermatol. 2011;164:473-478.
  5. Cox L. Overview of serological-specific IgE antibody testing in children. Curr Allergy Asthma Rep. 2011;11:447-453.
  6. Dolen WK. Skin testing and immunoassays for allergen-specific IgE. Clin Rev Allergy Immunol. 2001;21:229-239.
  7. Keeling BH, Gavino AC, Gavino AC. Skin biopsy, the allergists' tool: how to interpret a report. Curr Allergy Asthma Rep. 2015;15:62.
  8. Rosa G, Fernandez AP, Vij A, et al. Langerhans cell collections, but not eosinophils, are clues to a diagnosis of allergic contact dermatitis in appropriate skin biopsies. J Cutan Pathol. 2016;43:498-504.
  9. DeKoven JG, Warshaw EM, Belsito DV. North American Contact Dermatitis Group patch test results 2013-2014. Dermatitis. 2017;28:33-46.
  10. Davis MD, Bhate K, Rohlinger AL, et al. Delayed patch test reading after 5 days: the Mayo Clinic experience. J Am Acad Dermatol. 2008;59:225-233.
  11. Rajagopalan R, Anderson RT. The profile of a patient with contact dermatitis and a suspicion of contact allergy (history, physical characteristics, and dermatology-specific quality of life). Am J Contact Dermat. 1997;8:26-31.
  12. Huygens S, Goossens A. An update on airborne contact dermatitis. Contact Dermatitis. 2001;44:1-6.
  13. Salam TN, Fowler JF. Balsam-related systemic contact dermatitis. J Am Acad Dermatol. 2001;45:377-381.
  14. Fowler JF, Maibach HI, Zirwas M, et al. Effects of immunomodulatory agents on patch testing: expert opinion 2012. Dermatitis. 2012;23:301-303.
  15. ACDS CAMP. American Contact Dermatitis Society website. https://www.contactderm.org/i4a/pages/index.cfm?pageid=3489. Accessed November 14, 2018.
  16. Popple A, Williams J, Maxwell G, et al. The lymphocyte transformation test in allergic contact dermatitis: new opportunities. J Immunotoxicol. 2016;13:84-91.
Article PDF
Author and Disclosure Information

From the Division of Dermatology, University of Texas Dell Medical School, Austin.

The author reports no conflict of interest.

Correspondence: Ashley D. Lundgren, MD, 313 E 12th St, Ste 103, Austin, TX 78701 (ashley.diana@gmail.com).

Issue
Cutis - 102(5)
Publications
Topics
Page Number
E16-E19
Sections
Author and Disclosure Information

From the Division of Dermatology, University of Texas Dell Medical School, Austin.

The author reports no conflict of interest.

Correspondence: Ashley D. Lundgren, MD, 313 E 12th St, Ste 103, Austin, TX 78701 (ashley.diana@gmail.com).

Author and Disclosure Information

From the Division of Dermatology, University of Texas Dell Medical School, Austin.

The author reports no conflict of interest.

Correspondence: Ashley D. Lundgren, MD, 313 E 12th St, Ste 103, Austin, TX 78701 (ashley.diana@gmail.com).

Article PDF
Article PDF

Allergy testing typically refers to evaluation of a patient for suspected type I or type IV hypersensitivity.1,2 The possibility of type I hypersensitivity is raised in patients presenting with food allergies, allergic rhinitis, asthma, and immediate adverse reactions to medications, whereas type IV hypersensitivity is suspected in patients with eczematous eruptions, delayed adverse cutaneous reactions to medications, and failure of metallic implants (eg, metal joint replacements, cardiac stents) in conjunction with overlying skin rashes (Table 1).1-5 Type II (eg, pemphigus vulgaris) and type III (eg, IgA vasculitis) hypersensitivities are not evaluated with screening allergy tests.

Type I Sensitization

Type I hypersensitivity is an immediate hypersensitivity mediated predominantly by IgE activation of mast cells in the skin as well as the respiratory and gastric mucosa.1 Sensitization of an individual patient occurs when antigen-presenting cells induce a helper T cell (TH2) cytokine response leading to B-cell class switching and allergen-specific IgE production. Upon repeat exposure to the allergen, circulating antibodies then bind to high-affinity receptors on mast cells and basophils and initiate an allergic inflammatory response, leading to a clinical presentation of allergic rhinitis, urticaria, or immediate drug reactions. Confirming type I sensitization may be performed via serologic (in vitro) or skin testing (in vivo).5,6

Serologic Testing (In Vitro)
Serologic testing is a blood test that detects circulating IgE levels against specific allergens.5 The first such test, the radioallergosorbent test, was introduced in the 1970s but is not quantitative and is no longer used. Although common, it is inaccurate to describe current serum IgE (s-IgE) testing as radioallergosorbent testing. There are several US Food and Drug Administration-approved s-IgE assays in common use, and these tests may be helpful in elucidating relevant allergens and for tailoring therapy appropriately, which may consist of avoidance of certain foods or environmental agents and/or allergen immunotherapy.

Skin Testing (In Vivo)
Skin testing can be performed percutaneously (eg, percutaneous skin testing) or intradermally (eg, intradermal testing).6 Percutaneous skin testing is performed by placing a drop of allergen extract on the skin, after which a lancet is used to lightly scratch the skin; intradermal testing is performed by injecting a small amount of allergen extract into the dermis. In both cases, the skin is evaluated after 15 to 20 minutes for the presence and size of a cutaneous wheal. Medications with antihistaminergic activity must be discontinued prior to testing. Both s-IgE and skin testing assess for type I hypersensitivity, and factors such as extensive rash, concern for anaphylaxis, or inability to discontinue antihistamines may favor s-IgE testing versus skin testing. False-positive results can occur with both tests, and for this reason, test results should always be interpreted in conjunction with clinical examination and patient history to determine relevant allergies.

Type IV Sensitization

Type IV hypersensitivity is a delayed hypersensitivity mediated primarily by lymphocytes.2 Sensitization occurs when haptens bind to host proteins and are presented by epidermal and dermal dendritic cells to T lymphocytes in the skin. These lymphocytes then migrate to regional lymph nodes where antigen-specific T lymphocytes are produced and home back to the skin. Upon reexposure to the allergen, these memory T lymphocytes become activated and incite a delayed allergic response. Confirming type IV hypersensitivity primarily is accomplished via patch testing, though other testing modalities exist.

Skin Biopsy
Biopsy is sometimes performed in the workup of an individual presenting with allergic contact dermatitis (ACD) and typically will show spongiosis with normal stratum corneum and epidermal thickness in the setting of acute ACD and mild to marked acanthosis and parakeratosis in chronic ACD.7 The findings, however, are nonspecific and the differential of these histopathologic findings encompasses nummular dermatitis, atopic dermatitis, irritant contact dermatitis, and dyshidrotic eczema, among others. The presence of eosinophils and Langerhans cell microabscesses may provide supportive evidence for ACD over the other spongiotic dermatitides.7,8

Patch Testing
Patch testing is the gold standard in diagnosing type IV hypersensitivities resulting in a clinical presentation of ACD. Hundreds of allergens are commercially available for patch testing, and more commonly tested allergens fall into one of several categories, such as cosmetic preservatives, rubbers, metals, textiles, fragrances, adhesives, antibiotics, plants, and even corticosteroids. Of note, a common misconception is that ACD must result from new exposures; however, patients may develop ACD secondary to an exposure or product they have been using for many years without a problem.

Three commonly used screening series are the thin-layer rapid use epicutaneous (T.R.U.E.) test (SmartPractice), North American Contact Dermatitis Group screening series, and American Contact Dermatitis Society Core 80 allergen series, which have some variation in the type and number of allergens included (Table 2). The T.R.U.E. test will miss a notable number of clinically relevant allergens in comparison to the North American Contact Dermatitis Group and American Contact Dermatitis Society Core series, and it may be of particularly low utility in identifying fragrance or preservative ACD.9

Allergens are placed on the back in chambers in a petrolatum or aqueous medium. The patches remain affixed for 48 hours, during which time the patient is asked to refrain from showering or exercising to prevent loss of patches. The patient's skin is then evaluated for reactions to allergens on 2 separate occasions: at the time of patch removal 48 hours after initial placement, then the areas of patches are marked for delayed readings at day 4 to day 7 after initial patch placement. Results are scored based on the degree of the inflammatory reaction (Table 3). Delayed readings beyond day 7 may be necessary for metals, specific preservatives (eg, dodecyl gallate, propolis), and neomycin.10

There is a wide spectrum of cutaneous disease that should prompt consideration of patch testing, including well-circumscribed eczematous dermatitis (eg, recurrent lip, hand, and foot dermatitis); patchy or diffuse eczema, especially if recently worsened and/or unresponsive to topical steroids; lichenoid eruptions, particularly of mucosal surfaces; mucous membrane eruptions (eg, stomatitis, vulvitis); and eczematous presentations that raise concern for airborne (photodistributed) or systemic contact dermatitis.11-13 Although further studies of efficacy and safety are ongoing, patch testing also may be useful in the diagnosis of nonimmediate cutaneous adverse drug reactions, especially fixed drug eruptions, acute generalized exanthematous pustulosis, systemic contact dermatitis from medications, and drug-induced hypersensitivity syndrome.3 Lastly, patients with type IV hypersensitivity to metals, adhesives, or antibiotics used in metallic orthopedic or cardiac implants may experience implant failure, regional contact dermatitis, or both, and benefit from patch testing prior to implant replacement to assess for potential allergens. Of the joints that fail, it is estimated that up to 5% are due to metal hypersensitivity.4

Throughout patch testing, patients may continue to manage their skin condition with oral antihistamines and topical steroids, though application to the site at which the patches are applied should be avoided throughout patch testing and during the week prior. According to expert consensus, immunosuppressive medications that are less likely to impact patch testing and therefore may be continued include low-dose methotrexate, oral prednisone less than 10 mg daily, biologic therapy, and low-dose cyclosporine (<2 mg/kg daily). Therapeutic interventions that are more likely to impact patch testing and should be avoided include phototherapy or extensive sun exposure within a week prior to testing, oral prednisone more than 10 mg daily, intramuscular triamcinolone within the preceding month, and high-dose cyclosporine (>2 mg/kg daily).14

An important component to successful patch testing is posttest patient counseling. Providers can create a safe list of products for patients by logging onto the American Contact Dermatitis Society website and accessing the Contact Allergen Management Program (CAMP).15 All relevant allergens found on patch testing may be selected and patient-specific identification codes generated. Once these codes are entered into the CAMP app on the patient's cellular device, a personalized, regularly updated list of safe products appears for many categories of products, including shampoos, sunscreens, moisturizers, cosmetic products, and laundry or dish detergents, among others. Of note, this app is not helpful for avoidance in patients with textile allergies. Patients should be counseled that improvement occurs with avoidance, which usually occurs within weeks but may slowly occur over time in some cases.

Lymphocyte Transformation Test (In Vitro)
The lymphocyte transformation test is an experimental in vitro test for type IV hypersensitivity. This serologic test utilizes allergens to stimulate memory T lymphocytes in vitro and measures the degree of response to the allergen. Although this test has generated excitement, particularly for the potential to safely evaluate for severe adverse cutaneous drug reactions, it currently is not the standard of care and is not utilized in the United States.16

Conclusion

Dermatologists play a vital role in the workup of suspected type IV hypersensitivities. Patch testing is an important but underutilized tool in the arsenal of allergy testing and may be indicated in a wide variety of cutaneous presentations, adverse reactions to medications, and implanted device failures. Identification and avoidance of a culprit allergen has the potential to lead to complete resolution of disease and notable improvement in quality of life for patients.

Acknowledgments
The author thanks Nina Botto, MD (San Francisco, California), for her mentorship in the arena of ACD as well as the Women's Dermatologic Society for the support they provided through the mentorship program.

Allergy testing typically refers to evaluation of a patient for suspected type I or type IV hypersensitivity.1,2 The possibility of type I hypersensitivity is raised in patients presenting with food allergies, allergic rhinitis, asthma, and immediate adverse reactions to medications, whereas type IV hypersensitivity is suspected in patients with eczematous eruptions, delayed adverse cutaneous reactions to medications, and failure of metallic implants (eg, metal joint replacements, cardiac stents) in conjunction with overlying skin rashes (Table 1).1-5 Type II (eg, pemphigus vulgaris) and type III (eg, IgA vasculitis) hypersensitivities are not evaluated with screening allergy tests.

Type I Sensitization

Type I hypersensitivity is an immediate hypersensitivity mediated predominantly by IgE activation of mast cells in the skin as well as the respiratory and gastric mucosa.1 Sensitization of an individual patient occurs when antigen-presenting cells induce a helper T cell (TH2) cytokine response leading to B-cell class switching and allergen-specific IgE production. Upon repeat exposure to the allergen, circulating antibodies then bind to high-affinity receptors on mast cells and basophils and initiate an allergic inflammatory response, leading to a clinical presentation of allergic rhinitis, urticaria, or immediate drug reactions. Confirming type I sensitization may be performed via serologic (in vitro) or skin testing (in vivo).5,6

Serologic Testing (In Vitro)
Serologic testing is a blood test that detects circulating IgE levels against specific allergens.5 The first such test, the radioallergosorbent test, was introduced in the 1970s but is not quantitative and is no longer used. Although common, it is inaccurate to describe current serum IgE (s-IgE) testing as radioallergosorbent testing. There are several US Food and Drug Administration-approved s-IgE assays in common use, and these tests may be helpful in elucidating relevant allergens and for tailoring therapy appropriately, which may consist of avoidance of certain foods or environmental agents and/or allergen immunotherapy.

Skin Testing (In Vivo)
Skin testing can be performed percutaneously (eg, percutaneous skin testing) or intradermally (eg, intradermal testing).6 Percutaneous skin testing is performed by placing a drop of allergen extract on the skin, after which a lancet is used to lightly scratch the skin; intradermal testing is performed by injecting a small amount of allergen extract into the dermis. In both cases, the skin is evaluated after 15 to 20 minutes for the presence and size of a cutaneous wheal. Medications with antihistaminergic activity must be discontinued prior to testing. Both s-IgE and skin testing assess for type I hypersensitivity, and factors such as extensive rash, concern for anaphylaxis, or inability to discontinue antihistamines may favor s-IgE testing versus skin testing. False-positive results can occur with both tests, and for this reason, test results should always be interpreted in conjunction with clinical examination and patient history to determine relevant allergies.

Type IV Sensitization

Type IV hypersensitivity is a delayed hypersensitivity mediated primarily by lymphocytes.2 Sensitization occurs when haptens bind to host proteins and are presented by epidermal and dermal dendritic cells to T lymphocytes in the skin. These lymphocytes then migrate to regional lymph nodes where antigen-specific T lymphocytes are produced and home back to the skin. Upon reexposure to the allergen, these memory T lymphocytes become activated and incite a delayed allergic response. Confirming type IV hypersensitivity primarily is accomplished via patch testing, though other testing modalities exist.

Skin Biopsy
Biopsy is sometimes performed in the workup of an individual presenting with allergic contact dermatitis (ACD) and typically will show spongiosis with normal stratum corneum and epidermal thickness in the setting of acute ACD and mild to marked acanthosis and parakeratosis in chronic ACD.7 The findings, however, are nonspecific and the differential of these histopathologic findings encompasses nummular dermatitis, atopic dermatitis, irritant contact dermatitis, and dyshidrotic eczema, among others. The presence of eosinophils and Langerhans cell microabscesses may provide supportive evidence for ACD over the other spongiotic dermatitides.7,8

Patch Testing
Patch testing is the gold standard in diagnosing type IV hypersensitivities resulting in a clinical presentation of ACD. Hundreds of allergens are commercially available for patch testing, and more commonly tested allergens fall into one of several categories, such as cosmetic preservatives, rubbers, metals, textiles, fragrances, adhesives, antibiotics, plants, and even corticosteroids. Of note, a common misconception is that ACD must result from new exposures; however, patients may develop ACD secondary to an exposure or product they have been using for many years without a problem.

Three commonly used screening series are the thin-layer rapid use epicutaneous (T.R.U.E.) test (SmartPractice), North American Contact Dermatitis Group screening series, and American Contact Dermatitis Society Core 80 allergen series, which have some variation in the type and number of allergens included (Table 2). The T.R.U.E. test will miss a notable number of clinically relevant allergens in comparison to the North American Contact Dermatitis Group and American Contact Dermatitis Society Core series, and it may be of particularly low utility in identifying fragrance or preservative ACD.9

Allergens are placed on the back in chambers in a petrolatum or aqueous medium. The patches remain affixed for 48 hours, during which time the patient is asked to refrain from showering or exercising to prevent loss of patches. The patient's skin is then evaluated for reactions to allergens on 2 separate occasions: at the time of patch removal 48 hours after initial placement, then the areas of patches are marked for delayed readings at day 4 to day 7 after initial patch placement. Results are scored based on the degree of the inflammatory reaction (Table 3). Delayed readings beyond day 7 may be necessary for metals, specific preservatives (eg, dodecyl gallate, propolis), and neomycin.10

There is a wide spectrum of cutaneous disease that should prompt consideration of patch testing, including well-circumscribed eczematous dermatitis (eg, recurrent lip, hand, and foot dermatitis); patchy or diffuse eczema, especially if recently worsened and/or unresponsive to topical steroids; lichenoid eruptions, particularly of mucosal surfaces; mucous membrane eruptions (eg, stomatitis, vulvitis); and eczematous presentations that raise concern for airborne (photodistributed) or systemic contact dermatitis.11-13 Although further studies of efficacy and safety are ongoing, patch testing also may be useful in the diagnosis of nonimmediate cutaneous adverse drug reactions, especially fixed drug eruptions, acute generalized exanthematous pustulosis, systemic contact dermatitis from medications, and drug-induced hypersensitivity syndrome.3 Lastly, patients with type IV hypersensitivity to metals, adhesives, or antibiotics used in metallic orthopedic or cardiac implants may experience implant failure, regional contact dermatitis, or both, and benefit from patch testing prior to implant replacement to assess for potential allergens. Of the joints that fail, it is estimated that up to 5% are due to metal hypersensitivity.4

Throughout patch testing, patients may continue to manage their skin condition with oral antihistamines and topical steroids, though application to the site at which the patches are applied should be avoided throughout patch testing and during the week prior. According to expert consensus, immunosuppressive medications that are less likely to impact patch testing and therefore may be continued include low-dose methotrexate, oral prednisone less than 10 mg daily, biologic therapy, and low-dose cyclosporine (<2 mg/kg daily). Therapeutic interventions that are more likely to impact patch testing and should be avoided include phototherapy or extensive sun exposure within a week prior to testing, oral prednisone more than 10 mg daily, intramuscular triamcinolone within the preceding month, and high-dose cyclosporine (>2 mg/kg daily).14

An important component to successful patch testing is posttest patient counseling. Providers can create a safe list of products for patients by logging onto the American Contact Dermatitis Society website and accessing the Contact Allergen Management Program (CAMP).15 All relevant allergens found on patch testing may be selected and patient-specific identification codes generated. Once these codes are entered into the CAMP app on the patient's cellular device, a personalized, regularly updated list of safe products appears for many categories of products, including shampoos, sunscreens, moisturizers, cosmetic products, and laundry or dish detergents, among others. Of note, this app is not helpful for avoidance in patients with textile allergies. Patients should be counseled that improvement occurs with avoidance, which usually occurs within weeks but may slowly occur over time in some cases.

Lymphocyte Transformation Test (In Vitro)
The lymphocyte transformation test is an experimental in vitro test for type IV hypersensitivity. This serologic test utilizes allergens to stimulate memory T lymphocytes in vitro and measures the degree of response to the allergen. Although this test has generated excitement, particularly for the potential to safely evaluate for severe adverse cutaneous drug reactions, it currently is not the standard of care and is not utilized in the United States.16

Conclusion

Dermatologists play a vital role in the workup of suspected type IV hypersensitivities. Patch testing is an important but underutilized tool in the arsenal of allergy testing and may be indicated in a wide variety of cutaneous presentations, adverse reactions to medications, and implanted device failures. Identification and avoidance of a culprit allergen has the potential to lead to complete resolution of disease and notable improvement in quality of life for patients.

Acknowledgments
The author thanks Nina Botto, MD (San Francisco, California), for her mentorship in the arena of ACD as well as the Women's Dermatologic Society for the support they provided through the mentorship program.

References
  1. Oettgen H, Broide DH. Introduction to the mechanisms of allergic disease. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:1-32.
  2. Werfel T, Kapp A. Atopic dermatitis and allergic contact dermatitis. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:263-286.
  3. Zinn A, Gayam S, Chelliah MP, et al. Patch testing for nonimmediate cutaneous adverse drug reactions. J Am Acad Dermatol. 2018;78:421-423.
  4. Thyssen JP, Menne T, Schalock PC, et al. Pragmatic approach to the clinical work-up of patients with putative allergic disease to metallic orthopaedic implants before and after surgery. Br J Dermatol. 2011;164:473-478.
  5. Cox L. Overview of serological-specific IgE antibody testing in children. Curr Allergy Asthma Rep. 2011;11:447-453.
  6. Dolen WK. Skin testing and immunoassays for allergen-specific IgE. Clin Rev Allergy Immunol. 2001;21:229-239.
  7. Keeling BH, Gavino AC, Gavino AC. Skin biopsy, the allergists' tool: how to interpret a report. Curr Allergy Asthma Rep. 2015;15:62.
  8. Rosa G, Fernandez AP, Vij A, et al. Langerhans cell collections, but not eosinophils, are clues to a diagnosis of allergic contact dermatitis in appropriate skin biopsies. J Cutan Pathol. 2016;43:498-504.
  9. DeKoven JG, Warshaw EM, Belsito DV. North American Contact Dermatitis Group patch test results 2013-2014. Dermatitis. 2017;28:33-46.
  10. Davis MD, Bhate K, Rohlinger AL, et al. Delayed patch test reading after 5 days: the Mayo Clinic experience. J Am Acad Dermatol. 2008;59:225-233.
  11. Rajagopalan R, Anderson RT. The profile of a patient with contact dermatitis and a suspicion of contact allergy (history, physical characteristics, and dermatology-specific quality of life). Am J Contact Dermat. 1997;8:26-31.
  12. Huygens S, Goossens A. An update on airborne contact dermatitis. Contact Dermatitis. 2001;44:1-6.
  13. Salam TN, Fowler JF. Balsam-related systemic contact dermatitis. J Am Acad Dermatol. 2001;45:377-381.
  14. Fowler JF, Maibach HI, Zirwas M, et al. Effects of immunomodulatory agents on patch testing: expert opinion 2012. Dermatitis. 2012;23:301-303.
  15. ACDS CAMP. American Contact Dermatitis Society website. https://www.contactderm.org/i4a/pages/index.cfm?pageid=3489. Accessed November 14, 2018.
  16. Popple A, Williams J, Maxwell G, et al. The lymphocyte transformation test in allergic contact dermatitis: new opportunities. J Immunotoxicol. 2016;13:84-91.
References
  1. Oettgen H, Broide DH. Introduction to the mechanisms of allergic disease. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:1-32.
  2. Werfel T, Kapp A. Atopic dermatitis and allergic contact dermatitis. In: Holgate ST, Church MK, Broide DH, et al, eds. Allergy. 4th ed. Philadelphia, PA: Elsevier Saunders; 2012:263-286.
  3. Zinn A, Gayam S, Chelliah MP, et al. Patch testing for nonimmediate cutaneous adverse drug reactions. J Am Acad Dermatol. 2018;78:421-423.
  4. Thyssen JP, Menne T, Schalock PC, et al. Pragmatic approach to the clinical work-up of patients with putative allergic disease to metallic orthopaedic implants before and after surgery. Br J Dermatol. 2011;164:473-478.
  5. Cox L. Overview of serological-specific IgE antibody testing in children. Curr Allergy Asthma Rep. 2011;11:447-453.
  6. Dolen WK. Skin testing and immunoassays for allergen-specific IgE. Clin Rev Allergy Immunol. 2001;21:229-239.
  7. Keeling BH, Gavino AC, Gavino AC. Skin biopsy, the allergists' tool: how to interpret a report. Curr Allergy Asthma Rep. 2015;15:62.
  8. Rosa G, Fernandez AP, Vij A, et al. Langerhans cell collections, but not eosinophils, are clues to a diagnosis of allergic contact dermatitis in appropriate skin biopsies. J Cutan Pathol. 2016;43:498-504.
  9. DeKoven JG, Warshaw EM, Belsito DV. North American Contact Dermatitis Group patch test results 2013-2014. Dermatitis. 2017;28:33-46.
  10. Davis MD, Bhate K, Rohlinger AL, et al. Delayed patch test reading after 5 days: the Mayo Clinic experience. J Am Acad Dermatol. 2008;59:225-233.
  11. Rajagopalan R, Anderson RT. The profile of a patient with contact dermatitis and a suspicion of contact allergy (history, physical characteristics, and dermatology-specific quality of life). Am J Contact Dermat. 1997;8:26-31.
  12. Huygens S, Goossens A. An update on airborne contact dermatitis. Contact Dermatitis. 2001;44:1-6.
  13. Salam TN, Fowler JF. Balsam-related systemic contact dermatitis. J Am Acad Dermatol. 2001;45:377-381.
  14. Fowler JF, Maibach HI, Zirwas M, et al. Effects of immunomodulatory agents on patch testing: expert opinion 2012. Dermatitis. 2012;23:301-303.
  15. ACDS CAMP. American Contact Dermatitis Society website. https://www.contactderm.org/i4a/pages/index.cfm?pageid=3489. Accessed November 14, 2018.
  16. Popple A, Williams J, Maxwell G, et al. The lymphocyte transformation test in allergic contact dermatitis: new opportunities. J Immunotoxicol. 2016;13:84-91.
Issue
Cutis - 102(5)
Issue
Cutis - 102(5)
Page Number
E16-E19
Page Number
E16-E19
Publications
Publications
Topics
Article Type
Display Headline
Allergy Testing in Dermatology and Beyond
Display Headline
Allergy Testing in Dermatology and Beyond
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 11/28/2018 - 10:30
Un-Gate On Date
Wed, 11/28/2018 - 10:30
Use ProPublica
CFC Schedule Remove Status
Wed, 11/28/2018 - 10:30
Article PDF Media

Help parents manage screen time thoughtfully

Article Type
Changed
Fri, 01/18/2019 - 18:08

 

It has been 2 years since we last wrote about the potential risks to children and adolescents of spending too much time on screens. While there have been studies in the interval that offer us more information about the effects of heavy screen use and the developing brain, there is little certainty about what is optimal for children and adolescents, and less still on how parents might effectively equip their children to make good use of screens without suffering ill effects.

A teen looks at her smartphone while leaning against a school locker.
monkeybusinessimages/iStock/Getty Images Plus

You might recall that back in October of 2016, the American Academy of Pediatrics published screen time guidelines: recommending no screen time for infants and children up to 18 months old, limiting all screen time to 1 hour per day for children up to 5 years old, and 2 hours daily for older children (up to 11 years old), so that it would not interfere with homework, social time, exercise, and sleep. At the time, data suggested that children from 2 to 11 years old were spending an average of 4.5 hours per day on screens (TV, computer, tablets, or smartphones, not counting homework).

The Adolescent Brain Cognitive Development Study began in September 2016 to evaluate the effects of Canadian recommendations for 8- to 11-year-olds (9-11 hours sleep nightly, 1 hour of exercise daily, and 2 hours or less of screen time daily; the study subjects are in the United States). This fall they published their initial results, demonstrating that only 51% get the recommended amount of sleep, only 37% kept their daily screen time to under 2 hours, and only 18% were getting the recommended amount of exercise. Only 5% of children consistently met all three recommendations while 29% of children didn’t meet any of the recommendations.

The researchers assessed the children’s cognitive development and found that after 1 year, those children who met the screen time recommendations, both screen time and sleep, or all three recommendations demonstrated “superior global cognition.” Children were spending an average of 3.7 hours daily on screens, and those children who were spending 2 hours or less on screens performed 4% better on tests of cognitive function than did children spending the average amount of time. Sleep and exercise differences alone did not contribute to significant differences in cognitive function. This study will continue for another 10 years.1

In a much smaller study out of Cincinnati Children’s Hospital, researchers asked parents to describe the amount of time a child spent on reading and in screen-based media activities, then completed MRI scans of the children’s brains.2 They found a strong association between reading time and higher functional connectivity between the parts of the brain responsible for visual word formation and those responsible for language and cognitive control, with a negative correlation between functional connectivity and time spent in screen-based media activities.

While these studies are important pieces of data as we build a deeper understanding about the effects of screen-based media use on children’s cognitive and behavioral development, they do not offer certainty about causality. These studies do not yet clarify whether certain children are especially vulnerable to the untoward effects of heavy screen-based media use. Perhaps the research will someday offer guidelines with certainty, but families need guidance now. Without doubt, digital devices are here to stay, are important to homework, and can facilitate independence, long-distance connections, important technical work-skills, and even senseless fun and relaxation. So we will focus on offering some principles to help you guide young people (or their parents) in approaching screen time thoughtfully.

While recommending no more than 2 hours of daily screen time seems reasonable, it may be more useful to focus on what young people are doing with the rest of their time. Are they getting adequate, restful sleep? Are they able to exercise most days? Do they have enough time for homework? Do they have time for friends (time actually together, not just texting)? What about time for hobbies? When parents focus on the precious resource of time and all of the activities their children both need and want to do, it sets the frame for them to say that their children are allowed to have time to relax with screen-based media as long as it does not take away from these other priorities. Ensuring that the child has at least 8 hours of sleep, after homework and sports, also will set natural limits on screen time.

Parents also can use the frame of development to guide their rules about screen time. If use of an electronic device serves a developmental task, then it is reasonable. If it interferes with a developmental task, then it should be limited. Adolescents (ages 12-20) should be exploring their own identities, establishing independence, deepening social relationships, and learning to manage their impulses. Some interests can be most easily explored with the aid of a computer (such as with programming, art history, or astronomy). Use of cellphones can facilitate teenagers’ being more independent with plans or transportation. Social connections can be supported by texting or FaceTime. Some close friends may be in a different sport or live far away, and it is possible to stay connected only virtually. However, when use of electronic devices keeps the child from engaging with new friends and new interests or from getting into the world to establish real independence (i.e., a job), then there should be limits. In all of these cases, it is critical that adults explain to teenagers what is guiding their thinking about limits on screen time. Open discussions about the great utility and fun that screens can provide, as well as the challenge of keeping those activities in balance with other important activities, helps adolescents set the frame for that rapidly approaching time when they will be making those choices without adult supervision.

Younger children (ages 8-11) should be sampling a wide array of activities and interests and experiencing challenges and eventual mastery across domains. Video games can be very compelling for this age group because they appeal to exactly this drive to master a challenge. Parents want to ensure that their children can have senseless fun, and still have enough time to explore actual activities: social, athletic, creative, and academic. They can be ready to explain the why of rules, but consistent rules, enforced for everyone at home, are most helpful for this age group.

Dr. Susan D. Swick, physician in chief at Ohana,Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula.
Dr. Susan D. Swick


You also can help parents to consider the child’s temperament when thinking about which rules will be appropriate. Anxious children and teenagers may be especially prone to immersive virtual activities that allow them to avoid the stress of real undertakings or interactions. But anxious children may be able to prepare for something anxiety provoking by exploring it virtually first. Youth with ADHD are going to struggle with shifting away from video games or other electronic activities they enjoy that don’t have a natural ending, and will need strict rules and patient support around balanced screen time use. Screen time may play to a child’s strengths, enabling creative children to take in a wide range of art or music and even create their own when other resources are limited.

Dr. Michael S. Jellinek, professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston
Dr. Michael S. Jellinek


Finally, all parents should consider what their own screen use is teaching their children. Adolescents are unlikely to listen to their parents’ recommendations if the parents spend hours online after work. Younger children need their parents’ engaged attention: being coaches and cheerleaders for all of their efforts at mastery. You can help parents to imagine rules that the whole family can follow. They can consider how screen time helps them connect with their children, such as watching a favorite program or sport together. They can explore shared interests online together. They can even relax with ridiculous cat videos together! Screen time together is valuable if it supports parents’ connections with their children, while their rules ensure adequate time for sleep, physical activity, and developmental priorities.

Dr. Swick is physician in chief at Ohana, Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula. Dr. Jellinek is professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston. Email them at pdnews@mdedge.com.
 

References

1. Lancet Child Adolesc Health. 2018 Nov 1;2(11):783-91.

2. Acta Paediatra. 2018 Apr;107(4):685-93

Publications
Topics
Sections

 

It has been 2 years since we last wrote about the potential risks to children and adolescents of spending too much time on screens. While there have been studies in the interval that offer us more information about the effects of heavy screen use and the developing brain, there is little certainty about what is optimal for children and adolescents, and less still on how parents might effectively equip their children to make good use of screens without suffering ill effects.

A teen looks at her smartphone while leaning against a school locker.
monkeybusinessimages/iStock/Getty Images Plus

You might recall that back in October of 2016, the American Academy of Pediatrics published screen time guidelines: recommending no screen time for infants and children up to 18 months old, limiting all screen time to 1 hour per day for children up to 5 years old, and 2 hours daily for older children (up to 11 years old), so that it would not interfere with homework, social time, exercise, and sleep. At the time, data suggested that children from 2 to 11 years old were spending an average of 4.5 hours per day on screens (TV, computer, tablets, or smartphones, not counting homework).

The Adolescent Brain Cognitive Development Study began in September 2016 to evaluate the effects of Canadian recommendations for 8- to 11-year-olds (9-11 hours sleep nightly, 1 hour of exercise daily, and 2 hours or less of screen time daily; the study subjects are in the United States). This fall they published their initial results, demonstrating that only 51% get the recommended amount of sleep, only 37% kept their daily screen time to under 2 hours, and only 18% were getting the recommended amount of exercise. Only 5% of children consistently met all three recommendations while 29% of children didn’t meet any of the recommendations.

The researchers assessed the children’s cognitive development and found that after 1 year, those children who met the screen time recommendations, both screen time and sleep, or all three recommendations demonstrated “superior global cognition.” Children were spending an average of 3.7 hours daily on screens, and those children who were spending 2 hours or less on screens performed 4% better on tests of cognitive function than did children spending the average amount of time. Sleep and exercise differences alone did not contribute to significant differences in cognitive function. This study will continue for another 10 years.1

In a much smaller study out of Cincinnati Children’s Hospital, researchers asked parents to describe the amount of time a child spent on reading and in screen-based media activities, then completed MRI scans of the children’s brains.2 They found a strong association between reading time and higher functional connectivity between the parts of the brain responsible for visual word formation and those responsible for language and cognitive control, with a negative correlation between functional connectivity and time spent in screen-based media activities.

While these studies are important pieces of data as we build a deeper understanding about the effects of screen-based media use on children’s cognitive and behavioral development, they do not offer certainty about causality. These studies do not yet clarify whether certain children are especially vulnerable to the untoward effects of heavy screen-based media use. Perhaps the research will someday offer guidelines with certainty, but families need guidance now. Without doubt, digital devices are here to stay, are important to homework, and can facilitate independence, long-distance connections, important technical work-skills, and even senseless fun and relaxation. So we will focus on offering some principles to help you guide young people (or their parents) in approaching screen time thoughtfully.

While recommending no more than 2 hours of daily screen time seems reasonable, it may be more useful to focus on what young people are doing with the rest of their time. Are they getting adequate, restful sleep? Are they able to exercise most days? Do they have enough time for homework? Do they have time for friends (time actually together, not just texting)? What about time for hobbies? When parents focus on the precious resource of time and all of the activities their children both need and want to do, it sets the frame for them to say that their children are allowed to have time to relax with screen-based media as long as it does not take away from these other priorities. Ensuring that the child has at least 8 hours of sleep, after homework and sports, also will set natural limits on screen time.

Parents also can use the frame of development to guide their rules about screen time. If use of an electronic device serves a developmental task, then it is reasonable. If it interferes with a developmental task, then it should be limited. Adolescents (ages 12-20) should be exploring their own identities, establishing independence, deepening social relationships, and learning to manage their impulses. Some interests can be most easily explored with the aid of a computer (such as with programming, art history, or astronomy). Use of cellphones can facilitate teenagers’ being more independent with plans or transportation. Social connections can be supported by texting or FaceTime. Some close friends may be in a different sport or live far away, and it is possible to stay connected only virtually. However, when use of electronic devices keeps the child from engaging with new friends and new interests or from getting into the world to establish real independence (i.e., a job), then there should be limits. In all of these cases, it is critical that adults explain to teenagers what is guiding their thinking about limits on screen time. Open discussions about the great utility and fun that screens can provide, as well as the challenge of keeping those activities in balance with other important activities, helps adolescents set the frame for that rapidly approaching time when they will be making those choices without adult supervision.

Younger children (ages 8-11) should be sampling a wide array of activities and interests and experiencing challenges and eventual mastery across domains. Video games can be very compelling for this age group because they appeal to exactly this drive to master a challenge. Parents want to ensure that their children can have senseless fun, and still have enough time to explore actual activities: social, athletic, creative, and academic. They can be ready to explain the why of rules, but consistent rules, enforced for everyone at home, are most helpful for this age group.

Dr. Susan D. Swick, physician in chief at Ohana,Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula.
Dr. Susan D. Swick


You also can help parents to consider the child’s temperament when thinking about which rules will be appropriate. Anxious children and teenagers may be especially prone to immersive virtual activities that allow them to avoid the stress of real undertakings or interactions. But anxious children may be able to prepare for something anxiety provoking by exploring it virtually first. Youth with ADHD are going to struggle with shifting away from video games or other electronic activities they enjoy that don’t have a natural ending, and will need strict rules and patient support around balanced screen time use. Screen time may play to a child’s strengths, enabling creative children to take in a wide range of art or music and even create their own when other resources are limited.

Dr. Michael S. Jellinek, professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston
Dr. Michael S. Jellinek


Finally, all parents should consider what their own screen use is teaching their children. Adolescents are unlikely to listen to their parents’ recommendations if the parents spend hours online after work. Younger children need their parents’ engaged attention: being coaches and cheerleaders for all of their efforts at mastery. You can help parents to imagine rules that the whole family can follow. They can consider how screen time helps them connect with their children, such as watching a favorite program or sport together. They can explore shared interests online together. They can even relax with ridiculous cat videos together! Screen time together is valuable if it supports parents’ connections with their children, while their rules ensure adequate time for sleep, physical activity, and developmental priorities.

Dr. Swick is physician in chief at Ohana, Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula. Dr. Jellinek is professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston. Email them at pdnews@mdedge.com.
 

References

1. Lancet Child Adolesc Health. 2018 Nov 1;2(11):783-91.

2. Acta Paediatra. 2018 Apr;107(4):685-93

 

It has been 2 years since we last wrote about the potential risks to children and adolescents of spending too much time on screens. While there have been studies in the interval that offer us more information about the effects of heavy screen use and the developing brain, there is little certainty about what is optimal for children and adolescents, and less still on how parents might effectively equip their children to make good use of screens without suffering ill effects.

A teen looks at her smartphone while leaning against a school locker.
monkeybusinessimages/iStock/Getty Images Plus

You might recall that back in October of 2016, the American Academy of Pediatrics published screen time guidelines: recommending no screen time for infants and children up to 18 months old, limiting all screen time to 1 hour per day for children up to 5 years old, and 2 hours daily for older children (up to 11 years old), so that it would not interfere with homework, social time, exercise, and sleep. At the time, data suggested that children from 2 to 11 years old were spending an average of 4.5 hours per day on screens (TV, computer, tablets, or smartphones, not counting homework).

The Adolescent Brain Cognitive Development Study began in September 2016 to evaluate the effects of Canadian recommendations for 8- to 11-year-olds (9-11 hours sleep nightly, 1 hour of exercise daily, and 2 hours or less of screen time daily; the study subjects are in the United States). This fall they published their initial results, demonstrating that only 51% get the recommended amount of sleep, only 37% kept their daily screen time to under 2 hours, and only 18% were getting the recommended amount of exercise. Only 5% of children consistently met all three recommendations while 29% of children didn’t meet any of the recommendations.

The researchers assessed the children’s cognitive development and found that after 1 year, those children who met the screen time recommendations, both screen time and sleep, or all three recommendations demonstrated “superior global cognition.” Children were spending an average of 3.7 hours daily on screens, and those children who were spending 2 hours or less on screens performed 4% better on tests of cognitive function than did children spending the average amount of time. Sleep and exercise differences alone did not contribute to significant differences in cognitive function. This study will continue for another 10 years.1

In a much smaller study out of Cincinnati Children’s Hospital, researchers asked parents to describe the amount of time a child spent on reading and in screen-based media activities, then completed MRI scans of the children’s brains.2 They found a strong association between reading time and higher functional connectivity between the parts of the brain responsible for visual word formation and those responsible for language and cognitive control, with a negative correlation between functional connectivity and time spent in screen-based media activities.

While these studies are important pieces of data as we build a deeper understanding about the effects of screen-based media use on children’s cognitive and behavioral development, they do not offer certainty about causality. These studies do not yet clarify whether certain children are especially vulnerable to the untoward effects of heavy screen-based media use. Perhaps the research will someday offer guidelines with certainty, but families need guidance now. Without doubt, digital devices are here to stay, are important to homework, and can facilitate independence, long-distance connections, important technical work-skills, and even senseless fun and relaxation. So we will focus on offering some principles to help you guide young people (or their parents) in approaching screen time thoughtfully.

While recommending no more than 2 hours of daily screen time seems reasonable, it may be more useful to focus on what young people are doing with the rest of their time. Are they getting adequate, restful sleep? Are they able to exercise most days? Do they have enough time for homework? Do they have time for friends (time actually together, not just texting)? What about time for hobbies? When parents focus on the precious resource of time and all of the activities their children both need and want to do, it sets the frame for them to say that their children are allowed to have time to relax with screen-based media as long as it does not take away from these other priorities. Ensuring that the child has at least 8 hours of sleep, after homework and sports, also will set natural limits on screen time.

Parents also can use the frame of development to guide their rules about screen time. If use of an electronic device serves a developmental task, then it is reasonable. If it interferes with a developmental task, then it should be limited. Adolescents (ages 12-20) should be exploring their own identities, establishing independence, deepening social relationships, and learning to manage their impulses. Some interests can be most easily explored with the aid of a computer (such as with programming, art history, or astronomy). Use of cellphones can facilitate teenagers’ being more independent with plans or transportation. Social connections can be supported by texting or FaceTime. Some close friends may be in a different sport or live far away, and it is possible to stay connected only virtually. However, when use of electronic devices keeps the child from engaging with new friends and new interests or from getting into the world to establish real independence (i.e., a job), then there should be limits. In all of these cases, it is critical that adults explain to teenagers what is guiding their thinking about limits on screen time. Open discussions about the great utility and fun that screens can provide, as well as the challenge of keeping those activities in balance with other important activities, helps adolescents set the frame for that rapidly approaching time when they will be making those choices without adult supervision.

Younger children (ages 8-11) should be sampling a wide array of activities and interests and experiencing challenges and eventual mastery across domains. Video games can be very compelling for this age group because they appeal to exactly this drive to master a challenge. Parents want to ensure that their children can have senseless fun, and still have enough time to explore actual activities: social, athletic, creative, and academic. They can be ready to explain the why of rules, but consistent rules, enforced for everyone at home, are most helpful for this age group.

Dr. Susan D. Swick, physician in chief at Ohana,Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula.
Dr. Susan D. Swick


You also can help parents to consider the child’s temperament when thinking about which rules will be appropriate. Anxious children and teenagers may be especially prone to immersive virtual activities that allow them to avoid the stress of real undertakings or interactions. But anxious children may be able to prepare for something anxiety provoking by exploring it virtually first. Youth with ADHD are going to struggle with shifting away from video games or other electronic activities they enjoy that don’t have a natural ending, and will need strict rules and patient support around balanced screen time use. Screen time may play to a child’s strengths, enabling creative children to take in a wide range of art or music and even create their own when other resources are limited.

Dr. Michael S. Jellinek, professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston
Dr. Michael S. Jellinek


Finally, all parents should consider what their own screen use is teaching their children. Adolescents are unlikely to listen to their parents’ recommendations if the parents spend hours online after work. Younger children need their parents’ engaged attention: being coaches and cheerleaders for all of their efforts at mastery. You can help parents to imagine rules that the whole family can follow. They can consider how screen time helps them connect with their children, such as watching a favorite program or sport together. They can explore shared interests online together. They can even relax with ridiculous cat videos together! Screen time together is valuable if it supports parents’ connections with their children, while their rules ensure adequate time for sleep, physical activity, and developmental priorities.

Dr. Swick is physician in chief at Ohana, Center for Child and Adolescent Behavioral Health, Community Hospital of the Monterey (Calif.) Peninsula. Dr. Jellinek is professor emeritus of psychiatry and pediatrics, Harvard Medical School, Boston. Email them at pdnews@mdedge.com.
 

References

1. Lancet Child Adolesc Health. 2018 Nov 1;2(11):783-91.

2. Acta Paediatra. 2018 Apr;107(4):685-93

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Healthier lifestyle in midlife women reduces subclinical carotid atherosclerosis

Article Type
Changed
Fri, 01/18/2019 - 18:08

 

Women who have a healthier lifestyle during the menopausal transition could significantly reduce their risk of cardiovascular disease, new research suggests.

Women in an exercise class
kali9/E+

Because women experience a steeper increase in CVD risk during and after the menopausal transition, researchers analyzed data from the Study of Women’s Health Across the Nation (SWAN), a prospective longitudinal cohort study of 1,143 women aged 42-52 years. The report is in JAHA: Journal of the American Heart Association.

The analysis revealed that women with the highest average Healthy Lifestyle Score – a composite score of dietary quality, levels of physical activity, and smoking – over 10 years of follow-up had a 0.024-mm smaller common carotid artery intima-media thickness and 0.16-mm smaller adventitial diameter, compared to those with the lowest average score. This was after adjustment for confounders and physiological risk factors such as ethnicity, age, menopausal status, body mass index, and cholesterol levels.

“Smoking, unhealthy diet, and lack of physical activity are three well-known modifiable behavioral risk factors for CVD,” wrote Dongqing Wang of the University of Michigan, Ann Arbor, and his coauthors. “Even after adjusting for the lifestyle-related physiological risk factors, the adherence to a healthy lifestyle composed of abstinence from smoking, healthy diet, and regular engagement in physical activity is inversely associated with atherosclerosis in midlife women.”

Women with higher average health lifestyle score also had lower levels of carotid plaque after adjustment for confounding factors, but this was no longer significant after adjustment for physiological risk factors.

The authors analyzed the three components of the healthy lifestyle score separately, and found that not smoking was strongly and significantly associated with lower scores for all three measures of subclinical atherosclerosis. Women who never smoked across the duration of the study had a 49% lower odds of having a high carotid plaque index compared with women who smoked at some point during the follow-up period.



The analysis showed an inverse association between average Alternate Healthy Eating Index score – a measure of diet quality – and smaller common carotid artery adventitial diameter, although after adjustment for BMI this association was no longer statistically significant. Likewise, the association between dietary quality and intima-media thickness was only marginally significant and lost that significance after adjustment for BMI.

Long-term physical activity was only marginally significantly associated with common carotid artery intima-media thickness, but this was not significant after adjustment for physiological risk factors. No association was found between physical activity and common carotid artery adventitial diameter or carotid plaque.

The authors said that 1.7% of the study population managed to stay in the top category for all three components of healthy lifestyle at all three follow-up time points in the study.

“The low prevalence of a healthy lifestyle in midlife women highlights the potential for lifestyle interventions aimed at this vulnerable population,” they wrote.

In particular, they highlighted abstinence from smoking as having the strongest impact on all three measures of subclinical atherosclerosis, which is known to affect women more than men. However, the outcomes from diet and physical activity weren’t so strong: The authors suggested that BMI could partly mediate the effects of healthier diet and greater levels of physical activity.

One strength of the study was its ethnically diverse population, which included African American, Chinese, and Hispanic women in addition to non-Hispanic white women. However, the study was not powered to examine the impacts ethnicity may have had on outcomes, the researchers wrote.

The Study of Women’s Health Across the Nation is supported by the National Institutes of Health. No conflicts of interest were declared.

SOURCE: Wang D et al. JAHA 2018 Nov. 28.

Publications
Topics
Sections

 

Women who have a healthier lifestyle during the menopausal transition could significantly reduce their risk of cardiovascular disease, new research suggests.

Women in an exercise class
kali9/E+

Because women experience a steeper increase in CVD risk during and after the menopausal transition, researchers analyzed data from the Study of Women’s Health Across the Nation (SWAN), a prospective longitudinal cohort study of 1,143 women aged 42-52 years. The report is in JAHA: Journal of the American Heart Association.

The analysis revealed that women with the highest average Healthy Lifestyle Score – a composite score of dietary quality, levels of physical activity, and smoking – over 10 years of follow-up had a 0.024-mm smaller common carotid artery intima-media thickness and 0.16-mm smaller adventitial diameter, compared to those with the lowest average score. This was after adjustment for confounders and physiological risk factors such as ethnicity, age, menopausal status, body mass index, and cholesterol levels.

“Smoking, unhealthy diet, and lack of physical activity are three well-known modifiable behavioral risk factors for CVD,” wrote Dongqing Wang of the University of Michigan, Ann Arbor, and his coauthors. “Even after adjusting for the lifestyle-related physiological risk factors, the adherence to a healthy lifestyle composed of abstinence from smoking, healthy diet, and regular engagement in physical activity is inversely associated with atherosclerosis in midlife women.”

Women with higher average health lifestyle score also had lower levels of carotid plaque after adjustment for confounding factors, but this was no longer significant after adjustment for physiological risk factors.

The authors analyzed the three components of the healthy lifestyle score separately, and found that not smoking was strongly and significantly associated with lower scores for all three measures of subclinical atherosclerosis. Women who never smoked across the duration of the study had a 49% lower odds of having a high carotid plaque index compared with women who smoked at some point during the follow-up period.



The analysis showed an inverse association between average Alternate Healthy Eating Index score – a measure of diet quality – and smaller common carotid artery adventitial diameter, although after adjustment for BMI this association was no longer statistically significant. Likewise, the association between dietary quality and intima-media thickness was only marginally significant and lost that significance after adjustment for BMI.

Long-term physical activity was only marginally significantly associated with common carotid artery intima-media thickness, but this was not significant after adjustment for physiological risk factors. No association was found between physical activity and common carotid artery adventitial diameter or carotid plaque.

The authors said that 1.7% of the study population managed to stay in the top category for all three components of healthy lifestyle at all three follow-up time points in the study.

“The low prevalence of a healthy lifestyle in midlife women highlights the potential for lifestyle interventions aimed at this vulnerable population,” they wrote.

In particular, they highlighted abstinence from smoking as having the strongest impact on all three measures of subclinical atherosclerosis, which is known to affect women more than men. However, the outcomes from diet and physical activity weren’t so strong: The authors suggested that BMI could partly mediate the effects of healthier diet and greater levels of physical activity.

One strength of the study was its ethnically diverse population, which included African American, Chinese, and Hispanic women in addition to non-Hispanic white women. However, the study was not powered to examine the impacts ethnicity may have had on outcomes, the researchers wrote.

The Study of Women’s Health Across the Nation is supported by the National Institutes of Health. No conflicts of interest were declared.

SOURCE: Wang D et al. JAHA 2018 Nov. 28.

 

Women who have a healthier lifestyle during the menopausal transition could significantly reduce their risk of cardiovascular disease, new research suggests.

Women in an exercise class
kali9/E+

Because women experience a steeper increase in CVD risk during and after the menopausal transition, researchers analyzed data from the Study of Women’s Health Across the Nation (SWAN), a prospective longitudinal cohort study of 1,143 women aged 42-52 years. The report is in JAHA: Journal of the American Heart Association.

The analysis revealed that women with the highest average Healthy Lifestyle Score – a composite score of dietary quality, levels of physical activity, and smoking – over 10 years of follow-up had a 0.024-mm smaller common carotid artery intima-media thickness and 0.16-mm smaller adventitial diameter, compared to those with the lowest average score. This was after adjustment for confounders and physiological risk factors such as ethnicity, age, menopausal status, body mass index, and cholesterol levels.

“Smoking, unhealthy diet, and lack of physical activity are three well-known modifiable behavioral risk factors for CVD,” wrote Dongqing Wang of the University of Michigan, Ann Arbor, and his coauthors. “Even after adjusting for the lifestyle-related physiological risk factors, the adherence to a healthy lifestyle composed of abstinence from smoking, healthy diet, and regular engagement in physical activity is inversely associated with atherosclerosis in midlife women.”

Women with higher average health lifestyle score also had lower levels of carotid plaque after adjustment for confounding factors, but this was no longer significant after adjustment for physiological risk factors.

The authors analyzed the three components of the healthy lifestyle score separately, and found that not smoking was strongly and significantly associated with lower scores for all three measures of subclinical atherosclerosis. Women who never smoked across the duration of the study had a 49% lower odds of having a high carotid plaque index compared with women who smoked at some point during the follow-up period.



The analysis showed an inverse association between average Alternate Healthy Eating Index score – a measure of diet quality – and smaller common carotid artery adventitial diameter, although after adjustment for BMI this association was no longer statistically significant. Likewise, the association between dietary quality and intima-media thickness was only marginally significant and lost that significance after adjustment for BMI.

Long-term physical activity was only marginally significantly associated with common carotid artery intima-media thickness, but this was not significant after adjustment for physiological risk factors. No association was found between physical activity and common carotid artery adventitial diameter or carotid plaque.

The authors said that 1.7% of the study population managed to stay in the top category for all three components of healthy lifestyle at all three follow-up time points in the study.

“The low prevalence of a healthy lifestyle in midlife women highlights the potential for lifestyle interventions aimed at this vulnerable population,” they wrote.

In particular, they highlighted abstinence from smoking as having the strongest impact on all three measures of subclinical atherosclerosis, which is known to affect women more than men. However, the outcomes from diet and physical activity weren’t so strong: The authors suggested that BMI could partly mediate the effects of healthier diet and greater levels of physical activity.

One strength of the study was its ethnically diverse population, which included African American, Chinese, and Hispanic women in addition to non-Hispanic white women. However, the study was not powered to examine the impacts ethnicity may have had on outcomes, the researchers wrote.

The Study of Women’s Health Across the Nation is supported by the National Institutes of Health. No conflicts of interest were declared.

SOURCE: Wang D et al. JAHA 2018 Nov. 28.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM JAHA: JOURNAL OF THE AMERICAN HEART ASSOCIATION

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
189650
Vitals

 

Key clinical point: Healthier lifestyle in midlife women associated with less subclinical carotid atherosclerosis.

Major finding: Following a healthier diet and not smoking were significantly linked with lower subclinical carotid atherosclerosis in menopausal women.

Study details: A prospective, longitudinal cohort study of 1,143 women.

Disclosures: The Study of Women’s Health Across the Nation is supported by the National Institutes of Health. No conflicts of interest were declared.

Source: Wang D et al. JAHA 2018 Nov. 28.
 

Disqus Comments
Default
Use ProPublica