GAITHERSBURG, MD. – Testing to ensure the reliability of information from clinical decision support tools might become part of the health information technology oversight provided by the Food and Drug Administration, based on discussions at an FDA workshop.
"We will have to have some sort of follow-up system in place to make sure that systems do what they say they are going to do," American Cancer Society Deputy Chief Medical Officer J. Leonard Lichtenfeld said during the workshop, which was held to solicit commentary on the recent FDASIA (Safety and Innovation Act) Health IT Report, a proposed framework for the regulation of health information technology.
In April, a report was issued by the FDA, the U.S. Department of Health & Human Services Office of the National Coordinator for Health Information Technology, and the Federal Communications Commission.
The report defines clinical decision support tools as providing "knowledge and person-specific information, intelligently filtered or presented at appropriate times, to enhance health care." Examples of such tools include "computerized alerts and reminders for providers and patients; clinical guidelines; condition-specific order sets; focused patient data reports and summaries; documentation templates; diagnostic support; and contextually relevant information."
The report adds that the "FDA does not intend to focus its regulatory oversight on these products/functionalities, even if they meet the statutory definition of a medical device," citing examples of drug-drug interaction alerts and drug allergy contradiction alerts; drug dosing calculators; reminders for preventive care; and suggestions for diagnosis based on patient-specific information in EHRs.
To ensure the reliability of these tools, however, some oversight of the algorithms used to generate that information will be needed.
Using oncology as an example, Dr. Lichtenfeld observed that certain tests important to cancer care can be inaccurate and that not everyone follows the same protocols when administering tests. "We’re going to have to have – and I say this with trepidation – an oversight to make sure the systems do what they say [they are going to do]." That way, there is an assurance that "when we do press the button, something really does work the way it’s supposed to work."
Dr. David S. Hirschorn, director of radiology informatics at Staten Island (N.Y.)University Hospital, noted that there needs to be a distinction between information coming from "a vendor that might have other interests" and information from best medical evidence.
The best clinical decision support tools will need to allow protocols to be updated to reflect the most up-to-date information. Examples include a feedback mechanism that allows doctors to correct mistakes in the protocol used to write the code and make changes based on advances in the knowledge and delivery of health care, he said.
Panelists also noted that any potential oversight needs to be balanced with the ability to innovate and that strict reliance on guidelines could keep physicians from using clinical decision support tools to their fullest capabilities.