What if an app could diagnose melanoma from a photo? That was my idea. In December 2009, Google introduced Google Goggles, an application that recognized images. At the time, I thought, “Wouldn’t it be neat if we could use this with telederm?” I even pitched it to a friend at the search giant. “Great idea!” he wrote back, placating me. For those uninitiated in innovation, “Great idea!” is a euphemism for “Yeah, we thought of that.”
Yes, it isn’t only mine; no doubt, many of you had this same idea: Let’s use amazing image interpretation capabilities from companies like Google or Apple to help us make diagnoses. Sounds simple. It isn’t. This is why most melanoma-finding apps are for entertainment purposes only – they don’t work.
To reliably get this right takes immense experience and intuition, things we do better than computers. Or do we? Since 2009, processors have sped up and machine learning has become exponentially better. Now cars drive themselves and software can ID someone even in a grainy video. The two are related: Both require tremendous processing power and sophisticated algorithms to achieve artificial intelligence (AI). You’ve likely heard about AI or machine learning lately. If you’re unsure what all the fuss is about, read my previous column (Dermatology News, March 2017, p. 30).So can melanoma be diagnosed from an app? A Stanford University team believes so. They trained a machine learning system to make dermatologic diagnoses from photos of skin lesions. To overcome previous barriers, they used open-sourced software from Google and awesome processors. For a start, they pretrained the program on over 1.28 million images. Then they fed it 128,450 images of known diagnoses.
Then, just as when Google’s AlphaGo algorithm challenged Lee Sedol, the world Go champion, the Stanford research team challenged 21 dermatologists. They had to choose if they would biopsy/treat or reassure patients based on photos of benign lesions, keratinocyte carcinomas, clinical melanomas, and dermoscopic melanomas. Guess who won?
In a stunning victory (or defeat, if you’re rooting for our team), the trained algorithm matched or outperformed all the dermatologists when scored on sensitivity-specificity curves. While we dermatologists, of course, use more than just a photo to diagnose skin cancer, many around the globe don’t have access to us. Based on these findings, they might need access only to a smartphone to get potentially life-saving advice.
But, what does this mean? Will we someday be outsourced to AI? Will a future POTUS promise to “bring back the doctor industry?” Not if we adapt. The future is bright – if we learn to apply machine learning in ways that can have an impact. (Brain + Computer > Brain.) Consider the following: An optimized ophthalmologist who reads retinal scans prediagnosed by a computer. A teledermatologist who uses AI to perform perfectly in diagnosing melanoma.
Patients have always wanted high quality and high touch care. In the history of medicine, we’ve never been better at both than we are today. Until tomorrow, when we’ll be better still.
Jeff Benabio, MD, MBA, is director of Healthcare Transformation and chief of dermatology at Kaiser Permanente San Diego. Dr. Benabio is @Dermdoc on Twitter. Write to him at dermnews@frontlinemedcom.com. He has no disclosures related to this column.