Loading ...

user Admin_Adham
4th Jun, 2025 12:00 AM
Test

Current and Future AI Uses in Dermatology: An Expert’s View

Roxana Daneshjou, MD, PhD, is one of the leading experts on artificial intelligence (AI) in American dermatology. Daneshjou, assistant professor of biomedical data science and dermatology at Stanford University, Stanford, California, leads landmark AI studies, is an associate editor of the journal NEJM AI, and gives presentations about the topic, including one at the recent Society for Investigative Dermatology (SID) 2025 annual meeting where she starkly warned colleagues that “dermatologists who use AI will replace dermatologists who don’t.”

photo of Roxana Daneshjou
Roxana Daneshjou, MD, PhD

So one could assume that Daneshjou embraces AI in her clinical practice. But she doesn’t — not quite yet. While AI is helpful with office tasks that involve writing, she said, it’s not currently good enough at handling tasks, such as evaluating skin lesions or helping solve diagnostic riddles.

“You should only use it for tasks where you can easily catch errors and rectify them. You shouldn’t use it when you’re not sure of the answer or the next step because you could be badly misled,” she said in an interview with Medscape Medical News.

But just wait. “Eventually, once we have valid, well-validated AI tools that can help with diagnosis and triage, they’re going to become essentially standard of care,” Daneshjou said.

The following are excerpts from the interview with Daneshjou about the present and future of AI in dermatology.

What do you mean when you say, “Dermatologists who use AI will replace dermatologists who don’t”?

Daneshjou: That’s actually a rehashed phrase originally coined by Curt Langlotz, a radiologist who made the same claim about radiologists. The point is that dermatologists aren’t going anywhere. AI is not replacing dermatologists. It’s that dermatologists who use AI will replace dermatologists who don’t.

Will some dermatologists be left behind?

Daneshjou: Medicine always evolves. There was a time when we didn’t have advanced imaging technologies like CT scans and MRIs. And think about how many dermatologists now use electronic health records (EHRs) vs writing everything down by hand. There are still some people writing things by hand, but physicians who can use EHRs have largely replaced those who can’t.

This isn’t a new phenomenon. Whenever new technology comes along, it becomes incorporated into medical practice, and those who learn to adapt and adopt it eventually replace those who don’t.

Is there fear and denial in the dermatology community about AI?

Daneshjou: There’s fear, but there’s also enthusiasm — sometimes enthusiasm to the point of using things that aren’t ready for prime time. In my SID talk, I discussed how it’s not safe to use large language models [AI] — LLMs — for any clinical task where you don’t know the answer or can’t validate it quickly. These models can have errors that are difficult to catch because the outputs look so convincing.

Can you give an example of how using LLMs clinically might get a dermatologist in trouble?

Daneshjou: In my presentation, I showed AI being asked to calculate a RegiSCAR score for a patient. It gives an output that looks really convincing but has some of the scores wrong. If you didn’t know the RegiSCAR score yourself, you might not catch that mistake. Similarly, if you ask about medication dosing, sometimes AI gets it right. But research papers show it can get dosing wrong. If you’re not certain of the answer, you shouldn’t use an LLM for that task.

That’s different from giving it bullet points and saying, “Follow these bullet points to draft a prior authorization letter” or “Write an after-visit summary for my patient” about a disease you’re well-versed in, and you can verify [the text] for accuracy.

Are there reliable clinical uses for AI now?

Daneshjou: First, I should note that publicly facing models aren’t Health Insurance Portability and Accountability Act (HIPAA)–compliant, so you have to be careful about putting patient information in them. Some institutions like Stanford have HIPAA-compliant versions internally.

I’d be very wary of using these models for diagnosis and treatment because they can say things that are wrong. I’ve heard dermatologists say they’ve put patient images into these models to get a differential diagnosis, which I would strongly advise against — both for HIPAA concerns and because the outputs aren’t reliable.

What about “vision language” models (VLMs) in dermatology that are trained on skin images and could potentially be used for tasks such as identifying lesions?

Daneshjou: The VLMs we’ve tested perform worse than the LLMs. They’re even more in the research realm.

Are current AI systems actually good at categorizing skin lesions?

There are many papers claiming they’re good, but there’s not much prospective trial data validating that performance. We need more trial data proving that a particular model will continue to perform well in a clinical setting.

So AI isn’t ready for prime time in diagnosis and treatment?

Daneshjou: That’s correct. It’s more useful in a supportive role — helping with writing or editing text.

You worked on a “red teaming” event that assigned attendees — engineers, computer scientists, and health professionals such as dermatologists to assign medical tasks to AI and ask questions. The results were published in Nature in March 2025. What did you discover?

Daneshjou: We found that across all models tested, there was an error rate of around 20%. As our chief data scientist at Stanford likes to joke, “You can use large language models for any task where a 20% error rate is acceptable.”

Where do you think AI and dermatology are headed next?

Daneshjou: Image-based models will eventually get good enough to earn US Food and Drug Administration clearance. But my concern is this will happen without the creators having to prove the models work across diverse skin tones — an incredibly important part of validation.

Our research has shown that most image-based AI models exclude diverse skin tones in their training and testing. We’re also going to see more multimodal models — ones that incorporate diverse information like images, text, and molecular data — to provide outputs or risk assessments. That’s where AI is heading generally, not only just focusing on text or images alone but also taking information from multiple modalities the way humans do.

How often do you use AI in your clinical practice?

Daneshjou: Not very much. I run a research lab, so I use it extensively in research. I’ve used it to help with grant writing and to analyze recommendation letters I’ve written, asking it to identify weaknesses so I can improve them. Clinically, I’ve shown my nurse how to use our secure AI to draft prior authorization letters or rebuttals to insurance [rejections]. But otherwise, I don’t really use it in clinic.

You’ve discussed how AI handles clinical vignettes vs real patients. What should dermatologists understand about this?

Daneshjou: Headlines often misrepresent reality. They’ll say, “AI models can diagnose patients.” But in reality, these models were given very nicely packaged vignettes and were able to provide a diagnosis.

Patients don’t come as nicely packaged vignettes. In real clinical practice, I have to ask, “What’s going on?” I have to do the skin check, identify lesions, gather history, and ask about duration, symptoms, occupation, and sun exposure. I have to collect all this information and make a judgment.

Sometimes, the history doesn’t match what you see, so you have to use clinical reasoning. This kind of clinical reasoning isn’t what they’re testing in research papers that claim AI can diagnose patients.

Would you recommend using AI at all for generating differential diagnoses?

Daneshjou: I’m not using AI just to use it. I need a specific reason why I think it will help me. For example, if I’m writing a grant and want a summary of one of my own research papers, I might ask it to write a first draft that I can edit because I know my own research well enough to verify what’s correct. But I’m not using it to develop differentials for my patients.

What would you advise dermatologists who want to adapt to AI but don’t know where to start?

Daneshjou: The American Academy of Dermatology (AAD) has AI boot camp videos. At the annual AAD meetings, the AAD offers educational sessions on AI.

If you look in the Journal of the American Academy of Dermatology, there are Continuing Medical Education reviews that the AAD’s Augmented Intelligence Committee has written to educate dermatologists about AI technologies and what to watch for.

A few years ago, this content was sparse. But there’s been a concerted effort to create educational materials for dermatologists.

What would you tell dermatologists who are agonizing about AI?

Daneshjou: I see people posting on LinkedIn what I would call outrageous claims based on research papers. They’ll say, “This research paper shows we have autonomous AI agents that can treat patients,” but when you read the actual paper, it doesn’t show that at all. Often, the hype doesn’t match the reality on the ground.

And what about those who think AI is overblown and not worth worrying about?

Daneshjou: Claims about AI replacing physicians or dermatologists are indeed overblown. But this is definitely something dermatologists will have to adapt to. It’s eventually going to become part of practice in some ways.

Daneshjou has financial relationships with Revea, MDalgorithms, Pfizer, Frazier Healthcare Partners, and L'Oréal and has a patent pending for a system that aims to improve clinical images taken by patients. She is a member of the AAD’s Augmented Intelligence Committee.

TOP PICKS FOR YOU


Share This Article

Comments

Leave a comment