News

Article

AI Dermatology Mobile Apps Have Critical Efficacy, Safety Gaps, Review Says

This scoping review discovered potential risks and critical gaps in the efficacy, safety, and transparency of current artificial intelligence (AI) dermatology mobile apps, emphasizing the need for regulatory intervention.

In their current state, artificial intelligence (AI) dermatology mobile applications (apps) may pose harm to patients due to potential risks, lack of consistent validation, and misleading user communication, according to a study published in JAMA Dermatology.

The researchers explained that the performance of mobile apps claiming to assist with skin conditions is inconsistent, and none have gained FDA approval. However, the pace at which direct-to-consumer health apps emerge has posed challenges to regulatory bodies worldwide.

Despite their benefits, the current evidence supporting the clinical use of AI dermatology mobile apps remains unclear. Consequently, the researchers conducted a study to identify and characterize the current AI dermatology mobile apps available for download in app stores.

Male patient using a phone | Image Credit: Rostislav Sedlacek - stock.adobe.com

Male patient using a phone | Image Credit: Rostislav Sedlacek - stock.adobe.com

Using the terms “dermatology,” “derm,” and “skin,” they searched for publicly available dermatology-related mobile apps with AI features in both the Apple and Android app stores from November to December 2023. The search was performed independently by 2 investigators, and a third investigator independently reviewed the identified apps and resolved any discrepancies to determine the final list of included apps.

The researchers extracted various data about eligible apps from the app store, including the purpose, average rating out of 5 stars, number of downloads, and dermatologist input or lack thereof; app developers were contacted to supply additional information to complete a more comprehensive evaluation.

Initially, the researchers identified 909 eligible apps. After eliminating 518 duplicates, 391 apps remained. After subsequent reviews, they excluded 350 apps based on nonmedical content, unavailability, non-English language, or the absence of AI features. Consequently, the researchers included 41 apps in their in-depth analysis.

The target audience of 32 of these apps (78.0%) was patients, while 4 apps (9.76%) focused on clinicians, and 5 apps (12.2%) were designed for both. Also, 13 apps (31.7%) to Apple were exclusive to Apple, 12 (29.3%) to Android, and 16 (39.0%) were available on both platforms.

In terms of the most popular purposes, 14 apps (34.1%) were for skin cancer detection, 13 (31.7%) for diagnosis and/or identification of skin and/or hair conditions, and 7 (17.1%) for mole tracking. Other uses included tracking skin conditions, atopic dermatitis management, sun protection, and acne diagnosis, treatment, and/or monitoring.

Of these apps, 14 (34.1%) were US-based, with only 2 providing a disclaimer for lacking FDA approval. Similarly, 14 apps (34.1%) were based in Europe, and only 2 possessed regulatory health and safety product approval in the European Union.

The researchers noted that 10 apps (24.4%) claimed diagnostic capability. However, they explained that none of these had scientific publications as supporting evidence, and 2 apps lacked warnings cautioning patients about the potential inaccuracy of results and the absence of formal medical diagnoses. Also, only 5 apps had supporting peer-reviewed journal publications, and 1 had a preprint article available.

Conversely, 24 apps (58.5%) lacked any information on training and testing datasets. For those with dataset information, the majority offered only vague, general descriptions. Additionally, 21 apps (51.2%) lacked information on algorithm details, and most apps did not specify any clinician input. More specifically, only 16 apps (39.0%) identified that they included dermatologist input.

Lastly, only 12 (29.3%) apps indicated that they do not store user-submitted images. On the other hand, 16 (39.0%) specified that they store user images, 12 of which reported using secure cloud servers. In terms of how submitted images would be used, 20 apps (48.8%) used the data for analysis to provide user results, and 12 apps (29.3%) used it for research and further app development; however, 19 apps (46.3%) did not provide any details.

“The findings of this scoping review highlight that dermatology AI apps lack transparency about the effectiveness of the AI models, data used for development, and how user images are used,” the authors wrote. “This raises concerns about biases, inappropriate recommendations, and user privacy. The absence of regulatory approval and limited clinician involvement, particularly from dermatologists, further compounds these concerns.”

The researchers acknowledged their study’s limitations, one being that their search was limited by the regional availability of apps in the Apple and Android stores. Consequently, the searchers were limited to relevant mobile apps available for download in US app stores; some eligible international apps may have been omitted from the study. Despite these limitations, the researchers provided future research guidelines based on their findings.

“Addressing challenges in efficacy, safety, and transparency through effective regulation, validation, and standardized evaluation criteria is essential to harness the benefits of these apps while minimizing risks,” the authors wrote.

Reference

Wongvibulsin S, Yan MJ, Pahalyants V, Murphy W, Daneshjou R, Rotemberg V. Current state of dermatology mobile applications with artificial intelligence features: a scoping review. JAMA Dermatol. doi:10.1001/jamadermatol.2024.0468

Related Videos
1 expert is featured in this series.
5 experts are featured in this series
5 experts are featured in this series.
1 KOL is featured in this series.
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo