How accurate are smartphone applications ('apps') for detecting melanoma in adults?

What is the aim of the review?

We wanted to find out how well smartphone applications can help the general public understand whether their skin lesions might be melanoma.

Why is improving the diagnosis of malignant melanoma skin cancer important?

Melanoma is one of the most dangerous forms of skin cancer. Not recognising a melanoma (a false negative test result) could delay seeking appropriate advice and surgery to remove it. This increases the risk of the cancer spreading to other organs in the body and possibly causing death. Diagnosing a skin lesion as a melanoma when it is not present (a false positive result) may cause anxiety and lead to unnecessary surgery and further investigations.

What was studied in the review?

Specialised applications ('apps') that provide advice on skin lesions or moles that might cause people concern are widely available for smartphones. Some apps allow people to photograph any skin lesion they might be worried about and then receive guidance on whether to get medical advice. Apps may automatically classify lesions as high or low risk, while others can act as store-and-forward devices where images are sent to an experienced professional, such as a dermatologist, who then makes a risk assessment based on the photo. Cochrane researchers found two studies, evaluating five apps that used automated analysis of images and one that used a store-and-forward approach, to evaluate suspicious skin lesions.

What are the main results of the review?

The review included two studies with 332 lesions, including 86 melanomas, analysed by at least one smartphone application. Both studies used photographs of moles or skin lesions that were about to be removed because doctors had already decided they could be melanomas. The photographs were taken by doctors instead of people taking pictures of their lesions with their own smartphones. For these reasons, we are not able to make a reliable estimate about how well the apps actually work.

Four apps that produce an immediate (automated) assessment of a skin lesion or mole that has been photographed by the smartphone missed between 7 and 55 melanomas.

One app that sends the photograph of a mole or skin lesion to a dermatologist for assessment missed only one melanoma. Another 6 melanomas examined by the dermatologist via the application were not classified as high risk; instead the dermatologist was not able to classify the lesion as either 'atypical' (possibly a melanoma) or 'typical' (definitely not a melanoma).

How reliable are the results of the studies of this review?

The small number and poor quality of included studies reduces the reliability of findings. The people included were not typical of those who would use the applications in real life. The final diagnosis of melanoma was made by histology, which is likely to have been a reliable method for deciding whether patients really had melanoma*. However, the studies excluded between 2% and 18% of images because the applications failed to produce a recommendation.

Who do the results of this review apply to?

Studies took place in the USA and Germany. They did not report key patient information such as age and gender. The percentage of people with a final diagnosis of melanoma was 18% and 35%, much higher than that observed in community settings. The definition of eligible patients was narrow in comparison to likely users of the applications. The photographs used were taken by doctors rather than by smartphone users, which seriously impacts the applicability of results.

What are the implications of this review?

Current smartphone applications using automated analysis are observed to have a high chance of missing melanomas (false negatives). Store-and-forward image applications could have a potential role in the timely identification of people with potentially malignant lesions by facilitating early engagement of those with suspicious skin lesions, but they have resource and workload implications.

The development of applications to help identify people who might have melanoma is a fast-moving field. The emergence of new applications, higher quality and better reported studies could change the conclusions of this review substantially.

How up-to-date is this review?

The review authors searched for and used studies published up to August 2016.

*In these studies biopsy was the reference standard (means of establishing final diagnoses).

Authors' conclusions: 

Smartphone applications using artificial intelligence-based analysis have not yet demonstrated sufficient promise in terms of accuracy, and they are associated with a high likelihood of missing melanomas. Applications based on store-and-forward images could have a potential role in the timely presentation of people with potentially malignant lesions by facilitating active self-management health practices and early engagement of those with suspicious skin lesions; however, they may incur a significant increase in resource and workload. Given the paucity of evidence and low methodological quality of existing studies, it is not possible to draw any implications for practice. Nevertheless, this is a rapidly advancing field, and new and better applications with robust reporting of studies could change these conclusions substantially.

Read the full abstract...

Melanoma accounts for a small proportion of all skin cancer cases but is responsible for most skin cancer-related deaths. Early detection and treatment can improve survival. Smartphone applications are readily accessible and potentially offer an instant risk assessment of the likelihood of malignancy so that the right people seek further medical attention from a clinician for more detailed assessment of the lesion. There is, however, a risk that melanomas will be missed and treatment delayed if the application reassures the user that their lesion is low risk.


To assess the diagnostic accuracy of smartphone applications to rule out cutaneous invasive melanoma and atypical intraepidermal melanocytic variants in adults with concerns about suspicious skin lesions.

Search strategy: 

We undertook a comprehensive search of the following databases from inception to August 2016: Cochrane Central Register of Controlled Trials; MEDLINE; Embase; CINAHL; CPCI; Zetoc; Science Citation Index; US National Institutes of Health Ongoing Trials Register; NIHR Clinical Research Network Portfolio Database; and the World Health Organization International Clinical Trials Registry Platform. We studied reference lists and published systematic review articles.

Selection criteria: 

Studies of any design evaluating smartphone applications intended for use by individuals in a community setting who have lesions that might be suspicious for melanoma or atypical intraepidermal melanocytic variants versus a reference standard of histological confirmation or clinical follow-up and expert opinion.

Data collection and analysis: 

Two review authors independently extracted all data using a standardised data extraction and quality assessment form (based on QUADAS-2). Due to scarcity of data and poor quality of studies, we did not perform a meta-analysis for this review. For illustrative purposes, we plotted estimates of sensitivity and specificity on coupled forest plots for each application under consideration.

Main results: 

This review reports on two cohorts of lesions published in two studies. Both studies were at high risk of bias from selective participant recruitment and high rates of non-evaluable images. Concerns about applicability of findings were high due to inclusion only of lesions already selected for excision in a dermatology clinic setting, and image acquisition by clinicians rather than by smartphone app users.

We report data for five mobile phone applications and 332 suspicious skin lesions with 86 melanomas across the two studies. Across the four artificial intelligence-based applications that classified lesion images (photographs) as melanomas (one application) or as high risk or 'problematic' lesions (three applications) using a pre-programmed algorithm, sensitivities ranged from 7% (95% CI 2% to 16%) to 73% (95% CI 52% to 88%) and specificities from 37% (95% CI 29% to 46%) to 94% (95% CI 87% to 97%). The single application using store-and-forward review of lesion images by a dermatologist had a sensitivity of 98% (95% CI 90% to 100%) and specificity of 30% (95% CI 22% to 40%).

The number of test failures (lesion images analysed by the applications but classed as 'unevaluable' and excluded by the study authors) ranged from 3 to 31 (or 2% to 18% of lesions analysed). The store-and-forward application had one of the highest rates of test failure (15%). At least one melanoma was classed as unevaluable in three of the four application evaluations.