Over the last decades, the incidence of skin cancer, melanoma and non-melanoma, has increased at a continuous rate. In particular for melanoma, the deadliest type of skin cancer, early detection is important to increase patient prognosis. Recently, deep neural networks (DNNs) have become viable to deal with skin cancer detection. In this work, we present a smartphone-based application to assist on skin cancer detection. This application is based on a Convolutional Neural Network (CNN) trained on clinical images and patients demographics, both collected from smartphones. Also, as skin cancer datasets are imbalanced, we present an approach, based on the mutation operator of Di?erential Evolution (DE) algorithm, to balance data. In this sense, beyond provides a ?exible tool to assist doctors on skin cancer screening phase, the method obtains promising results with a balanced accuracy of 85% and a recall of 96%. Index Terms—skin cancer detection, smartphone application, deep learning, convolutional neural network
Introduction
I. INTRODUCTION
The skin cancer occurrence, melanoma and non-melanoma, has increased at a constant rate over the last decades. The World Health Organization (WHO) estimates that 2-3 million non-melanoma cancers and 132,000 melanomas occur every year in the world . The presence of skin cancer is strongly related to the incidence of ultraviolet radiation caused by sunlight exposure. Due to the lack of pigmentation, caucasian people are under the highest risk. Early detection is important to increase patient prognosis [1-8].
Several computer-aided diagnoses (CAD) have been pro- posed to tackle automated skin cancer detection. Nowadays, most approaches are based on Convolutional Neural Networks (CNN) trained on dermoscopy images. However, in emerging countries such as Brazil, in particular in the countryside there is a lack of dermatologists and dermatoscopes, which constraints the use of a CAD system based on dermoscopy images. In this context, smartphones may be useful devices to handle this problem. According to the Ericsson report in 2019 the total number of mobile subscriptions around the world was around 8 billion. In Brazil, a medical device that magnifies the lesion for better visualization around 78% of the population have their own smartphone. Therefore, a smartphone-based application to assist clinicians to diagnose skin cancer during the screening process seems to be feasible [9-12].
Proposed a deep learning model based on convolutional neural network (CNN) for Android platforms. Their model was tested on the grand challenge PHDB melanoma dataset and outperformed the known baseline model in terms of accuracy and computational efficiency. presented an iOS mobile application for skin cancer also using a CNN. The model was trained on the HAM10000 dataset, which contains 10,000 dermoscopy images clustered into di?erent types of skin lesions [13-16].
II. PROBLEM STATEMENT
There are many types of human cancers, the most common type of these cancers is the skin cancer. It is severe among the faired-skinned population in Europe, North America, and Australia.
There are two major types of skin cancer, name malignant melanoma and non-melanoma basal cell, squamous cell, and Markel cell carcinomas, etc. Melanoma is more dangerous and can be fatal if not treated. If melanoma is detected in its early stages, it is highly curable, yet advanced melanoma is lethal.
But the diagnostic processes have some disadvantages which can be analyzed and improved upon.
The main diagnostic for this type of situation is to get a biopsy test but there lies the problems which can be addressed.
Biopsy normally scrapes of a portion of the skin tissue for examination which could lead to some possible complications which are:
Excessive bleeding (hemorrhage)
Infection
Puncture damage to nearby tissue or organs
Skin numbness around the biopsy site.
III. PROPOSED SYSTEM
CNN or the convolutional neural network (CNN) is a class of deep learning neural networks. In short think of CNN as a machine learning algorithm that can take in an input image, assign importance (learnable weights and biases) to various aspects/ objects in the image, and be able to di?erentiate one from the other.
CNN works by extracting features from the images. Any CNN consists of the following:
The input layer which is a grayscale image
The Output layer which is a binary or multi-class labels
Hidden layers consisting of convolution layers, RELU (rectified linear unit) layers, the pooling layers, and a fully connected Neural Network
It is very important to understand that ANN or Artificial Neural Networks, made up of multiple neurons is not capable of extracting features from the image. This is where a combination of convolution and pooling layers comes into the picture. Similarly, the convolution and pooling layers can’t perform classification hence we need a fully connected Neural Network.
Before we jump into the concepts further let’s try and understand these individual segments separately.
Conclusion
In this paper, we presented a smartphone based application to support the diagnostic of skin cancer using convolutional neural networks. The results obtained with clinical information presents an average balanced accuracy of 85% and a recall of 96%. The study of the impact of clinical information has shown that clinical information is relevant to cancer detection and recall in about 1.4%, 1.1% and 2.4%, respectively. since it improved on average the balanced accuracy, precis-Regarding the data balancing approach, the weighted loss function presented the best results but the approach based on the mutation operator of di?erential evolution is competitive. It is worth mentioning that these results are promising but yet preliminary since our collected dataset is small. The next phase consists of applying this approach to a real world scenario to assist doctors in the screening process. We also continue to collect more data to improve our results.
References
[1] R. L. Siegel, K. D. Miller, and A. Jemal, “Cancer statistics, 2019,” CA: a CancerJournal for Clinicians, vol. 69, no. 1, pp. 7–34, 2019.
[2] WHO-World Health Organization. (2019) How common is the skin cancer? [Online]. Available: https://www.who.int/uv/faq/ skincancer/en/ index1.html
[3] WHO - World Health Organization. (2019) Health e?ects of UV radiation.[Online]. Available: https://www.who.int/uv/health/uv health2/en/index1.html
[4] WHO - World Health Organization . (2019) Who is most at risk of getting skincancer? [Online]. Available: https://www.who.int/ uv/faq/ skincancer/en/ index2.html
[5] D. Schadendorf, A. C. van Akkooi, C. Berking, K. G. Griewank, R. Gutzmer, A. Hauschild, A. Stang, A. Roesch, and S. Ugurel, “Melanoma,” The Lancet, vol. 392, no. 10151, pp. 971–984, 2018.
[6] N. Zhang, Y.-X. Cai, Y.-Y. Wang, Y.-T. Tian, X.-L. Wang, and B. Badami, “Skin cancer diagnosis based on optimized convolutional neural network,” Arti?cial Intelligence in Medicine, vol. 102, p. 101756, 2020.
[7] A. Hekler, J. S. Utikal, A. H. Enk, A. Hauschild, M. Weichenthal, R. C. Maron, C. Berking, S. Haferkamp, J. Klode, D. Schadendorf et al., “Superior skin cancer classification by the combination of human and artificial intelligence,” EuropeanJournal of Cancer, vol. 120, pp. 114– 121, 2019.
[8] T. J. Brinker, A. Hekler, A. H. Enk, C. Berking, S. Haferkamp, A. Hauschild, M. Weichenthal, J. Klode, D. Schadendorf, T. Holland- Letz et al., “Deep neural networks are superior to dermatologists in melanoma image classification,” European Journal of Cancer, vol. 119, pp. 11–17, 2019.
[9] T. J. Brinker, A. Hekler, A. Hauschild, C. Berking, B. Schilling, A. H. Enk, S. Haferkamp, A. Karoglan, C. von Kalle, M. Weichenthal et al., “Comparing artificialintelligence algorithms to German dermatol- ogists: the melanoma classification benchmark,” European Journal of Cancer, vol.111, pp. 30–37, 2019.
[10] T. J. Brinker, A. Hekler, J. S. Utikal, N. Grabe, D. Schadendorf, J. Klode, C. Berking, T. Steeb, A. H. Enk, and C. von Kalle, “Skin cancer classification using convolutional neural networks: systematic review,” Journal of Medical InternetResearch, vol. 20, no. 10, p. e11936, 2018.
[11] R. C. Maron, M. Weichenthal, J. S. Utikal, A. Hekler, C. Berking, A. Hauschild, A. H. Enk, S. Haferkamp, J. Klode, D. Schadendorf et al., “Systematic outperformance of 112 dermatologists in multiclass skin cancer image classification by convolutional neural networks,” European Journal of Cancer, vol. 119, pp. 57–65, 2019.
[12] A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun,“Dermatologist classification of skin cancer with deep neural networks,” Nature, vol. 542, no. 7639, p. 115, 2017
[13] M. Cristian, “SpotMole.” available at: Google Play Store. (accessed Jul. 01, 2019).
[14] C. P. Koirala, “Deep Learning for Melanoma.”, CJ63, available at: Google Play Store. (accessed Jul. 01, 2019).
[15] GeniaLabs, “DermIA.”, available at: Google Play Store. (accessed Jul. 01, 2019).
[16] T. Maier et al., “Accuracy of a smartphone application using fractal image analysis of pigmented moles compared to clinical diagnosis and histological result,” J. Eur. Acad. Dermatology Venereol., vol. 29, no. 4, pp. 663–667, 2015, doi: 10.1111/jdv.12648 .