A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://pubmed.ncbi.nlm.nih.gov/32347802/ below:

Identification of the Facial Features of Patients With Cancer: A Deep Learning-Based Pilot Study

. 2020 Apr 29;22(4):e17234. doi: 10.2196/17234. Identification of the Facial Features of Patients With Cancer: A Deep Learning-Based Pilot Study

Affiliations

Affiliations

Item in Clipboard

Identification of the Facial Features of Patients With Cancer: A Deep Learning-Based Pilot Study

Bin Liang et al. J Med Internet Res. 2020.

. 2020 Apr 29;22(4):e17234. doi: 10.2196/17234. Affiliations

Item in Clipboard

Abstract

Background: Cancer has become the second leading cause of death globally. Most cancer cases are due to genetic mutations, which affect metabolism and result in facial changes.

Objective: In this study, we aimed to identify the facial features of patients with cancer using the deep learning technique.

Methods: Images of faces of patients with cancer were collected to build the cancer face image data set. A face image data set of people without cancer was built by randomly selecting images from the publicly available MegaAge data set according to the sex and age distribution of the cancer face image data set. Each face image was preprocessed to obtain an upright centered face chip, following which the background was filtered out to exclude the effects of nonrelative factors. A residual neural network was constructed to classify cancer and noncancer cases. Transfer learning, minibatches, few epochs, L2 regulation, and random dropout training strategies were used to prevent overfitting. Moreover, guided gradient-weighted class activation mapping was used to reveal the relevant features.

Results: A total of 8124 face images of patients with cancer (men: n=3851, 47.4%; women: n=4273, 52.6%) were collected from January 2018 to January 2019. The ages of the patients ranged from 1 year to 70 years (median age 52 years). The average faces of both male and female patients with cancer displayed more obvious facial adiposity than the average faces of people without cancer, which was supported by a landmark comparison. When testing the data set, the training process was terminated after 5 epochs. The area under the receiver operating characteristic curve was 0.94, and the accuracy rate was 0.82. The main relative feature of cancer cases was facial skin, while the relative features of noncancer cases were extracted from the complementary face region.

Conclusions: In this study, we built a face data set of patients with cancer and constructed a deep learning model to classify the faces of people with and those without cancer. We found that facial skin and adiposity were closely related to the presence of cancer.

Keywords: cancer; cancer patient; convolutional neural network; deep learning; facial features.

©Bin Liang, Na Yang, Guosheng He, Peng Huang, Yong Yang. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.04.2020.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: None declared.

Figures

Figure 1

Pre-processing workflow to obtain chip…

Figure 1

Pre-processing workflow to obtain chip images to use as inputs in the convolutional…

Figure 1

Pre-processing workflow to obtain chip images to use as inputs in the convolutional neutral network, showing the intermediate results. Image courtesy Barack Obama Presidential Library [22].

Figure 2

Network architecture. A) The basic…

Figure 2

Network architecture. A) The basic cell of ResNet: the residual block. B) The…

Figure 2

Network architecture. A) The basic cell of ResNet: the residual block. B) The sampled residual block. The difference is that the convolution of the first layer is performed with a 2-pixel stride. Correspondingly, 2-pixel stride sampling was performed before identity mapping. C) The ResNet architecture. The blocks shown in A) and B) are annotated with solid and dashed round-ended boxes, respectively.

Figure 3

Average faces and landmark comparison.…

Figure 3

Average faces and landmark comparison. Top row: male, bottom row: female. First column:…

Figure 3

Average faces and landmark comparison. Top row: male, bottom row: female. First column: cancer, second column: noncancer, and third column: landmark comparison (from left to right). In the third column, the landmark of the average cancer face is depicted in red and that of the average noncancer face is depicted in blue.

Figure 4

Training and testing results: (A)…

Figure 4

Training and testing results: (A) Binary cross-entropy (BCE) and accuracy (Acc) during the…

Figure 4

Training and testing results: (A) Binary cross-entropy (BCE) and accuracy (Acc) during the training process. The loss value is normalized for clarity. (B) Receiver operating characteristic (ROC) curve of the testing dataset (AUC: area under the curve).

Figure 5

Grad-CAM analysis results for face…

Figure 5

Grad-CAM analysis results for face images of people with and without cancer.

Figure 5

Grad-CAM analysis results for face images of people with and without cancer.

Similar articles Cited by References
    1. Henderson AJ, Holzleitner IJ, Talamas SN, Perrett DI. Perception of health from facial cues. Phil Trans R Soc B. 2016 May 05;371(1693):20150380. doi: 10.1098/rstb.2015.0380. - DOI - PMC - PubMed
    1. Coetzee V, Re D, Perrett DI, Tiddeman BP, Xiao D. Judging the health and attractiveness of female faces: is the most attractive level of facial adiposity also considered the healthiest? Body Image. 2011 Mar;8(2):190–3. doi: 10.1016/j.bodyim.2010.12.003. - DOI - PubMed
    1. de Jager S, Coetzee N, Coetzee V. Facial Adiposity, Attractiveness, and Health: A Review. Front Psychol. 2018;9:2562. doi: 10.3389/fpsyg.2018.02562. doi: 10.3389/fpsyg.2018.02562. - DOI - DOI - PMC - PubMed
    1. Jones B, Little A, Penton-Voak I, Tiddeman B, Burt D, Perrett D. Facial symmetry and judgements of apparent health. Evol Hum Behav. 2001 Nov;22(6):417–429. doi: 10.1016/s1090-5138(01)00083-6. - DOI
    1. Tan KW, Tiddeman B, Stephen ID. Skin texture and colour predict perceived health in Asian faces. Evol Hum Behav. 2018 May;39(3):320–335. doi: 10.1016/j.evolhumbehav.2018.02.003. - DOI

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.3