Data augmentation with occluded facial features for age and gender estimation

Abstract Here, the feature occlusion, a data augmentation method that simulates real‐life challenges on the main features of the human face for age and gender recognition is proposed. Previous methods achieved promising results on constrained data sets with strict environmental settings, but the res...

Full description

Saved in:
Bibliographic Details
Main Authors: Lu En Lin, Chang Hong Lin
Format: Article
Language:English
Published: Wiley 2021-11-01
Series:IET Biometrics
Subjects:
Online Access:https://doi.org/10.1049/bme2.12030
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Here, the feature occlusion, a data augmentation method that simulates real‐life challenges on the main features of the human face for age and gender recognition is proposed. Previous methods achieved promising results on constrained data sets with strict environmental settings, but the results on unconstrained data sets are still far from perfect. The proposed method adopted three simple occlusion techniques, blackout, random brightness, and blur, and each simulates a different kind of challenge that would be encountered in real‐world applications. A modified cross‐entropy loss that gives less penalty to the age predictions that land on the adjacent classes of the ground truth class is also proposed. The effectiveness of our proposed method is verified by implementing the augmentation method and modified cross‐entropy loss on two different convolution neural networks, the slightly modified AdienceNet and the slightly modified VGG16, to perform age and gender classification. The proposed augmentation system improves the age and gender classification accuracy of the slightly modified AdienceNet network by 6.62% and 6.53% on the Adience data set, respectively. The proposed augmentation system also improves the age and gender classification accuracy of the slightly modified VGG16 network by 6.20% and 6.31% on the Adience data set, respectively.
ISSN:2047-4938
2047-4946