Ordinal Hyperplanes Ranker with Cost Sensitivities for Age Estimation
Kuang-Yu Chang 1,3, Chu-Song Chen 1,2,4, and Yi-Ping Hung1,3,4
1 Institute of Information Science, Academia Sinica, Taipei, Taiwan.
2 Research Center for Information Technology Innovation, Academia Sinica, Taipei, Taiwan.
3 Dept. of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan.
4 Graduate Institute of Networking and Multimedia, National Taiwan University, Taipei, Taiwan
In this paper, we propose an ordinal hyperplane ranking algorithm called OHRank, which estimates human ages via facial images. The design of the algorithm is based on the relative order information among the age labels in a database. Each ordinal hyperplane separates all the facial images into two groups according to the relative order, and a cost-sensitive property is exploited to find better hyperplanes based on the classification costs. Human ages are inferred by aggregating a set of preferences from the ordinal hyperplanes with their cost sensitivities. Our experimental results demonstrate that the proposed approach outperforms conventional multiclass-based and regressionbased approaches as well as recently developed rankingbased age estimation approaches.
PublicationKuang-Yu Chang, Chu-Song Chen, and Yi-Ping Hung, Ordinal Hyperplanes Ranker with Cost Sensitivities for Age Estimation, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), June, 2011.
[ Paper (970KB) ] [ Bibtex ]
We performed age estimation experiments on two benchmark age databases: (1) FG-NET and (2) MORPH Album 2, and use AAM as the feature extraction method.
FG-NET contains 1,002 color or gray facial images of 82 individuals with large variations in pose, expression and lighting. For each subject, there are more than ten images ranging from age 0 to age 69.
There are two scales of MORPH databases. We use the MORPH Album 2 that is a larger-scale database in our experiments. MORPH Album 2 contains 55,608 facial images with about three images per person ranging from 16 to 77 years old. To reduce the variation between ethnic groups, we selected 5,492 images of people of Caucasian descent, so that cross-race influence can be avoided.
Our previous work (ICPR, 2010) use the same datasets.
|Age Range||FG-NET (%)||MORPH (%)|
Supplementary Files - Landmark of Morph Album 2
- MORPH Album 2 point files used in the experiment: [ Download (5.2MB) ]
Information of other database used in our experiment:
- FG-NET aging database and point files: [ refer to Link (currently not available)]
- MORPH Album 2 database: [ Link ] (Provide by Ricanek Jr. et al. 2006)
MORPH Album 2 is provided by Ricanek Jr. et al. (2006). The database contains 55,608 face images of several races without point files. We provide the point files of the selected 5,492 images of Caucasian descent in MORPH Album 2. The point files can help for AAM feature extraction or other face aligning approaches.