Ordinal Hyperplanes Ranker with Cost Sensitivities for Age Estimation

Kuang-Yu Chang 1,3, Chu-Song Chen 1,2,4, and Yi-Ping Hung1,3,4

1 Institute of Information Science, Academia Sinica, Taipei, Taiwan.
2 Research Center for Information Technology Innovation, Academia Sinica, Taipei, Taiwan.
3 Dept. of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan.
4 Graduate Institute of Networking and Multimedia, National Taiwan University, Taipei, Taiwan
{kuangyu, song}@iis.sinica.edu.tw

 

Abstract

In this paper, we propose an ordinal hyperplane ranking algorithm called OHRank, which estimates human ages via facial images. The design of the algorithm is based on the relative order information among the age labels in a database. Each ordinal hyperplane separates all the facial images into two groups according to the relative order, and a cost-sensitive property is exploited to find better hyperplanes based on the classification costs. Human ages are inferred by aggregating a set of preferences from the ordinal hyperplanes with their cost sensitivities. Our experimental results demonstrate that the proposed approach outperforms conventional multiclass-based and regressionbased approaches as well as recently developed rankingbased age estimation approaches.

 

Publication

Kuang-Yu Chang, Chu-Song Chen, and Yi-Ping Hung, Ordinal Hyperplanes Ranker with Cost Sensitivities for Age Estimation, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), June, 2011.
[ Paper (970KB) ] [ Bibtex ]

 

Datasets


We performed age estimation experiments on two benchmark age databases: (1) FG-NET and (2) MORPH Album 2, and use AAM as the feature extraction method.

FG-NET contains 1,002 color or gray facial images of 82 individuals with large variations in pose, expression and lighting. For each subject, there are more than ten images ranging from age 0 to age 69.

There are two scales of MORPH databases. We use the MORPH Album 2 that is a larger-scale database in our experiments. MORPH Album 2 contains 55,608 facial images with about three images per person ranging from 16 to 77 years old. To reduce the variation between ethnic groups, we selected 5,492 images of people of Caucasian descent, so that cross-race influence can be avoided.

Our previous work (ICPR, 2010) use the same datasets.

Table 1. Age range distribution of face images in the FG-NET and the MORPH Album 2 databases.
Age Range FG-NET (%) MORPH (%)
0-9 37.03 0
10-19 33.83 8.94
20-29 14.37 26.04
30-39 7.88 32.16
40-49 4.59 24.58
50-59 1.5 7.37
60-69 0.8 0.82
70-77 0 0.09

 

Supplementary Files - Landmark of Morph Album 2

Copyright © 2011 Kuang-Yu Chang at Institute of Information Science, Academia Sinica.