Previous [ 1] [ 2] [ 3] [ 4] [ 5] [ 6] [ 7] [ 8] [ 9] [ 10] [ 11] [ 12] [ 13] [ 14] [ 15] [ 16] [ 17] [ 18]

@

Journal of Information Science and Engineering, Vol. 22 No. 5, pp. 1205-1227 (September 2006)

Visual-Based Emotional Descriptor and Feedback Mechanism for Image Retrieval

Hun-Woo Yoo
Center for Cognitive Science
Yonsei University
Seodaemun-Ku, Seoul 120-749, Korea
E-mail: paulyhw@yonsei.ac.kr

A new emotion-based image retrieval method is proposed in this paper. Query emotional descriptors called query color code and query gray code are designed on the basis of human evaluation of 13 emotion pairs ("like-dislike", "beautiful-ugly", "naturalunnatural", "dynamic-static", "warm-cold", "gay-sober", "cheerful-dismal", "unstablestable", "light-dark", "strong-weak", "gaudy-plain", "hard-soft", "heavy-light") when 30 random patterns with different color, intensity, and dot sizes are presented. For emotion image retrieval, once a query emotion is selected, the associated query color code and query gray code are obtained, and DB color code and DB gray code that capture color, intensity, and dot size are extracted from each database image. Next, a matching process between the two color codes and between the two gray codes is performed to retrieve images with a sensation of the query emotion. The new relevance feedback method proposed here incorporates the human information needs in the retrieval process by dynamically updating relative weights between the query and DB color codes, and among relative weights of an intra query color code. To show the validity of the proposed method, experiments are performed on over 450 images.

Keywords: emotion-based image retrieval, color code, gray code, relevance feedback, weight update

Full Text () Retrieve PDF document (200609_13.pdf)

Received May 27, 2004; revised December 3, 2004 & February 25, 2005; accepted March 21, 2005.
Communicated by Kuo-Chin Fan.
*This work was supported by Korea Research Foundation grant No. KRF-2004-005-H0005.