Previous [ 1] [ 2] [ 3] [ 4] [ 5] [ 6] [ 7] [ 8] [ 9] [ 10] [ 11] [ 12] [ 13] [ 14] [ 15] [ 16] [ 17] [ 18] [ 19] [ 20] [ 21] [ 22] [ 23] [ 24]


Journal of Information Science and Engineering, Vol. 27 No. 3, pp. 1045-1057 (May 2011)

Perceptual LOD under Low Resolution Conditions*

Department of Computer Science and Information Engineering
National Chi Nan University
Nantou, 545 Taiwan

In this paper, we propose a Perceptual LOD (Level of Detail) system based on a skeleton structure by integrating the concepts of the human perceptual system and 3D skeletons into the weighting mechanism of error metrics for mesh simplification. The human reception system refers to the way a human being identifies a graphic object. It consists of template- matching theories", "prototype-matching theories" and "feature-discrimination theories". From the psychological point of view, the 3D skeleton of an object can be considered as an extremely simplified description of its original shape. It provides important visual clues to keep a model recognizable even if it is extremely simplified. The 3D skeleton is extracted from the model by a DCG (Domain Connected Graph) algorithm first. In accordance with their alignment to the skeleton structure, the vertices of a given model are hierarchically clustered. Fuzzy sets are then adopted to identify the possible prototype from a prototype database. During the model-simplification stage, the weighting value of each vertex is adjusted not only depending on the geometric and topological information, but also on those perception-oriented considerations. A preliminary experiment result shows that our method is effective for shape-preserving with low resolution LOD.

Keywords: computer graphics, level of detail, 3D skeleton, human perceptual system, QEM

Full Text () Retrieve PDF document (201105_15.pdf)

Received August 31, 2009; revised January 6, 2010; accepted February 9, 2010.
Communicated by Tyng-Luh Liu.
* This project was partially supported by the National Science Council of Taiwan, R.O.C. under grants No. NSC 95-2221-E-260-035 and NSC 98-2221-E-260-023.