Data fusion for skin detection

Jamal Ahmad Dargham, and Ali Chekima, and Sigeru, Omatu and Chelsia Amy Doukim, (2009) Data fusion for skin detection. Artificial Life and Robotics, 13 (2). pp. 438-441. ISSN 1433-5298

[img]
Preview
PDF
43Kb

Official URL: http://dx.doi.org/10.1007/s10015-008-0616-3

Abstract

Two methods of data fusion to improve the performance of skin detection were tested. The first method fuses two chrominance components from the same color space, while the second method fuses the outputs of two skin detection methods each based on a different color space. The color spaces used are the normalized red, green, blue (RGB) color space, referred to here as pixel intensity normalization, and a new method of obtaining the R, G, and B components of the normalized RGB color space called maximum intensity normalization. The multilayer perceptron (MLP) neural network and histogram thresholding were used for skin detection. It was found that fusion of two chrominance components gives a lower skin detection error than a single chrominance component regardless of the database or the color space for both skin detection methods. In addition, the fusion of the outputs of two skin detection methods further reduces the skin detection error. © International Symposium on Artificial Life and Robotics (ISAROB). 2009.

Item Type:Article
Uncontrolled Keywords:Datafusion, Histogram thresholding, Intensity normalization, MLP neural networks, Skin detection
Subjects:?? TR624-835 ??
Divisions:SCHOOL > School of Engineering and Information Technology
ID Code:1546
Deposited By:IR Admin
Deposited On:17 Mar 2011 10:27
Last Modified:23 Feb 2015 15:02

Repository Staff Only: item control page


Browse Repository
Collection
   Articles
   Book
   Speeches
   Thesis
   UMS News
Search
Quick Search

   Latest Repository

Link to other Malaysia University Institutional Repository

Malaysia University Institutional Repository