×

zbMATH — the first resource for mathematics

Cloud basis function neural network: A modified RBF network architecture for holistic facial expression recognition. (English) Zbl 1131.68080
Summary: The paper presents novel modifications to radial basis functions (RBFs) and a neural network based classifier for holistic recognition of the six universal facial expressions from static images. The new basis functions, called Cloud Basis Functions (CBFs) use a different feature weighting, derived to emphasize features relevant to class discrimination. Further, these basis functions are designed to have multiple boundary segments, rather than a single boundary as for RBFs. These new enhancements to the basis functions along with a suitable training algorithm allow the neural network to better learn the specific properties of the problem domain. The proposed classifiers have demonstrated superior performance compared to conventional RBF neural networks as well as several other types of holistic techniques used in conjunction with RBF neural networks. The CBF neural network based classifier yielded an accuracy of 96.1%, compared to 86.6%, the best accuracy obtained from all other conventional RBF neural network based classification schemes tested using the same database.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
68T10 Pattern recognition, speech recognition
Software:
Cohn-Kanade
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Ekman, P.; Friesen, W.V., Constants across cultures in the face and emotion, J. pers. soc. psychol., 17, 2, 124-129, (1971)
[2] M. Suwa, N. Sugie, K. Fujimora, A preliminary note on pattern recognition of human emotional expression, in: Proceedings of the Fourth International Joint Conference on Pattern Recognition, Kyoto, Japan, 1978, pp. 408-410.
[3] Samal, A.; Iyengar, P.A., Automatic recognition and analysis of human faces and facial expressions: a survey, Pattern recognition, 25, 1, 65-77, (1992)
[4] Pantic, M.; Rothkrantz, L.J.M., Automatic analysis of facial expressions: the state of the art, IEEE trans. pattern anal. Mach. intell., 22, 12, 1424-1445, (2000)
[5] Fasel, B.; Luettin, J., Automatic facial expression analysis: a survey, Pattern recognition, 36, 1, 259-275, (2003) · Zbl 1007.68947
[6] Mase, K., Recognition of facial expression from optical flow, Proc. IEICE trans. spec. issue comput. vision appl., 74, 3474-3483, (1991)
[7] Black, M.J.; Yacoob, Y., Recognizing facial expressions in image sequences using local parameterized models of image motion, Int. J. comput. vision, 25, 1, 23-48, (1997)
[8] Essa, I.; Pentland, A., Coding, analysis interpretation, recognition of facial expressions, IEEE trans. pattern anal. Mach. intell., 19, 7, 757-763, (1997)
[9] J.F. Cohn, A.J. Zlochower, J.J. Lien, T. Kanade, Feature-point tracking by optical flow discriminates subtle differences in facial expression, in: Proceedings of the International Conference on Automatic Face and Gesture Recognition, 1998, pp. 396-401.
[10] M. Wang, Y. Iwai, M. Yachida, Expression recognition from time-sequential facial images by use of expression change model, in: Proceedings of the International Conference on Automatic Face and Gesture Recognition, 1998, pp. 324-329.
[11] Lanitis, A.; Taylor, C.; Cootes, T., Automatic interpretation and coding of face images using flexible models, IEEE trans. pattern anal. Mach. intell., 19, 7, 743-756, (1997)
[12] H. Hong, H. Neven, C. von der Malsburg, Online facial expression recognition based on personalized galleries, in: Proceedings of the International Conference on Automatic Face and Gesture Recognition, 1998, pp. 354-359.
[13] Z. Zhang, M. Lyons, M. Schuster, S. Akamatsu, Comparison between geometry-based and gabor wavelets-based facial expression recognition using multi-layer perceptron, in: Proceedings of the International Conference on Automatic Face and Gesture Recognition, 1998, pp. 454-459.
[14] H. Kobayashi, F. Hara, Recognition of six basic facial expressions and their strength by neural network, in: Proceedings of the International on Workshop Robot and Human Communication, 1992, pp. 381-386.
[15] Pantic, M.; Rothkrantz, L.J.M., Expert system for automatic analysis of facial expression, J. image vision comput., 18, 11, 881-905, (2000)
[16] J. Zhao, G. Kearney, Classifying facial emotions by backpropagation neural networks with fuzzy inputs, in: Proceedings of the International Conference on Neural Information Processing, vol. 1, 1996, pp. 454-457.
[17] Turk, M.; Pentland, A., Eigenfaces for recognition, J. cognitive neurosci., 3, 1, 71-86, (1991)
[18] M.S. Bartlett, H.M. Lades, T.J. Sejnowski, Independent component representations for face recognition, in: Proceedings of SPIE Symposium on Electronic Imaging: Science and Technology; Human Vision and Electronic Imaging III. 3,299, San Jose, CA, 1998, pp. 528-539.
[19] Belhumeur, P.; Hespanha, J.; Kriegman, D., Eigenfaces vs. fisherfaces: recognition using class specific linear projection, IEEE trans. pattern anal. Mach. intell., 19, 7, 711-720, (1997)
[20] Lades, M.; Vorbruggen, J.; Buhmann, J.; Lange, J.; Von der Malsburg, C., Distortion invariant object recognition in the dynamic link architecture, IEEE trans. comput., 42, 300-311, (1993)
[21] Donato, G.; Bartlett, S.; Hager, C.; Ekman, P.; Sejnowski, J., Classifying facial actions, IEEE trans. pattern anal. Mach. intell., 21, 10, 974-989, (1999)
[22] Ekman, P.; Friesen, W.V., Facial action coding system: A technique for the measurement of facial movement, (1978), Consulting Psychologists Press Palo Alto
[23] M.S. Bartlett, G. Littlewort, I. Fasel, J.R. Movellan, Real time face detection and facial expression recognition: development and applications to human computer interaction, in: CVPR Workshop on Computer Vision and Pattern Recognition for Human-Computer Interaction, Vancouver, Canada, 2003.
[24] Y. Wang, H. Ai, B. Wu, C. Huang, Real time facial expression recognition with AdaBoost, in: Proceedings of the 17th International Conference on Pattern Recognition ICPR 2004, vol. 3, August 2004, pp. 926-929.
[25] C. Padgett, G. Cottrell, R. Adolphs, Categorical perception in facial emotion classification, in: Proceedings of the 18th Annual Conference of the Cognitive Science Society, San Diego, CA, 1996, pp. 249-253.
[26] D.T. Lin, J. Chen, Facial expressions classification with hierarchical radial basis function networks, in: Proceedings of the 6th International Conference on Neural Information Processing, ICONIP’99, vol. 3, 1999, pp. 1202-1207.
[27] Ma, L.; Khorasani, K., Facial expression recognition using constructive feedforward neural networks, IEEE trans. syst. man cybernetics—part B: cybernetics, 34, 3, 1588-1595, (2004)
[28] Haykin, S., Neural network—A comprehensive foundation, (1999), Prentice-Hall Englewood Cliffs, NJ · Zbl 0934.68076
[29] Bishop, C.M., Neural networks for pattern recognition, (1995), Oxford University Press Oxford
[30] Diamantaras, K.I.; Kung, S.Y., Principal component neural networks: theory and applications, (1996), Wiley New York · Zbl 0911.68157
[31] Rowley, H.A.; Baluja, S.; Kanade, T., Neural network-based face detection, IEEE trans. pattern anal. Mach. intell., 20, 1, 23-38, (1998)
[32] C.R. De Silva, New radial basis function network based techniques for holistic recognition of facial expressions, Ph.D. Thesis, National University of Singapore, 2005.
[33] S. Mika, G. Rätsch, J. Weston, B. Schölkopf, A.J. Smola, K.-R. Müller, Invariant feature extraction and classification in kernel spaces, in: Advances in Neural Information Processing Systems, vol. 12, MIT Press, Cambridge, MA, 2000, pp. 526-532.
[34] T. Kanade, J.F. Cohn, Y. Tian, Comprehensive database for facial expression analysis, in: The Fourth IEEE International Conference on Automatic Face and Gesture Recognition (FG’00), France, 2000.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.