Gesture recognition using combination of acceleration sensor and images for casual communication between robots and humans

Y. Yamazaki*, H. A. Vu, P. Q. Le, Z. Liu, C. Fatichah, M. Dai, H. Oikawa, D. Masano, O. Thet, Y. Tang, N. Nagashima, M. L. Tangel, F. Dong, K. Hirota

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Citations (Scopus)

Abstract

A fuzzy logic based multi-modal gesture recognition method is proposed, where both camera images and hand motion data (given by a 3D acceleration sensor put on human wrists) are used to notify the emotion of humans to robots in real time. To demonstrate the validity, it is applied to a part of the home party scenario enjoyed by five eye robots and four human participants, where 8 types of human emotional-gestures are successfully recognized by robots. The proposed method aims to realize the casual communication between robots and humans in the mascot robot system that has been developed by authors' group.

Original languageEnglish
Title of host publication2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010 - Barcelona, Spain
Duration: 18 Jul 201023 Jul 2010

Publication series

Name2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010

Conference

Conference2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010
Country/TerritorySpain
CityBarcelona
Period18/07/1023/07/10

Fingerprint

Dive into the research topics of 'Gesture recognition using combination of acceleration sensor and images for casual communication between robots and humans'. Together they form a unique fingerprint.

Cite this