The research project of Gesture Emotion Detection in collaboration with Kasikorn Business-Technology Group (KBTG), aims at providing machines the ability to detect, identify, and track people automatically and subsequently interpret human emotional behavior in a real-time manner. Most of the existing systems attempting to analyze nonverbal human behavior only focus on human face detection. However, face detection methods have limitations in real-world service scenarios. First, the resolution of image sequences obtained from installed cameras in bank branches is not always ideal for face emotion methods. In addition, the current camera systems are inconsistent at capturing face images due to their ceiling-to-floor shooting angle. Face detection systems suffer from limitations regarding customer body positioning, such as people turning their backs to the camera and obstructing the camera view.
In order to solve these problems, we make use of human gesture information that is free from the restrictions of face detection stated above. Compared to existing gesture emotion detection techniques, our method doesn’t require special gesture capturing facilities, such as Microsoft Kinect or depth cameras. With no need for additional hardware, promising detection results can be achieved with the lowest cost to organizations like banks or subsidiary retailers.