Abstract:
Emotions are the most fundamental feature for non-verbal communication between human and machine. But the improvement of different human interacted media depends largely on the appropriate recognition of the human feeling that means what he or she wants just right at that moment. To communicate computer with human, to detect present feelings of the patient, to understand customer interest for shopping, to realize interest of physically handicapped people or to detect the lie of a convict; emotion recognition is a fundamental prerequisite. But due to some complexities, the proper recognition of human emotion from Electroencephalogram (EEG) has become too much challenging.
Normally, feature-based emotion recognition requires a strong effort to design the perfect feature or feature set related to the classification of emotion. To curtail the manual human effort of feature extraction, we designed a model with Convolutional Neural Network (CNN). As Electroencephalogram (EEG) is 1D data, to use CNN the 1D EEG data have to be converted into 2D significant image data. That is a challenging task. To meet up this challenge, initially, we calculated Pearson’s correlation coefficients form different sub-bands of EEG to formulate a virtual image. Later, this virtual image was fed into a CNN architecture to classify emotion. We made two distinct protocols; between these, protocol-1 was to classify positive and negative emotion and protocol-2 was to classify three distinct emotions. Overall maximum accuracy of 76.52% on valence and 76.82% on arousal was obtained by using internationally authorized DEAP dataset. We observed that the Convolutional Neural Network (CNN) based method showed state-of-the-art performance for emotion classification.
Description:
This thesis is submitted to the Department of Electrical and Electronic Engineering, Khulna University of Engineering & Technology in partial fulfillment of the requirements for the degree of Master of Science in Electrical and Electronic Engineering, July 2019.
Cataloged from PDF Version of Thesis.
Includes bibliographical references (pages 85-91).