Human Emotion Classification based on EEG Signals Using Recurrent Neural Network And KNN

##plugins.themes.academic_pro.article.main##

Shashank Joshi
Falak Joshi

Abstract

In human contact, emotion is very crucial. Attributes like words, voice intonation, facial expressions, and kinesics can all be used to portray one's feelings. However, brain-computer interface (BCI) devices have not yet reached the level required for emotion interpretation. With the rapid development of machine learning algorithms, dry electrode techniques, and different real-world applications of the brain-computer interface for normal individuals, emotion categorization from EEG data has recently gotten a lot of attention. Electroencephalogram (EEG) signals are a critical resource for these systems. The primary benefit of employing EEG signals is that they reflect true emotion and are easily resolved by computer systems. In this work, EEG signals associated with good, neutral, and negative emotions were identified using channel selection preprocessing. However, researchers had a limited grasp of the specifics of the link between various emotional states until now. To identify EEG signals, we used discrete wavelet transform and machine learning techniques such as recurrent neural network (RNN) and k-nearest neighbor (kNN) algorithm. Initially, the classifier methods were utilized for channel selection. As a result, final feature vectors were created by integrating the features of EEG segments from these channels. Using the RNN and kNN algorithms, the final feature vectors with connected positive, neutral, and negative emotions were categorized independently. The classification performance of both techniques is computed and compared. Using RNN and kNN, the average overall accuracies were 94.844 % and 93.438 %, respectively.

##plugins.themes.academic_pro.article.details##

How to Cite
Shashank Joshi, & Falak Joshi. (2023). Human Emotion Classification based on EEG Signals Using Recurrent Neural Network And KNN. International Journal of Next-Generation Computing, 14(2). https://doi.org/10.47164/ijngc.v14i2.691

References

  1. Adeli, H., Zhou, Z., and Dadmehr, N. 2003. Analysis of eeg records in an epileptic patient using wavelet transform. Journal of neuroscience methods 123, 1, 69–87. DOI: https://doi.org/10.1016/S0165-0270(02)00340-0
  2. Anderson, K. and McOwan, P. W. 2006. A real-time automated system for the recognition of human facial expressions. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 36, 1, 96–105. DOI: https://doi.org/10.1109/TSMCB.2005.854502
  3. Atyabi, A., Luerssen, M. H., and Powers, D. M. 2013. Pso-based dimension reduction of eeg recordings: implications for subject transfer in bci. Neurocomputing 119, 319–331. DOI: https://doi.org/10.1016/j.neucom.2013.03.027
  4. Berrueta, L. A., Alonso-Salces, R. M., and He´berger, K. 2007. Supervised pattern recognition in food analysis. Journal of chromatography A 1158, 1-2, 196–214. DOI: https://doi.org/10.1016/j.chroma.2007.05.024
  5. Bhardwaj, A., Gupta, A., Jain, P., Rani, A., and Yadav, J. 2015. Classification of human emotions from eeg signals using svm and lda classifiers. 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), 180–185. DOI: https://doi.org/10.1109/SPIN.2015.7095376
  6. Bird, J. J., Manso, L. J., Ribeiro, E. P., Eka´rt, A., and Faria, D. R. 2018. A study on mental state classification using eeg-based brain-machine interface. In 2018 international conference on intelligent systems (IS). IEEE, 795–800. DOI: https://doi.org/10.1109/IS.2018.8710576
  7. Black, M. J. and Yacoob, Y. 1997. Recognizing facial expressions in image sequences using lo- cal parameterized models of image motion. International Journal of Computer Vision 25, 1, 23–48.
  8. Chanel, G., Kronegg, J., Grandjean, D., and Pun, T. 2006. Emotion assessment: Arousal evaluation using eeg’s and peripheral physiological signals. In International workshop on multimedia content representation, classification and security. Springer, 530–537. DOI: https://doi.org/10.1007/11848035_70
  9. Coenen, A., Fine, E., and Zayachkivska, O. 2014. Adolf beck: A forgotten pioneer in electroencephalography. Journal of the History of the Neurosciences 23, 3, 276–286. DOI: https://doi.org/10.1080/0964704X.2013.867600
  10. Davidson, R. 1979. Frontal versus perietal eeg asymmetry during positive and negative affect.
  11. Psychophysiology 16, 2, 202–203.
  12. Khosrowabadi, R., Quek, H. C., Wahab, A., and Ang, K. K. 2010. Eeg-based emotion recognition using self-organizing map for boundary detection. In 2010 20th International Conference on Pattern Recognition. IEEE, 4242–4245. DOI: https://doi.org/10.1109/ICPR.2010.1031
  13. Kim, J. and Andre´, E. 2006. Emotion recognition using physiological and speech signal in short-term observation. In International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems. Springer, 53–64. DOI: https://doi.org/10.1007/11768029_6
  14. Kim, K. H., Bang, S. W., and Kim, S. R. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing 42, 3, 419–427. DOI: https://doi.org/10.1007/BF02344719
  15. Lee, G., Kwon, M., Sri, S. K., and Lee, M. 2014. Emotion recognition based on 3d fuzzy visual and eeg features in movie clips. Neurocomputing 144, 560–568. DOI: https://doi.org/10.1016/j.neucom.2014.04.008
  16. Murugappan, M., Ramachandran, N., Sazali, Y., et al. 2010. Classification of human emotion from eeg using discrete wavelet transform. Journal of biomedical science and engineering 3, 04, 390. DOI: https://doi.org/10.4236/jbise.2010.34054
  17. Petrantonakis, P. C. and Hadjileontiadis, L. J. 2009. Emotion recognition from eeg using higher order crossings. IEEE Transactions on information Technology in Biomedicine 14, 2, 186–197. DOI: https://doi.org/10.1109/TITB.2009.2034649
  18. Petrushin, V. 1999. Emotion in speech: Recognition and application to call centers. In Pro- ceedings of artificial neural networks in engineering. Vol. 710. 22.
  19. Picard, R. W. 2000. Affective computing. MIT press. DOI: https://doi.org/10.7551/mitpress/1140.001.0001
  20. Shah, A. K. and Mittal, S. 2014. Invasive electroencephalography monitoring: Indications and presurgical planning. Annals of Indian Academy of Neurology 17, Suppl 1, S89. DOI: https://doi.org/10.4103/0972-2327.128668
  21. Swartz, B. E. 1998. The advantages of digital over analog recording techniques. Electroen- cephalography and clinical neurophysiology 106, 2, 113–117. DOI: https://doi.org/10.1016/S0013-4694(97)00113-2
  22. Zhang, Q. and Lee, M. 2009. Analysis of positive and negative emotions in natural scene using brain activity and gist. Neurocomputing 72, 4-6, 1302–1306. DOI: https://doi.org/10.1016/j.neucom.2008.11.007