Emotion-based Media Recommendation System

##plugins.themes.academic_pro.article.main##

Shailendra Aote
Aayush Muley
Adesh Kotgirwar
Yash Daware
Gaurav Shukla
Jayesh Kapse

Abstract

In today’s world, digital media is a significant element of human life. People take its help to stay motivated and explore media according to their moods. It takes a lot of effort to find appropriate music that suits the particular emotional state from loads of options available. Media players in today’s world are not giving priority to the emotional state and effective recommendation of a person. Human emotion plays a vital role in media selection in recent times. Emotion expresses the individual’s behavior and state of mind and digital media has the power to change one’s mental state from negative to positive. The objective of this paper is to extract features from the human face and detect emotion, age, and gender, and suggest media according to the features detected. The emotional state, age, and gender can be interpreted from facial expressions through the webcam. We have used the CNN classifier to build a neural network model. This model is trained and subjected to detect mood, age, and gender from facial expressions using OpenCV. A system that generates a media playlist based on the detected emotion, age, and gender gives better results.

##plugins.themes.academic_pro.article.details##

How to Cite
Aote, S., Muley, A., Kotgirwar, A., Daware, Y., Shukla, G., & Kapse, J. (2021). Emotion-based Media Recommendation System. International Journal of Next-Generation Computing, 12(5). https://doi.org/10.47164/ijngc.v12i5.415

References

  1. Babanne, V., Borgaonkar, M., Katta, M., Kudale, P., and Deshpande, V. 2020. Emotion
  2. based personalized recommendation system. International Research Journal of Engineering
  3. and Technology (IRJET) Vol.7, No.8, pp.701–705.
  4. Bali, V., Haval, S., Patil, S., and Priyambiga, R. 2019. Emotion based music player.
  5. International Journal of Research in Engineering, Science and Management Vol.2, No.2,
  6. pp.65–70.
  7. Bhat, A., Amith, V., Prasad, N., and Mohan, D. 2014. An efficient classification algorithm
  8. for music mood detection in western and hindi music using audio feature extraction. pp.359–
  9. Hemanth P, A., B, A. C., P, A., and Kumar, V. A. 2018. Emo player: Emotion based music
  10. player. International Research Journal of Engineering and Technology (IRJET) Vol.4, No.4,
  11. pp.4822–4827.
  12. Koldijk, S., Neerincx, M. A., and Kraaij, W. 2018. Detecting work stress in offices by
  13. combining unobtrusive sensors. IEEE Trans. Affect. Comput. Vol.9, No.2, pp.227–239.
  14. Lawrence, S., Gilese, C., Tsoi, A. C., and Back, A. 1997. Face recognition: a convolutional
  15. neural-network approach. IEEE Transactions on Neural Networks Vol.8, No.1, pp.98–113.
  16. Mariappan, M., Suk, M., and Prabhakaran, B. 2012. Facefetch: A user driven multimedia
  17. content recommendation system based on facial expression recognition. pp.84–87.
  18. Roy, S., Sharma, P. M., and Singh, D. S. 2019. Movie recommendation system using semisupervised
  19. learning.
  20. Swaminathan, S. and Schellenberg, E. G. 2015. Current emotion research in music psychology.
  21. Emotion Review Vol.7, No.2, pp.189–197.
  22. Tian, Y.-L., Kanade, T., and Cohn, J. 2000. Recognizing lower. face action units for facial
  23. expression analysis. Proceedings of the 4th IEEE International Conference on Automatic
  24. Face and Gesture Recognition, pp.284–490.