CNN-BiLSTM based Facial Emotion Recognition

Authors

  • Alina Lamichhane Software Engineer, Auxfin Development Nepal, Kupondole, Lalitpur, Nepal
  • Gopal Karn Kantipur Engineering College, Dhapakhel, Lalitpur, Nepal

DOI:

https://doi.org/10.3126/injet.v2i1.72579

Keywords:

Hybrid Neural Network, CNN-BiLSTM, Image Processing, Face emotion Recognition, FER2013 Dataset

Abstract

Human emotions play an important role as they let articulate themselves without any words. Emotion Recognition involve a lot of information about facial expression, body language, tone, and pitch of voice etc. Among all this, Facial expression also plays an important role in interaction between humans and machines and significant amount of research had been done in face emotion recognition. Traditionally feature extraction from the image were done manually but over some past years different Machine Learning (ML) algorithms and Neural Network (NN) had been used for face emotion recognition. In this paper, Hybrid Neural Network CNN-Bi LSTM is used to extract the feature from facial image and detect emotion. To proceed this publicly available FER2013 dataset is used. The model is trained with greyscale image to classify seven emotions such as happy, sad, disgust, angry, fearful, surprise and neutral. The CNN component extracts spatial features, while the BiLSTM layer processes these features to capture temporal dependencies. The model achieves an accuracy of 79.4% when classifying all seven different emotions. However, when limited to detecting three emotions (happy, sad, neutral), the accuracy improves to 89.0%, demonstrating the model’s potential for focused emotion recognition tasks.

Downloads

Download data is not yet available.
Abstract
33
PDF
23

Downloads

Published

2024-12-16

How to Cite

Lamichhane, A., & Karn, G. (2024). CNN-BiLSTM based Facial Emotion Recognition. International Journal on Engineering Technology, 2(1), 227–236. https://doi.org/10.3126/injet.v2i1.72579

Issue

Section

Articles