Research Article Open Access

Advanced Facial Emotion Recognition Using DCNN-ELM: A Comprehensive Approach to Preprocessing, Feature Extraction and Performance Evaluation

Boopalan K.1, Satyajee Srivastava2, K. Kavitha3, D. Usha Rani4, K. Jayaram Kumar5, M. V. Jagannatha6 and V. Bhoopathy7
  • 1 School of Computing, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, Chennai, India
  • 2 Department CSE, M.M. Engineering College, Maharishi Markandeshwar (Deemed to Be University), Mullana, Ambala, Haryana, India
  • 3 Department of CSE-AI&ML, GMR Institute of Technology, Rajam Andhra Pradesh, India
  • 4 Department of Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Vaddeswaram, India
  • 5 Department of Electronics and Communication Engineering, Aditya University, Surampalem, India
  • 6 Department of AIML, M S Engineering College, Navarathna Agrahara, Sadahalli, Bengaluru, India
  • 7 Department of Computer Science and Engineering, Sree Rama Engineering College, Tirupathi, India

Abstract

As a subfield of affective computing, Facial Emotion Recognition (FER) teaches computers to read people's facial expressions to determine their emotional state. Because facial expressions convey 55% of an individual's emotional and mental state in the whole range of face-to-face communication, Facial Emotion Recognition is crucial for connecting humans and computers. Improvements in the way computer systems (robotic systems) interact with or assist humans are another benefit of advancements in this area. Deep learning is key to the highly advanced research being conducted in this area. Recently, FER research has made use of Ekman's list of fundamental emotions as one of these models. Anger, Disgust, Fear, Happy, Sad, Surprise, and Neutral are the seven main emotions mapped out on Robert Plutchik's wheel. Opposite to each of the main emotions is its polar opposite. There are four steps to the suggested method: Preprocessing, feature extraction, model performance evaluation, and finalization. The preprocessing step makes use of the kernel filter. The proposed approach uses SWLDA for feature extraction. Facial Emotion Recognition (FER) is critical for improving human-computer interactions, particularly in educational settings. This study presents a novel hybrid approach combining Deep Convolutional Neural Networks (DCNN) with Extreme Learning Machines (ELM) to enhance emotion recognition accuracy. The proposed model demonstrates superior performance compared to traditional DCNN and standalone ELM approaches, offering real-time emotion detection in online learning environments. The effectiveness of the model is validated using publicly available datasets, setting a new benchmark for FER. This study makes major contributions to the field of Facial Emotion Recognition (FER) by offering a robust architecture that combines Deep Convolutional Neural Networks (DCNN) with Extreme Learning Machines (ELM). The methodology's efficacy is proven with publicly available datasets, establishing a new standard in FER, particularly in educational settings.

Journal of Computer Science
Volume 21 No. 1, 2025, 13-24

DOI: https://doi.org/10.3844/jcssp.2025.13.24

Submitted On: 31 May 2024 Published On: 30 November 2024

How to Cite: K., B., Srivastava, S., Kavitha, K., Rani, D. U., Kumar, K. J., Jagannatha, M. V. & Bhoopathy, V. (2025). Advanced Facial Emotion Recognition Using DCNN-ELM: A Comprehensive Approach to Preprocessing, Feature Extraction and Performance Evaluation. Journal of Computer Science, 21(1), 13-24. https://doi.org/10.3844/jcssp.2025.13.24

  • 276 Views
  • 150 Downloads
  • 0 Citations

Download

Keywords

  • Facial Emotion Recognition (FER)
  • Linear Discriminant Analysis (LDA)
  • Extreme Learning Machine (ELM)
  • Deep Convolutional Neural Network (DCNN)