Implementation and Analysis of Sentimental Analysis on Facial Expression Using HAAR Cascade Methods (original) (raw)

Facial Expression Recognition System Using Haar Cascades Classifier

International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022

Facial expression conveys non-verbal cues, which play a crucial role in social relations. Facial Expression Recognition is a significant yet challenging task, as we can use it to identify the emotions and the mental state of an individual. In this system, using image processing and machine learning, we compare the captured image with the trained dataset and then display the emotional state of the image. To design a robust facial feature recognition system, Local Binary Pattern(LBP) is used. We then assess the performance of the suggested system by using a database that is trained, with the help of Neural Networks. The results show the competitive classification accuracy of various emotions.

Real Time Facial Emotion Recognition based on Image Processing and Machine Learning Sushmit Sengupta Arnab Pal Sudipta Ghosh Debashish Kundu

Behaviors, actions, poses, facial expressions and speech; these are considered as channels that convey human emotions. Extensive research has being carried out to explore the relationships between these channels and emotions. This paper proposes a prototype system which automatically recognizes the emotion represented on a face. Thus a neural network based solution combined with image processing is used in classifying the universal emotions: Happiness, Sadness, Anger, Disgust, Surprise and Fear. Colored frontal face images are given as input to the prototype system. After the face is detected, image processing based feature point extraction method is used to extract a set of selected feature points. Finally, a set of values obtained after processing those extracted feature points are given as input to the neural network to recognize the emotion contained.

Facial Emotion Detection for Educational Purpose Using Image Processing Technique

UNIVERSITI TEKNOLOGI MARA, 2020

Human emotion usually changes based on its surrounding and different actions taken. Facial expression is a non-verbal communication made by students in the classroom which can help lecturer received feedback on how student react during lecture session. Different emotions show on student’s faces whether they understand the given lecture or not. At the end of the semester, each student needs to give a feedback about their understanding on certain subjects. The feedback given may not reflect their actual feeling towards the subject. Thus, by detecting the face emotion of the student, it will help to recognize the actual emotion. One of the most important components of the modern tertiary education system in academic institutions is student’s emotional states towards their lecturer. The task of detecting face is not easy because the orientation and scale of the face may be different from image to image. This because of the movement during capture image using the camera. The existence of noise and occlusion are also a problem. The Face Emotion Detection Application will go through image processing steps for identifying the emotion type from the static frontal face image. The steps are image acquisition, grayscale conversion and contrast stretching for image pre-processing, Haar Cascade or also known as Viola-Jones technique for face detection, face model technique for eye and mouth localization, skin-color segmentation technique for image segmentation, and Grey-Level Co-Occurrence Matrix (GLCM) for feature extraction. The classification for emotion type is using SVM Regression. The accuracy percentage of emotion classification is calculated. The result shown that SVM Regression has high accuracy percentage which is 99.16%. A real-time application will be developed to identify human face emotion instead of static image for future work.

Analysis on techniques used to recognize and identifying the Human emotions

International Journal of Electrical and Computer Engineering (IJECE), 2020

Facial expression is a major area for non-verbal language in day to day life communication. As the statistical analysis shows only 7 percent of the message in communication was covered in verbal communication while 55 percent transmitted by facial expression. Emotional expression has been a research subject of physiology since Darwin's work on emotional expression in the 19th century. According to Psychological theory the classification of human emotion is classified majorly into six emotions: happiness, fear, anger, surprise, disgust, and sadness. Facial expressions which involve the emotions and the nature of speech play a foremost role in expressing these emotions. Thereafter, researchers developed a system based on Anatomic of face named Facial Action Coding System (FACS) in 1970. Ever since the development of FACS there is a rapid progress in the domain of emotion recognition. This work is intended to give a thorough comparative analysis of the various techniques and methods that were applied to recognize and identify human emotions. This analysis results will help to identify proper and suitable techniques, algorithms and the methodologies for future research directions. In this paper extensive analysis on various recognition techniques used to identify the complexity in recognizing the facial expression is presented.

A Study of Method in Facial Emotional Recognitation

—Facial expressions make important role in social communication and widely used in the behavioral interpretation of emotions. Automatic facial expression recognition is one of the most provocative and stimulate obstacle in computer vision due to its potential utilization such as Human Computer Interaction (HCI), behavioral science, video games etc. Two popular methods utilized mostly in the literature for the automatic FER systems are based on geometry and appearance. Even though there is lots of research using static images, the research is still going on for the development of new methods which would be quiet easy in computation and would have less memory usage as compared to previous methods. This paper presents a quick compare of facial expression recognition. A comparative study ofvarious feature extraction techniques by differentmethod

A Survey on: Emotion Recognition with respect to Database and Various Recognition Techniques

International Journal of Computer Applications, 2012

Recognition and extracting various emotions and then validating those emotions from the facial expressions has become important for improving the overall human computer interaction. This paper reviews the literature on different aspects like different theories of emotions, methods for studying different images in the databases, different action units like outer brow raisers where the frontals and pars medal's facial muscles are studied. The paper reviews comparative techniques for automatically recognizing facial actions in sequences of images. The goal of this research is to show the comparison with Other AU Recognition Systems Comparison of selected facial expression recognition technique with different approaches on JAFFE database and Cohn-Kanade database. To study and evaluate their performance, using JAFEE and Cohn Kanade database. The basic five principal emotions to be recognized are: Angry, Happy, Sad, Disgust and Surprise along with neutral. Their recognition rate is obtained on all the facial expressions and observed comparatively.

Emotion based Facial Recognition

International Journal for Research in Applied Science and Engineering Technology IJRASET, 2020

This paper tends to design an artificially intelligent system capable of emotion recognition through facial expressions of unknown people. The network in this paper consists of three convolutional layers each followed by max pooling and ReLU. The network is trained on FER2013 dataset and tested on RaFD dataset thus giving a wide range of training images to the network, so that it can overcome the basic problem of recognition of unknown faces. The pertinence of the final model is depicted in a live video application that can instantaneously return users emotions based on their facial posture. The accuracy obtained by this method was 68%, which is better than the previous state-of-the-arts methods. The results provide an important insight on the significance of using different datasets for training and validation. This paper mainly focuses on neural network based artificially intelligent systems capable of deriving the emotion of a person through pictures of his or her face. The primary research question being: "How can an artificial neural network, be used for interpreting the facial expression of a human?" Keywords: Face Expression Recognition (FER), RaFD, ReLU, artificial neural network, FER2013. I. INTRODUCTION Emotions are an important property of humans and are essential for effective interactions among the society. Humans communication can be either verbal of nonverbal, which it has been shown most of them refer to nonverbal communication. In nonverbal communication, emotion plays effective role because it conveys humans feeling about the subject, and in the psychology research it is proven that facial expressions is more effective then spoken word in conversation. Facial expressions are one of the natural means to communicate the emotions and these emotions can be used in entertainment and Human Machine Interface (HMI) fields. Information from facial expression distributed in different area of face and each of them has different information so that mouth and eyes include more information that cheek and forehead. There were shown on several psychological studies which culture and environment can influence the impact of emotion and the way of expressing feeling for human beings. In many of these studies shown that gender, cultural background, age have bias in expressing emotion while there is not clear evidence on importance of environment for tendency the emotion. Emotion recognition methods can be divided into two main groups: First group work on static images and second one work on dynamic image sequences. In the static approaches, temporal information is not considered and they just use current image information, while in the dynamic approaches images temporal information used in order to recognize expressed emotion in frame sequences. Automatic emotion expression recognition include three steps: face image acquisition, feature extraction, and facial emotion expression recognition. In the optimal extracted features, within-class variations of expression should be minimum while between-class variations should be maximum. If the extracted features are not suite for task in hand and do not have enough information, even the best classifier may be unsuccessful to have best performance. Feature extraction for emotion recognition can be divided into two approaches: Geometric feature-based methods and appearance-based methods. In the first methods, location and shape of parts of the face such as eyes, mouth, eyebrows and nose are considered, while in the second methods, particular regions or whole of face are considered. Because of differentiating expressions' feature space is a difficult problem, so expression recognition is still a challenging task for computers. Some problems may be due to that, extracted features from two faces with equal expression may be different, while extracted features from one face with two expression may be equal, or some expression such as "fear" and "sad" are very similar. In today's world, with the advancements in the areas of technology various music players are deployed with features like reversing the media, fast forwarding it, streaming playback with multicast streams. Although these features satisfy the basic requirements of the user, yet one has to manually surf for the song from a large set of songs, according to the current circumstance and mood. This is a time-consuming task that needs some effort and patience. The main objective of this work is to develop an intelligent system that can easily recognize the emotion through facial expression and accordingly play a music track based on that particular expression/emotion recognized.

Emotion Extraction from Facial Expressions by Using Artificial Intelligence Techniques

Nowadays, there is no application area in which Artificial Intelligence oriented solutions are not employed. It is possible to see its use in even daily life and the solution scope of this scientific field of the future is growing day by day. Because of its great advantages in analyzing the physical world and solving real world problems, Artificial Intelligence techniques are often employed in different research problems that cannot be solved with traditional computational approaches. In this context, using intelligent systems to understand human features is one of the most popular research interest for recent years. In the sense of the explanations so far, objective of this study is to introduce a research in which a system that is able to extract emotions from individuals' facial expressions was designed and developed. In detail, the system considered here a Cascade Feed-forward Artificial Neural Network model trained by a recent optimization algorithm called as Vortex Optimization Algorithm. The developed system has been applied to different sets of photos from the literature and positive results were obtained at the final for each different set considered.