Abstract— Facial expressions
and emotions plays an important role in communications in social interactions
with other human beings which delivers rich information about their mood. The
“BLUE EYES TECHNOLOGY” aims at creating computational machines that have sensory
and perceptual abilities like those of human beings which enables the computer
to gather information about humans and interact with them. This paper
implements the detection of emotions (happy, sad, fear, surprised, anger,
disgust) by taking in consideration the human eye expressions and by using
emotion mouse. The emotion mouse obtains physiological data and
emotional state of a person through the single touch of mouse having different
sensors. Emotions are also determined by human eye expression in which the eye
region from a video sequence is analyzed. From the different frames of the
video stream, the human eyes can be extracted using the edge operator and then
can be classified using a Support Vector machine (SVM) classifier. After the
classification we use standard learning tool, Hidden Markov Model (HMM) for
recognizing the emotions from the human eye expressions. After
successful detection of emotion, suitable audio track will be played.

 

Keywords- Blue eyes, emotion mouse, emotion recognition, eye expressions, Support Vector Machine (SVM), Hidden Markov Model (HMM).

                                                                                                                                                   
I.         
INTRODUCTION

The “BLUE EYES” technology aims
at creating computational machines by adding extraordinary perceptual abilities
to the computers that helps them to verify a person’s identity, feel their
presence, and interact with them. Human recognition depends primarily on the
stability to perceive, interpret, and integrate audio/visuals and sensoring
information, Blue eyes technology makes a computer to sense and understand
human feelings and their behavior and enables the computer to respond according
to the sensed emotional level. The main aim of blue eyes technology is to give
human abilities or power to a computer, so that the machine can naturally
interact with human beings as humans interact with each other.

The proposed methodologies in
this paper detect human emotions are emotional mouse and emotion recognition by
human eye expressions. Emotion mouse is an input device which is designed to
track the emotions of a user by a simple touch of it. The emotion mouse is used
to evaluate and identify the user’s emotions such as happy, sad, anger, fear,
disgust, surprised, etc. when the user is interacting with computer.

 

Human’s emotion recognition is an important component for
efficient man-machine interaction. It plays a critical role in communication by
allowing people to express oneself beyond the verbal domain. Analysis of
emotions from human eye expression involves the detection and categorization of
various human emotions or different state of mind. For example, in security and
surveillance, they can predict the offender or criminal’s behavior by analyzing
the images of their face from the frames of the video sequence. The analysis of
human emotions can be applied in a variety of application domains, such as
video surveillance and human – computer interaction systems. In some cases, the
results of such analysis can be applied to identify and categorize the various
human emotions automatically from the videos.

 

 

                                                                                                                                         
II.        
METHODOLOGY USED

A.    Emotion
Recognition From Human Eyes

Facial
expressions play an essential role in communications in social interactions
with other human beings which delivers information about their emotions. The
most crucial feature of human interaction that grants naturalism to the process
is our ability to infer the emotional states of others. Our goal is to
categorize the different human emotions from their eye expressions. The
proposed system presents a human emotion recognition system that analyzes the
human eye region from video sequences. From the frames of the video stream the
human eyes can be extracted using the well-known canny edge operator and
classified using a non – linear Support Vector machine (SVM) classifier.
Finally, a standard learning tool is used, Hidden Markov Model (HMM) for
recognizing the emotions from the human eye expressions.

Human emotion recognition is an
important component for efficient human – computer interaction. It plays a
critical role in communication, allowing people to express themselves beyond
the verbal domain. Analysis of emotions from human eye expression involves the
detection and categorization of various human emotions and state of mind. The
analysis of human emotions can be applied in a variety of application domains,
such as video surveillance and human – computer interaction systems. In some
cases, the results of such analysis can be applied to identify and categorize
the various human emotions automatically from the videos. The six primary or
main types of emotions are shown in Fig. 1: surprised, sad, happy, anger, fear,
disgust. Our method is to use the feature extraction technique to extract the
eyes, support vector machine (SVM) classifier and a HMM to build a human
emotion recognition system.

The methodology of emotion recognition from human eye expression
is shown in Fig. 2. In this methodology image of the user sitting in front of the camera
is captured. Then image representing a set of frames is preprocessed and a
noise free image is obtained. The noise free image is edge detected using Canny
Edge Operator. Using the feature extraction process, the eye regions are
extracted from the resultant edge detected image. The extracted eye regions are
classified using SVM classifier. Finally, the corresponding emotions are
recognized.

 

B.    Emotion
Mouse

One proposed, non-invasive
method for gaining user information through touch is via a computer input
device, the mouse. This then allows the user to relate the cardiac rhythm, the
body temperature and other physiological attributes with the mood. 

The block diagram of emotion mouse is shown in Fig. 3, this device can
measure heart rate and temperature and matches them with six emotional states:
happiness, surprise, anger, fear, sadness and disgust.   The mouse includes a set of sensors, including
infrared detectors and temperature-sensitive chips. These components can also
be crafted into other commonly used items such as the office chair, the
steering wheel, the keyboard and the phone handle. Integrating the system into
the steering wheel, for instance, could allow an alert to be sounded when a
driver becomes drowsy.

Heart rate is taken by IR on the thumb
and temperature is taken using a thermistor chip. These values are input into a
series of discriminate function analyses and correlated to an emotional state.
Specifically, for the mouse, discriminate function analysis is used in
accordance with basic principles to determine a baseline relationship, that is,
the relationship between each set of calibration physiological signals and the
associated emotion.

 

                                                                                                                                                   
I.         
SYSTEM MODEL

In this system, two methodologies
namely emotion mouse and emotion recognition from eye expression are used.
Emotion mouse will consider the physiological as well as biological parameters
such as cardiac rhythm and body temperature, whereas on the other side emotion
recognition from human eye expression considers facial expression for the
detection of human emotion and mood.

Fig. 4 shows the block diagram of the
system. In this system the data from the heartbeat sensor and temperature
sensor of the emotion mouse is given to the microcontroller. The output of the
microcontroller is then fed to the computer. The value of heartbeat sensor and
temperature sensor is compared with the standard range of each emotion and the
suitable emotion is selected on the other hand a webcam is connected with the
computer which will take the image of the person from a video sequence and will
further recognize the emotion by detecting the eye part. The captured eye
section will be compared to the images stored in database to detect mood of the
person. After detecting the mood, the musicor audio command is played according
to the detected mood.

 

                                                                                                                                                               
I.         
RESULT

In proposed system, there are two
results of the mentioned methodologies. Firstly, different eye expressions of
the different people are taken in consideration by edge detection of eyes.
Further each eye expression is categorized into a given set of emotions (happy,
sad, fear, surprised, disgust, anger} to take in account a single standard
expression for each emotion. Thus emotion of a person can be detected by
comparing the eye expression of the person with the standard eye expressions of
each emotion. Secondly, the values of heartbeat sensor and temperature sensor
are compared with the standard value range of each emotion and accordingly the
value range of a emotion that matches with the data values of the user is
considered as the emotional state of the user. According to the detected
emotion the music or audio command is played.

 

                                                                                                                                                      
II.        
CONCLUSION

Recent research documents tell that the understanding and
recognition of emotional expressions plays a very important role in the maintenance
and development of social relationships. This paper gives an approach of creating
computational machines that have perceptual and sensory ability like those of
human beings which enables the computer to gather information about you through
special techniques like facial expressions recognition and considering
biological factors such as cardiac rhythm and body temperature. This makes it
possible for computer and machines to detect the emotion of the human and
respond to it.