Emotion Recognition from Facial Expressions
1. Introduction
Emotion recognition from facial expressions is a computer vision task that identifies human emotions based on facial features. This project leverages Convolutional Neural Networks (CNNs) to classify emotions such as happiness, sadness, anger, and surprise using publicly available datasets.
2. Prerequisites
• Python: Install Python 3.x from the official Python
website.
• Required Libraries:
- numpy: Install using pip install
numpy
- pandas: Install using pip install
pandas
- tensorflow: Install using pip install
tensorflow
- keras: Part of TensorFlow for deep
learning.
- matplotlib: Install using pip install
matplotlib
• A basic understanding of deep learning and neural networks.
3. Project Setup
1. Create a Project Directory:
- Name your project folder, e.g., `EmotionRecognition`.
- Inside this folder, create the Python script file (`emotion_recognition.py`).
2. Install Required Libraries:
Ensure numpy, pandas, tensorflow, keras, and matplotlib are installed using `pip`.
3. Download a Dataset:
Use a publicly available dataset like FER-2013 or CK+ for training and testing. Ensure the dataset is structured into training and validation sets.
4. Writing the Code
Below is the Python code for the Emotion Recognition system:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv2D, MaxPooling2D, Flatten, Dense,
Dropout
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.optimizers import Adam
# Load the dataset (example assumes FER-2013 CSV format)
data = pd.read_csv('fer2013.csv')
# Preprocess the data
def preprocess_data(data):
pixels = data['pixels'].tolist()
images = [np.array(p.split(),
dtype='float32') for p in pixels]
images = np.array(images).reshape(-1,
48, 48, 1) / 255.0
labels =
pd.get_dummies(data['emotion']).values
return images, labels
images, labels = preprocess_data(data)
# Split the data into training and validation sets
from sklearn.model_selection import train_test_split
x_train, x_val, y_train, y_val = train_test_split(images, labels,
test_size=0.2, random_state=42)
# Build the CNN model
model = Sequential([
Conv2D(32, (3, 3), activation='relu',
input_shape=(48, 48, 1)),
MaxPooling2D(pool_size=(2, 2)),
Conv2D(64, (3, 3),
activation='relu'),
MaxPooling2D(pool_size=(2, 2)),
Flatten(),
Dense(128, activation='relu'),
Dropout(0.5),
Dense(7, activation='softmax') # 7 for number of emotions
])
# Compile the model
model.compile(optimizer=Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy'])
# Train the model
history = model.fit(x_train, y_train, batch_size=64, epochs=20,
validation_data=(x_val, y_val))
# Plot training history
plt.plot(history.history['accuracy'], label='Train Accuracy')
plt.plot(history.history['val_accuracy'], label='Validation Accuracy')
plt.legend()
plt.show()
# Save the model
model.save('emotion_recognition_model.h5')
5. Key Components
• Dataset: FER-2013 is a commonly used dataset for emotion
recognition tasks.
• CNN Architecture: A convolutional neural network for feature extraction and
classification.
• Training: Uses categorical crossentropy loss and Adam optimizer.
6. Testing
1. Use the trained model to predict emotions from new images.
2. Visualize predictions with the corresponding emotion labels.
7. Enhancements
• Data Augmentation: Improve model performance with
augmented images.
• Transfer Learning: Use pre-trained models like VGG or ResNet for better
accuracy.
• Real-Time Recognition: Implement real-time emotion detection using a webcam.
8. Troubleshooting
• Overfitting: Use dropout layers and data augmentation to
avoid overfitting.
• Low Accuracy: Experiment with hyperparameter tuning or a deeper model
architecture.
• Insufficient Data: Use data augmentation or additional datasets.
9. Conclusion
This project demonstrates how to implement emotion recognition using CNNs. With further improvements, it can be applied in real-world scenarios like human-computer interaction and sentiment analysis.