EmotionDetectionLSTM: Text-Based Emotion Recognition with LSTM
EmotionDetectionLSTM is an NLP model designed to classify emotions from text using a Long Short-Term Memory (LSTM) deep learning architecture.
It analyzes the linguistic and semantic patterns of input text to predict one of six emotions:
joy, sadness, anger, fear, love, or surprise.
β οΈ Disclaimer:
This model is created for educational, research, and demonstration purposes only.
Please ensure responsible and ethical use of AI systems.
π§© Model Details
Key Features:
- Includes preprocessing: tokenization, stopword removal, and lemmatization
 - Uses LSTM layers for sequential text understanding
 - Outputs emotion predictions with probabilities and expressive emojis
 - Deployed on Hugging Face Spaces with a Streamlit-based interface
 
Skills & Technologies Used:
- TensorFlow / Keras for deep learning
 - NLTK for text preprocessing
 - Streamlit for deployment
 - Matplotlib and Seaborn for data visualization
 
- Developed by: Rawan Alwadeya
 - Model Type: Text Classification (LSTM)
 - Language: English
 - License: MIT
 
π‘ Uses
This model can be applied in:
- Emotion and sentiment analysis in text data
 - Chatbots and virtual assistants for emotion-aware responses
 - Human-computer interaction and affective computing research
 - Customer feedback or social media emotion analysis
 
π Performance
The LSTM model achieved strong accuracy on the test set:
- Accuracy: 
92% 
This demonstrates reliable emotion recognition performance across multiple emotion categories.
π Deployment
- Hugging Face Repo: EmotionDetectionLSTM
 - Demo Video: Watch on LinkedIn
 
Users can input text and instantly receive a predicted emotion label accompanied by an expressive emoji, making the interaction both informative and user-friendly.
π©βπ» Author
Rawan Alwadeya
AI Engineer | Generative AI & Deep Learning | Data Scientist  
- π§ Email: [email protected]
 - π LinkedIn Profile
 
π§ Example Usage
from huggingface_hub import hf_hub_download
from tensorflow.keras.models import load_model
from tensorflow.keras.preprocessing.sequence import pad_sequences
import pickle
import numpy as np
# Download model and tokenizer files from Hugging Face Hub
model_path = hf_hub_download(repo_id="RawanAlwadeya/EmotionDetectionLSTM", filename="lstm.h5")
tokenizer_path = hf_hub_download(repo_id="RawanAlwadeya/EmotionDetectionLSTM", filename="tokenizer.pkl")
label_path = hf_hub_download(repo_id="RawanAlwadeya/EmotionDetectionLSTM", filename="label_encoder.pkl")
# Load model and preprocessing objects
model = load_model(model_path)
with open(tokenizer_path, "rb") as f:
    tokenizer = pickle.load(f)
with open(label_path, "rb") as f:
    label_encoder = pickle.load(f)
# Example text
text = "Being a mom has made me feel more vulnerable than I have ever felt before."
# Preprocess
seq = tokenizer.texts_to_sequences([text.lower()])
padded = pad_sequences(seq, maxlen=35, padding='post')
# Predict
pred = model.predict(padded)
emotion = label_encoder.inverse_transform([np.argmax(pred)])[0]
print("Predicted emotion:", emotion)
- Downloads last month
 - -