A real time Multimodal Emotion Recognition web app for text, sound and video inputs
-
Updated
Apr 29, 2021 - Jupyter Notebook
A real time Multimodal Emotion Recognition web app for text, sound and video inputs
A collection of datasets for the purpose of emotion recognition/detection in speech.
Models for predicting emotions from English tweets.
PyTorch implementation of Emotic CNN methodology to recognize emotions in images using context information.
Multispeaker & Emotional TTS based on Tacotron 2 and Waveglow
This program can recognize your mood by detecting your face and play song according your mood
Personal assistant to help you save your personal memories and emotions together, securely
⚛ The stable base upon which we build our React projects at Mirego.
Documentation for the MixedEmotions Toolbox
Experiments with Hugging Face 🔬 🤗
Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).
🏄 State-of-the-art Tracker for emotions, habits and thoughts. | Gamified. | Anonymous and open source. | Healthiest version of you
What if we could see the emotions and moods of people through the breadcrumbs they leave on Twitter ?
A python package for classifying emotion
Emoji embeddings trained using their emotional content from their online dictionary meanings.
Code and models for 3 different tools to measure appeals to 8 discrete emotions in German political text
A multilingual DeBERTa model fine-tuned on political communication to classify discrete emotions
Arabic - English emotion lexicon
Generating Photorealistic Facial Images with specified emotions using CGAN.
Add a description, image, and links to the emotions topic page so that developers can more easily learn about it.
To associate your repository with the emotions topic, visit your repo's landing page and select "manage topics."