Retico modules for real-time user emotional state tracking using the VAD (Valence, Arousal, Dominance) model or Emotion Intensity model. Powered by the NRC Valence, Arousal, and Dominance (NRC-VAD) Lexicon.
-
Real-time emotion detection
Analyze live input to estimate user emotional states using VAD or emotion intensity scores. -
Dual tracking modes
Supports both VAD and Emotion Intensity scoring models, switchable as needed. -
Live visualization
With a separate terminal, view a real-time line plot of emotion values updating as you speak or type.
git clone https://github.com/zihaurpang/retico-emotion-tracking.gitStart live analysis from the microphone:
python simple_emotion.pyIn a separate terminal, run:
python realtime_vad_plot.pyThis will display a continuously updating line chart of emotion values.
Try a text input instead of live input:
python manual_emotion_module.pyUse this to quickly test or demo how the module responds to text.
-
Input: audio from microphone or user-entered text
-
Processing: Parse input into tokens and score each with NRC-VAD lexicon
-
Output:
- In real-time module (
simple_emotion.py): prints emotion values continuously - In plot module (
realtime_vad_plot.py): streams values to a live line chart
- In real-time module (
- Python β₯β―3.7
- Simple microphone setup for real-time mode