Skip to content

zihaurpang/retico-emotion-tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Retico Emotion Tracking 🎭

Retico modules for real-time user emotional state tracking using the VAD (Valence, Arousal, Dominance) model or Emotion Intensity model. Powered by the NRC Valence, Arousal, and Dominance (NRC-VAD) Lexicon.


πŸš€ Features

  • Real-time emotion detection
    Analyze live input to estimate user emotional states using VAD or emotion intensity scores.

  • Dual tracking modes
    Supports both VAD and Emotion Intensity scoring models, switchable as needed.

  • Live visualization
    With a separate terminal, view a real-time line plot of emotion values updating as you speak or type.


🧰 Installation

git clone https://github.com/zihaurpang/retico-emotion-tracking.git

πŸ› οΈ Usage

1. Run real-time emotion tracking

Start live analysis from the microphone:

python simple_emotion.py

2. View real-time emotion visualization

In a separate terminal, run:

python realtime_vad_plot.py

This will display a continuously updating line chart of emotion values.

3. Manual text-based testing

Try a text input instead of live input:

python manual_emotion_module.py

Use this to quickly test or demo how the module responds to text.


πŸ’‘ How It Works

  1. Input: audio from microphone or user-entered text

  2. Processing: Parse input into tokens and score each with NRC-VAD lexicon

  3. Output:

    • In real-time module (simple_emotion.py): prints emotion values continuously
    • In plot module (realtime_vad_plot.py): streams values to a live line chart

βœ… Requirements

  • Python β‰₯β€―3.7
  • Simple microphone setup for real-time mode

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages