Skip to content
/ dreem Public

DREEM Relates Every Entities' Motion (DREEM). Global Tracking Transformers for biological multi-object tracking.

License

Notifications You must be signed in to change notification settings

talmolab/dreem

Repository files navigation

DREEM Relates Every Entity's Motion

CI codecov Documentation code

Features

  • Command-Line & API Access: Use DREEM via a simple CLI or integrate into your own Python scripts.
  • Pretrained Models: Get started quickly with models trained specially for microscopy and animal domains.
  • Configurable Workflows: Easily customize training and inference using YAML configuration files.
  • Visualization: Visualize tracking results in your browser without any data leaving your machine, or use the SLEAP GUI for a more detailed view.
  • Examples: Step-by-step notebooks and guides for common workflows.

Installation

DREEM works best with Python 3.12. We recommend using uv for package management.

In a new directory:

   uv venv && source .venv/bin/activate
   uv pip install dreem-track

or as a system-wide package that does not require a virtual environment:

   uv tool install dreem-track

Now dreem commands will be available without activating a virtual environment.

For more installation options and details, see the Installation Guide.

Quickstart

1. Download Sample Data and Model

# Install huggingface-hub if needed
uv pip install huggingface_hub

# Download sample data
hf download talmolab/sample-flies --repo-type dataset --local-dir ./data

# Download pretrained model
hf download talmolab/animals-pretrained \
    --repo-type model \
    --local-dir ./models \
    --include "animals-pretrained.ckpt"

2. Run Tracking

dreem track ./data/inference \
    --checkpoint ./models/animals-pretrained.ckpt \
    --output ./results \
    --crop-size 70

3. Visualize Results

Results are saved as .slp files that can be opened directly in SLEAP for visualization.

For a more detailed walkthrough, check out the Quickstart Guide or try the Colab notebook.

Usage

Training a Model

Train your own model on custom data:

dreem train ./data/train \
    --val-dir ./data/val \
    --crop-size 70 \
    --epochs 10

Running Inference

Run tracking on new data with a pretrained model:

dreem track ./data/inference \
    --checkpoint ./models/my_model.ckpt \
    --output ./results \
    --crop-size 70

Evaluating Results

Evaluate tracking accuracy against ground truth:

dreem eval ./data/test \
    --checkpoint ./models/my_model.ckpt \
    --output ./results \
    --crop-size 70

For detailed usage instructions, see the Usage Guide.

Documentation

Examples

We provide several example notebooks to help you get started:

All notebooks are available on Google Colab.

Contributing

We welcome contributions! Please see our Contributing Guide for details on:

  • Code style and conventions
  • Submitting pull requests
  • Reporting issues

Citation

If you use DREEM in your research, please cite our paper:

@article{dreem2024,
  title={DREEM: Global Tracking Transformers for Biological Multi-Object Tracking},
  author={...},
  journal={...},
  year={2024}
}

License

This project is licensed under the BSD-3-Clause License - see the LICENSE file for details.


Questions? Open an issue on GitHub or visit our documentation.

About

DREEM Relates Every Entities' Motion (DREEM). Global Tracking Transformers for biological multi-object tracking.

Resources

License

Stars

Watchers

Forks

Contributors 7

Languages