Skip to content

emiledgl/ebt-wm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ebt-wm

Implementing a JEPA-style World Model using the Energy-Based-Transformer, an Attentive State Pooler and LeJEPA loss.

State Encoder

The State Encoder is composed of an image backbone followed by an attentive state pooler that aggregates spatial & temporal features as well as proprioception into a compact latent state representation of K tokens.

This State Encoder is meant to be trained jointly with the Predictor using the reconstruction loss and the SIGReg regularization.

About

Implementing a JEPA-style World Model using the Energy-Based-Transformer, an Attentive State Pooler and LeJEPA loss.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages