Stars
An open source environment for digital agents.
📖 This is a repository for organizing papers, codes, and other resources related to Latent Reasoning.
Real-time object tracker on Jetson NX using YOLO v3 and sort.
This is a Python project for drone area coverage path planning. It implements the Boustrophedon decomposition algorithm to generate efficient coverage paths for drones.
[NeurIPS 2025] "Ditch the Denoiser: Emergence of Noise Robustness in Self-Supervised Learning from Data Curriculum": Improve Noise-robustness of Joint-embedding SSL Models, e.g., DINOv2
Dino V2 for Classification, PCA Visualization, Instance Retrival: https://arxiv.org/abs/2304.07193
[NeurIPS '25] FastDINOv2: Frequency Based Curriculum Learning Improves Robustness and Training Speed
[COLM 2025: 1st Workshop on the Application of LLM Explainability to Reasoning and Planning] Latent Chain-of-Thought? Decoding the Depth-Recurrent Transformer
Elucidating the Design Space of Diffusion-Based Generative Models (EDM)
Implementation of functions related to the Ambient diffusion family of papers.
PyTorch code and models for VJEPA2 self-supervised learning from video.
Fully open reproduction of DeepSeek-R1
verl: Volcano Engine Reinforcement Learning for LLMs
Witness the aha moment of VLM with less than $3.
Training Large Language Model to Reason in a Continuous Latent Space
iBOT 🤖: Image BERT Pre-Training with Online Tokenizer (ICLR 2022)
A pytorch implementation for paper 'Exploring Simple Siamese Representation Learning'
PyTorch implementation of SimCLR: supports multi-GPU training and closely reproduces results
Real-world Noisy Image Denoising: A New Benchmark
A curated collection of LLM reasoning and planning resources, including key papers, limitations, benchmarks, and additional learning materials.
Downstream-Dino-V2: A GitHub repository featuring an easy-to-use implementation of the DINOv2 model by Facebook for downstream tasks such as Classification, Semantic Segmentation and Monocular dept…
PyTorch code and models for the DINOv2 self-supervised learning method.
Reliable, minimal and scalable library for pretraining foundation and world models
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners